Sample records for kernel black point

  1. Hyperspectral imaging for detection of black tip damage in wheat kernels

    NASA Astrophysics Data System (ADS)

    Delwiche, Stephen R.; Yang, I.-Chang; Kim, Moon S.

    2009-05-01

    A feasibility study was conducted on the use of hyperspectral imaging to differentiate sound wheat kernels from those with the fungal condition called black point or black tip. Individual kernels of hard red spring wheat were loaded in indented slots on a blackened machined aluminum plate. Damage conditions, determined by official (USDA) inspection, were either sound (no damage) or damaged by the black tip condition alone. Hyperspectral imaging was separately performed under modes of reflectance from white light illumination and fluorescence from UV light (~380 nm) illumination. By cursory inspection of wavelength images, one fluorescence wavelength (531 nm) was selected for image processing and classification analysis. Results indicated that with this one wavelength alone, classification accuracy can be as high as 95% when kernels are oriented with their dorsal side toward the camera. It is suggested that improvement in classification can be made through the inclusion of multiple wavelength images.

  2. Ambered kernels in stenospermocarpic fruit of eastern black walnut

    Treesearch

    Michele R. Warmund; J.W. Van Sambeek

    2014-01-01

    "Ambers" is a term used to describe poorly filled, shriveled eastern black walnut (Juglans nigra L.) kernels with a dark brown or black-colored pellicle that are unmarketable. Studies were conducted to determine the incidence of ambered black walnut kernels and to ascertain when symptoms were apparent in specific tissues. The occurrence of...

  3. A comparison of skyshine computational methods.

    PubMed

    Hertel, Nolan E; Sweezy, Jeremy E; Shultis, J Kenneth; Warkentin, J Karl; Rose, Zachary J

    2005-01-01

    A variety of methods employing radiation transport and point-kernel codes have been used to model two skyshine problems. The first problem is a 1 MeV point source of photons on the surface of the earth inside a 2 m tall and 1 m radius silo having black walls. The skyshine radiation downfield from the point source was estimated with and without a 30-cm-thick concrete lid on the silo. The second benchmark problem is to estimate the skyshine radiation downfield from 12 cylindrical canisters emplaced in a low-level radioactive waste trench. The canisters are filled with ion-exchange resin with a representative radionuclide loading, largely 60Co, 134Cs and 137Cs. The solution methods include use of the MCNP code to solve the problem by directly employing variance reduction techniques, the single-scatter point kernel code GGG-GP, the QADMOD-GP point kernel code, the COHORT Monte Carlo code, the NAC International version of the SKYSHINE-III code, the KSU hybrid method and the associated KSU skyshine codes.

  4. Vis- and NIR-based instruments for detection of black-tip damaged wheat kernels: A comparative study

    USDA-ARS?s Scientific Manuscript database

    Black-tip (BT) present in wheat kernels is a non-mycotoxic fungus that attacks the kernels wherein any of a number of molds forms a dark brown or black sooty mold at the tip of the wheat kernel. Three spectrometers covering the spectral ranges 950-1636nm (Spec1), 600-1045nm (Spec2), and 380-780nm (S...

  5. Detecting and Segregating Black Tip-Damaged Wheat Kernels Using Visible and Near Infrared Spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Detection of individual wheat kernels with black tip symptom (BTS) and black tip damage (BTD) was demonstrated using near infrared reflectance spectroscopy (NIRS) and silicon light-emitting-diode (LED) based instruments. The two instruments tested, a single kernel near-infrared spectroscopy instrume...

  6. 7 CFR 810.202 - Definition of other terms.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... GRAIN United States Standards for Barley Terms Defined § 810.202 Definition of other terms. (a) Black barley. Barley with black hulls. (b) Broken kernels. Barley with more than 1/4 of the kernel removed. (c... barley kernels, other grains, and wild oats that are badly shrunken and distinctly discolored black or...

  7. Evaluation of various carbon blacks and dispersing agents for use in the preparation of uranium microspheres with carbon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, Rodney Dale; Johnson, Jared A.; Collins, Jack Lee

    A comparison study on carbon blacks and dispersing agents was performed to determine their impacts on the final properties of uranium fuel kernels with carbon. The main target compositions in this internal gelation study were 10 and 20 mol % uranium dicarbide (UC 2), which is UC 1.86, with the balance uranium dioxide. After heat treatment at 1900 K in flowing carbon monoxide in argon for 12 h, the density of the kernels produced using a X-energy proprietary carbon suspension, which is commercially available, ranged from 96% to 100% of theoretical density (TD), with full conversion of UC to UCmore » 2 at both carbon concentrations. However, higher carbon concentrations such as a 2.5 mol ratio of carbon to uranium in the feed solutions failed to produce gel spheres with the proprietary carbon suspension. The kernels using our former baseline of Mogul L carbon black and Tamol SN were 90–92% of TD with full conversion of UC to UC 2 at a variety of carbon levels. Raven 5000 carbon black and Tamol SN were used to produce 10 mol % UC2 kernels with 95% of TD. However, an increase in the Raven 5000 concentration led to a kernel density below 90% of TD. Raven 3500 carbon black and Tamol SN were used to make very dense kernels without complete conversion to UC 2. Lastly, the selection of the carbon black and dispersing agent is highly dependent on the desired final properties of the target kernels.« less

  8. Evaluation of various carbon blacks and dispersing agents for use in the preparation of uranium microspheres with carbon

    NASA Astrophysics Data System (ADS)

    Hunt, R. D.; Johnson, J. A.; Collins, J. L.; McMurray, J. W.; Reif, T. J.; Brown, D. R.

    2018-01-01

    A comparison study on carbon blacks and dispersing agents was performed to determine their impacts on the final properties of uranium fuel kernels with carbon. The main target compositions in this internal gelation study were 10 and 20 mol % uranium dicarbide (UC2), which is UC1.86, with the balance uranium dioxide. After heat treatment at 1900 K in flowing carbon monoxide in argon for 12 h, the density of the kernels produced using a X-energy proprietary carbon suspension, which is commercially available, ranged from 96% to 100% of theoretical density (TD), with full conversion of UC to UC2 at both carbon concentrations. However, higher carbon concentrations such as a 2.5 mol ratio of carbon to uranium in the feed solutions failed to produce gel spheres with the proprietary carbon suspension. The kernels using our former baseline of Mogul L carbon black and Tamol SN were 90-92% of TD with full conversion of UC to UC2 at a variety of carbon levels. Raven 5000 carbon black and Tamol SN were used to produce 10 mol % UC2 kernels with 95% of TD. However, an increase in the Raven 5000 concentration led to a kernel density below 90% of TD. Raven 3500 carbon black and Tamol SN were used to make very dense kernels without complete conversion to UC2. The selection of the carbon black and dispersing agent is highly dependent on the desired final properties of the target kernels.

  9. Evaluation of various carbon blacks and dispersing agents for use in the preparation of uranium microspheres with carbon

    DOE PAGES

    Hunt, Rodney Dale; Johnson, Jared A.; Collins, Jack Lee; ...

    2017-10-12

    A comparison study on carbon blacks and dispersing agents was performed to determine their impacts on the final properties of uranium fuel kernels with carbon. The main target compositions in this internal gelation study were 10 and 20 mol % uranium dicarbide (UC 2), which is UC 1.86, with the balance uranium dioxide. After heat treatment at 1900 K in flowing carbon monoxide in argon for 12 h, the density of the kernels produced using a X-energy proprietary carbon suspension, which is commercially available, ranged from 96% to 100% of theoretical density (TD), with full conversion of UC to UCmore » 2 at both carbon concentrations. However, higher carbon concentrations such as a 2.5 mol ratio of carbon to uranium in the feed solutions failed to produce gel spheres with the proprietary carbon suspension. The kernels using our former baseline of Mogul L carbon black and Tamol SN were 90–92% of TD with full conversion of UC to UC 2 at a variety of carbon levels. Raven 5000 carbon black and Tamol SN were used to produce 10 mol % UC2 kernels with 95% of TD. However, an increase in the Raven 5000 concentration led to a kernel density below 90% of TD. Raven 3500 carbon black and Tamol SN were used to make very dense kernels without complete conversion to UC 2. Lastly, the selection of the carbon black and dispersing agent is highly dependent on the desired final properties of the target kernels.« less

  10. Brachypodium distachyon-Cochliobolus sativus pathosystem is a new model for studying plant-fungal interactions in cereal crops

    USDA-ARS?s Scientific Manuscript database

    Cochliobolus sativus (anamorph: Bipolaris sorokiniana) causes three major diseases in barley and wheat, including spot blotch, common root rot and kernel blight or black point. These diseases significantly reduce the yield and quality of the two most important cereal crops in the US and other region...

  11. 7 CFR 810.202 - Definition of other terms.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... barley kernels, other grains, and wild oats that are badly shrunken and distinctly discolored black or... kernels. Kernels and pieces of barley kernels that are distinctly indented, immature or shrunken in...

  12. 7 CFR 810.202 - Definition of other terms.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... barley kernels, other grains, and wild oats that are badly shrunken and distinctly discolored black or... kernels. Kernels and pieces of barley kernels that are distinctly indented, immature or shrunken in...

  13. 7 CFR 810.202 - Definition of other terms.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... barley kernels, other grains, and wild oats that are badly shrunken and distinctly discolored black or... kernels. Kernels and pieces of barley kernels that are distinctly indented, immature or shrunken in...

  14. Analysis and Implementation of Particle-to-Particle (P2P) Graphics Processor Unit (GPU) Kernel for Black-Box Adaptive Fast Multipole Method

    DTIC Science & Technology

    2015-06-01

    5110P and 16 dx360M4 nodes each with one NVIDIA Kepler K20M/K40M GPU. Each node contained dual Intel Xeon E5-2670 (Sandy Bridge) central processing...kernel and as such does not employ multiple processors. This work makes use of a single processing core and a single NVIDIA Kepler K40 GK110...bandwidth (2 × 16 slot), 7.877 GFloat/s; Kepler K40 peak, 4,290 × 1 billion floating-point operations (GFLOPs), and 288 GB/s Kepler K40 memory

  15. 7 CFR 51.1450 - Serious damage.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...; (c) Decay affecting any portion of the kernel; (d) Insects, web, or frass or any distinct evidence of insect feeding on the kernel; (e) Internal discoloration which is dark gray, dark brown, or black and...) Dark kernel spots when more than three are on the kernel, or when any dark kernel spot or the aggregate...

  16. 7 CFR 51.1450 - Serious damage.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...; (c) Decay affecting any portion of the kernel; (d) Insects, web, or frass or any distinct evidence of insect feeding on the kernel; (e) Internal discoloration which is dark gray, dark brown, or black and...) Dark kernel spots when more than three are on the kernel, or when any dark kernel spot or the aggregate...

  17. 7 CFR 51.1450 - Serious damage.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...; (c) Decay affecting any portion of the kernel; (d) Insects, web, or frass or any distinct evidence of insect feeding on the kernel; (e) Internal discoloration which is dark gray, dark brown, or black and...) Dark kernel spots when more than three are on the kernel, or when any dark kernel spot or the aggregate...

  18. 7 CFR 51.2560 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... are excessively thin kernels and can have black, brown or gray surface with a dark interior color and the immaturity has adversely affected the flavor of the kernel. (2) Kernel spotting refers to dark brown or dark gray spots aggregating more than one-eighth of the surface of the kernel. (g) Serious...

  19. 7 CFR 51.2560 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... are excessively thin kernels and can have black, brown or gray surface with a dark interior color and the immaturity has adversely affected the flavor of the kernel. (2) Kernel spotting refers to dark brown or dark gray spots aggregating more than one-eighth of the surface of the kernel. (g) Serious...

  20. 7 CFR 51.2560 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... have black, brown or gray surface with a dark interior color and the immaturity has adversely affected the flavor of the kernel. (2) Kernel spotting refers to dark brown or dark gray spots aggregating more... the kernel shows conspicuous evidence of feeding. (3) Insect damage is an insect, insect fragment, web...

  1. 7 CFR 51.2560 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... have black, brown or gray surface with a dark interior color and the immaturity has adversely affected the flavor of the kernel. (2) Kernel spotting refers to dark brown or dark gray spots aggregating more... the kernel shows conspicuous evidence of feeding. (3) Insect damage is an insect, insect fragment, web...

  2. 7 CFR 51.2560 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... have black, brown or gray surface with a dark interior color and the immaturity has adversely affected the flavor of the kernel. (2) Kernel spotting refers to dark brown or dark gray spots aggregating more... the kernel shows conspicuous evidence of feeding. (3) Insect damage is an insect, insect fragment, web...

  3. 7 CFR 51.1450 - Serious damage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... not be classed as rancidity; (c) Decay affecting any portion of the kernel; (d) Insects, web, or frass or any distinct evidence of insect feeding on the kernel; (e) Internal discoloration which is dark gray, dark brown, or black and extends more than one-third the length of the half-kernel or piece; (f...

  4. 7 CFR 51.1450 - Serious damage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... not be classed as rancidity; (c) Decay affecting any portion of the kernel; (d) Insects, web, or frass or any distinct evidence of insect feeding on the kernel; (e) Internal discoloration which is dark gray, dark brown, or black and extends more than one-third the length of the half-kernel or piece; (f...

  5. Oil point and mechanical behaviour of oil palm kernels in linear compression

    NASA Astrophysics Data System (ADS)

    Kabutey, Abraham; Herak, David; Choteborsky, Rostislav; Mizera, Čestmír; Sigalingging, Riswanti; Akangbe, Olaosebikan Layi

    2017-07-01

    The study described the oil point and mechanical properties of roasted and unroasted bulk oil palm kernels under compression loading. The literature information available is very limited. A universal compression testing machine and vessel diameter of 60 mm with a plunger were used by applying maximum force of 100 kN and speed ranging from 5 to 25 mm min-1. The initial pressing height of the bulk kernels was measured at 40 mm. The oil point was determined by a litmus test for each deformation level of 5, 10, 15, 20, and 25 mm at a minimum speed of 5 mmmin-1. The measured parameters were the deformation, deformation energy, oil yield, oil point strain and oil point pressure. Clearly, the roasted bulk kernels required less deformation energy compared to the unroasted kernels for recovering the kernel oil. However, both kernels were not permanently deformed. The average oil point strain was determined at 0.57. The study is an essential contribution to pursuing innovative methods for processing palm kernel oil in rural areas of developing countries.

  6. Effect of dietary fiber on the activity of intestinal and fecal beta-glucuronidase activity during 1,2-dimethylhydrazine induced colon carcinogenesis.

    PubMed

    Manoj, G; Thampi, B S; Leelamma, S; Menon, P V

    2001-01-01

    The effects of fiber isolated from black gram (Phaseolus mungo) and coconut (Cocos nucifera) kernel on the metabolic activity of intestinal and fecal beta glucuronidase activity during 1,2-dimethylhydrazine induced colon carcinogenesis were studied. The results indicated that the inclusion of fiber from black gram and coconut kernel generally supported lower specific activities and less fecal output of beta-glucuronidase than did the fiber free diet. This study suggests that the fibers isolated from coconut or black gram may potentially play a role in preventing the formation of colon tumors induced by the carcinogen 1,2-dimethylhydrazine by reducing the activity of the intestinal as well as fecal beta-glucuronidase.

  7. Anthocyanin composition and oxygen radical scavenging capacity (ORAC) of milled and pearled purple, black, and common barley.

    PubMed

    Bellido, Guillermo G; Beta, Trust

    2009-02-11

    The importance of anthocyanins to the total antioxidant capacity of various fruits and vegetables has been well established, but less attention has been focused on cereal grains. This study investigated the antioxidant capacity and anthocyanin composition of a bran-rich pearling fraction (10% outer kernel layers) and whole kernel flour of purple (CI-1248), black (PERU-35), and yellow (EX-83) barley genotypes. HPLC analysis showed that as much as 6 times more anthocyanin per unit weight (microg/g) was present in the bran-rich fractions of yellow and purple barley (1587 and 3534, respectively) than in their corresponding whole kernel flours (210 and 573, respectively). Delphinidin 3-glucoside, delphinidin 3-rutinoside, cyanidin 3-glucoside, petunidin 3-glucoside, and cyanidin chloride were positively identified in barley, with as many as 9 and 15 anthocyanins being detected in yellow and purple barley, respectively. Antioxidant activity analysis showed that the ORAC values for the bran-rich fractions were significantly (p < 0.05) higher than for the whole kernel flour.

  8. Kernel K-Means Sampling for Nyström Approximation.

    PubMed

    He, Li; Zhang, Hong

    2018-05-01

    A fundamental problem in Nyström-based kernel matrix approximation is the sampling method by which training set is built. In this paper, we suggest to use kernel -means sampling, which is shown in our works to minimize the upper bound of a matrix approximation error. We first propose a unified kernel matrix approximation framework, which is able to describe most existing Nyström approximations under many popular kernels, including Gaussian kernel and polynomial kernel. We then show that, the matrix approximation error upper bound, in terms of the Frobenius norm, is equal to the -means error of data points in kernel space plus a constant. Thus, the -means centers of data in kernel space, or the kernel -means centers, are the optimal representative points with respect to the Frobenius norm error upper bound. Experimental results, with both Gaussian kernel and polynomial kernel, on real-world data sets and image segmentation tasks show the superiority of the proposed method over the state-of-the-art methods.

  9. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  10. LoCoH: Non-parameteric kernel methods for constructing home ranges and utilization distributions

    USGS Publications Warehouse

    Getz, Wayne M.; Fortmann-Roe, Scott; Cross, Paul C.; Lyons, Andrew J.; Ryan, Sadie J.; Wilmers, Christopher C.

    2007-01-01

    Parametric kernel methods currently dominate the literature regarding the construction of animal home ranges (HRs) and utilization distributions (UDs). These methods frequently fail to capture the kinds of hard boundaries common to many natural systems. Recently a local convex hull (LoCoH) nonparametric kernel method, which generalizes the minimum convex polygon (MCP) method, was shown to be more appropriate than parametric kernel methods for constructing HRs and UDs, because of its ability to identify hard boundaries (e.g., rivers, cliff edges) and convergence to the true distribution as sample size increases. Here we extend the LoCoH in two ways: ‘‘fixed sphere-of-influence,’’ or r -LoCoH (kernels constructed from all points within a fixed radius r of each reference point), and an ‘‘adaptive sphere-of-influence,’’ or a -LoCoH (kernels constructed from all points within a radius a such that the distances of all points within the radius to the reference point sum to a value less than or equal to a ), and compare them to the original ‘‘fixed-number-of-points,’’ or k -LoCoH (all kernels constructed from k -1 nearest neighbors of root points). We also compare these nonparametric LoCoH to parametric kernel methods using manufactured data and data collected from GPS collars on African buffalo in the Kruger National Park, South Africa. Our results demonstrate that LoCoH methods are superior to parametric kernel methods in estimating areas used by animals, excluding unused areas (holes) and, generally, in constructing UDs and HRs arising from the movement of animals influenced by hard boundaries and irregular structures (e.g., rocky outcrops). We also demonstrate that a -LoCoH is generally superior to k - and r -LoCoH (with software for all three methods available at http://locoh.cnr.berkeley.edu).

  11. Improvements to the kernel function method of steady, subsonic lifting surface theory

    NASA Technical Reports Server (NTRS)

    Medan, R. T.

    1974-01-01

    The application of a kernel function lifting surface method to three dimensional, thin wing theory is discussed. A technique for determining the influence functions is presented. The technique is shown to require fewer quadrature points, while still calculating the influence functions accurately enough to guarantee convergence with an increasing number of spanwise quadrature points. The method also treats control points on the wing leading and trailing edges. The report introduces and employs an aspect of the kernel function method which apparently has never been used before and which significantly enhances the efficiency of the kernel function approach.

  12. Predicting complex traits using a diffusion kernel on genetic markers with an application to dairy cattle and wheat data

    PubMed Central

    2013-01-01

    Background Arguably, genotypes and phenotypes may be linked in functional forms that are not well addressed by the linear additive models that are standard in quantitative genetics. Therefore, developing statistical learning models for predicting phenotypic values from all available molecular information that are capable of capturing complex genetic network architectures is of great importance. Bayesian kernel ridge regression is a non-parametric prediction model proposed for this purpose. Its essence is to create a spatial distance-based relationship matrix called a kernel. Although the set of all single nucleotide polymorphism genotype configurations on which a model is built is finite, past research has mainly used a Gaussian kernel. Results We sought to investigate the performance of a diffusion kernel, which was specifically developed to model discrete marker inputs, using Holstein cattle and wheat data. This kernel can be viewed as a discretization of the Gaussian kernel. The predictive ability of the diffusion kernel was similar to that of non-spatial distance-based additive genomic relationship kernels in the Holstein data, but outperformed the latter in the wheat data. However, the difference in performance between the diffusion and Gaussian kernels was negligible. Conclusions It is concluded that the ability of a diffusion kernel to capture the total genetic variance is not better than that of a Gaussian kernel, at least for these data. Although the diffusion kernel as a choice of basis function may have potential for use in whole-genome prediction, our results imply that embedding genetic markers into a non-Euclidean metric space has very small impact on prediction. Our results suggest that use of the black box Gaussian kernel is justified, given its connection to the diffusion kernel and its similar predictive performance. PMID:23763755

  13. Mycoflora and mycotoxins in Brazilian black pepper, white pepper and Brazil nuts.

    PubMed

    Freire, F C; Kozakiewicz, Z; Paterson, R R

    2000-01-01

    A wide range of field and storage fungi were isolated from black pepper, white pepper and Brazil nut kernels from Amazonia. A total of 42 species were isolated from both peppers. Aspergillus flavus and A. niger were isolated more frequently from black than from white pepper. Other potential mycotoxigenic species isolated included: A. ochraceus, A. tamarii, A. versicolor, Emericella nidulans and Chaetomium globosum, Penicillium brevicompactum, P. citrinum, P. islandicum and P. glabrum. Species isolated from pepper for the first time were Acrogenospora sphaerocephala, Cylindrocarpon lichenicola, Lacellinopsis sacchari, Microascus cinereus, Petriella setifera and Sporormiella minima. Seventeen species were isolated from Brazil nut kernels. A. flavus was the dominant species followed by A. niger. P. citrinum and P. glabrum were the only penicillia isolated. Species isolated for the first time included Acremonium curvulum, Cunninghamella elegans, Exophiala sp., Fusarium oxysporum, Pseudoallescheria boydii, Rhizopus oryzae, Scopulariopsis sp., Thielavia terricola and Trichoderma citrinoviride. Considerably more metabolites were detected from black than white pepper in qualitative analyses. Chaetocin, penitrem A, and xanthocillin were identified only from black pepper, and tenuazonic acid was identified from both black and white pepper. Aflatoxin G2, chaetoglobosin C, and spinulosin were identified from poor quality brazil nuts. Aflatoxin B1 and B2 were also only detected in poor quality brazil nuts at concentrations of 27.1 micrograms kg-1 and 2.1 micrograms kg-1 respectively (total 29.2 micrograms kg-1).

  14. A method of smoothed particle hydrodynamics using spheroidal kernels

    NASA Technical Reports Server (NTRS)

    Fulbright, Michael S.; Benz, Willy; Davies, Melvyn B.

    1995-01-01

    We present a new method of three-dimensional smoothed particle hydrodynamics (SPH) designed to model systems dominated by deformation along a preferential axis. These systems cause severe problems for SPH codes using spherical kernels, which are best suited for modeling systems which retain rough spherical symmetry. Our method allows the smoothing length in the direction of the deformation to evolve independently of the smoothing length in the perpendicular plane, resulting in a kernel with a spheroidal shape. As a result the spatial resolution in the direction of deformation is significantly improved. As a test case we present the one-dimensional homologous collapse of a zero-temperature, uniform-density cloud, which serves to demonstrate the advantages of spheroidal kernels. We also present new results on the problem of the tidal disruption of a star by a massive black hole.

  15. Proximate Nutritional Evaluation of Gamma Irradiated Black Rice (Oryza sativa L. cv. Cempo ireng)

    NASA Astrophysics Data System (ADS)

    Riyatun; Suharyana; Ramelan, A. H.; Sutarno; Saputra, O. A.; Suryanti, V.

    2018-03-01

    Black rice is a type of pigmented rice with black bran covering the endosperm of the rice kernel. The main objective of the present study was to provide details information on the proximate composition of third generation of gamma irradiated black rice (Oryza sativa L. cv. Cempo ireng). In respect to the control, generally speaking, there were no significant changes of moisture, lipids, proteins, carbohydrates and fibers contents have been observed for the both gamma irradiated black rice. However, the 200-BR has slightly better nutritional value than that of 300-BR and the control. The mineral contents of 200-BR increased significantly of about 35% than the non-gamma irradiated black rice.

  16. General methodology for nonlinear modeling of neural systems with Poisson point-process inputs.

    PubMed

    Marmarelis, V Z; Berger, T W

    2005-07-01

    This paper presents a general methodological framework for the practical modeling of neural systems with point-process inputs (sequences of action potentials or, more broadly, identical events) based on the Volterra and Wiener theories of functional expansions and system identification. The paper clarifies the distinctions between Volterra and Wiener kernels obtained from Poisson point-process inputs. It shows that only the Wiener kernels can be estimated via cross-correlation, but must be defined as zero along the diagonals. The Volterra kernels can be estimated far more accurately (and from shorter data-records) by use of the Laguerre expansion technique adapted to point-process inputs, and they are independent of the mean rate of stimulation (unlike their P-W counterparts that depend on it). The Volterra kernels can also be estimated for broadband point-process inputs that are not Poisson. Useful applications of this modeling approach include cases where we seek to determine (model) the transfer characteristics between one neuronal axon (a point-process 'input') and another axon (a point-process 'output') or some other measure of neuronal activity (a continuous 'output', such as population activity) with which a causal link exists.

  17. Point kernel calculations of skyshine exposure rates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roseberry, M.L.; Shultis, J.K.

    1982-02-01

    A simple point kernel model is presented for the calculation of skyshine exposure rates arising from the atmospheric reflection of gamma radiation produced by a vertically collimated or a shielded point source. This model is shown to be in good agreement with benchmark experimental data from a /sup 60/Co source for distances out to 700 m.

  18. An Approximate Approach to Automatic Kernel Selection.

    PubMed

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  19. Single kernel ionomic profiles are highly heritable indicators of genetic and environmental influences on elemental accumulation in maize grain (Zea mays)

    USDA-ARS?s Scientific Manuscript database

    The ionome, or elemental profile, of a maize kernel represents at least two distinct ideas. First, the collection of elements within the kernel are food, feed and feedstocks for people, animals and industrial processes. Second, the ionome of the kernel represents a developmental end point that can s...

  20. Antidiabetic and antioxidant functionality associated with phenolic constituents from fruit parts of indigenous black jamun (Syzygium cumini L.) landraces.

    PubMed

    Gajera, H P; Gevariya, Shila N; Hirpara, Darshna G; Patel, S V; Golakiya, B A

    2017-09-01

    Fruit phenolics are important dietary antioxidant and antidiabetic constituents. The fruit parts (pulp, seed, seed coat, kernel) of underutilized indigenous six black jamun landraces ( Syzygium cumini L.), found in Gir forest region of India and differed in their fruit size, shape and weight, are evaluated and correlated with antidiabetic, DPPH radical scavenging and phenolic constituents. The α-amylase inhibitors propose an efficient antidiabetic strategy and the levels of postprandial hyperglycemia were lowered by restraining starch breakdown. The sequential solvent systems with ascending polarity-petroleum ether, ethyl acetate, methanol and water were performed for soxhlet extraction by hot percolation method and extractive yield was found maximum with methanolic fruit part extracts of six landraces. The methanolic extracts of fruit parts also evidenced higher antidiabetic activity and hence utilized for further characterization. Among the six landraces, pulp and kernel of BJLR-6 (very small, oblong fruits) evidenced maximum 53.8 and 98.2% inhibition of α-amylase activity, respectively. The seed attained inhibitory activity mostly contributed by the kernel fraction. The inhibition of DPPH radical scavenging activity was positively correlated with phenol constituents. An HPLC-PDA technique was used to quantify the seven individual phenolics. The seed and kernel of BJLR-6 exhibited higher individual phenolics-gallic, catechin, ellagic, ferulic acids and quercetin, whereas pulp evidenced higher with gallic acid and catechin as α-amylase inhibitors. The IC 50 value indicates concentration of fruit extracts exhibiting ≥50% inhibition on porcine pancreatic α-amylase (PPA) activity. The kernel fraction of BJLR6 evidenced lowest (8.3 µg ml -1 ) IC 50 value followed by seed (12.9 µg ml -1 ), seed coat (50.8 µg ml -1 ) and pulp (270 µg ml -1 ). The seed and kernel of BJLR-6 inhibited PPA at much lower concentrations than standard acarbose (24.7 µg ml -1 ) considering good candidates for antidiabetic herbal formulations.

  1. Application of the matrix exponential kernel

    NASA Technical Reports Server (NTRS)

    Rohach, A. F.

    1972-01-01

    A point matrix kernel for radiation transport, developed by the transmission matrix method, has been used to develop buildup factors and energy spectra through slab layers of different materials for a point isotropic source. Combinations of lead-water slabs were chosen for examples because of the extreme differences in shielding properties of these two materials.

  2. Stochastic Gravity: Theory and Applications.

    PubMed

    Hu, Bei Lok; Verdaguer, Enric

    2004-01-01

    Whereas semiclassical gravity is based on the semiclassical Einstein equation with sources given by the expectation value of the stress-energy tensor of quantum fields, stochastic semiclassical gravity is based on the Einstein-Langevin equation, which has in addition sources due to the noise kernel. The noise kernel is the vacuum expectation value of the (operatorvalued) stress-energy bi-tensor which describes the fluctuations of quantum matter fields in curved spacetimes. In the first part, we describe the fundamentals of this new theory via two approaches: the axiomatic and the functional. The axiomatic approach is useful to see the structure of the theory from the framework of semiclassical gravity, showing the link from the mean value of the stress-energy tensor to their correlation functions. The functional approach uses the Feynman-Vernon influence functional and the Schwinger-Keldysh closed-time-path effective action methods which are convenient for computations. It also brings out the open systems concepts and the statistical and stochastic contents of the theory such as dissipation, fluctuations, noise, and decoherence. We then focus on the properties of the stress-energy bi-tensor. We obtain a general expression for the noise kernel of a quantum field defined at two distinct points in an arbitrary curved spacetime as products of covariant derivatives of the quantum field's Green function. In the second part, we describe three applications of stochastic gravity theory. First, we consider metric perturbations in a Minkowski spacetime. We offer an analytical solution of the Einstein-Langevin equation and compute the two-point correlation functions for the linearized Einstein tensor and for the metric perturbations. Second, we discuss structure formation from the stochastic gravity viewpoint, which can go beyond the standard treatment by incorporating the full quantum effect of the inflaton fluctuations. Third, we discuss the backreaction of Hawking radiation in the gravitational background of a quasi-static black hole (enclosed in a box). We derive a fluctuation-dissipation relation between the fluctuations in the radiation and the dissipative dynamics of metric fluctuations.

  3. Black Ink of Activated Carbon Derived From Palm Kernel Cake (PKC)

    NASA Astrophysics Data System (ADS)

    Selamat, M. H.; Ahmad, A. H.

    2009-06-01

    Recycling the waste from natural plant to produce useful end products will benefit many industries and help preserve the environment. The research reported in this paper is an investigation on the use of the natural waste of palm kernel cake (PKC) to produce carbon residue as a black carbon for pigment source by using pyrolysis process. The activated carbons (AC) is produced in powder form using ball milling process. Rheological spectra in ink is one of quality control process in determining its performance properties. Findings from this study will help expand the scientific knowledge-base for black ink production and formulation base on PKC. Various inks with different weight percentage compositions of AC will be made and tested against its respective rheological properties in order to determine ideal ink printing system. The items in the formulation used comprised of organic and bio-waste materials with added additive to improve the quality of the black ink. Modified Polyurethane was used as binder. The binder's properties highlighted an ideal vehicle to be applied for good black ink opacity performance. The rheological behaviour is a general foundation for ink characterization where the wt% of AC-PKC resulted in different pseudoplastic behaviors, including the Newtonian behavior. The result found that Newtonian field was located in between 2 wt% and 10 wt% of AC-PKC composition with binder. Mass spectroscopy results shown that the carbon content in PKC is high and very suitable for black performance. In the ageing test, the pigment of PKC perform fairly according to the standard pigment of Black carbon (CB) of ferum oxide pigment. The contact angle for substrate's wettability of the ink system shown a good angle proven to be a water resistive coating on paper subtrates; an advantage of the PKC ink pigment performance.

  4. Protein Analysis Meets Visual Word Recognition: A Case for String Kernels in the Brain

    ERIC Educational Resources Information Center

    Hannagan, Thomas; Grainger, Jonathan

    2012-01-01

    It has been recently argued that some machine learning techniques known as Kernel methods could be relevant for capturing cognitive and neural mechanisms (Jakel, Scholkopf, & Wichmann, 2009). We point out that "String kernels," initially designed for protein function prediction and spam detection, are virtually identical to one contending proposal…

  5. Quality changes in macadamia kernel between harvest and farm-gate.

    PubMed

    Walton, David A; Wallace, Helen M

    2011-02-01

    Macadamia integrifolia, Macadamia tetraphylla and their hybrids are cultivated for their edible kernels. After harvest, nuts-in-shell are partially dried on-farm and sorted to eliminate poor-quality kernels before consignment to a processor. During these operations, kernel quality may be lost. In this study, macadamia nuts-in-shell were sampled at five points of an on-farm postharvest handling chain from dehusking to the final storage silo to assess quality loss prior to consignment. Shoulder damage, weight of pieces and unsound kernel were assessed for raw kernels, and colour, mottled colour and surface damage for roasted kernels. Shoulder damage, weight of pieces and unsound kernel for raw kernels increased significantly between the dehusker and the final silo. Roasted kernels displayed a significant increase in dark colour, mottled colour and surface damage during on-farm handling. Significant loss of macadamia kernel quality occurred on a commercial farm during sorting and storage of nuts-in-shell before nuts were consigned to a processor. Nuts-in-shell should be dried as quickly as possible and on-farm handling minimised to maintain optimum kernel quality. 2010 Society of Chemical Industry.

  6. Alternative Derivations for the Poisson Integral Formula

    ERIC Educational Resources Information Center

    Chen, J. T.; Wu, C. S.

    2006-01-01

    Poisson integral formula is revisited. The kernel in the Poisson integral formula can be derived in a series form through the direct BEM free of the concept of image point by using the null-field integral equation in conjunction with the degenerate kernels. The degenerate kernels for the closed-form Green's function and the series form of Poisson…

  7. A 3D Ginibre Point Field

    NASA Astrophysics Data System (ADS)

    Kargin, Vladislav

    2018-06-01

    We introduce a family of three-dimensional random point fields using the concept of the quaternion determinant. The kernel of each field is an n-dimensional orthogonal projection on a linear space of quaternionic polynomials. We find explicit formulas for the basis of the orthogonal quaternion polynomials and for the kernel of the projection. For number of particles n → ∞, we calculate the scaling limits of the point field in the bulk and at the center of coordinates. We compare our construction with the previously introduced Fermi-sphere point field process.

  8. Initial Simulations of RF Waves in Hot Plasmas Using the FullWave Code

    NASA Astrophysics Data System (ADS)

    Zhao, Liangji; Svidzinski, Vladimir; Spencer, Andrew; Kim, Jin-Soo

    2017-10-01

    FullWave is a simulation tool that models RF fields in hot inhomogeneous magnetized plasmas. The wave equations with linearized hot plasma dielectric response are solved in configuration space on adaptive cloud of computational points. The nonlocal hot plasma dielectric response is formulated by calculating the plasma conductivity kernel based on the solution of the linearized Vlasov equation in inhomogeneous magnetic field. In an rf field, the hot plasma dielectric response is limited to the distance of a few particles' Larmor radii, near the magnetic field line passing through the test point. The localization of the hot plasma dielectric response results in a sparse matrix of the problem thus significantly reduces the size of the problem and makes the simulations faster. We will present the initial results of modeling of rf waves using the Fullwave code, including calculation of nonlocal conductivity kernel in 2D Tokamak geometry; the interpolation of conductivity kernel from test points to adaptive cloud of computational points; and the results of self-consistent simulations of 2D rf fields using calculated hot plasma conductivity kernel in a tokamak plasma with reduced parameters. Work supported by the US DOE ``SBIR program.

  9. Kernel-Based Sensor Fusion With Application to Audio-Visual Voice Activity Detection

    NASA Astrophysics Data System (ADS)

    Dov, David; Talmon, Ronen; Cohen, Israel

    2016-12-01

    In this paper, we address the problem of multiple view data fusion in the presence of noise and interferences. Recent studies have approached this problem using kernel methods, by relying particularly on a product of kernels constructed separately for each view. From a graph theory point of view, we analyze this fusion approach in a discrete setting. More specifically, based on a statistical model for the connectivity between data points, we propose an algorithm for the selection of the kernel bandwidth, a parameter, which, as we show, has important implications on the robustness of this fusion approach to interferences. Then, we consider the fusion of audio-visual speech signals measured by a single microphone and by a video camera pointed to the face of the speaker. Specifically, we address the task of voice activity detection, i.e., the detection of speech and non-speech segments, in the presence of structured interferences such as keyboard taps and office noise. We propose an algorithm for voice activity detection based on the audio-visual signal. Simulation results show that the proposed algorithm outperforms competing fusion and voice activity detection approaches. In addition, we demonstrate that a proper selection of the kernel bandwidth indeed leads to improved performance.

  10. Absorbed dose kernel and self-shielding calculations for a novel radiopaque glass microsphere for transarterial radioembolization.

    PubMed

    Church, Cody; Mawko, George; Archambault, John Paul; Lewandowski, Robert; Liu, David; Kehoe, Sharon; Boyd, Daniel; Abraham, Robert; Syme, Alasdair

    2018-02-01

    Radiopaque microspheres may provide intraprocedural and postprocedural feedback during transarterial radioembolization (TARE). Furthermore, the potential to use higher resolution x-ray imaging techniques as opposed to nuclear medicine imaging suggests that significant improvements in the accuracy and precision of radiation dosimetry calculations could be realized for this type of therapy. This study investigates the absorbed dose kernel for novel radiopaque microspheres including contributions of both short and long-lived contaminant radionuclides while concurrently quantifying the self-shielding of the glass network. Monte Carlo simulations using EGSnrc were performed to determine the dose kernels for all monoenergetic electron emissions and all beta spectra for radionuclides reported in a neutron activation study of the microspheres. Simulations were benchmarked against an accepted 90 Y dose point kernel. Self-shielding was quantified for the microspheres by simulating an isotropically emitting, uniformly distributed source, in glass and in water. The ratio of the absorbed doses was scored as a function of distance from a microsphere. The absorbed dose kernel for the microspheres was calculated for (a) two bead formulations following (b) two different durations of neutron activation, at (c) various time points following activation. Self-shielding varies with time postremoval from the reactor. At early time points, it is less pronounced due to the higher energies of the emissions. It is on the order of 0.4-2.8% at a radial distance of 5.43 mm with increased size from 10 to 50 μm in diameter during the time that the microspheres would be administered to a patient. At long time points, self-shielding is more pronounced and can reach values in excess of 20% near the end of the range of the emissions. Absorbed dose kernels for 90 Y, 90m Y, 85m Sr, 85 Sr, 87m Sr, 89 Sr, 70 Ga, 72 Ga, and 31 Si are presented and used to determine an overall kernel for the microspheres based on weighted activities. The shapes of the absorbed dose kernels are dominated at short times postactivation by the contributions of 70 Ga and 72 Ga. Following decay of the short-lived contaminants, the absorbed dose kernel is effectively that of 90 Y. After approximately 1000 h postactivation, the contributions of 85 Sr and 89 Sr become increasingly dominant, though the absorbed dose-rate around the beads drops by roughly four orders of magnitude. The introduction of high atomic number elements for the purpose of increasing radiopacity necessarily leads to the production of radionuclides other than 90 Y in the microspheres. Most of the radionuclides in this study are short-lived and are likely not of any significant concern for this therapeutic agent. The presence of small quantities of longer lived radionuclides will change the shape of the absorbed dose kernel around a microsphere at long time points postadministration when activity levels are significantly reduced. © 2017 American Association of Physicists in Medicine.

  11. The partial replacement of palm kernel shell by carbon black and halloysite nanotubes as fillers in natural rubber composites

    NASA Astrophysics Data System (ADS)

    Daud, Shuhairiah; Ismail, Hanafi; Bakar, Azhar Abu

    2017-07-01

    The effect of partial replacement of palm kernel shell powder by carbon black (CB) and halloysite nanotube (HNT) on the tensile properties, rubber-filler interaction, thermal properties and morphological studies of natural rubber (NR) composites were investigated. Four different compositions of NR/PKS/CB and NR/PKS/HNT composites i.e 20/0, 15/5, 10/10,5/15 and 0/20 parts per hundred rubber (phr) were prepared on a two roll mill. The results showed that the tensile strength and modulus at 100% elongation (M100) and 300% elongation (M300) were higher for NR/PKS/CB compared to NR/PKS/HNT composites. NR/PKS/CB composites had the lowest elongation at break (Eb). The effect of commercial fillers in NR/PKS composites on tensile properties was confirmed by the rubber-filler interaction and scanning electron microscopy (SEM) study. The thermal stability of PKS filled NR composites with partially replaced by commercial fillers also determined by Thermo gravimetric Analysis (TGA).

  12. Noise kernels of stochastic gravity in conformally-flat spacetimes

    NASA Astrophysics Data System (ADS)

    Cho, H. T.; Hu, B. L.

    2015-03-01

    The central object in the theory of semiclassical stochastic gravity is the noise kernel, which is the symmetric two point correlation function of the stress-energy tensor. Using the corresponding Wightman functions in Minkowski, Einstein and open Einstein spaces, we construct the noise kernels of a conformally coupled scalar field in these spacetimes. From them we show that the noise kernels in conformally-flat spacetimes, including the Friedmann-Robertson-Walker universes, can be obtained in closed analytic forms by using a combination of conformal and coordinate transformations.

  13. A robust, high-throughput method for computing maize ear, cob, and kernel attributes automatically from images.

    PubMed

    Miller, Nathan D; Haase, Nicholas J; Lee, Jonghyun; Kaeppler, Shawn M; de Leon, Natalia; Spalding, Edgar P

    2017-01-01

    Grain yield of the maize plant depends on the sizes, shapes, and numbers of ears and the kernels they bear. An automated pipeline that can measure these components of yield from easily-obtained digital images is needed to advance our understanding of this globally important crop. Here we present three custom algorithms designed to compute such yield components automatically from digital images acquired by a low-cost platform. One algorithm determines the average space each kernel occupies along the cob axis using a sliding-window Fourier transform analysis of image intensity features. A second counts individual kernels removed from ears, including those in clusters. A third measures each kernel's major and minor axis after a Bayesian analysis of contour points identifies the kernel tip. Dimensionless ear and kernel shape traits that may interrelate yield components are measured by principal components analysis of contour point sets. Increased objectivity and speed compared to typical manual methods are achieved without loss of accuracy as evidenced by high correlations with ground truth measurements and simulated data. Millimeter-scale differences among ear, cob, and kernel traits that ranged more than 2.5-fold across a diverse group of inbred maize lines were resolved. This system for measuring maize ear, cob, and kernel attributes is being used by multiple research groups as an automated Web service running on community high-throughput computing and distributed data storage infrastructure. Users may create their own workflow using the source code that is staged for download on a public repository. © 2016 The Authors. The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  14. Small convolution kernels for high-fidelity image restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1991-01-01

    An algorithm is developed for computing the mean-square-optimal values for small, image-restoration kernels. The algorithm is based on a comprehensive, end-to-end imaging system model that accounts for the important components of the imaging process: the statistics of the scene, the point-spread function of the image-gathering device, sampling effects, noise, and display reconstruction. Subject to constraints on the spatial support of the kernel, the algorithm generates the kernel values that restore the image with maximum fidelity, that is, the kernel minimizes the expected mean-square restoration error. The algorithm is consistent with the derivation of the spatially unconstrained Wiener filter, but leads to a small, spatially constrained kernel that, unlike the unconstrained filter, can be efficiently implemented by convolution. Simulation experiments demonstrate that for a wide range of imaging systems these small kernels can restore images with fidelity comparable to images restored with the unconstrained Wiener filter.

  15. Richardson-Lucy deblurring for the star scene under a thinning motion path

    NASA Astrophysics Data System (ADS)

    Su, Laili; Shao, Xiaopeng; Wang, Lin; Wang, Haixin; Huang, Yining

    2015-05-01

    This paper puts emphasis on how to model and correct image blur that arises from a camera's ego motion while observing a distant star scene. Concerning the significance of accurate estimation of point spread function (PSF), a new method is employed to obtain blur kernel by thinning star motion path. In particular, how the blurred star image can be corrected to reconstruct the clear scene with a thinning motion blur model which describes the camera's path is presented. This thinning motion path to build blur kernel model is more effective at modeling the spatially motion blur introduced by camera's ego motion than conventional blind estimation of kernel-based PSF parameterization. To gain the reconstructed image, firstly, an improved thinning algorithm is used to obtain the star point trajectory, so as to extract the blur kernel of the motion-blurred star image. Then how motion blur model can be incorporated into the Richardson-Lucy (RL) deblurring algorithm, which reveals its overall effectiveness, is detailed. In addition, compared with the conventional estimated blur kernel, experimental results show that the proposed method of using thinning algorithm to get the motion blur kernel is of less complexity, higher efficiency and better accuracy, which contributes to better restoration of the motion-blurred star images.

  16. Integrating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Wilton, Donald R.

    2008-01-01

    A formulation for integrating the gradient of the thin wire kernel is presented. This approach employs a new expression for the gradient of the thin wire kernel derived from a recent technique for numerically evaluating the exact thin wire kernel. This approach should provide essentially arbitrary accuracy and may be used with higher-order elements and basis functions using the procedure described in [4].When the source and observation points are close, the potential integrals over wire segments involving the wire kernel are split into parts to handle the singular behavior of the integrand [1]. The singularity characteristics of the gradient of the wire kernel are different than those of the wire kernel, and the axial and radial components have different singularities. The characteristics of the gradient of the wire kernel are discussed in [2]. To evaluate the near electric and magnetic fields of a wire, the integration of the gradient of the wire kernel needs to be calculated over the source wire. Since the vector bases for current have constant direction on linear wire segments, these integrals reduce to integrals of the form

  17. Direct Measurement of Wave Kernels in Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.

    2006-01-01

    Solar f-mode waves are surface-gravity waves which propagate horizontally in a thin layer near the photosphere with a dispersion relation approximately that of deep water waves. At the power maximum near 3 mHz, the wavelength of 5 Mm is large enough for various wave scattering properties to be observable. Gizon and Birch (2002,ApJ,571,966)h ave calculated kernels, in the Born approximation, for the sensitivity of wave travel times to local changes in damping rate and source strength. In this work, using isolated small magnetic features as approximate point-sourc'e scatterers, such a kernel has been measured. The observed kernel contains similar features to a theoretical damping kernel but not for a source kernel. A full understanding of the effect of small magnetic features on the waves will require more detailed modeling.

  18. Explaining Support Vector Machines: A Color Based Nomogram

    PubMed Central

    Van Belle, Vanya; Van Calster, Ben; Van Huffel, Sabine; Suykens, Johan A. K.; Lisboa, Paulo

    2016-01-01

    Problem setting Support vector machines (SVMs) are very popular tools for classification, regression and other problems. Due to the large choice of kernels they can be applied with, a large variety of data can be analysed using these tools. Machine learning thanks its popularity to the good performance of the resulting models. However, interpreting the models is far from obvious, especially when non-linear kernels are used. Hence, the methods are used as black boxes. As a consequence, the use of SVMs is less supported in areas where interpretability is important and where people are held responsible for the decisions made by models. Objective In this work, we investigate whether SVMs using linear, polynomial and RBF kernels can be explained such that interpretations for model-based decisions can be provided. We further indicate when SVMs can be explained and in which situations interpretation of SVMs is (hitherto) not possible. Here, explainability is defined as the ability to produce the final decision based on a sum of contributions which depend on one single or at most two input variables. Results Our experiments on simulated and real-life data show that explainability of an SVM depends on the chosen parameter values (degree of polynomial kernel, width of RBF kernel and regularization constant). When several combinations of parameter values yield the same cross-validation performance, combinations with a lower polynomial degree or a larger kernel width have a higher chance of being explainable. Conclusions This work summarizes SVM classifiers obtained with linear, polynomial and RBF kernels in a single plot. Linear and polynomial kernels up to the second degree are represented exactly. For other kernels an indication of the reliability of the approximation is presented. The complete methodology is available as an R package and two apps and a movie are provided to illustrate the possibilities offered by the method. PMID:27723811

  19. Exploring microwave resonant multi-point ignition using high-speed schlieren imaging

    NASA Astrophysics Data System (ADS)

    Liu, Cheng; Zhang, Guixin; Xie, Hong; Deng, Lei; Wang, Zhi

    2018-03-01

    Microwave plasma offers a potential method to achieve rapid combustion in a high-speed combustor. In this paper, microwave resonant multi-point ignition and its control method have been studied via high-speed schlieren imaging. The experiment was conducted with the microwave resonant ignition system and the schlieren optical system. The microwave pulse in 2.45 GHz with 2 ms width and 3 kW peak power was employed as an ignition energy source to produce initial flame kernels in the combustion chamber. A reflective schlieren method was designed to illustrate the flame development process with a high-speed camera. The bottom of the combustion chamber was made of a quartz glass coated with indium tin oxide, which ensures sufficient microwave reflection and light penetration. Ignition experiments were conducted at 2 bars of stoichiometric methane-air mixtures. Schlieren images show that flame kernels were generated at more than one location simultaneously and flame propagated with different speeds in different flame kernels. Ignition kernels were discussed in three types according to their appearances. Pressure curves and combustion duration also show that multi-point ignition plays a significant role in accelerating combustion.

  20. Exploring microwave resonant multi-point ignition using high-speed schlieren imaging.

    PubMed

    Liu, Cheng; Zhang, Guixin; Xie, Hong; Deng, Lei; Wang, Zhi

    2018-03-01

    Microwave plasma offers a potential method to achieve rapid combustion in a high-speed combustor. In this paper, microwave resonant multi-point ignition and its control method have been studied via high-speed schlieren imaging. The experiment was conducted with the microwave resonant ignition system and the schlieren optical system. The microwave pulse in 2.45 GHz with 2 ms width and 3 kW peak power was employed as an ignition energy source to produce initial flame kernels in the combustion chamber. A reflective schlieren method was designed to illustrate the flame development process with a high-speed camera. The bottom of the combustion chamber was made of a quartz glass coated with indium tin oxide, which ensures sufficient microwave reflection and light penetration. Ignition experiments were conducted at 2 bars of stoichiometric methane-air mixtures. Schlieren images show that flame kernels were generated at more than one location simultaneously and flame propagated with different speeds in different flame kernels. Ignition kernels were discussed in three types according to their appearances. Pressure curves and combustion duration also show that multi-point ignition plays a significant role in accelerating combustion.

  1. Standardising Home Range Studies for Improved Management of the Critically Endangered Black Rhinoceros

    PubMed Central

    Plotz, Roan D.; Grecian, W. James; Kerley, Graham I.H.; Linklater, Wayne L.

    2016-01-01

    Comparisons of recent estimations of home range sizes for the critically endangered black rhinoceros in Hluhluwe-iMfolozi Park (HiP), South Africa, with historical estimates led reports of a substantial (54%) increase, attributed to over-stocking and habitat deterioration that has far-reaching implications for rhino conservation. Other reports, however, suggest the increase is more likely an artefact caused by applying various home range estimators to non-standardised datasets. We collected 1939 locations of 25 black rhino over six years (2004–2009) to estimate annual home ranges and evaluate the hypothesis that they have increased in size. A minimum of 30 and 25 locations were required for accurate 95% MCP estimation of home range of adult rhinos, during the dry and wet seasons respectively. Forty and 55 locations were required for adult female and male annual MCP home ranges, respectively, and 30 locations were necessary for estimating 90% bivariate kernel home ranges accurately. Average annual 95% bivariate kernel home ranges were 20.4 ± 1.2 km2, 53 ±1.9% larger than 95% MCP ranges (9.8 km2 ± 0.9). When home range techniques used during the late-1960s in HiP were applied to our dataset, estimates were similar, indicating that ranges have not changed substantially in 50 years. Inaccurate, non-standardised, home range estimates and their comparison have the potential to mislead black rhino population management. We recommend that more care be taken to collect adequate numbers of rhino locations within standardized time periods (i.e., season or year) and that the comparison of home ranges estimated using dissimilar procedures be avoided. Home range studies of black rhino have been data deficient and procedurally inconsistent. Standardisation of methods is required. PMID:27028728

  2. Standardising Home Range Studies for Improved Management of the Critically Endangered Black Rhinoceros.

    PubMed

    Plotz, Roan D; Grecian, W James; Kerley, Graham I H; Linklater, Wayne L

    2016-01-01

    Comparisons of recent estimations of home range sizes for the critically endangered black rhinoceros in Hluhluwe-iMfolozi Park (HiP), South Africa, with historical estimates led reports of a substantial (54%) increase, attributed to over-stocking and habitat deterioration that has far-reaching implications for rhino conservation. Other reports, however, suggest the increase is more likely an artefact caused by applying various home range estimators to non-standardised datasets. We collected 1939 locations of 25 black rhino over six years (2004-2009) to estimate annual home ranges and evaluate the hypothesis that they have increased in size. A minimum of 30 and 25 locations were required for accurate 95% MCP estimation of home range of adult rhinos, during the dry and wet seasons respectively. Forty and 55 locations were required for adult female and male annual MCP home ranges, respectively, and 30 locations were necessary for estimating 90% bivariate kernel home ranges accurately. Average annual 95% bivariate kernel home ranges were 20.4 ± 1.2 km(2), 53 ± 1.9% larger than 95% MCP ranges (9.8 km(2) ± 0.9). When home range techniques used during the late-1960s in HiP were applied to our dataset, estimates were similar, indicating that ranges have not changed substantially in 50 years. Inaccurate, non-standardised, home range estimates and their comparison have the potential to mislead black rhino population management. We recommend that more care be taken to collect adequate numbers of rhino locations within standardized time periods (i.e., season or year) and that the comparison of home ranges estimated using dissimilar procedures be avoided. Home range studies of black rhino have been data deficient and procedurally inconsistent. Standardisation of methods is required.

  3. Gaussian processes with optimal kernel construction for neuro-degenerative clinical onset prediction

    NASA Astrophysics Data System (ADS)

    Canas, Liane S.; Yvernault, Benjamin; Cash, David M.; Molteni, Erika; Veale, Tom; Benzinger, Tammie; Ourselin, Sébastien; Mead, Simon; Modat, Marc

    2018-02-01

    Gaussian Processes (GP) are a powerful tool to capture the complex time-variations of a dataset. In the context of medical imaging analysis, they allow a robust modelling even in case of highly uncertain or incomplete datasets. Predictions from GP are dependent of the covariance kernel function selected to explain the data variance. To overcome this limitation, we propose a framework to identify the optimal covariance kernel function to model the data.The optimal kernel is defined as a composition of base kernel functions used to identify correlation patterns between data points. Our approach includes a modified version of the Compositional Kernel Learning (CKL) algorithm, in which we score the kernel families using a new energy function that depends both the Bayesian Information Criterion (BIC) and the explained variance score. We applied the proposed framework to model the progression of neurodegenerative diseases over time, in particular the progression of autosomal dominantly-inherited Alzheimer's disease, and use it to predict the time to clinical onset of subjects carrying genetic mutation.

  4. On the Floating Point Performance of the i860 Microprocessor

    NASA Technical Reports Server (NTRS)

    Lee, King; Kutler, Paul (Technical Monitor)

    1997-01-01

    The i860 microprocessor is a pipelined processor that can deliver two double precision floating point results every clock. It is being used in the Touchstone project to develop a teraflop computer by the year 2000. With such high computational capabilities it was expected that memory bandwidth would limit performance on many kernels. Measured performance of three kernels showed performance is less than what memory bandwidth limitations would predict. This paper develops a model that explains the discrepancy in terms of memory latencies and points to some problems involved in moving data from memory to the arithmetic pipelines.

  5. SU-F-SPS-09: Parallel MC Kernel Calculations for VMAT Plan Improvement

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chamberlain, S; Roswell Park Cancer Institute, Buffalo, NY; French, S

    Purpose: Adding kernels (small perturbations in leaf positions) to the existing apertures of VMAT control points may improve plan quality. We investigate the calculation of kernel doses using a parallelized Monte Carlo (MC) method. Methods: A clinical prostate VMAT DICOM plan was exported from Eclipse. An arbitrary control point and leaf were chosen, and a modified MLC file was created, corresponding to the leaf position offset by 0.5cm. The additional dose produced by this 0.5 cm × 0.5 cm kernel was calculated using the DOSXYZnrc component module of BEAMnrc. A range of particle history counts were run (varying from 3more » × 10{sup 6} to 3 × 10{sup 7}); each job was split among 1, 10, or 100 parallel processes. A particle count of 3 × 10{sup 6} was established as the lower range because it provided the minimal accuracy level. Results: As expected, an increase in particle counts linearly increases run time. For the lowest particle count, the time varied from 30 hours for the single-processor run, to 0.30 hours for the 100-processor run. Conclusion: Parallel processing of MC calculations in the EGS framework significantly decreases time necessary for each kernel dose calculation. Particle counts lower than 1 × 10{sup 6} have too large of an error to output accurate dose for a Monte Carlo kernel calculation. Future work will investigate increasing the number of parallel processes and optimizing run times for multiple kernel calculations.« less

  6. Heterogeneity in Schooling Rates of Return

    ERIC Educational Resources Information Center

    Henderson, Daniel J.; Polachek, Solomon W.; Wang, Le

    2011-01-01

    This paper relaxes the assumption of homogeneous rates of return to schooling by employing nonparametric kernel regression. This approach allows us to examine the differences in rates of return to education both across and within groups. Similar to previous studies we find that on average blacks have higher returns to education than whites,…

  7. Biochemical studies of some non-conventional sources of proteins. Part 7. Effect of detoxification treatments on the nutritional quality of apricot kernels.

    PubMed

    el-Adawy, T A; Rahma, E H; el-Badawey, A A; Gomaa, M A; Lásztity, R; Sarkadi, L

    1994-01-01

    Detoxification of apricot kernels by soaking in distilled water and ammonium hydroxide for 30 h at 47 degrees C decreased the total protein, non-protein nitrogen, total ash, glucose, sucrose, minerals, non-essential amino acids, polar amino acids, acidic amino acids, aromatic amino acids, antinutritional factors, hydrocyanic acid, tannins and phytic acid. On the other hand, removal of toxic and bitter compounds from apricot kernels increased the relative content of crude fibre, starch, total essential amino acids. Higher in-vitro protein digestibility and biological value was also observed. Generally, the detoxified apricot kernels were nutritionally well balanced. Utilization and incorporation of detoxified apricot kernel flours in food products is completely safe from the toxicity point of view.

  8. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  9. New Fukui, dual and hyper-dual kernels as bond reactivity descriptors.

    PubMed

    Franco-Pérez, Marco; Polanco-Ramírez, Carlos-A; Ayers, Paul W; Gázquez, José L; Vela, Alberto

    2017-06-21

    We define three new linear response indices with promising applications for bond reactivity using the mathematical framework of τ-CRT (finite temperature chemical reactivity theory). The τ-Fukui kernel is defined as the ratio between the fluctuations of the average electron density at two different points in the space and the fluctuations in the average electron number and is designed to integrate to the finite-temperature definition of the electronic Fukui function. When this kernel is condensed, it can be interpreted as a site-reactivity descriptor of the boundary region between two atoms. The τ-dual kernel corresponds to the first order response of the Fukui kernel and is designed to integrate to the finite temperature definition of the dual descriptor; it indicates the ambiphilic reactivity of a specific bond and enriches the traditional dual descriptor by allowing one to distinguish between the electron-accepting and electron-donating processes. Finally, the τ-hyper dual kernel is defined as the second-order derivative of the Fukui kernel and is proposed as a measure of the strength of ambiphilic bonding interactions. Although these quantities have never been proposed, our results for the τ-Fukui kernel and for τ-dual kernel can be derived in zero-temperature formulation of the chemical reactivity theory with, among other things, the widely-used parabolic interpolation model.

  10. Flexibly imposing periodicity in kernel independent FMM: A multipole-to-local operator approach

    NASA Astrophysics Data System (ADS)

    Yan, Wen; Shelley, Michael

    2018-02-01

    An important but missing component in the application of the kernel independent fast multipole method (KIFMM) is the capability for flexibly and efficiently imposing singly, doubly, and triply periodic boundary conditions. In most popular packages such periodicities are imposed with the hierarchical repetition of periodic boxes, which may give an incorrect answer due to the conditional convergence of some kernel sums. Here we present an efficient method to properly impose periodic boundary conditions using a near-far splitting scheme. The near-field contribution is directly calculated with the KIFMM method, while the far-field contribution is calculated with a multipole-to-local (M2L) operator which is independent of the source and target point distribution. The M2L operator is constructed with the far-field portion of the kernel function to generate the far-field contribution with the downward equivalent source points in KIFMM. This method guarantees the sum of the near-field & far-field converge pointwise to results satisfying periodicity and compatibility conditions. The computational cost of the far-field calculation observes the same O (N) complexity as FMM and is designed to be small by reusing the data computed by KIFMM for the near-field. The far-field calculations require no additional control parameters, and observes the same theoretical error bound as KIFMM. We present accuracy and timing test results for the Laplace kernel in singly periodic domains and the Stokes velocity kernel in doubly and triply periodic domains.

  11. Kernel-PCA data integration with enhanced interpretability

    PubMed Central

    2014-01-01

    Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge. PMID:25032747

  12. Locally-Based Kernal PLS Smoothing to Non-Parametric Regression Curve Fitting

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Korsmeyer, David (Technical Monitor)

    2002-01-01

    We present a novel smoothing approach to non-parametric regression curve fitting. This is based on kernel partial least squares (PLS) regression in reproducing kernel Hilbert space. It is our concern to apply the methodology for smoothing experimental data where some level of knowledge about the approximate shape, local inhomogeneities or points where the desired function changes its curvature is known a priori or can be derived based on the observed noisy data. We propose locally-based kernel PLS regression that extends the previous kernel PLS methodology by incorporating this knowledge. We compare our approach with existing smoothing splines, hybrid adaptive splines and wavelet shrinkage techniques on two generated data sets.

  13. Optimization of light source parameters in the photodynamic therapy of heterogeneous prostate

    NASA Astrophysics Data System (ADS)

    Li, Jun; Altschuler, Martin D.; Hahn, Stephen M.; Zhu, Timothy C.

    2008-08-01

    The three-dimensional (3D) heterogeneous distributions of optical properties in a patient prostate can now be measured in vivo. Such data can be used to obtain a more accurate light-fluence kernel. (For specified sources and points, the kernel gives the fluence delivered to a point by a source of unit strength.) In turn, the kernel can be used to solve the inverse problem that determines the source strengths needed to deliver a prescribed photodynamic therapy (PDT) dose (or light-fluence) distribution within the prostate (assuming uniform drug concentration). We have developed and tested computational procedures to use the new heterogeneous data to optimize delivered light-fluence. New problems arise, however, in quickly obtaining an accurate kernel following the insertion of interstitial light sources and data acquisition. (1) The light-fluence kernel must be calculated in 3D and separately for each light source, which increases kernel size. (2) An accurate kernel for light scattering in a heterogeneous medium requires ray tracing and volume partitioning, thus significant calculation time. To address these problems, two different kernels were examined and compared for speed of creation and accuracy of dose. Kernels derived more quickly involve simpler algorithms. Our goal is to achieve optimal dose planning with patient-specific heterogeneous optical data applied through accurate kernels, all within clinical times. The optimization process is restricted to accepting the given (interstitially inserted) sources, and determining the best source strengths with which to obtain a prescribed dose. The Cimmino feasibility algorithm is used for this purpose. The dose distribution and source weights obtained for each kernel are analyzed. In clinical use, optimization will also be performed prior to source insertion to obtain initial source positions, source lengths and source weights, but with the assumption of homogeneous optical properties. For this reason, we compare the results from heterogeneous optical data with those obtained from average homogeneous optical properties. The optimized treatment plans are also compared with the reference clinical plan, defined as the plan with sources of equal strength, distributed regularly in space, which delivers a mean value of prescribed fluence at detector locations within the treatment region. The study suggests that comprehensive optimization of source parameters (i.e. strengths, lengths and locations) is feasible, thus allowing acceptable dose coverage in a heterogeneous prostate PDT within the time constraints of the PDT procedure.

  14. Assessing opportunities for physical activity in the built environment of children: interrelation between kernel density and neighborhood scale.

    PubMed

    Buck, Christoph; Kneib, Thomas; Tkaczick, Tobias; Konstabel, Kenn; Pigeot, Iris

    2015-12-22

    Built environment studies provide broad evidence that urban characteristics influence physical activity (PA). However, findings are still difficult to compare, due to inconsistent measures assessing urban point characteristics and varying definitions of spatial scale. Both were found to influence the strength of the association between the built environment and PA. We simultaneously evaluated the effect of kernel approaches and network-distances to investigate the association between urban characteristics and physical activity depending on spatial scale and intensity measure. We assessed urban measures of point characteristics such as intersections, public transit stations, and public open spaces in ego-centered network-dependent neighborhoods based on geographical data of one German study region of the IDEFICS study. We calculated point intensities using the simple intensity and kernel approaches based on fixed bandwidths, cross-validated bandwidths including isotropic and anisotropic kernel functions and considering adaptive bandwidths that adjust for residential density. We distinguished six network-distances from 500 m up to 2 km to calculate each intensity measure. A log-gamma regression model was used to investigate the effect of each urban measure on moderate-to-vigorous physical activity (MVPA) of 400 2- to 9.9-year old children who participated in the IDEFICS study. Models were stratified by sex and age groups, i.e. pre-school children (2 to <6 years) and school children (6-9.9 years), and were adjusted for age, body mass index (BMI), education and safety concerns of parents, season and valid weartime of accelerometers. Association between intensity measures and MVPA strongly differed by network-distance, with stronger effects found for larger network-distances. Simple intensity revealed smaller effect estimates and smaller goodness-of-fit compared to kernel approaches. Smallest variation in effect estimates over network-distances was found for kernel intensity measures based on isotropic and anisotropic cross-validated bandwidth selection. We found a strong variation in the association between the built environment and PA of children based on the choice of intensity measure and network-distance. Kernel intensity measures provided stable results over various scales and improved the assessment compared to the simple intensity measure. Considering different spatial scales and kernel intensity methods might reduce methodological limitations in assessing opportunities for PA in the built environment.

  15. Implementation of kernels on the Maestro processor

    NASA Astrophysics Data System (ADS)

    Suh, Jinwoo; Kang, D. I. D.; Crago, S. P.

    Currently, most microprocessors use multiple cores to increase performance while limiting power usage. Some processors use not just a few cores, but tens of cores or even 100 cores. One such many-core microprocessor is the Maestro processor, which is based on Tilera's TILE64 processor. The Maestro chip is a 49-core, general-purpose, radiation-hardened processor designed for space applications. The Maestro processor, unlike the TILE64, has a floating point unit (FPU) in each core for improved floating point performance. The Maestro processor runs at 342 MHz clock frequency. On the Maestro processor, we implemented several widely used kernels: matrix multiplication, vector add, FIR filter, and FFT. We measured and analyzed the performance of these kernels. The achieved performance was up to 5.7 GFLOPS, and the speedup compared to single tile was up to 49 using 49 tiles.

  16. Blending of palm oil, palm stearin and palm kernel oil in the preparation of table and pastry margarine.

    PubMed

    Norlida, H M; Md Ali, A R; Muhadhir, I

    1996-01-01

    Palm oil (PO ; iodin value = 52), palm stearin (POs1; i.v. = 32 and POs2; i.v. = 40) and palm kernel oil (PKO; i.v. = 17) were blended in ternary systems. The blends were then studied for their physical properties such as melting point (m.p.), solid fat content (SFC), and cooling curve. Results showed that palm stearin increased the blends melting point while palm kernel oil reduced it. To produce table margarine with melting point (m.p.) below 40 degrees C, the POs1 should be added at level of < or = 16%, while POs2 at level of < or = 20%. At 10 degrees C, eutectic interaction occur between PO and PKO which reach their maximum at about 60:40 blending ratio. Within the eutectic region, to maintain the SFC at 10 degrees C to be < or = 50%, POs1 may be added at level of < or = 7%, while POs2 at level of < or = 12%. The addition of palm stearin increased the blends solidification Tmin and Tmax values, while PKO reduced them. Blends which contained high amount of palm stearin showed melting point and cooling curves quite similar to that of pastry margarine.

  17. SOME ENGINEERING PROPERTIES OF SHELLED AND KERNEL TEA (Camellia sinensis) SEEDS.

    PubMed

    Altuntas, Ebubekir; Yildiz, Merve

    2017-01-01

    Camellia sinensis is the source of tea leaves and it is an economic crop now grown around the World. Tea seed oil has been used for cooking in China and other Asian countries for more than a thousand years. Tea is the most widely consumed beverages after water in the world. It is mainly produced in Asia, central Africa, and exported throughout the World. Some engineering properties (size dimensions, sphericity, volume, bulk and true densities, friction coefficient, colour characteristics and mechanical behaviour as rupture force of shelled and kernel tea ( Camellia sinensis ) seeds were determined in this study. This research was carried out for shelled and kernel tea seeds. The shelled tea seeds used in this study were obtained from East-Black Sea Tea Cooperative Institution in Rize city of Turkey. Shelled and kernel tea seeds were characterized as large and small sizes. The average geometric mean diameter and seed mass of the shelled tea seeds were 15.8 mm, 10.7 mm (large size); 1.47 g, 0.49 g (small size); while the average geometric mean diameter and seed mass of the kernel tea seeds were 11.8 mm, 8 mm for large size; 0.97 g, 0.31 g for small size, respectively. The sphericity, surface area and volume values were found to be higher in a larger size than small size for the shelled and kernel tea samples. The shelled tea seed's colour intensity (Chroma) were found between 59.31 and 64.22 for large size, while the kernel tea seed's chroma values were found between 56.04 68.34 for large size, respectively. The rupture force values of kernel tea seeds were higher than shelled tea seeds for the large size along X axis; whereas, the rupture force values of along X axis were higher than Y axis for large size of shelled tea seeds. The static coefficients of friction of shelled and kernel tea seeds for the large and small sizes higher values for rubber than the other friction surfaces. Some engineering properties, such as geometric mean diameter, sphericity, volume, bulk and true densities, the coefficient of friction, L*, a*, b* colour characteristics and rupture force of shelled and kernel tea ( Camellia sinensis ) seeds will serve to design the equipment used in postharvest treatments.

  18. Image quality of mixed convolution kernel in thoracic computed tomography.

    PubMed

    Neubauer, Jakob; Spira, Eva Maria; Strube, Juliane; Langer, Mathias; Voss, Christian; Kotter, Elmar

    2016-11-01

    The mixed convolution kernel alters his properties geographically according to the depicted organ structure, especially for the lung. Therefore, we compared the image quality of the mixed convolution kernel to standard soft and hard kernel reconstructions for different organ structures in thoracic computed tomography (CT) images.Our Ethics Committee approved this prospective study. In total, 31 patients who underwent contrast-enhanced thoracic CT studies were included after informed consent. Axial reconstructions were performed with hard, soft, and mixed convolution kernel. Three independent and blinded observers rated the image quality according to the European Guidelines for Quality Criteria of Thoracic CT for 13 organ structures. The observers rated the depiction of the structures in all reconstructions on a 5-point Likert scale. Statistical analysis was performed with the Friedman Test and post hoc analysis with the Wilcoxon rank-sum test.Compared to the soft convolution kernel, the mixed convolution kernel was rated with a higher image quality for lung parenchyma, segmental bronchi, and the border between the pleura and the thoracic wall (P < 0.03). Compared to the hard convolution kernel, the mixed convolution kernel was rated with a higher image quality for aorta, anterior mediastinal structures, paratracheal soft tissue, hilar lymph nodes, esophagus, pleuromediastinal border, large and medium sized pulmonary vessels and abdomen (P < 0.004) but a lower image quality for trachea, segmental bronchi, lung parenchyma, and skeleton (P < 0.001).The mixed convolution kernel cannot fully substitute the standard CT reconstructions. Hard and soft convolution kernel reconstructions still seem to be mandatory for thoracic CT.

  19. Investigating the Impact of Aerosol Deposition on Snow Melt over the Greenland Ice Sheet Using a New Kernel

    NASA Astrophysics Data System (ADS)

    Li, Y.; Flanner, M.

    2017-12-01

    Accelerating surface melt on the Greenland Ice Sheet (GrIS) has led to a doubling of Greenland's contribution to global sea level rise during recent decades. The darkening effect due to black carbon (BC), dust, and other light absorbing impurities (LAI) enhances snow melt by boosting its absorption of solar energy. It is therefore important for coupled aerosol-climate and ice sheet models to include snow darkening effects from LAI, and yet most do not. In this study, we develop an aerosol deposition—snow melt kernel based on the Community Earth System Model (CESM) to investigate changes in melt flux due to variations in the amount and timing of aerosol deposition on the GrIS. The Community Land Model (CLM) component of CESM is driven with a large range of aerosol deposition fluxes to determine non-linear relationships between melt perturbation and deposition amount occurring in different months and location (thereby capturing variations in base state associated with elevation and latitude). The kernel product will include climatological-mean effects and standard deviations associated with interannual variability. Finally, the kernel will allow aerosol deposition fluxes from any global or regional aerosol model to be translated into surface melt perturbations of the GrIS, thus extending the utility of state-of-the-art aerosol models.

  20. Mathematical inference in one point microrheology

    NASA Astrophysics Data System (ADS)

    Hohenegger, Christel; McKinley, Scott

    2016-11-01

    Pioneered by the work of Mason and Weitz, one point passive microrheology has been successfully applied to obtaining estimates of the loss and storage modulus of viscoelastic fluids when the mean-square displacement obeys a local power law. Using numerical simulations of a fluctuating viscoelastic fluid model, we study the problem of recovering the mechanical parameters of the fluid's memory kernel using statistical inference like mean-square displacements and increment auto-correlation functions. Seeking a better understanding of the influence of the assumptions made in the inversion process, we mathematically quantify the uncertainty in traditional one point microrheology for simulated data and demonstrate that a large family of memory kernels yields the same statistical signature. We consider both simulated data obtained from a full viscoelastic fluid simulation of the unsteady Stokes equations with fluctuations and from a Generalized Langevin Equation of the particle's motion described by the same memory kernel. From the theory of inverse problems, we propose an alternative method that can be used to recover information about the loss and storage modulus and discuss its limitations and uncertainties. NSF-DMS 1412998.

  1. Effect of black point on accuracy of LCD displays colorimetric characterization

    NASA Astrophysics Data System (ADS)

    Li, Tong; Xie, Kai; He, Nannan; Ye, Yushan

    2018-03-01

    Black point is the point at which RGB's single channel digital drive value is 0. Due to the problem of light leakage of liquid-crystal displays (LCDs), black point's luminance value is not 0, this phenomenon bring some errors to colorimetric characterization of LCDs, especially low luminance value driving greater sampling effect. This paper describes the characteristic accuracy of polynomial model method and the effect of black point on accuracy, the color difference accuracy is given. When considering the black point in the characteristics equation, the maximum color difference is 3.246, the maximum color difference than without considering the black points reduced by 2.36. The experimental results show that the accuracy of LCDs colorimetric characterization can be improved, if the effect of black point is eliminated properly.

  2. Scanning Apollo Flight Films and Reconstructing CSM Trajectories

    NASA Astrophysics Data System (ADS)

    Speyerer, E.; Robinson, M. S.; Grunsfeld, J. M.; Locke, S. D.; White, M.

    2006-12-01

    Over thirty years ago, the astronauts of the Apollo program made the journey from the Earth to the Moon and back. To record their historic voyages and collect scientific observations many thousands of photographs were acquired with handheld and automated cameras. After returning to Earth, these films were developed and stored at the film archive at Johnson Space Center (JSC), where they still reside. Due to the historical significance of the original flight films typically only duplicate (2nd or 3rd generation) film products are studied and used to make prints. To allow full access to the original flight films for both researchers and the general public, JSC and Arizona State University are scanning and creating an online digital archive. A Leica photogrammetric scanner is being used to insure geometric and radiometric fidelity. Scanning resolution will preserve the grain of the film. Color frames are being scanned and archived as 48 bit pixels to insure capture of the full dynamic range of the film (16 bit for BW). The raw scans will consist of 70 Terabytes of data (10,000 BW Hasselblad, 10,000 color Hasselblad, 10,000 Metric frames, 4500 Pan frames, 620 35mm frames counts; are estimates). All the scanned films will be made available for download through a searchable database. Special tools are being developed to locate images based on various search parameters. To geolocate metric and panoramic frames acquired during Apollos 15\\-17, prototype SPICE kernels are being generated from existing photographic support data by entering state vectors and timestamps from multiple points throughout each orbit into the NAIF toolkit to create a type 9 Spacecraft and Planet Ephemeris Kernel (SPK), a nadir pointing C\\- matrix Kernel (CK), and a Spacecraft Clock Kernel (SCLK). These SPICE kernels, in addition to the Instrument Kernel (IK) and Frames Kernel (FK) that also under development, will be archived along with the scanned images. From the generated kernels, several IDL programs have been designed to display orbital tracks, produce footprint plots, and create image projections. Using the output from these SPICE based programs enables accurate geolocating of SIM bay photography as well as providing potential data from lunar gravitational studies.

  3. Fruit position within the canopy affects kernel lipid composition of hazelnuts.

    PubMed

    Pannico, Antonio; Cirillo, Chiara; Giaccone, Matteo; Scognamiglio, Pasquale; Romano, Raffaele; Caporaso, Nicola; Sacchi, Raffaele; Basile, Boris

    2017-11-01

    The aim of this research was to study the variability in kernel composition within the canopy of hazelnut trees. Kernel fresh and dry weight increased linearly with fruit height above the ground. Fat content decreased, while protein and ash content increased, from the bottom to the top layers of the canopy. The level of unsaturation of fatty acids decreased from the bottom to the top of the canopy. Thus, the kernels located in the bottom layers of the canopy appear to be more interesting from a nutritional point of view, but their lipids may be more exposed to oxidation. The content of different phytosterols increased progressively from bottom to top canopy layers. Most of these effects correlated with the pattern in light distribution inside the canopy. The results of this study indicate that fruit position within the canopy is an important factor in determining hazelnut kernel growth and composition. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  4. Many Molecular Properties from One Kernel in Chemical Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Raghunathan; von Lilienfeld, O. Anatole

    We introduce property-independent kernels for machine learning modeling of arbitrarily many molecular properties. The kernels encode molecular structures for training sets of varying size, as well as similarity measures sufficiently diffuse in chemical space to sample over all training molecules. Corresponding molecular reference properties provided, they enable the instantaneous generation of ML models which can systematically be improved through the addition of more data. This idea is exemplified for single kernel based modeling of internal energy, enthalpy, free energy, heat capacity, polarizability, electronic spread, zero-point vibrational energy, energies of frontier orbitals, HOMOLUMO gap, and the highest fundamental vibrational wavenumber. Modelsmore » of these properties are trained and tested using 112 kilo organic molecules of similar size. Resulting models are discussed as well as the kernels’ use for generating and using other property models.« less

  5. Computing black hole partition functions from quasinormal modes

    DOE PAGES

    Arnold, Peter; Szepietowski, Phillip; Vaman, Diana

    2016-07-07

    We propose a method of computing one-loop determinants in black hole space-times (with emphasis on asymptotically anti-de Sitter black holes) that may be used for numerics when completely-analytic results are unattainable. The method utilizes the expression for one-loop determinants in terms of quasinormal frequencies determined by Denef, Hartnoll and Sachdev in [1]. A numerical evaluation must face the fact that the sum over the quasinormal modes, indexed by momentum and overtone numbers, is divergent. A necessary ingredient is then a regularization scheme to handle the divergent contributions of individual fixed-momentum sectors to the partition function. To this end, we formulatemore » an effective two-dimensional problem in which a natural refinement of standard heat kernel techniques can be used to account for contributions to the partition function at fixed momentum. We test our method in a concrete case by reproducing the scalar one-loop determinant in the BTZ black hole background. Furthermore, we then discuss the application of such techniques to more complicated spacetimes.« less

  6. Numerical method for solving the nonlinear four-point boundary value problems

    NASA Astrophysics Data System (ADS)

    Lin, Yingzhen; Lin, Jinnan

    2010-12-01

    In this paper, a new reproducing kernel space is constructed skillfully in order to solve a class of nonlinear four-point boundary value problems. The exact solution of the linear problem can be expressed in the form of series and the approximate solution of the nonlinear problem is given by the iterative formula. Compared with known investigations, the advantages of our method are that the representation of exact solution is obtained in a new reproducing kernel Hilbert space and accuracy of numerical computation is higher. Meanwhile we present the convergent theorem, complexity analysis and error estimation. The performance of the new method is illustrated with several numerical examples.

  7. A dose assessment method for arbitrary geometries with virtual reality in the nuclear facilities decommissioning

    NASA Astrophysics Data System (ADS)

    Chao, Nan; Liu, Yong-kuo; Xia, Hong; Ayodeji, Abiodun; Bai, Lu

    2018-03-01

    During the decommissioning of nuclear facilities, a large number of cutting and demolition activities are performed, which results in a frequent change in the structure and produce many irregular objects. In order to assess dose rates during the cutting and demolition process, a flexible dose assessment method for arbitrary geometries and radiation sources was proposed based on virtual reality technology and Point-Kernel method. The initial geometry is designed with the three-dimensional computer-aided design tools. An approximate model is built automatically in the process of geometric modeling via three procedures namely: space division, rough modeling of the body and fine modeling of the surface, all in combination with collision detection of virtual reality technology. Then point kernels are generated by sampling within the approximate model, and when the material and radiometric attributes are inputted, dose rates can be calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The effectiveness and accuracy of the proposed method was verified by means of simulations using different geometries and the dose rate results were compared with that derived from CIDEC code, MCNP code and experimental measurements.

  8. Approximate l-fold cross-validation with Least Squares SVM and Kernel Ridge Regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, Richard E; Zhang, Hao; Parker, Lynne Edwards

    2013-01-01

    Kernel methods have difficulties scaling to large modern data sets. The scalability issues are based on computational and memory requirements for working with a large matrix. These requirements have been addressed over the years by using low-rank kernel approximations or by improving the solvers scalability. However, Least Squares Support VectorMachines (LS-SVM), a popular SVM variant, and Kernel Ridge Regression still have several scalability issues. In particular, the O(n^3) computational complexity for solving a single model, and the overall computational complexity associated with tuning hyperparameters are still major problems. We address these problems by introducing an O(n log n) approximate l-foldmore » cross-validation method that uses a multi-level circulant matrix to approximate the kernel. In addition, we prove our algorithm s computational complexity and present empirical runtimes on data sets with approximately 1 million data points. We also validate our approximate method s effectiveness at selecting hyperparameters on real world and standard benchmark data sets. Lastly, we provide experimental results on using a multi-level circulant kernel approximation to solve LS-SVM problems with hyperparameters selected using our method.« less

  9. Electron beam lithographic modeling assisted by artificial intelligence technology

    NASA Astrophysics Data System (ADS)

    Nakayamada, Noriaki; Nishimura, Rieko; Miura, Satoru; Nomura, Haruyuki; Kamikubo, Takashi

    2017-07-01

    We propose a new concept of tuning a point-spread function (a "kernel" function) in the modeling of electron beam lithography using the machine learning scheme. Normally in the work of artificial intelligence, the researchers focus on the output results from a neural network, such as success ratio in image recognition or improved production yield, etc. In this work, we put more focus on the weights connecting the nodes in a convolutional neural network, which are naturally the fractions of a point-spread function, and take out those weighted fractions after learning to be utilized as a tuned kernel. Proof-of-concept of the kernel tuning has been demonstrated using the examples of proximity effect correction with 2-layer network, and charging effect correction with 3-layer network. This type of new tuning method can be beneficial to give researchers more insights to come up with a better model, yet it might be too early to be deployed to production to give better critical dimension (CD) and positional accuracy almost instantly.

  10. Multiple kernel SVR based on the MRE for remote sensing water depth fusion detection

    NASA Astrophysics Data System (ADS)

    Wang, Jinjin; Ma, Yi; Zhang, Jingyu

    2018-03-01

    Remote sensing has an important means of water depth detection in coastal shallow waters and reefs. Support vector regression (SVR) is a machine learning method which is widely used in data regression. In this paper, SVR is used to remote sensing multispectral bathymetry. Aiming at the problem that the single-kernel SVR method has a large error in shallow water depth inversion, the mean relative error (MRE) of different water depth is retrieved as a decision fusion factor with single kernel SVR method, a multi kernel SVR fusion method based on the MRE is put forward. And taking the North Island of the Xisha Islands in China as an experimentation area, the comparison experiments with the single kernel SVR method and the traditional multi-bands bathymetric method are carried out. The results show that: 1) In range of 0 to 25 meters, the mean absolute error(MAE)of the multi kernel SVR fusion method is 1.5m,the MRE is 13.2%; 2) Compared to the 4 single kernel SVR method, the MRE of the fusion method reduced 1.2% (1.9%) 3.4% (1.8%), and compared to traditional multi-bands method, the MRE reduced 1.9%; 3) In 0-5m depth section, compared to the single kernel method and the multi-bands method, the MRE of fusion method reduced 13.5% to 44.4%, and the distribution of points is more concentrated relative to y=x.

  11. Effects of sample size on KERNEL home range estimates

    USGS Publications Warehouse

    Seaman, D.E.; Millspaugh, J.J.; Kernohan, Brian J.; Brundige, Gary C.; Raedeke, Kenneth J.; Gitzen, Robert A.

    1999-01-01

    Kernel methods for estimating home range are being used increasingly in wildlife research, but the effect of sample size on their accuracy is not known. We used computer simulations of 10-200 points/home range and compared accuracy of home range estimates produced by fixed and adaptive kernels with the reference (REF) and least-squares cross-validation (LSCV) methods for determining the amount of smoothing. Simulated home ranges varied from simple to complex shapes created by mixing bivariate normal distributions. We used the size of the 95% home range area and the relative mean squared error of the surface fit to assess the accuracy of the kernel home range estimates. For both measures, the bias and variance approached an asymptote at about 50 observations/home range. The fixed kernel with smoothing selected by LSCV provided the least-biased estimates of the 95% home range area. All kernel methods produced similar surface fit for most simulations, but the fixed kernel with LSCV had the lowest frequency and magnitude of very poor estimates. We reviewed 101 papers published in The Journal of Wildlife Management (JWM) between 1980 and 1997 that estimated animal home ranges. A minority of these papers used nonparametric utilization distribution (UD) estimators, and most did not adequately report sample sizes. We recommend that home range studies using kernel estimates use LSCV to determine the amount of smoothing, obtain a minimum of 30 observations per animal (but preferably a?Y50), and report sample sizes in published results.

  12. Transient and asymptotic behaviour of the binary breakage problem

    NASA Astrophysics Data System (ADS)

    Mantzaris, Nikos V.

    2005-06-01

    The general binary breakage problem with power-law breakage functions and two families of symmetric and asymmetric breakage kernels is studied in this work. A useful transformation leads to an equation that predicts self-similar solutions in its asymptotic limit and offers explicit knowledge of the mean size and particle density at each point in dimensionless time. A novel moving boundary algorithm in the transformed coordinate system is developed, allowing the accurate prediction of the full transient behaviour of the system from the initial condition up to the point where self-similarity is achieved, and beyond if necessary. The numerical algorithm is very rapid and its results are in excellent agreement with known analytical solutions. In the case of the symmetric breakage kernels only unimodal, self-similar number density functions are obtained asymptotically for all parameter values and independent of the initial conditions, while in the case of asymmetric breakage kernels, bimodality appears for high degrees of asymmetry and sharp breakage functions. For symmetric and discrete breakage kernels, self-similarity is not achieved. The solution exhibits sustained oscillations with amplitude that depends on the initial condition and the sharpness of the breakage mechanism, while the period is always fixed and equal to ln 2 with respect to dimensionless time.

  13. Suitability of point kernel dose calculation techniques in brachytherapy treatment planning

    PubMed Central

    Lakshminarayanan, Thilagam; Subbaiah, K. V.; Thayalan, K.; Kannan, S. E.

    2010-01-01

    Brachytherapy treatment planning system (TPS) is necessary to estimate the dose to target volume and organ at risk (OAR). TPS is always recommended to account for the effect of tissue, applicator and shielding material heterogeneities exist in applicators. However, most brachytherapy TPS software packages estimate the absorbed dose at a point, taking care of only the contributions of individual sources and the source distribution, neglecting the dose perturbations arising from the applicator design and construction. There are some degrees of uncertainties in dose rate estimations under realistic clinical conditions. In this regard, an attempt is made to explore the suitability of point kernels for brachytherapy dose rate calculations and develop new interactive brachytherapy package, named as BrachyTPS, to suit the clinical conditions. BrachyTPS is an interactive point kernel code package developed to perform independent dose rate calculations by taking into account the effect of these heterogeneities, using two regions build up factors, proposed by Kalos. The primary aim of this study is to validate the developed point kernel code package integrated with treatment planning computational systems against the Monte Carlo (MC) results. In the present work, three brachytherapy applicators commonly used in the treatment of uterine cervical carcinoma, namely (i) Board of Radiation Isotope and Technology (BRIT) low dose rate (LDR) applicator and (ii) Fletcher Green type LDR applicator (iii) Fletcher Williamson high dose rate (HDR) applicator, are studied to test the accuracy of the software. Dose rates computed using the developed code are compared with the relevant results of the MC simulations. Further, attempts are also made to study the dose rate distribution around the commercially available shielded vaginal applicator set (Nucletron). The percentage deviations of BrachyTPS computed dose rate values from the MC results are observed to be within plus/minus 5.5% for BRIT LDR applicator, found to vary from 2.6 to 5.1% for Fletcher green type LDR applicator and are up to −4.7% for Fletcher-Williamson HDR applicator. The isodose distribution plots also show good agreements with the results of previous literatures. The isodose distributions around the shielded vaginal cylinder computed using BrachyTPS code show better agreement (less than two per cent deviation) with MC results in the unshielded region compared to shielded region, where the deviations are observed up to five per cent. The present study implies that the accurate and fast validation of complicated treatment planning calculations is possible with the point kernel code package. PMID:20589118

  14. SU-E-T-423: Fast Photon Convolution Calculation with a 3D-Ideal Kernel On the GPU

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriya, S; Sato, M; Tachibana, H

    Purpose: The calculation time is a trade-off for improving the accuracy of convolution dose calculation with fine calculation spacing of the KERMA kernel. We investigated to accelerate the convolution calculation using an ideal kernel on the Graphic Processing Units (GPU). Methods: The calculation was performed on the AMD graphics hardware of Dual FirePro D700 and our algorithm was implemented using the Aparapi that convert Java bytecode to OpenCL. The process of dose calculation was separated with the TERMA and KERMA steps. The dose deposited at the coordinate (x, y, z) was determined in the process. In the dose calculation runningmore » on the central processing unit (CPU) of Intel Xeon E5, the calculation loops were performed for all calculation points. On the GPU computation, all of the calculation processes for the points were sent to the GPU and the multi-thread computation was done. In this study, the dose calculation was performed in a water equivalent homogeneous phantom with 150{sup 3} voxels (2 mm calculation grid) and the calculation speed on the GPU to that on the CPU and the accuracy of PDD were compared. Results: The calculation time for the GPU and the CPU were 3.3 sec and 4.4 hour, respectively. The calculation speed for the GPU was 4800 times faster than that for the CPU. The PDD curve for the GPU was perfectly matched to that for the CPU. Conclusion: The convolution calculation with the ideal kernel on the GPU was clinically acceptable for time and may be more accurate in an inhomogeneous region. Intensity modulated arc therapy needs dose calculations for different gantry angles at many control points. Thus, it would be more practical that the kernel uses a coarse spacing technique if the calculation is faster while keeping the similar accuracy to a current treatment planning system.« less

  15. Power Series Approximation for the Correlation Kernel Leading to Kohn-Sham Methods Combining Accuracy, Computational Efficiency, and General Applicability

    NASA Astrophysics Data System (ADS)

    Erhard, Jannis; Bleiziffer, Patrick; Görling, Andreas

    2016-09-01

    A power series approximation for the correlation kernel of time-dependent density-functional theory is presented. Using this approximation in the adiabatic-connection fluctuation-dissipation (ACFD) theorem leads to a new family of Kohn-Sham methods. The new methods yield reaction energies and barriers of unprecedented accuracy and enable a treatment of static (strong) correlation with an accuracy of high-level multireference configuration interaction methods but are single-reference methods allowing for a black-box-like handling of static correlation. The new methods exhibit a better scaling of the computational effort with the system size than rivaling wave-function-based electronic structure methods. Moreover, the new methods do not suffer from the problem of singularities in response functions plaguing previous ACFD methods and therefore are applicable to any type of electronic system.

  16. Coronary Stent Artifact Reduction with an Edge-Enhancing Reconstruction Kernel - A Prospective Cross-Sectional Study with 256-Slice CT.

    PubMed

    Tan, Stéphanie; Soulez, Gilles; Diez Martinez, Patricia; Larrivée, Sandra; Stevens, Louis-Mathieu; Goussard, Yves; Mansour, Samer; Chartrand-Lefebvre, Carl

    2016-01-01

    Metallic artifacts can result in an artificial thickening of the coronary stent wall which can significantly impair computed tomography (CT) imaging in patients with coronary stents. The objective of this study is to assess in vivo visualization of coronary stent wall and lumen with an edge-enhancing CT reconstruction kernel, as compared to a standard kernel. This is a prospective cross-sectional study involving the assessment of 71 coronary stents (24 patients), with blinded observers. After 256-slice CT angiography, image reconstruction was done with medium-smooth and edge-enhancing kernels. Stent wall thickness was measured with both orthogonal and circumference methods, averaging thickness from diameter and circumference measurements, respectively. Image quality was assessed quantitatively using objective parameters (noise, signal to noise (SNR) and contrast to noise (CNR) ratios), as well as visually using a 5-point Likert scale. Stent wall thickness was decreased with the edge-enhancing kernel in comparison to the standard kernel, either with the orthogonal (0.97 ± 0.02 versus 1.09 ± 0.03 mm, respectively; p<0.001) or the circumference method (1.13 ± 0.02 versus 1.21 ± 0.02 mm, respectively; p = 0.001). The edge-enhancing kernel generated less overestimation from nominal thickness compared to the standard kernel, both with the orthogonal (0.89 ± 0.19 versus 1.00 ± 0.26 mm, respectively; p<0.001) and the circumference (1.06 ± 0.26 versus 1.13 ± 0.31 mm, respectively; p = 0.005) methods. The edge-enhancing kernel was associated with lower SNR and CNR, as well as higher background noise (all p < 0.001), in comparison to the medium-smooth kernel. Stent visual scores were higher with the edge-enhancing kernel (p<0.001). In vivo 256-slice CT assessment of coronary stents shows that the edge-enhancing CT reconstruction kernel generates thinner stent walls, less overestimation from nominal thickness, and better image quality scores than the standard kernel.

  17. A Experimental Study of the Growth of Laser Spark and Electric Spark Ignited Flame Kernels.

    NASA Astrophysics Data System (ADS)

    Ho, Chi Ming

    1995-01-01

    Better ignition sources are constantly in demand for enhancing the spark ignition in practical applications such as automotive and liquid rocket engines. In response to this practical challenge, the present experimental study was conducted with the major objective to obtain a better understanding on how spark formation and hence spark characteristics affect the flame kernel growth. Two laser sparks and one electric spark were studied in air, propane-air, propane -air-nitrogen, methane-air, and methane-oxygen mixtures that were initially at ambient pressure and temperature. The growth of the kernels was monitored by imaging the kernels with shadowgraph systems, and by imaging the planar laser -induced fluorescence of the hydroxyl radicals inside the kernels. Characteristic dimensions and kernel structures were obtained from these images. Since different energy transfer mechanisms are involved in the formation of a laser spark as compared to that of an electric spark; a laser spark is insensitive to changes in mixture ratio and mixture type, while an electric spark is sensitive to changes in both. The detailed structures of the kernels in air and propane-air mixtures primarily depend on the spark characteristics. But the combustion heat released rapidly in methane-oxygen mixtures significantly modifies the kernel structure. Uneven spark energy distribution causes remarkably asymmetric kernel structure. The breakdown energy of a spark creates a blast wave that shows good agreement with the numerical point blast solution, and a succeeding complex spark-induced flow that agrees reasonably well with a simple puff model. The transient growth rates of the propane-air, propane-air -nitrogen, and methane-air flame kernels can be interpreted in terms of spark effects, flame stretch, and preferential diffusion. For a given mixture, a spark with higher breakdown energy produces a greater and longer-lasting enhancing effect on the kernel growth rate. By comparing the growth rates of the appropriate mixtures, the positive and negative effects of preferential diffusion and flame stretch on the developing flame are clearly demonstrated.

  18. Observation of a 3D Magnetic Null Point

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Romano, P.; Falco, M.; Guglielmino, S. L.

    2017-03-10

    We describe high-resolution observations of a GOES B-class flare characterized by a circular ribbon at the chromospheric level, corresponding to the network at the photospheric level. We interpret the flare as a consequence of a magnetic reconnection event that occurred at a three-dimensional (3D) coronal null point located above the supergranular cell. The potential field extrapolation of the photospheric magnetic field indicates that the circular chromospheric ribbon is cospatial with the fan footpoints, while the ribbons of the inner and outer spines look like compact kernels. We found new interesting observational aspects that need to be explained by models: (1)more » a loop corresponding to the outer spine became brighter a few minutes before the onset of the flare; (2) the circular ribbon was formed by several adjacent compact kernels characterized by a size of 1″–2″; (3) the kernels with a stronger intensity emission were located at the outer footpoint of the darker filaments, departing radially from the center of the supergranular cell; (4) these kernels started to brighten sequentially in clockwise direction; and (5) the site of the 3D null point and the shape of the outer spine were detected by RHESSI in the low-energy channel between 6.0 and 12.0 keV. Taking into account all these features and the length scales of the magnetic systems involved in the event, we argue that the low intensity of the flare may be ascribed to the low amount of magnetic flux and to its symmetric configuration.« less

  19. Introducing etch kernels for efficient pattern sampling and etch bias prediction

    NASA Astrophysics Data System (ADS)

    Weisbuch, François; Lutich, Andrey; Schatz, Jirka

    2018-01-01

    Successful patterning requires good control of the photolithography and etch processes. While compact litho models, mainly based on rigorous physics, can predict very well the contours printed in photoresist, pure empirical etch models are less accurate and more unstable. Compact etch models are based on geometrical kernels to compute the litho-etch biases that measure the distance between litho and etch contours. The definition of the kernels, as well as the choice of calibration patterns, is critical to get a robust etch model. This work proposes to define a set of independent and anisotropic etch kernels-"internal, external, curvature, Gaussian, z_profile"-designed to represent the finest details of the resist geometry to characterize precisely the etch bias at any point along a resist contour. By evaluating the etch kernels on various structures, it is possible to map their etch signatures in a multidimensional space and analyze them to find an optimal sampling of structures. The etch kernels evaluated on these structures were combined with experimental etch bias derived from scanning electron microscope contours to train artificial neural networks to predict etch bias. The method applied to contact and line/space layers shows an improvement in etch model prediction accuracy over standard etch model. This work emphasizes the importance of the etch kernel definition to characterize and predict complex etch effects.

  20. Proteome analysis of the almond kernel (Prunus dulcis).

    PubMed

    Li, Shugang; Geng, Fang; Wang, Ping; Lu, Jiankang; Ma, Meihu

    2016-08-01

    Almond (Prunus dulcis) is a popular tree nut worldwide and offers many benefits to human health. However, the importance of almond kernel proteins in the nutrition and function in human health requires further evaluation. The present study presents a systematic evaluation of the proteins in the almond kernel using proteomic analysis. The nutrient and amino acid content in almond kernels from Xinjiang is similar to that of American varieties; however, Xinjiang varieties have a higher protein content. Two-dimensional electrophoresis analysis demonstrated a wide distribution of molecular weights and isoelectric points of almond kernel proteins. A total of 434 proteins were identified by LC-MS/MS, and most were proteins that were experimentally confirmed for the first time. Gene ontology (GO) analysis of the 434 proteins indicated that proteins involved in primary biological processes including metabolic processes (67.5%), cellular processes (54.1%), and single-organism processes (43.4%), the main molecular function of almond kernel proteins are in catalytic activity (48.0%), binding (45.4%) and structural molecule activity (11.9%), and proteins are primarily distributed in cell (59.9%), organelle (44.9%), and membrane (22.8%). Almond kernel is a source of a wide variety of proteins. This study provides important information contributing to the screening and identification of almond proteins, the understanding of almond protein function, and the development of almond protein products. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  1. Assessment of the microbiological safety of edible roasted nut kernels on retail sale in England, with a focus on Salmonella.

    PubMed

    Little, C L; Jemmott, W; Surman-Lee, S; Hucklesby, L; de Pinnal, E

    2009-04-01

    There is little published information on the prevalence of Salmonella in edible nut kernels. A study in early 2008 of edible roasted nut kernels on retail sale in England was undertaken to assess the microbiological safety of this product. A total of 727 nut kernel samples of different varieties were examined. Overall, Salmonella and Escherichia coli were detected from 0.2 and 0.4% of edible roasted nut kernels. Of the nut varieties examined, Salmonella Havana was detected from 1 (4.0%) sample of pistachio nuts, indicating a risk to health. The United Kingdom Food Standards Agency was immediately informed, and full investigations were undertaken. Further examination established the contamination to be associated with the pistachio kernels and not the partly opened shells. Salmonella was not detected in other varieties tested (almonds, Brazils, cashews, hazelnuts, macadamia, peanuts, pecans, pine nuts, and walnuts). E. coli was found at low levels (range of 3.6 to 4/g) in walnuts (1.4%), almonds (1.2%), and Brazils (0.5%). The presence of Salmonella is unacceptable in edible nut kernels. Prevention of microbial contamination in these products lies in the application of good agricultural, manufacturing, and storage practices together with a hazard analysis and critical control points system that encompass all stages of production, processing, and distribution.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin, Zheming; Yoshii, Kazutomo; Finkel, Hal

    Open Computing Language (OpenCL) is a high-level language that enables software programmers to explore Field Programmable Gate Arrays (FPGAs) for application acceleration. The Intel FPGA software development kit (SDK) for OpenCL allows a user to specify applications at a high level and explore the performance of low-level hardware acceleration. In this report, we present the FPGA performance and power consumption results of the single-precision floating-point vector add OpenCL kernel using the Intel FPGA SDK for OpenCL on the Nallatech 385A FPGA board. The board features an Arria 10 FPGA. We evaluate the FPGA implementations using the compute unit duplication andmore » kernel vectorization optimization techniques. On the Nallatech 385A FPGA board, the maximum compute kernel bandwidth we achieve is 25.8 GB/s, approximately 76% of the peak memory bandwidth. The power consumption of the FPGA device when running the kernels ranges from 29W to 42W.« less

  3. Sensitivity Kernels for the Cross-Convolution Measure: Eliminate the Source in Waveform Tomography

    NASA Astrophysics Data System (ADS)

    Menke, W. H.

    2017-12-01

    We use the adjoint method to derive sensitivity kernels for the cross-convolution measure, a goodness-of-fit criterion that is applicable to seismic data containing closely-spaced multiple arrivals, such as reverberating compressional waves and split shear waves. In addition to a general formulation, specific expressions for sensitivity with respect to density, Lamé parameter and shear modulus are derived for a isotropic elastic solid. As is typical of adjoint methods, the kernels depend upon an adjoint field, the source of which, in this case, is the reference displacement field, pre-multiplied by a matrix of cross-correlations of components of the observed field. We use a numerical simulation to evaluate the resolving power of a topographic inversion that employs the cross-convolution measure. The estimated resolving kernel shows is point-like, indicating that the cross-convolution measure will perform well in waveform tomography settings.

  4. Kernel-Phase Interferometry for Super-Resolution Detection of Faint Companions

    NASA Astrophysics Data System (ADS)

    Factor, Samuel M.; Kraus, Adam L.

    2017-01-01

    Direct detection of close in companions (exoplanets or binary systems) is notoriously difficult. While coronagraphs and point spread function (PSF) subtraction can be used to reduce contrast and dig out signals of companions under the PSF, there are still significant limitations in separation and contrast. Non-redundant aperture masking (NRM) interferometry can be used to detect companions well inside the PSF of a diffraction limited image, though the mask discards ˜95% of the light gathered by the telescope and thus the technique is severely flux limited. Kernel-phase analysis applies interferometric techniques similar to NRM to a diffraction limited image utilizing the full aperture. Instead of non-redundant closure-phases, kernel-phases are constructed from a grid of points on the full aperture, simulating a redundant interferometer. I have developed my own faint companion detection pipeline which utilizes an Bayesian analysis of kernel-phases. I have used this pipeline to search for new companions in archival images from HST/NICMOS in order to constrain planet and binary formation models at separations inaccessible to previous techniques. Using this method, it is possible to detect a companion well within the classical λ/D Rayleigh diffraction limit using a fraction of the telescope time as NRM. This technique can easily be applied to archival data as no mask is needed and will thus make the detection of close in companions cheap and simple as no additional observations are needed. Since the James Webb Space Telescope (JWST) will be able to perform NRM observations, further development and characterization of kernel-phase analysis will allow efficient use of highly competitive JWST telescope time.

  5. Kernel-Phase Interferometry for Super-Resolution Detection of Faint Companions

    NASA Astrophysics Data System (ADS)

    Factor, Samuel

    2016-10-01

    Direct detection of close in companions (binary systems or exoplanets) is notoriously difficult. While chronagraphs and point spread function (PSF) subtraction can be used to reduce contrast and dig out signals of companions under the PSF, there are still significant limitations in separation and contrast. While non-redundant aperture masking (NRM) interferometry can be used to detect companions well inside the PSF of a diffraction limited image, the mask discards 95% of the light gathered by the telescope and thus the technique is severely flux limited. Kernel-phase analysis applies interferometric techniques similar to NRM though utilizing the full aperture. Instead of closure-phases, kernel-phases are constructed from a grid of points on the full aperture, simulating a redundant interferometer. I propose to develop my own faint companion detection pipeline which utilizes an MCMC analysis of kernel-phases. I will search for new companions in archival images from NIC1 and ACS/HRC in order to constrain binary and planet formation models at separations inaccessible to previous techniques. Using this method, it is possible to detect a companion well within the classical l/D Rayleigh diffraction limit using a fraction of the telescope time as NRM. This technique can easily be applied to archival data as no mask is needed and will thus make the detection of close in companions cheap and simple as no additional observations are needed. Since the James Webb Space Telescope (JWST) will be able to perform NRM observations, further development and characterization of kernel-phase analysis will allow efficient use of highly competitive JWST telescope time.

  6. Calculation of plasma dielectric response in inhomogeneous magnetic field near electron cyclotron resonance

    NASA Astrophysics Data System (ADS)

    Evstatiev, Evstati; Svidzinski, Vladimir; Spencer, Andy; Galkin, Sergei

    2014-10-01

    Full wave 3-D modeling of RF fields in hot magnetized nonuniform plasma requires calculation of nonlocal conductivity kernel describing the dielectric response of such plasma to the RF field. In many cases, the conductivity kernel is a localized function near the test point which significantly simplifies numerical solution of the full wave 3-D problem. Preliminary results of feasibility analysis of numerical calculation of the conductivity kernel in a 3-D hot nonuniform magnetized plasma in the electron cyclotron frequency range will be reported. This case is relevant to modeling of ECRH in ITER. The kernel is calculated by integrating the linearized Vlasov equation along the unperturbed particle's orbits. Particle's orbits in the nonuniform equilibrium magnetic field are calculated numerically by one of the Runge-Kutta methods. RF electric field is interpolated on a specified grid on which the conductivity kernel is discretized. The resulting integrals in the particle's initial velocity and time are then calculated numerically. Different optimization approaches of the integration are tested in this feasibility analysis. Work is supported by the U.S. DOE SBIR program.

  7. On the logarithmic-singularity correction in the kernel function method of subsonic lifting-surface theory

    NASA Technical Reports Server (NTRS)

    Lan, C. E.; Lamar, J. E.

    1977-01-01

    A logarithmic-singularity correction factor is derived for use in kernel function methods associated with Multhopp's subsonic lifting-surface theory. Because of the form of the factor, a relation was formulated between the numbers of chordwise and spanwise control points needed for good accuracy. This formulation is developed and discussed. Numerical results are given to show the improvement of the computation with the new correction factor.

  8. Evaluation of the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels using particle and heavy ion transport code system: PHITS.

    PubMed

    Shiiba, Takuro; Kuga, Naoya; Kuroiwa, Yasuyoshi; Sato, Tatsuhiko

    2017-10-01

    We assessed the accuracy of mono-energetic electron and beta-emitting isotope dose-point kernels (DPKs) calculated using the particle and heavy ion transport code system (PHITS) for patient-specific dosimetry in targeted radionuclide treatment (TRT) and compared our data with published data. All mono-energetic and beta-emitting isotope DPKs calculated using PHITS, both in water and compact bone, were in good agreement with those in literature using other MC codes. PHITS provided reliable mono-energetic electron and beta-emitting isotope scaled DPKs for patient-specific dosimetry. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Effect of Local TOF Kernel Miscalibrations on Contrast-Noise in TOF PET

    NASA Astrophysics Data System (ADS)

    Clementel, Enrico; Mollet, Pieter; Vandenberghe, Stefaan

    2013-06-01

    TOF PET imaging requires specific calibrations: accurate characterization of the system timing resolution and timing offset is required to achieve the full potential image quality. Current system models used in image reconstruction assume a spatially uniform timing resolution kernel. Furthermore, although the timing offset errors are often pre-corrected, this correction becomes less accurate with the time since, especially in older scanners, the timing offsets are often calibrated only during the installation, as the procedure is time-consuming. In this study, we investigate and compare the effects of local mismatch of timing resolution when a uniform kernel is applied to systems with local variations in timing resolution and the effects of uncorrected time offset errors on image quality. A ring-like phantom was acquired on a Philips Gemini TF scanner and timing histograms were obtained from coincidence events to measure timing resolution along all sets of LORs crossing the scanner center. In addition, multiple acquisitions of a cylindrical phantom, 20 cm in diameter with spherical inserts, and a point source were simulated. A location-dependent timing resolution was simulated, with a median value of 500 ps and increasingly large local variations, and timing offset errors ranging from 0 to 350 ps were also simulated. Images were reconstructed with TOF MLEM with a uniform kernel corresponding to the effective timing resolution of the data, as well as with purposefully mismatched kernels. To CRC vs noise curves were measured over the simulated cylinder realizations, while the simulated point source was processed to generate timing histograms of the data. Results show that timing resolution is not uniform over the FOV of the considered scanner. The simulated phantom data indicate that CRC is moderately reduced in data sets with locally varying timing resolution reconstructed with a uniform kernel, while still performing better than non-TOF reconstruction. On the other hand, uncorrected offset errors in our setup have a larger potential for decreasing image quality and can lead to a reduction of CRC of up to 15% and an increase in the measured timing resolution kernel up to 40%. However, in realistic conditions in frequently calibrated systems, using a larger effective timing kernel in image reconstruction can compensate uncorrected offset errors.

  10. Temporal Effects on Internal Fluorescence Emissions Associated with Aflatoxin Contamination from Corn Kernel Cross-Sections Inoculated with Toxigenic and Atoxigenic Aspergillus flavus.

    PubMed

    Hruska, Zuzana; Yao, Haibo; Kincaid, Russell; Brown, Robert L; Bhatnagar, Deepak; Cleveland, Thomas E

    2017-01-01

    Non-invasive, easy to use and cost-effective technology offers a valuable alternative for rapid detection of carcinogenic fungal metabolites, namely aflatoxins, in commodities. One relatively recent development in this area is the use of spectral technology. Fluorescence hyperspectral imaging, in particular, offers a potential rapid and non-invasive method for detecting the presence of aflatoxins in maize infected with the toxigenic fungus Aspergillus flavus . Earlier studies have shown that whole maize kernels contaminated with aflatoxins exhibit different spectral signatures from uncontaminated kernels based on the external fluorescence emission of the whole kernels. Here, the effect of time on the internal fluorescence spectral emissions from cross-sections of kernels infected with toxigenic and atoxigenic A. flavus , were examined in order to elucidate the interaction between the fluorescence signals emitted by some aflatoxin contaminated maize kernels and the fungal invasion resulting in the production of aflatoxins. First, the difference in internal fluorescence emissions between cross-sections of kernels incubated in toxigenic and atoxigenic inoculum was assessed. Kernels were inoculated with each strain for 5, 7, and 9 days before cross-sectioning and imaging. There were 270 kernels (540 halves) imaged, including controls. Second, in a different set of kernels (15 kernels/group; 135 total), the germ of each kernel was separated from the endosperm to determine the major areas of aflatoxin accumulation and progression over nine growth days. Kernels were inoculated with toxigenic and atoxigenic fungal strains for 5, 7, and 9 days before the endosperm and germ were separated, followed by fluorescence hyperspectral imaging and chemical aflatoxin determination. A marked difference in fluorescence intensity was shown between the toxigenic and atoxigenic strains on day nine post-inoculation, which may be a useful indicator of the location of aflatoxin contamination. This finding suggests that both, the fluorescence peak shift and intensity as well as timing, may be essential in distinguishing toxigenic and atoxigenic fungi based on spectral features. Results also reveal a possible preferential difference in the internal colonization of maize kernels between the toxigenic and atoxigenic strains of A. flavus suggesting a potential window for differentiating the strains based on fluorescence spectra at specific time points.

  11. Temporal Effects on Internal Fluorescence Emissions Associated with Aflatoxin Contamination from Corn Kernel Cross-Sections Inoculated with Toxigenic and Atoxigenic Aspergillus flavus

    PubMed Central

    Hruska, Zuzana; Yao, Haibo; Kincaid, Russell; Brown, Robert L.; Bhatnagar, Deepak; Cleveland, Thomas E.

    2017-01-01

    Non-invasive, easy to use and cost-effective technology offers a valuable alternative for rapid detection of carcinogenic fungal metabolites, namely aflatoxins, in commodities. One relatively recent development in this area is the use of spectral technology. Fluorescence hyperspectral imaging, in particular, offers a potential rapid and non-invasive method for detecting the presence of aflatoxins in maize infected with the toxigenic fungus Aspergillus flavus. Earlier studies have shown that whole maize kernels contaminated with aflatoxins exhibit different spectral signatures from uncontaminated kernels based on the external fluorescence emission of the whole kernels. Here, the effect of time on the internal fluorescence spectral emissions from cross-sections of kernels infected with toxigenic and atoxigenic A. flavus, were examined in order to elucidate the interaction between the fluorescence signals emitted by some aflatoxin contaminated maize kernels and the fungal invasion resulting in the production of aflatoxins. First, the difference in internal fluorescence emissions between cross-sections of kernels incubated in toxigenic and atoxigenic inoculum was assessed. Kernels were inoculated with each strain for 5, 7, and 9 days before cross-sectioning and imaging. There were 270 kernels (540 halves) imaged, including controls. Second, in a different set of kernels (15 kernels/group; 135 total), the germ of each kernel was separated from the endosperm to determine the major areas of aflatoxin accumulation and progression over nine growth days. Kernels were inoculated with toxigenic and atoxigenic fungal strains for 5, 7, and 9 days before the endosperm and germ were separated, followed by fluorescence hyperspectral imaging and chemical aflatoxin determination. A marked difference in fluorescence intensity was shown between the toxigenic and atoxigenic strains on day nine post-inoculation, which may be a useful indicator of the location of aflatoxin contamination. This finding suggests that both, the fluorescence peak shift and intensity as well as timing, may be essential in distinguishing toxigenic and atoxigenic fungi based on spectral features. Results also reveal a possible preferential difference in the internal colonization of maize kernels between the toxigenic and atoxigenic strains of A. flavus suggesting a potential window for differentiating the strains based on fluorescence spectra at specific time points. PMID:28966606

  12. Generalized and efficient algorithm for computing multipole energies and gradients based on Cartesian tensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Dejun, E-mail: dejun.lin@gmail.com

    2015-09-21

    Accurate representation of intermolecular forces has been the central task of classical atomic simulations, known as molecular mechanics. Recent advancements in molecular mechanics models have put forward the explicit representation of permanent and/or induced electric multipole (EMP) moments. The formulas developed so far to calculate EMP interactions tend to have complicated expressions, especially in Cartesian coordinates, which can only be applied to a specific kernel potential function. For example, one needs to develop a new formula each time a new kernel function is encountered. The complication of these formalisms arises from an intriguing and yet obscured mathematical relation between themore » kernel functions and the gradient operators. Here, I uncover this relation via rigorous derivation and find that the formula to calculate EMP interactions is basically invariant to the potential kernel functions as long as they are of the form f(r), i.e., any Green’s function that depends on inter-particle distance. I provide an algorithm for efficient evaluation of EMP interaction energies, forces, and torques for any kernel f(r) up to any arbitrary rank of EMP moments in Cartesian coordinates. The working equations of this algorithm are essentially the same for any kernel f(r). Recently, a few recursive algorithms were proposed to calculate EMP interactions. Depending on the kernel functions, the algorithm here is about 4–16 times faster than these algorithms in terms of the required number of floating point operations and is much more memory efficient. I show that it is even faster than a theoretically ideal recursion scheme, i.e., one that requires 1 floating point multiplication and 1 addition per recursion step. This algorithm has a compact vector-based expression that is optimal for computer programming. The Cartesian nature of this algorithm makes it fit easily into modern molecular simulation packages as compared with spherical coordinate-based algorithms. A software library based on this algorithm has been implemented in C++11 and has been released.« less

  13. 3D local feature BKD to extract road information from mobile laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Yang, Bisheng; Liu, Yuan; Dong, Zhen; Liang, Fuxun; Li, Bijun; Peng, Xiangyang

    2017-08-01

    Extracting road information from point clouds obtained through mobile laser scanning (MLS) is essential for autonomous vehicle navigation, and has hence garnered a growing amount of research interest in recent years. However, the performance of such systems is seriously affected due to varying point density and noise. This paper proposes a novel three-dimensional (3D) local feature called the binary kernel descriptor (BKD) to extract road information from MLS point clouds. The BKD consists of Gaussian kernel density estimation and binarization components to encode the shape and intensity information of the 3D point clouds that are fed to a random forest classifier to extract curbs and markings on the road. These are then used to derive road information, such as the number of lanes, the lane width, and intersections. In experiments, the precision and recall of the proposed feature for the detection of curbs and road markings on an urban dataset and a highway dataset were as high as 90%, thus showing that the BKD is accurate and robust against varying point density and noise.

  14. Flood susceptibility mapping using a novel ensemble weights-of-evidence and support vector machine models in GIS

    NASA Astrophysics Data System (ADS)

    Tehrany, Mahyat Shafapour; Pradhan, Biswajeet; Jebur, Mustafa Neamah

    2014-05-01

    Flood is one of the most devastating natural disasters that occur frequently in Terengganu, Malaysia. Recently, ensemble based techniques are getting extremely popular in flood modeling. In this paper, weights-of-evidence (WoE) model was utilized first, to assess the impact of classes of each conditioning factor on flooding through bivariate statistical analysis (BSA). Then, these factors were reclassified using the acquired weights and entered into the support vector machine (SVM) model to evaluate the correlation between flood occurrence and each conditioning factor. Through this integration, the weak point of WoE can be solved and the performance of the SVM will be enhanced. The spatial database included flood inventory, slope, stream power index (SPI), topographic wetness index (TWI), altitude, curvature, distance from the river, geology, rainfall, land use/cover (LULC), and soil type. Four kernel types of SVM (linear kernel (LN), polynomial kernel (PL), radial basis function kernel (RBF), and sigmoid kernel (SIG)) were used to investigate the performance of each kernel type. The efficiency of the new ensemble WoE and SVM method was tested using area under curve (AUC) which measured the prediction and success rates. The validation results proved the strength and efficiency of the ensemble method over the individual methods. The best results were obtained from RBF kernel when compared with the other kernel types. Success rate and prediction rate for ensemble WoE and RBF-SVM method were 96.48% and 95.67% respectively. The proposed ensemble flood susceptibility mapping method could assist researchers and local governments in flood mitigation strategies.

  15. Fast Gaussian kernel learning for classification tasks based on specially structured global optimization.

    PubMed

    Zhong, Shangping; Chen, Tianshun; He, Fengying; Niu, Yuzhen

    2014-09-01

    For a practical pattern classification task solved by kernel methods, the computing time is mainly spent on kernel learning (or training). However, the current kernel learning approaches are based on local optimization techniques, and hard to have good time performances, especially for large datasets. Thus the existing algorithms cannot be easily extended to large-scale tasks. In this paper, we present a fast Gaussian kernel learning method by solving a specially structured global optimization (SSGO) problem. We optimize the Gaussian kernel function by using the formulated kernel target alignment criterion, which is a difference of increasing (d.i.) functions. Through using a power-transformation based convexification method, the objective criterion can be represented as a difference of convex (d.c.) functions with a fixed power-transformation parameter. And the objective programming problem can then be converted to a SSGO problem: globally minimizing a concave function over a convex set. The SSGO problem is classical and has good solvability. Thus, to find the global optimal solution efficiently, we can adopt the improved Hoffman's outer approximation method, which need not repeat the searching procedure with different starting points to locate the best local minimum. Also, the proposed method can be proven to converge to the global solution for any classification task. We evaluate the proposed method on twenty benchmark datasets, and compare it with four other Gaussian kernel learning methods. Experimental results show that the proposed method stably achieves both good time-efficiency performance and good classification performance. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Spatial patterns of aflatoxin levels in relation to ear-feeding insect damage in pre-harvest corn.

    PubMed

    Ni, Xinzhi; Wilson, Jeffrey P; Buntin, G David; Guo, Baozhu; Krakowsky, Matthew D; Lee, R Dewey; Cottrell, Ted E; Scully, Brian T; Huffaker, Alisa; Schmelz, Eric A

    2011-07-01

    Key impediments to increased corn yield and quality in the southeastern US coastal plain region are damage by ear-feeding insects and aflatoxin contamination caused by infection of Aspergillus flavus. Key ear-feeding insects are corn earworm, Helicoverpa zea, fall armyworm, Spodoptera frugiperda, maize weevil, Sitophilus zeamais, and brown stink bug, Euschistus servus. In 2006 and 2007, aflatoxin contamination and insect damage were sampled before harvest in three 0.4-hectare corn fields using a grid sampling method. The feeding damage by each of ear/kernel-feeding insects (i.e., corn earworm/fall armyworm damage on the silk/cob, and discoloration of corn kernels by stink bugs), and maize weevil population were assessed at each grid point with five ears. The spatial distribution pattern of aflatoxin contamination was also assessed using the corn samples collected at each sampling point. Aflatoxin level was correlated to the number of maize weevils and stink bug-discolored kernels, but not closely correlated to either husk coverage or corn earworm damage. Contour maps of the maize weevil populations, stink bug-damaged kernels, and aflatoxin levels exhibited an aggregated distribution pattern with a strong edge effect on all three parameters. The separation of silk- and cob-feeding insects from kernel-feeding insects, as well as chewing (i.e., the corn earworm and maize weevil) and piercing-sucking insects (i.e., the stink bugs) and their damage in relation to aflatoxin accumulation is economically important. Both theoretic and applied ramifications of this study were discussed by proposing a hypothesis on the underlying mechanisms of the aggregated distribution patterns and strong edge effect of insect damage and aflatoxin contamination, and by discussing possible management tactics for aflatoxin reduction by proper management of kernel-feeding insects. Future directions on basic and applied research related to aflatoxin contamination are also discussed.

  17. Spatial Patterns of Aflatoxin Levels in Relation to Ear-Feeding Insect Damage in Pre-Harvest Corn

    PubMed Central

    Ni, Xinzhi; Wilson, Jeffrey P.; Buntin, G. David; Guo, Baozhu; Krakowsky, Matthew D.; Lee, R. Dewey; Cottrell, Ted E.; Scully, Brian T.; Huffaker, Alisa; Schmelz, Eric A.

    2011-01-01

    Key impediments to increased corn yield and quality in the southeastern US coastal plain region are damage by ear-feeding insects and aflatoxin contamination caused by infection of Aspergillus flavus. Key ear-feeding insects are corn earworm, Helicoverpa zea, fall armyworm, Spodoptera frugiperda, maize weevil, Sitophilus zeamais, and brown stink bug, Euschistus servus. In 2006 and 2007, aflatoxin contamination and insect damage were sampled before harvest in three 0.4-hectare corn fields using a grid sampling method. The feeding damage by each of ear/kernel-feeding insects (i.e., corn earworm/fall armyworm damage on the silk/cob, and discoloration of corn kernels by stink bugs), and maize weevil population were assessed at each grid point with five ears. The spatial distribution pattern of aflatoxin contamination was also assessed using the corn samples collected at each sampling point. Aflatoxin level was correlated to the number of maize weevils and stink bug-discolored kernels, but not closely correlated to either husk coverage or corn earworm damage. Contour maps of the maize weevil populations, stink bug-damaged kernels, and aflatoxin levels exhibited an aggregated distribution pattern with a strong edge effect on all three parameters. The separation of silk- and cob-feeding insects from kernel-feeding insects, as well as chewing (i.e., the corn earworm and maize weevil) and piercing-sucking insects (i.e., the stink bugs) and their damage in relation to aflatoxin accumulation is economically important. Both theoretic and applied ramifications of this study were discussed by proposing a hypothesis on the underlying mechanisms of the aggregated distribution patterns and strong edge effect of insect damage and aflatoxin contamination, and by discussing possible management tactics for aflatoxin reduction by proper management of kernel-feeding insects. Future directions on basic and applied research related to aflatoxin contamination are also discussed. PMID:22069748

  18. Efficient protein structure search using indexing methods

    PubMed Central

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively. PMID:23691543

  19. Efficient protein structure search using indexing methods.

    PubMed

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  20. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve

    NASA Astrophysics Data System (ADS)

    Xu, Lili; Luo, Shuqian

    2010-11-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  1. Optimal algorithm for automatic detection of microaneurysms based on receiver operating characteristic curve.

    PubMed

    Xu, Lili; Luo, Shuqian

    2010-01-01

    Microaneurysms (MAs) are the first manifestations of the diabetic retinopathy (DR) as well as an indicator for its progression. Their automatic detection plays a key role for both mass screening and monitoring and is therefore in the core of any system for computer-assisted diagnosis of DR. The algorithm basically comprises the following stages: candidate detection aiming at extracting the patterns possibly corresponding to MAs based on mathematical morphological black top hat, feature extraction to characterize these candidates, and classification based on support vector machine (SVM), to validate MAs. Feature vector and kernel function of SVM selection is very important to the algorithm. We use the receiver operating characteristic (ROC) curve to evaluate the distinguishing performance of different feature vectors and different kernel functions of SVM. The ROC analysis indicates the quadratic polynomial SVM with a combination of features as the input shows the best discriminating performance.

  2. Refinement of Methods for Evaluation of Near-Hypersingular Integrals in BEM Formulations

    NASA Technical Reports Server (NTRS)

    Fink, Patricia W.; Khayat, Michael A.; Wilton, Donald R.

    2006-01-01

    In this paper, we present advances in singularity cancellation techniques applied to integrals in BEM formulations that are nearly hypersingular. Significant advances have been made recently in singularity cancellation techniques applied to 1 R type kernels [M. Khayat, D. Wilton, IEEE Trans. Antennas and Prop., 53, pp. 3180-3190, 2005], as well as to the gradients of these kernels [P. Fink, D. Wilton, and M. Khayat, Proc. ICEAA, pp. 861-864, Torino, Italy, 2005] on curved subdomains. In these approaches, the source triangle is divided into three tangent subtriangles with a common vertex at the normal projection of the observation point onto the source element or the extended surface containing it. The geometry of a typical tangent subtriangle and its local rectangular coordinate system with origin at the projected observation point is shown in Fig. 1. Whereas singularity cancellation techniques for 1 R type kernels are now nearing maturity, the efficient handling of near-hypersingular kernels still needs attention. For example, in the gradient reference above, techniques are presented for computing the normal component of the gradient relative to the plane containing the tangent subtriangle. These techniques, summarized in the transformations in Table 1, are applied at the sub-triangle level and correspond particularly to the case in which the normal projection of the observation point lies within the boundary of the source element. They are found to be highly efficient as z approaches zero. Here, we extend the approach to cover two instances not previously addressed. First, we consider the case in which the normal projection of the observation point lies external to the source element. For such cases, we find that simple modifications to the transformations of Table 1 permit significant savings in computational cost. Second, we present techniques that permit accurate computation of the tangential components of the gradient; i.e., tangent to the plane containing the source element.

  3. Comparison of GATE/GEANT4 with EGSnrc and MCNP for electron dose calculations at energies between 15 keV and 20 MeV.

    PubMed

    Maigne, L; Perrot, Y; Schaart, D R; Donnarieix, D; Breton, V

    2011-02-07

    The GATE Monte Carlo simulation platform based on the GEANT4 toolkit has come into widespread use for simulating positron emission tomography (PET) and single photon emission computed tomography (SPECT) imaging devices. Here, we explore its use for calculating electron dose distributions in water. Mono-energetic electron dose point kernels and pencil beam kernels in water are calculated for different energies between 15 keV and 20 MeV by means of GATE 6.0, which makes use of the GEANT4 version 9.2 Standard Electromagnetic Physics Package. The results are compared to the well-validated codes EGSnrc and MCNP4C. It is shown that recent improvements made to the GEANT4/GATE software result in significantly better agreement with the other codes. We furthermore illustrate several issues of general interest to GATE and GEANT4 users who wish to perform accurate simulations involving electrons. Provided that the electron step size is sufficiently restricted, GATE 6.0 and EGSnrc dose point kernels are shown to agree to within less than 3% of the maximum dose between 50 keV and 4 MeV, while pencil beam kernels are found to agree to within less than 4% of the maximum dose between 15 keV and 20 MeV.

  4. Critical environmental and genotypic factors for Fusarium verticillioides infection, fungal growth and fumonisin contamination in maize grown in northwestern Spain.

    PubMed

    Cao, Ana; Santiago, Rogelio; Ramos, Antonio J; Souto, Xosé C; Aguín, Olga; Malvar, Rosa Ana; Butrón, Ana

    2014-05-02

    In northwestern Spain, where weather is rainy and mild throughout the year, Fusarium verticillioides is the most prevalent fungus in kernels and a significant risk of fumonisin contamination has been exposed. In this study, detailed information about environmental and maize genotypic factors affecting F. verticillioides infection, fungal growth and fumonisin content in maize kernels was obtained in order to establish control points to reduce fumonisin contamination. Evaluations were conducted in a total of 36 environments and factorial regression analyses were performed to determine the contribution of each factor to variability among environments, genotypes, and genotype × environment interactions for F. verticillioides infection, fungal growth and fumonisin content. Flowering and kernel drying were the most critical periods throughout the growing season for F. verticillioides infection and fumonisin contamination. Around flowering, wetter and cooler conditions limited F. verticillioides infection and growth, and high temperatures increased fumonisin contents. During kernel drying, increased damaged kernels favored fungal growth, and higher ear damage by corn borers and hard rainfall favored fumonisin accumulation. Later planting dates and especially earlier harvest dates reduced the risk of fumonisin contamination, possibly due to reduced incidence of insects and accumulation of rainfall during the kernel drying period. The use of maize varieties resistant to Sitotroga cerealella, with good husk coverage and non-excessive pericarp thickness could also be useful to reduce fumonisin contamination of maize kernels. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Relationship between processing score and kernel-fraction particle size in whole-plant corn silage.

    PubMed

    Dias Junior, G S; Ferraretto, L F; Salvati, G G S; de Resende, L C; Hoffman, P C; Pereira, M N; Shaver, R D

    2016-04-01

    Kernel processing increases starch digestibility in whole-plant corn silage (WPCS). Corn silage processing score (CSPS), the percentage of starch passing through a 4.75-mm sieve, is widely used to assess degree of kernel breakage in WPCS. However, the geometric mean particle size (GMPS) of the kernel-fraction that passes through the 4.75-mm sieve has not been well described. Therefore, the objectives of this study were (1) to evaluate particle size distribution and digestibility of kernels cut in varied particle sizes; (2) to propose a method to measure GMPS in WPCS kernels; and (3) to evaluate the relationship between CSPS and GMPS of the kernel fraction in WPCS. Composite samples of unfermented, dried kernels from 110 corn hybrids commonly used for silage production were kept whole (WH) or manually cut in 2, 4, 8, 16, 32 or 64 pieces (2P, 4P, 8P, 16P, 32P, and 64P, respectively). Dry sieving to determine GMPS, surface area, and particle size distribution using 9 sieves with nominal square apertures of 9.50, 6.70, 4.75, 3.35, 2.36, 1.70, 1.18, and 0.59 mm and pan, as well as ruminal in situ dry matter (DM) digestibilities were performed for each kernel particle number treatment. Incubation times were 0, 3, 6, 12, and 24 h. The ruminal in situ DM disappearance of unfermented kernels increased with the reduction in particle size of corn kernels. Kernels kept whole had the lowest ruminal DM disappearance for all time points with maximum DM disappearance of 6.9% at 24 h and the greatest disappearance was observed for 64P, followed by 32P and 16P. Samples of WPCS (n=80) from 3 studies representing varied theoretical length of cut settings and processor types and settings were also evaluated. Each WPCS sample was divided in 2 and then dried at 60 °C for 48 h. The CSPS was determined in duplicate on 1 of the split samples, whereas on the other split sample the kernel and stover fractions were separated using a hydrodynamic separation procedure. After separation, the kernel fraction was redried at 60°C for 48 h in a forced-air oven and dry sieved to determine GMPS and surface area. Linear relationships between CSPS from WPCS (n=80) and kernel fraction GMPS, surface area, and proportion passing through the 4.75-mm screen were poor. Strong quadratic relationships between proportion of kernel fraction passing through the 4.75-mm screen and kernel fraction GMPS and surface area were observed. These findings suggest that hydrodynamic separation and dry sieving of the kernel fraction may provide a better assessment of kernel breakage in WPCS than CSPS. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  6. GPU Acceleration of Mean Free Path Based Kernel Density Estimators for Monte Carlo Neutronics Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, TImothy P.; Kiedrowski, Brian C.; Martin, William R.

    Kernel Density Estimators (KDEs) are a non-parametric density estimation technique that has recently been applied to Monte Carlo radiation transport simulations. Kernel density estimators are an alternative to histogram tallies for obtaining global solutions in Monte Carlo tallies. With KDEs, a single event, either a collision or particle track, can contribute to the score at multiple tally points with the uncertainty at those points being independent of the desired resolution of the solution. Thus, KDEs show potential for obtaining estimates of a global solution with reduced variance when compared to a histogram. Previously, KDEs have been applied to neutronics formore » one-group reactor physics problems and fixed source shielding applications. However, little work was done to obtain reaction rates using KDEs. This paper introduces a new form of the MFP KDE that is capable of handling general geometries. Furthermore, extending the MFP KDE to 2-D problems in continuous energy introduces inaccuracies to the solution. An ad-hoc solution to these inaccuracies is introduced that produces errors smaller than 4% at material interfaces.« less

  7. Data-based diffraction kernels for surface waves from convolution and correlation processes through active seismic interferometry

    NASA Astrophysics Data System (ADS)

    Chmiel, Malgorzata; Roux, Philippe; Herrmann, Philippe; Rondeleux, Baptiste; Wathelet, Marc

    2018-05-01

    We investigated the construction of diffraction kernels for surface waves using two-point convolution and/or correlation from land active seismic data recorded in the context of exploration geophysics. The high density of controlled sources and receivers, combined with the application of the reciprocity principle, allows us to retrieve two-dimensional phase-oscillation diffraction kernels (DKs) of surface waves between any two source or receiver points in the medium at each frequency (up to 15 Hz, at least). These DKs are purely data-based as no model calculations and no synthetic data are needed. They naturally emerge from the interference patterns of the recorded wavefields projected on the dense array of sources and/or receivers. The DKs are used to obtain multi-mode dispersion relations of Rayleigh waves, from which near-surface shear velocity can be extracted. Using convolution versus correlation with a grid of active sources is an important step in understanding the physics of the retrieval of surface wave Green's functions. This provides the foundation for future studies based on noise sources or active sources with a sparse spatial distribution.

  8. Accurately estimating PSF with straight lines detected by Hough transform

    NASA Astrophysics Data System (ADS)

    Wang, Ruichen; Xu, Liangpeng; Fan, Chunxiao; Li, Yong

    2018-04-01

    This paper presents an approach to estimating point spread function (PSF) from low resolution (LR) images. Existing techniques usually rely on accurate detection of ending points of the profile normal to edges. In practice however, it is often a great challenge to accurately localize profiles of edges from a LR image, which hence leads to a poor PSF estimation of the lens taking the LR image. For precisely estimating the PSF, this paper proposes firstly estimating a 1-D PSF kernel with straight lines, and then robustly obtaining the 2-D PSF from the 1-D kernel by least squares techniques and random sample consensus. Canny operator is applied to the LR image for obtaining edges and then Hough transform is utilized to extract straight lines of all orientations. Estimating 1-D PSF kernel with straight lines effectively alleviates the influence of the inaccurate edge detection on PSF estimation. The proposed method is investigated on both natural and synthetic images for estimating PSF. Experimental results show that the proposed method outperforms the state-ofthe- art and does not rely on accurate edge detection.

  9. Kernel-Phase Interferometry for Super-Resolution Detection of Faint Companions

    NASA Astrophysics Data System (ADS)

    Factor, Samuel M.; Kraus, Adam L.

    2017-06-01

    Direct detection of close in companions (exoplanets or binary systems) is notoriously difficult. While coronagraphs and point spread function (PSF) subtraction can be used to reduce contrast and dig out signals of companions under the PSF, there are still significant limitations in separation and contrast near λ/D. Non-redundant aperture masking (NRM) interferometry can be used to detect companions well inside the PSF of a diffraction limited image, though the mask discards ˜ 95% of the light gathered by the telescope and thus the technique is severely flux limited. Kernel-phase analysis applies interferometric techniques similar to NRM to a diffraction limited image utilizing the full aperture. Instead of non-redundant closure-phases, kernel-phases are constructed from a grid of points on the full aperture, simulating a redundant interferometer. I have developed a new, easy to use, faint companion detection pipeline which analyzes kernel-phases utilizing Bayesian model comparison. I demonstrate this pipeline on archival images from HST/NICMOS, searching for new companions in order to constrain binary formation models at separations inaccessible to previous techniques. Using this method, it is possible to detect a companion well within the classical λ/D Rayleigh diffraction limit using a fraction of the telescope time as NRM. Since the James Webb Space Telescope (JWST) will be able to perform NRM observations, further development and characterization of kernel-phase analysis will allow efficient use of highly competitive JWST telescope time. As no mask is needed, this technique can easily be applied to archival data and even target acquisition images (e.g. from JWST), making the detection of close in companions cheap and simple as no additional observations are needed.

  10. Home range, habitat selection, and movements of California Black Rails at tidal marshes at San Francisco Bay, California

    USGS Publications Warehouse

    Tsao, Danika C.; Takekawa, John Y.; Woo, Isa; Yee, Julie L.; Evens, Jules G.

    2009-01-01

    Little is known about the movements and habitat selection of California Black Rails (Laterallus jamaicensis coturniculus) in coastal California. We captured 130 Black Rails, of which we radio-marked 48, in tidal marshes in San Francisco Bay during 2005 and 2006. Our objective was to examine their home ranges, movements, and habitat selection to improve the species' conservation. The mean fixed-kernel home range was 0.59 ha, the mean core area was 0.14 ha. Home ranges and core areas did not differ by year or site. Males had significantly larger home ranges and core areas than did females. All sites combined, Black Rails used areas with ≥94% total vegetative cover, with perennial pickleweed (Sarcocornia pacifica) the dominant plant. The rails' habitat selection varied by year and site but not by sex. A multivariate analysis of variance indicated that Black Rails selected areas with pickleweed taller and denser than average, greater cover and height of alkali bulrush (Bolboschoenus maritimus) and common saltgrass (Distichlis spicata), more stems between 20 and 30 cm above the ground, maximum vegetation height, and shorter distance to refugia. On average, Black Rails moved 27.6 ±1.8 (SE) m daily and 38.4 ± 5.5 m during extreme high tides. Understanding the California Black Rail's movements, home range, and habitat use is critical for management to benefit the species.

  11. On Quantile Regression in Reproducing Kernel Hilbert Spaces with Data Sparsity Constraint

    PubMed Central

    Zhang, Chong; Liu, Yufeng; Wu, Yichao

    2015-01-01

    For spline regressions, it is well known that the choice of knots is crucial for the performance of the estimator. As a general learning framework covering the smoothing splines, learning in a Reproducing Kernel Hilbert Space (RKHS) has a similar issue. However, the selection of training data points for kernel functions in the RKHS representation has not been carefully studied in the literature. In this paper we study quantile regression as an example of learning in a RKHS. In this case, the regular squared norm penalty does not perform training data selection. We propose a data sparsity constraint that imposes thresholding on the kernel function coefficients to achieve a sparse kernel function representation. We demonstrate that the proposed data sparsity method can have competitive prediction performance for certain situations, and have comparable performance in other cases compared to that of the traditional squared norm penalty. Therefore, the data sparsity method can serve as a competitive alternative to the squared norm penalty method. Some theoretical properties of our proposed method using the data sparsity constraint are obtained. Both simulated and real data sets are used to demonstrate the usefulness of our data sparsity constraint. PMID:27134575

  12. Convolution kernels for multi-wavelength imaging

    NASA Astrophysics Data System (ADS)

    Boucaud, A.; Bocchio, M.; Abergel, A.; Orieux, F.; Dole, H.; Hadj-Youcef, M. A.

    2016-12-01

    Astrophysical images issued from different instruments and/or spectral bands often require to be processed together, either for fitting or comparison purposes. However each image is affected by an instrumental response, also known as point-spread function (PSF), that depends on the characteristics of the instrument as well as the wavelength and the observing strategy. Given the knowledge of the PSF in each band, a straightforward way of processing images is to homogenise them all to a target PSF using convolution kernels, so that they appear as if they had been acquired by the same instrument. We propose an algorithm that generates such PSF-matching kernels, based on Wiener filtering with a tunable regularisation parameter. This method ensures all anisotropic features in the PSFs to be taken into account. We compare our method to existing procedures using measured Herschel/PACS and SPIRE PSFs and simulated JWST/MIRI PSFs. Significant gains up to two orders of magnitude are obtained with respect to the use of kernels computed assuming Gaussian or circularised PSFs. A software to compute these kernels is available at https://github.com/aboucaud/pypher

  13. A Precise Drunk Driving Detection Using Weighted Kernel Based on Electrocardiogram.

    PubMed

    Wu, Chung Kit; Tsang, Kim Fung; Chi, Hao Ran; Hung, Faan Hei

    2016-05-09

    Globally, 1.2 million people die and 50 million people are injured annually due to traffic accidents. These traffic accidents cost $500 billion dollars. Drunk drivers are found in 40% of the traffic crashes. Existing drunk driving detection (DDD) systems do not provide accurate detection and pre-warning concurrently. Electrocardiogram (ECG) is a proven biosignal that accurately and simultaneously reflects human's biological status. In this letter, a classifier for DDD based on ECG is investigated in an attempt to reduce traffic accidents caused by drunk drivers. At this point, it appears that there is no known research or literature found on ECG classifier for DDD. To identify drunk syndromes, the ECG signals from drunk drivers are studied and analyzed. As such, a precise ECG-based DDD (ECG-DDD) using a weighted kernel is developed. From the measurements, 10 key features of ECG signals were identified. To incorporate the important features, the feature vectors are weighted in the customization of kernel functions. Four commonly adopted kernel functions are studied. Results reveal that weighted feature vectors improve the accuracy by 11% compared to the computation using the prime kernel. Evaluation shows that ECG-DDD improved the accuracy by 8% to 18% compared to prevailing methods.

  14. BSD Portals for LINUX 2.0

    NASA Technical Reports Server (NTRS)

    McNab, A. David; woo, Alex (Technical Monitor)

    1999-01-01

    Portals, an experimental feature of 4.4BSD, extend the file system name space by exporting certain open () requests to a user-space daemon. A portal daemon is mounted into the file name space as if it were a standard file system. When the kernel resolves a pathname and encounters a portal mount point, the remainder of the path is passed to the portal daemon. Depending on the portal "pathname" and the daemon's configuration, some type of open (2) is performed. The resulting file descriptor is passed back to the kernel which eventually returns it to the user, to whom it appears that a "normal" open has occurred. A proxy portalfs file system is responsible for kernel interaction with the daemon. The overall effect is that the portal daemon performs an open (2) on behalf of the kernel, possibly hiding substantial complexity from the calling process. One particularly useful application is implementing a connection service that allows simple scripts to open network sockets. This paper describes the implementation of portals for LINUX 2.0.

  15. Accurate interatomic force fields via machine learning with covariant kernels

    NASA Astrophysics Data System (ADS)

    Glielmo, Aldo; Sollich, Peter; De Vita, Alessandro

    2017-06-01

    We present a novel scheme to accurately predict atomic forces as vector quantities, rather than sets of scalar components, by Gaussian process (GP) regression. This is based on matrix-valued kernel functions, on which we impose the requirements that the predicted force rotates with the target configuration and is independent of any rotations applied to the configuration database entries. We show that such covariant GP kernels can be obtained by integration over the elements of the rotation group SO (d ) for the relevant dimensionality d . Remarkably, in specific cases the integration can be carried out analytically and yields a conservative force field that can be recast into a pair interaction form. Finally, we show that restricting the integration to a summation over the elements of a finite point group relevant to the target system is sufficient to recover an accurate GP. The accuracy of our kernels in predicting quantum-mechanical forces in real materials is investigated by tests on pure and defective Ni, Fe, and Si crystalline systems.

  16. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization

    PubMed Central

    Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms. PMID:27436996

  17. Kernel Recursive Least-Squares Temporal Difference Algorithms with Sparsification and Regularization.

    PubMed

    Zhang, Chunyuan; Zhu, Qingxin; Niu, Xinzheng

    2016-01-01

    By combining with sparse kernel methods, least-squares temporal difference (LSTD) algorithms can construct the feature dictionary automatically and obtain a better generalization ability. However, the previous kernel-based LSTD algorithms do not consider regularization and their sparsification processes are batch or offline, which hinder their widespread applications in online learning problems. In this paper, we combine the following five techniques and propose two novel kernel recursive LSTD algorithms: (i) online sparsification, which can cope with unknown state regions and be used for online learning, (ii) L 2 and L 1 regularization, which can avoid overfitting and eliminate the influence of noise, (iii) recursive least squares, which can eliminate matrix-inversion operations and reduce computational complexity, (iv) a sliding-window approach, which can avoid caching all history samples and reduce the computational cost, and (v) the fixed-point subiteration and online pruning, which can make L 1 regularization easy to implement. Finally, simulation results on two 50-state chain problems demonstrate the effectiveness of our algorithms.

  18. Efficient High Performance Collective Communication for Distributed Memory Environments

    ERIC Educational Resources Information Center

    Ali, Qasim

    2009-01-01

    Collective communication allows efficient communication and synchronization among a collection of processes, unlike point-to-point communication that only involves a pair of communicating processes. Achieving high performance for both kernels and full-scale applications running on a distributed memory system requires an efficient implementation of…

  19. Norm overlap between many-body states: Uncorrelated overlap between arbitrary Bogoliubov product states

    NASA Astrophysics Data System (ADS)

    Bally, B.; Duguet, T.

    2018-02-01

    Background: State-of-the-art multi-reference energy density functional calculations require the computation of norm overlaps between different Bogoliubov quasiparticle many-body states. It is only recently that the efficient and unambiguous calculation of such norm kernels has become available under the form of Pfaffians [L. M. Robledo, Phys. Rev. C 79, 021302 (2009), 10.1103/PhysRevC.79.021302]. Recently developed particle-number-restored Bogoliubov coupled-cluster (PNR-BCC) and particle-number-restored Bogoliubov many-body perturbation (PNR-BMBPT) ab initio theories [T. Duguet and A. Signoracci, J. Phys. G 44, 015103 (2017), 10.1088/0954-3899/44/1/015103] make use of generalized norm kernels incorporating explicit many-body correlations. In PNR-BCC and PNR-BMBPT, the Bogoliubov states involved in the norm kernels differ specifically via a global gauge rotation. Purpose: The goal of this work is threefold. We wish (i) to propose and implement an alternative to the Pfaffian method to compute unambiguously the norm overlap between arbitrary Bogoliubov quasiparticle states, (ii) to extend the first point to explicitly correlated norm kernels, and (iii) to scrutinize the analytical content of the correlated norm kernels employed in PNR-BMBPT. Point (i) constitutes the purpose of the present paper while points (ii) and (iii) are addressed in a forthcoming paper. Methods: We generalize the method used in another work [T. Duguet and A. Signoracci, J. Phys. G 44, 015103 (2017), 10.1088/0954-3899/44/1/015103] in such a way that it is applicable to kernels involving arbitrary pairs of Bogoliubov states. The formalism is presently explicated in detail in the case of the uncorrelated overlap between arbitrary Bogoliubov states. The power of the method is numerically illustrated and benchmarked against known results on the basis of toy models of increasing complexity. Results: The norm overlap between arbitrary Bogoliubov product states is obtained under a closed-form expression allowing its computation without any phase ambiguity. The formula is physically intuitive, accurate, and versatile. It equally applies to norm overlaps between Bogoliubov states of even or odd number parity. Numerical applications illustrate these features and provide a transparent representation of the content of the norm overlaps. Conclusions: The complex norm overlap between arbitrary Bogoliubov states is computed, without any phase ambiguity, via elementary linear algebra operations. The method can be used in any configuration mixing of orthogonal and non-orthogonal product states. Furthermore, the closed-form expression extends naturally to correlated overlaps at play in PNR-BCC and PNR-BMBPT. As such, the straight overlap between Bogoliubov states is the zero-order reduction of more involved norm kernels to be studied in a forthcoming paper.

  20. On Pfaffian Random Point Fields

    NASA Astrophysics Data System (ADS)

    Kargin, V.

    2014-02-01

    We study Pfaffian random point fields by using the Moore-Dyson quaternion determinants. First, we give sufficient conditions that ensure that a self-dual quaternion kernel defines a valid random point field, and then we prove a CLT for Pfaffian point fields. The proofs are based on a new quaternion extension of the Cauchy-Binet determinantal identity. In addition, we derive the Fredholm determinantal formulas for the Pfaffian point fields which use the quaternion determinant.

  1. StreamMap: Smooth Dynamic Visualization of High-Density Streaming Points.

    PubMed

    Li, Chenhui; Baciu, George; Han, Yu

    2018-03-01

    Interactive visualization of streaming points for real-time scatterplots and linear blending of correlation patterns is increasingly becoming the dominant mode of visual analytics for both big data and streaming data from active sensors and broadcasting media. To better visualize and interact with inter-stream patterns, it is generally necessary to smooth out gaps or distortions in the streaming data. Previous approaches either animate the points directly or present a sampled static heat-map. We propose a new approach, called StreamMap, to smoothly blend high-density streaming points and create a visual flow that emphasizes the density pattern distributions. In essence, we present three new contributions for the visualization of high-density streaming points. The first contribution is a density-based method called super kernel density estimation that aggregates streaming points using an adaptive kernel to solve the overlapping problem. The second contribution is a robust density morphing algorithm that generates several smooth intermediate frames for a given pair of frames. The third contribution is a trend representation design that can help convey the flow directions of the streaming points. The experimental results on three datasets demonstrate the effectiveness of StreamMap when dynamic visualization and visual analysis of trend patterns on streaming points are required.

  2. Development of FullWave : Hot Plasma RF Simulation Tool

    NASA Astrophysics Data System (ADS)

    Svidzinski, Vladimir; Kim, Jin-Soo; Spencer, J. Andrew; Zhao, Liangji; Galkin, Sergei

    2017-10-01

    Full wave simulation tool, modeling RF fields in hot inhomogeneous magnetized plasma, is being developed. The wave equations with linearized hot plasma dielectric response are solved in configuration space on adaptive cloud of computational points. The nonlocal hot plasma dielectric response is formulated in configuration space without limiting approximations by calculating the plasma conductivity kernel based on the solution of the linearized Vlasov equation in inhomogeneous magnetic field. This approach allows for better resolution of plasma resonances, antenna structures and complex boundaries. The formulation of FullWave and preliminary results will be presented: construction of the finite differences for approximation of derivatives on adaptive cloud of computational points; model and results of nonlocal conductivity kernel calculation in tokamak geometry; results of 2-D full wave simulations in the cold plasma model in tokamak geometry using the formulated approach; results of self-consistent calculations of hot plasma dielectric response and RF fields in 1-D mirror magnetic field; preliminary results of self-consistent simulations of 2-D RF fields in tokamak using the calculated hot plasma conductivity kernel; development of iterative solver for wave equations. Work is supported by the U.S. DOE SBIR program.

  3. Development of full wave code for modeling RF fields in hot non-uniform plasmas

    NASA Astrophysics Data System (ADS)

    Zhao, Liangji; Svidzinski, Vladimir; Spencer, Andrew; Kim, Jin-Soo

    2016-10-01

    FAR-TECH, Inc. is developing a full wave RF modeling code to model RF fields in fusion devices and in general plasma applications. As an important component of the code, an adaptive meshless technique is introduced to solve the wave equations, which allows resolving plasma resonances efficiently and adapting to the complexity of antenna geometry and device boundary. The computational points are generated using either a point elimination method or a force balancing method based on the monitor function, which is calculated by solving the cold plasma dispersion equation locally. Another part of the code is the conductivity kernel calculation, used for modeling the nonlocal hot plasma dielectric response. The conductivity kernel is calculated on a coarse grid of test points and then interpolated linearly onto the computational points. All the components of the code are parallelized using MPI and OpenMP libraries to optimize the execution speed and memory. The algorithm and the results of our numerical approach to solving 2-D wave equations in a tokamak geometry will be presented. Work is supported by the U.S. DOE SBIR program.

  4. Voronoi cell patterns: Theoretical model and applications

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2011-11-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We use our model to describe the Voronoi cell patterns of several systems. Specifically, we study the island nucleation with irreversible attachment, the 1D car-parking problem, the formation of second-level administrative divisions, and the pattern formed by the Paris Métro stations.

  5. Voronoi Cell Patterns: theoretical model and application to submonolayer growth

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2012-02-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We apply our model to describe the Voronoi cell patterns of island nucleation for critical island sizes i=0,1,2,3. Experimental results for the Voronoi cells of InAs/GaAs quantum dots are also described by our model.

  6. Classification of Microarray Data Using Kernel Fuzzy Inference System

    PubMed Central

    Kumar Rath, Santanu

    2014-01-01

    The DNA microarray classification technique has gained more popularity in both research and practice. In real data analysis, such as microarray data, the dataset contains a huge number of insignificant and irrelevant features that tend to lose useful information. Classes with high relevance and feature sets with high significance are generally referred for the selected features, which determine the samples classification into their respective classes. In this paper, kernel fuzzy inference system (K-FIS) algorithm is applied to classify the microarray data (leukemia) using t-test as a feature selection method. Kernel functions are used to map original data points into a higher-dimensional (possibly infinite-dimensional) feature space defined by a (usually nonlinear) function ϕ through a mathematical process called the kernel trick. This paper also presents a comparative study for classification using K-FIS along with support vector machine (SVM) for different set of features (genes). Performance parameters available in the literature such as precision, recall, specificity, F-measure, ROC curve, and accuracy are considered to analyze the efficiency of the classification model. From the proposed approach, it is apparent that K-FIS model obtains similar results when compared with SVM model. This is an indication that the proposed approach relies on kernel function. PMID:27433543

  7. [Effects of different rootstocks on the weak light tolerance ability of summer black grape based on 4 photo-response models].

    PubMed

    Han, Xiao; Wang, Hai Bo; Wang, Xiao di; Shi, Xiang Bin; Wang, Bao Liang; Zheng, Xiao Cui; Wang, Zhi Qiang; Liu, Feng Zhi

    2017-10-01

    The photo response curves of 11 rootstock-scion combinations including summer black/Beta, summer black/1103P, summer black/101-14, summer black/3309C, summer black/140Ru, summer black/5C, summer black/5BB, summer black/420A, summer black/SO4, summer black/Kangzhen No.1, summer black/Huapu No.1 were fitted by rectangular hyperbola mo-del, non-rectangular hyperbola model, modified rectangular hyperbola model and exponential model respectively, and the differences of imitative effects were analyzed by determination coefficiency, light compensation point, light saturation point, initial quantum efficiency, maximum photosynthetic rate and dark respiration rate. The result showed that the fit coefficients of all four models were above 0.98, and there was no obvious difference on the fitted values of light compensation point among the four models. The modified rectangular hyperbola model fitted best on light saturation point, apparent quantum yield, maximum photosynthetic rate and dark respiration rate, and had the minimum AIC value based on the akaike information criterion, therefore, the modified rectangular hyperbola model was the best one. The clustering analysis indicated that summer black/SO4 and summer black/420A combinations had low light compensation point, high apparent quantum yield and low dark respiration rate among 11 rootstock-scion combinations, suggesting that these two combinations could use weak light more efficiently due to their less respiratory consumption and higher weak light tolerance. The Topsis comparison method ranked summer black/SO4 and summer black/420A combinations as No. 1 and No. 2 respectively in weak light tolerance ability, which was consistent with cluster analysis. Consequently, summer black has the highest weak light tolerance in case grafted on 420A or SO4, which could be the most suitable rootstock-scion combinations for protected cultivation.

  8. Entanglement in a model for Hawking radiation: An application of quadratic algebras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bambah, Bindu A., E-mail: bbsp@uohyd.ernet.in; Mukku, C., E-mail: mukku@iiit.ac.in; Shreecharan, T., E-mail: shreecharan@gmail.com

    2013-03-15

    Quadratic polynomially deformed su(1,1) and su(2) algebras are utilized in model Hamiltonians to show how the gravitational system consisting of a black hole, infalling radiation and outgoing (Hawking) radiation can be solved exactly. The models allow us to study the long-time behaviour of the black hole and its outgoing modes. In particular, we calculate the bipartite entanglement entropies of subsystems consisting of (a) infalling plus outgoing modes and (b) black hole modes plus the infalling modes, using the Janus-faced nature of the model. The long-time behaviour also gives us glimpses of modifications in the character of Hawking radiation. Finally, wemore » study the phenomenon of superradiance in our model in analogy with atomic Dicke superradiance. - Highlights: Black-Right-Pointing-Pointer We examine a toy model for Hawking radiation with quantized black hole modes. Black-Right-Pointing-Pointer We use quadratic polynomially deformed su(1,1) algebras to study its entanglement properties. Black-Right-Pointing-Pointer We study the 'Dicke Superradiance' in black hole radiation using quadratically deformed su(2) algebras. Black-Right-Pointing-Pointer We study the modification of the thermal character of Hawking radiation due to quantized black hole modes.« less

  9. Analysis of the cable equation with non-local and non-singular kernel fractional derivative

    NASA Astrophysics Data System (ADS)

    Karaagac, Berat

    2018-02-01

    Recently a new concept of differentiation was introduced in the literature where the kernel was converted from non-local singular to non-local and non-singular. One of the great advantages of this new kernel is its ability to portray fading memory and also well defined memory of the system under investigation. In this paper the cable equation which is used to develop mathematical models of signal decay in submarine or underwater telegraphic cables will be analysed using the Atangana-Baleanu fractional derivative due to the ability of the new fractional derivative to describe non-local fading memory. The existence and uniqueness of the more generalized model is presented in detail via the fixed point theorem. A new numerical scheme is used to solve the new equation. In addition, stability, convergence and numerical simulations are presented.

  10. An orthogonal oriented quadrature hexagonal image pyramid

    NASA Technical Reports Server (NTRS)

    Watson, Andrew B.; Ahumada, Albert J., Jr.

    1987-01-01

    An image pyramid has been developed with basis functions that are orthogonal, self-similar, and localized in space, spatial frequency, orientation, and phase. The pyramid operates on a hexagonal sample lattice. The set of seven basis functions consist of three even high-pass kernels, three odd high-pass kernels, and one low-pass kernel. The three even kernels are identified when rotated by 60 or 120 deg, and likewise for the odd. The seven basis functions occupy a point and a hexagon of six nearest neighbors on a hexagonal sample lattice. At the lowest level of the pyramid, the input lattice is the image sample lattice. At each higher level, the input lattice is provided by the low-pass coefficients computed at the previous level. At each level, the output is subsampled in such a way as to yield a new hexagonal lattice with a spacing sq rt 7 larger than the previous level, so that the number of coefficients is reduced by a factor of 7 at each level. The relationship between this image code and the processing architecture of the primate visual cortex is discussed.

  11. Methods for compressible fluid simulation on GPUs using high-order finite differences

    NASA Astrophysics Data System (ADS)

    Pekkilä, Johannes; Väisälä, Miikka S.; Käpylä, Maarit J.; Käpylä, Petri J.; Anjum, Omer

    2017-08-01

    We focus on implementing and optimizing a sixth-order finite-difference solver for simulating compressible fluids on a GPU using third-order Runge-Kutta integration. Since graphics processing units perform well in data-parallel tasks, this makes them an attractive platform for fluid simulation. However, high-order stencil computation is memory-intensive with respect to both main memory and the caches of the GPU. We present two approaches for simulating compressible fluids using 55-point and 19-point stencils. We seek to reduce the requirements for memory bandwidth and cache size in our methods by using cache blocking and decomposing a latency-bound kernel into several bandwidth-bound kernels. Our fastest implementation is bandwidth-bound and integrates 343 million grid points per second on a Tesla K40t GPU, achieving a 3 . 6 × speedup over a comparable hydrodynamics solver benchmarked on two Intel Xeon E5-2690v3 processors. Our alternative GPU implementation is latency-bound and achieves the rate of 168 million updates per second.

  12. Genome-wide linkage mapping of QTL for black point reaction in bread wheat (Triticum aestivum L.).

    PubMed

    Liu, Jindong; He, Zhonghu; Wu, Ling; Bai, Bin; Wen, Weie; Xie, Chaojie; Xia, Xianchun

    2016-11-01

    Nine QTL for black point resistance in wheat were identified using a RIL population derived from a Linmai 2/Zhong 892 cross and 90K SNP assay. Black point, discoloration of the embryo end of the grain, downgrades wheat grain quality leading to significant economic losses to the wheat industry. The availability of molecular markers will accelerate improvement of black point resistance in wheat breeding. The aims of this study were to identify quantitative trait loci (QTL) for black point resistance and tightly linked molecular markers, and to search for candidate genes using a high-density genetic linkage map of wheat. A recombinant inbred line (RIL) population derived from the cross Linmai 2/Zhong 892 was evaluated for black point reaction during the 2011-2012, 2012-2013 and 2013-2014 cropping seasons, providing data for seven environments. A high-density linkage map was constructed by genotyping the RILs with the wheat 90K single nucleotide polymorphism (SNP) chip. Composite interval mapping detected nine QTL on chromosomes 2AL, 2BL, 3AL, 3BL, 5AS, 6A, 7AL (2) and 7BS, designated as QBp.caas-2AL, QBp.caas-2BL, QBp.caas-3AL, QBp.caas-3BL, QBp.caas-5AS, QBp.caas-6A, QBp.caas-7AL.1, QBp.caas-7AL.2 and QBp.caas-7BS, respectively. All resistance alleles, except for QBp.caas-7AL.1 from Linmai 2, were contributed by Zhong 892. QBp.caas-3BL, QBp.caas-5AS, QBp.caas-7AL.1, QBp.caas-7AL.2 and QBp.caas-7BS probably represent new loci for black point resistance. Sequences of tightly linked SNPs were used to survey wheat and related cereal genomes identifying three candidate genes for black point resistance. The tightly linked SNP markers can be used in marker-assisted breeding in combination with the kompetitive allele specific PCR technique to improve black point resistance.

  13. Calculation of electron Dose Point Kernel in water with GEANT4 for medical application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guimaraes, C. C.; Sene, F. F.; Martinelli, J. R.

    2009-06-03

    The rapid insertion of new technologies in medical physics in the last years, especially in nuclear medicine, has been followed by a great development of faster Monte Carlo algorithms. GEANT4 is a Monte Carlo toolkit that contains the tools to simulate the problems of particle transport through matter. In this work, GEANT4 was used to calculate the dose-point-kernel (DPK) for monoenergetic electrons in water, which is an important reference medium for nuclear medicine. The three different physical models of electromagnetic interactions provided by GEANT4 - Low Energy, Penelope and Standard - were employed. To verify the adequacy of these models,more » the results were compared with references from the literature. For all energies and physical models, the agreement between calculated DPKs and reported values is satisfactory.« less

  14. Total Ambient Dose Equivalent Buildup Factor Determination for Nbs04 Concrete.

    PubMed

    Duckic, Paulina; Hayes, Robert B

    2018-06-01

    Buildup factors are dimensionless multiplicative factors required by the point kernel method to account for scattered radiation through a shielding material. The accuracy of the point kernel method is strongly affected by the correspondence of analyzed parameters to experimental configurations, which is attempted to be simplified here. The point kernel method has not been found to have widespread practical use for neutron shielding calculations due to the complex neutron transport behavior through shielding materials (i.e. the variety of interaction mechanisms that neutrons may undergo while traversing the shield) as well as non-linear neutron total cross section energy dependence. In this work, total ambient dose buildup factors for NBS04 concrete are calculated in terms of neutron and secondary gamma ray transmission factors. The neutron and secondary gamma ray transmission factors are calculated using MCNP6™ code with updated cross sections. Both transmission factors and buildup factors are given in a tabulated form. Practical use of neutron transmission and buildup factors warrants rigorously calculated results with all associated uncertainties. In this work, sensitivity analysis of neutron transmission factors and total buildup factors with varying water content has been conducted. The analysis showed significant impact of varying water content in concrete on both neutron transmission factors and total buildup factors. Finally, support vector regression, a machine learning technique, has been engaged to make a model based on the calculated data for calculation of the buildup factors. The developed model can predict most of the data with 20% relative error.

  15. Towards a Holistic Cortical Thickness Descriptor: Heat Kernel-Based Grey Matter Morphology Signatures.

    PubMed

    Wang, Gang; Wang, Yalin

    2017-02-15

    In this paper, we propose a heat kernel based regional shape descriptor that may be capable of better exploiting volumetric morphological information than other available methods, thereby improving statistical power on brain magnetic resonance imaging (MRI) analysis. The mechanism of our analysis is driven by the graph spectrum and the heat kernel theory, to capture the volumetric geometry information in the constructed tetrahedral meshes. In order to capture profound brain grey matter shape changes, we first use the volumetric Laplace-Beltrami operator to determine the point pair correspondence between white-grey matter and CSF-grey matter boundary surfaces by computing the streamlines in a tetrahedral mesh. Secondly, we propose multi-scale grey matter morphology signatures to describe the transition probability by random walk between the point pairs, which reflects the inherent geometric characteristics. Thirdly, a point distribution model is applied to reduce the dimensionality of the grey matter morphology signatures and generate the internal structure features. With the sparse linear discriminant analysis, we select a concise morphology feature set with improved classification accuracies. In our experiments, the proposed work outperformed the cortical thickness features computed by FreeSurfer software in the classification of Alzheimer's disease and its prodromal stage, i.e., mild cognitive impairment, on publicly available data from the Alzheimer's Disease Neuroimaging Initiative. The multi-scale and physics based volumetric structure feature may bring stronger statistical power than some traditional methods for MRI-based grey matter morphology analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Ming-Liang, E-mail: mingliang0301@163.com

    Dynamics of disentanglement as measured by the tripartite negativity and Bell nonlocality as measured by the extent of violation of the multipartite Bell-type inequalities are investigated in this work. It is shown definitively that for the initial three-qubit Greenberger-Horne-Zeilinger (GHZ) or W class state preparation, the Bell nonlocality suffers sudden death under the influence of thermal reservoirs. Moreover, all the Bell-nonlocal states are useful for nonclassical teleportation, while there are entangled states that do not violate any Bell-type inequalities, but still yield nonclassical teleportation fidelity. - Highlights: Black-Right-Pointing-Pointer Comparison of different aspects of quantum correlations. Black-Right-Pointing-Pointer Robustness of the initialmore » tripartite GHZ and W class states against decoherence. Black-Right-Pointing-Pointer Bell-nonlocality sudden death under the influence of thermal reservoir. Black-Right-Pointing-Pointer A nonzero minimum tripartite negativity is needed for nonclassical teleportation. Black-Right-Pointing-Pointer All the Bell-nonlocal states yield nonclassical teleportation fidelity.« less

  17. A locally adaptive kernel regression method for facies delineation

    NASA Astrophysics Data System (ADS)

    Fernàndez-Garcia, D.; Barahona-Palomo, M.; Henri, C. V.; Sanchez-Vila, X.

    2015-12-01

    Facies delineation is defined as the separation of geological units with distinct intrinsic characteristics (grain size, hydraulic conductivity, mineralogical composition). A major challenge in this area stems from the fact that only a few scattered pieces of hydrogeological information are available to delineate geological facies. Several methods to delineate facies are available in the literature, ranging from those based only on existing hard data, to those including secondary data or external knowledge about sedimentological patterns. This paper describes a methodology to use kernel regression methods as an effective tool for facies delineation. The method uses both the spatial and the actual sampled values to produce, for each individual hard data point, a locally adaptive steering kernel function, self-adjusting the principal directions of the local anisotropic kernels to the direction of highest local spatial correlation. The method is shown to outperform the nearest neighbor classification method in a number of synthetic aquifers whenever the available number of hard data is small and randomly distributed in space. In the case of exhaustive sampling, the steering kernel regression method converges to the true solution. Simulations ran in a suite of synthetic examples are used to explore the selection of kernel parameters in typical field settings. It is shown that, in practice, a rule of thumb can be used to obtain suboptimal results. The performance of the method is demonstrated to significantly improve when external information regarding facies proportions is incorporated. Remarkably, the method allows for a reasonable reconstruction of the facies connectivity patterns, shown in terms of breakthrough curves performance.

  18. Improved scatter correction using adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Sun, M.; Star-Lack, J. M.

    2010-11-01

    Accurate scatter correction is required to produce high-quality reconstructions of x-ray cone-beam computed tomography (CBCT) scans. This paper describes new scatter kernel superposition (SKS) algorithms for deconvolving scatter from projection data. The algorithms are designed to improve upon the conventional approach whose accuracy is limited by the use of symmetric kernels that characterize the scatter properties of uniform slabs. To model scatter transport in more realistic objects, nonstationary kernels, whose shapes adapt to local thickness variations in the projection data, are proposed. Two methods are introduced: (1) adaptive scatter kernel superposition (ASKS) requiring spatial domain convolutions and (2) fast adaptive scatter kernel superposition (fASKS) where, through a linearity approximation, convolution is efficiently performed in Fourier space. The conventional SKS algorithm, ASKS, and fASKS, were tested with Monte Carlo simulations and with phantom data acquired on a table-top CBCT system matching the Varian On-Board Imager (OBI). All three models accounted for scatter point-spread broadening due to object thickening, object edge effects, detector scatter properties and an anti-scatter grid. Hounsfield unit (HU) errors in reconstructions of a large pelvis phantom with a measured maximum scatter-to-primary ratio over 200% were reduced from -90 ± 58 HU (mean ± standard deviation) with no scatter correction to 53 ± 82 HU with SKS, to 19 ± 25 HU with fASKS and to 13 ± 21 HU with ASKS. HU accuracies and measured contrast were similarly improved in reconstructions of a body-sized elliptical Catphan phantom. The results show that the adaptive SKS methods offer significant advantages over the conventional scatter deconvolution technique.

  19. On the solution of integral equations with a generalized cauchy kernel

    NASA Technical Reports Server (NTRS)

    Kaya, A. C.; Erdogan, F.

    1986-01-01

    In this paper a certain class of singular integral equations that may arise from the mixed boundary value problems in nonhomogeneous materials is considered. The distinguishing feature of these equations is that in addition to the Cauchy singularity, the kernels contain terms that are singular only at the end points. In the form of the singular integral equations adopted, the density function is a potential or a displacement and consequently the kernel has strong singularities of the form (t-x) sup-2, x sup n-2 (t+x) sup n, (n or = 2, 0x,tb). The complex function theory is used to determine the fundamental function of the problem for the general case and a simple numerical technique is described to solve the integral equation. Two examples from the theory of elasticity are then considered to show the application of the technique.

  20. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction.

    PubMed

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  1. Automatically detect and track infrared small targets with kernel Fukunaga-Koontz transform and Kalman prediction

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan

    2007-11-01

    Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.

  2. Pixel-based meshfree modelling of skeletal muscles.

    PubMed

    Chen, Jiun-Shyan; Basava, Ramya Rao; Zhang, Yantao; Csapo, Robert; Malis, Vadim; Sinha, Usha; Hodgson, John; Sinha, Shantanu

    2016-01-01

    This paper introduces the meshfree Reproducing Kernel Particle Method (RKPM) for 3D image-based modeling of skeletal muscles. This approach allows for construction of simulation model based on pixel data obtained from medical images. The material properties and muscle fiber direction obtained from Diffusion Tensor Imaging (DTI) are input at each pixel point. The reproducing kernel (RK) approximation allows a representation of material heterogeneity with smooth transition. A multiphase multichannel level set based segmentation framework is adopted for individual muscle segmentation using Magnetic Resonance Images (MRI) and DTI. The application of the proposed methods for modeling the human lower leg is demonstrated.

  3. Estimating average growth trajectories in shape-space using kernel smoothing.

    PubMed

    Hutton, Tim J; Buxton, Bernard F; Hammond, Peter; Potts, Henry W W

    2003-06-01

    In this paper, we show how a dense surface point distribution model of the human face can be computed and demonstrate the usefulness of the high-dimensional shape-space for expressing the shape changes associated with growth and aging. We show how average growth trajectories for the human face can be computed in the absence of longitudinal data by using kernel smoothing across a population. A training set of three-dimensional surface scans of 199 male and 201 female subjects of between 0 and 50 years of age is used to build the model.

  4. On the solution of integral equations with strongly singular kernels

    NASA Technical Reports Server (NTRS)

    Kaya, A. C.; Erdogan, F.

    1986-01-01

    Some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m ,m greater than or equal 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t-x) sup -m , terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.

  5. On the solution of integral equations with strong ly singular kernels

    NASA Technical Reports Server (NTRS)

    Kaya, A. C.; Erdogan, F.

    1985-01-01

    In this paper some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m, m or = 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t,x) sup-m, terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.

  6. On the solution of integral equations with strongly singular kernels

    NASA Technical Reports Server (NTRS)

    Kaya, A. C.; Erdogan, F.

    1987-01-01

    Some useful formulas are developed to evaluate integrals having a singularity of the form (t-x) sup-m, m greater than or equal 1. Interpreting the integrals with strong singularities in Hadamard sense, the results are used to obtain approximate solutions of singular integral equations. A mixed boundary value problem from the theory of elasticity is considered as an example. Particularly for integral equations where the kernel contains, in addition to the dominant term (t-x) sup-m, terms which become unbounded at the end points, the present technique appears to be extremely effective to obtain rapidly converging numerical results.

  7. A shock-capturing SPH scheme based on adaptive kernel estimation

    NASA Astrophysics Data System (ADS)

    Sigalotti, Leonardo Di G.; López, Hender; Donoso, Arnaldo; Sira, Eloy; Klapp, Jaime

    2006-02-01

    Here we report a method that converts standard smoothed particle hydrodynamics (SPH) into a working shock-capturing scheme without relying on solutions to the Riemann problem. Unlike existing adaptive SPH simulations, the present scheme is based on an adaptive kernel estimation of the density, which combines intrinsic features of both the kernel and nearest neighbor approaches in a way that the amount of smoothing required in low-density regions is effectively controlled. Symmetrized SPH representations of the gas dynamic equations along with the usual kernel summation for the density are used to guarantee variational consistency. Implementation of the adaptive kernel estimation involves a very simple procedure and allows for a unique scheme that handles strong shocks and rarefactions the same way. Since it represents a general improvement of the integral interpolation on scattered data, it is also applicable to other fluid-dynamic models. When the method is applied to supersonic compressible flows with sharp discontinuities, as in the classical one-dimensional shock-tube problem and its variants, the accuracy of the results is comparable, and in most cases superior, to that obtained from high quality Godunov-type methods and SPH formulations based on Riemann solutions. The extension of the method to two- and three-space dimensions is straightforward. In particular, for the two-dimensional cylindrical Noh's shock implosion and Sedov point explosion problems the present scheme produces much better results than those obtained with conventional SPH codes.

  8. Providing the Fire Risk Map in Forest Area Using a Geographically Weighted Regression Model with Gaussin Kernel and Modis Images, a Case Study: Golestan Province

    NASA Astrophysics Data System (ADS)

    Shah-Heydari pour, A.; Pahlavani, P.; Bigdeli, B.

    2017-09-01

    According to the industrialization of cities and the apparent increase in pollutants and greenhouse gases, the importance of forests as the natural lungs of the earth is felt more than ever to clean these pollutants. Annually, a large part of the forests is destroyed due to the lack of timely action during the fire. Knowledge about areas with a high-risk of fire and equipping these areas by constructing access routes and allocating the fire-fighting equipment can help to eliminate the destruction of the forest. In this research, the fire risk of region was forecasted and the risk map of that was provided using MODIS images by applying geographically weighted regression model with Gaussian kernel and ordinary least squares over the effective parameters in forest fire including distance from residential areas, distance from the river, distance from the road, height, slope, aspect, soil type, land use, average temperature, wind speed, and rainfall. After the evaluation, it was found that the geographically weighted regression model with Gaussian kernel forecasted 93.4% of the all fire points properly, however the ordinary least squares method could forecast properly only 66% of the fire points.

  9. Fast dose kernel interpolation using Fourier transform with application to permanent prostate brachytherapy dosimetry.

    PubMed

    Liu, Derek; Sloboda, Ron S

    2014-05-01

    Boyer and Mok proposed a fast calculation method employing the Fourier transform (FT), for which calculation time is independent of the number of seeds but seed placement is restricted to calculation grid points. Here an interpolation method is described enabling unrestricted seed placement while preserving the computational efficiency of the original method. The Iodine-125 seed dose kernel was sampled and selected values were modified to optimize interpolation accuracy for clinically relevant doses. For each seed, the kernel was shifted to the nearest grid point via convolution with a unit impulse, implemented in the Fourier domain. The remaining fractional shift was performed using a piecewise third-order Lagrange filter. Implementation of the interpolation method greatly improved FT-based dose calculation accuracy. The dose distribution was accurate to within 2% beyond 3 mm from each seed. Isodose contours were indistinguishable from explicit TG-43 calculation. Dose-volume metric errors were negligible. Computation time for the FT interpolation method was essentially the same as Boyer's method. A FT interpolation method for permanent prostate brachytherapy TG-43 dose calculation was developed which expands upon Boyer's original method and enables unrestricted seed placement. The proposed method substantially improves the clinically relevant dose accuracy with negligible additional computation cost, preserving the efficiency of the original method.

  10. Rediscovering the Kernels of Truth in the Urban Legends of the Freshman Composition Classroom

    ERIC Educational Resources Information Center

    Lovoy, Thomas

    2004-01-01

    English teachers, as well as teachers within other disciplines, often boil down abstract principles to easily explainable bullet points. Students often pick up and retain these points but fail to grasp the broader contexts that make them relevant. It is therefore sometimes helpful to revisit some of the more common of these "rules of thumb" to…

  11. Feasibility study, software design, layout and simulation of a two-dimensional fast Fourier transform machine for use in optical array interferometry

    NASA Technical Reports Server (NTRS)

    Boriakoff, Valentin; Chen, Wei

    1990-01-01

    The NASA-Cornell Univ.-Worcester Polytechnic Institute Fast Fourier Transform (FFT) chip based on the architecture of the systolic FFT computation as presented by Boriakoff is implemented into an operating device design. The kernel of the system, a systolic inner product floating point processor, was designed to be assembled into a systolic network that would take incoming data streams in pipeline fashion and provide an FFT output at the same rate, word by word. It was thoroughly simulated for proper operation, and it has passed a comprehensive set of tests showing no operational errors. The black box specifications of the chip, which conform to the initial requirements of the design as specified by NASA, are given. The five subcells are described and their high level function description, logic diagrams, and simulation results are presented. Some modification of the Read Only Memory (ROM) design were made, since some errors were found in it. Because a four stage pipeline structure was used, simulating such a structure is more difficult than an ordinary structure. Simulation methods are discussed. Chip signal protocols and chip pinout are explained.

  12. Blur kernel estimation with algebraic tomography technique and intensity profiles of object boundaries

    NASA Astrophysics Data System (ADS)

    Ingacheva, Anastasia; Chukalina, Marina; Khanipov, Timur; Nikolaev, Dmitry

    2018-04-01

    Motion blur caused by camera vibration is a common source of degradation in photographs. In this paper we study the problem of finding the point spread function (PSF) of a blurred image using the tomography technique. The PSF reconstruction result strongly depends on the particular tomography technique used. We present a tomography algorithm with regularization adapted specifically for this task. We use the algebraic reconstruction technique (ART algorithm) as the starting algorithm and introduce regularization. We use the conjugate gradient method for numerical implementation of the proposed approach. The algorithm is tested using a dataset which contains 9 kernels extracted from real photographs by the Adobe corporation where the point spread function is known. We also investigate influence of noise on the quality of image reconstruction and investigate how the number of projections influence the magnitude change of the reconstruction error.

  13. Extremal black holes, Stueckelberg scalars and phase transitions

    NASA Astrophysics Data System (ADS)

    Marrani, Alessio; Miskovic, Olivera; Leon, Paula Quezada

    2018-02-01

    We calculate the entropy of a static extremal black hole in 4D gravity, non-linearly coupled to a massive Stueckelberg scalar. We find that the scalar field does not allow the black hole to be magnetically charged. We also show that the system can exhibit a phase transition due to electric charge variations. For spherical and hyperbolic horizons, the critical point exists only in presence of a cosmological constant, and if the scalar is massive and non-linearly coupled to electromagnetic field. On one side of the critical point, two extremal solutions coexist: Reissner-Nordström (A)dS black hole and the charged hairy (A)dS black hole, while on the other side of the critical point the black hole does not have hair. A near-critical analysis reveals that the hairy black hole has larger entropy, thus giving rise to a zero temperature phase transition. This is characterized by a discontinuous second derivative of the entropy with respect to the electric charge at the critical point. The results obtained here are analytical and based on the entropy function formalism and the second law of thermodynamics.

  14. Thermodynamics sheds light on black hole dynamics

    NASA Astrophysics Data System (ADS)

    Cárdenas, Marcela; Julié, Félix-Louis; Deruelle, Nathalie

    2018-06-01

    We propose to unify two a priori distinct aspects of black hole physics: their thermodynamics, and their description as point particles, which is an essential starting point in the post-Newtonian approach to their dynamics. We will find that, when reducing a black hole to a point particle endowed with its specific effective mass, one in fact describes a black hole satisfying the first law of thermodynamics, such that its global charges, and hence its entropy, remain constant. This gives a thermodynamical interpretation of its effective mass, thus opening a promising synergy between black hole thermodynamics and the analytical approaches to the two-body problems in gravity theories. To illustrate this relationship, the Einstein-Maxwell-dilaton theory, which contains simple examples of asympotically flat, hairy black hole solutions, will serve as a laboratory.

  15. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods.

    PubMed

    Vizcaíno, Iván P; Carrera, Enrique V; Muñoz-Romero, Sergio; Cumbal, Luis H; Rojo-Álvarez, José Luis

    2017-10-16

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.

  16. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods

    PubMed Central

    Vizcaíno, Iván P.; Muñoz-Romero, Sergio; Cumbal, Luis H.

    2017-01-01

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer’s kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer’s kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem. PMID:29035333

  17. Lossy Wavefield Compression for Full-Waveform Inversion

    NASA Astrophysics Data System (ADS)

    Boehm, C.; Fichtner, A.; de la Puente, J.; Hanzich, M.

    2015-12-01

    We present lossy compression techniques, tailored to the inexact computation of sensitivity kernels, that significantly reduce the memory requirements of adjoint-based minimization schemes. Adjoint methods are a powerful tool to solve tomography problems in full-waveform inversion (FWI). Yet they face the challenge of massive memory requirements caused by the opposite directions of forward and adjoint simulations and the necessity to access both wavefields simultaneously during the computation of the sensitivity kernel. Thus, storage, I/O operations, and memory bandwidth become key topics in FWI. In this talk, we present strategies for the temporal and spatial compression of the forward wavefield. This comprises re-interpolation with coarse time steps and an adaptive polynomial degree of the spectral element shape functions. In addition, we predict the projection errors on a hierarchy of grids and re-quantize the residuals with an adaptive floating-point accuracy to improve the approximation. Furthermore, we use the first arrivals of adjoint waves to identify "shadow zones" that do not contribute to the sensitivity kernel at all. Updating and storing the wavefield within these shadow zones is skipped, which reduces memory requirements and computational costs at the same time. Compared to check-pointing, our approach has only a negligible computational overhead, utilizing the fact that a sufficiently accurate sensitivity kernel does not require a fully resolved forward wavefield. Furthermore, we use adaptive compression thresholds during the FWI iterations to ensure convergence. Numerical experiments on the reservoir scale and for the Western Mediterranean prove the high potential of this approach with an effective compression factor of 500-1000. Furthermore, it is computationally cheap and easy to integrate in both, finite-differences and finite-element wave propagation codes.

  18. Dynamics of nonautonomous rogue waves in Bose-Einstein condensate

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Li-Chen, E-mail: zhaolichen3@163.com

    2013-02-15

    We study rogue waves of Bose-Einstein condensate (BEC) analytically in a time-dependent harmonic trap with a complex potential. Properties of the nonautonomous rogue waves are investigated analytically. It is reported that there are possibilities to 'catch' rogue waves through manipulating nonlinear interaction properly. The results provide many possibilities to manipulate rogue waves experimentally in a BEC system. - Highlights: Black-Right-Pointing-Pointer One more generalized rogue wave solutions are presented. Black-Right-Pointing-Pointer Present one possible way to catch a rouge wave. Black-Right-Pointing-Pointer Properties of rogue waves are investigated analytically for the first time. Black-Right-Pointing-Pointer Provide many possibilities to manipulate rogue waves in BEC.

  19. Dynamic recrystallization in friction surfaced austenitic stainless steel coatings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Puli, Ramesh, E-mail: rameshpuli2000@gmail.com; Janaki Ram, G.D.

    2012-12-15

    Friction surfacing involves complex thermo-mechanical phenomena. In this study, the nature of dynamic recrystallization in friction surfaced austenitic stainless steel AISI 316L coatings was investigated using electron backscattered diffraction and transmission electron microscopy. The results show that the alloy 316L undergoes discontinuous dynamic recrystallization under conditions of moderate Zener-Hollomon parameter during friction surfacing. - Highlights: Black-Right-Pointing-Pointer Dynamic recrystallization in alloy 316L friction surfaced coatings is examined. Black-Right-Pointing-Pointer Friction surfacing leads to discontinuous dynamic recrystallization in alloy 316L. Black-Right-Pointing-Pointer Strain rates in friction surfacing exceed 400 s{sup -1}. Black-Right-Pointing-Pointer Estimated grain size matches well with experimental observations in 316L coatings.

  20. The intriguing enhancement of chloroperoxidase mediated one-electron oxidations by azide, a known active-site ligand

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrew, Daniel; Hager, Lowell; Manoj, Kelath Murali, E-mail: muralimanoj@vit.ac.in

    2011-12-02

    Highlights: Black-Right-Pointing-Pointer Azide is a well known heme-enzyme active site ligand and inhibitor. Black-Right-Pointing-Pointer Herein, azide is reported to enhance a set of heme-enzyme mediated reactions. Black-Right-Pointing-Pointer This effect is disconnected from native enzyme-azide binding. Black-Right-Pointing-Pointer Azide could enhance heme-enzyme reactions via a newly proposed mechanism. Black-Right-Pointing-Pointer Azide contained in reagents could impact reaction outcomes in redox biochemistry. -- Abstract: Azide is a well-known inhibitor of heme-enzymes. Herein, we report the counter-intuitive observation that at some concentration regimes, incorporation of azide in the reaction medium enhances chloroperoxidase (CPO, a heme-enzyme) mediated one-electron abstractions from several substrates. A diffusible azidyl radicalmore » based mechanism is proposed for explaining the phenomenon. Further, it is projected that the finding could have significant impact on routine in situ or in vitro biochemistry studies involving heme-enzyme systems and azide.« less

  1. Biologically-Inspired Spike-Based Automatic Speech Recognition of Isolated Digits Over a Reproducing Kernel Hilbert Space

    PubMed Central

    Li, Kan; Príncipe, José C.

    2018-01-01

    This paper presents a novel real-time dynamic framework for quantifying time-series structure in spoken words using spikes. Audio signals are converted into multi-channel spike trains using a biologically-inspired leaky integrate-and-fire (LIF) spike generator. These spike trains are mapped into a function space of infinite dimension, i.e., a Reproducing Kernel Hilbert Space (RKHS) using point-process kernels, where a state-space model learns the dynamics of the multidimensional spike input using gradient descent learning. This kernelized recurrent system is very parsimonious and achieves the necessary memory depth via feedback of its internal states when trained discriminatively, utilizing the full context of the phoneme sequence. A main advantage of modeling nonlinear dynamics using state-space trajectories in the RKHS is that it imposes no restriction on the relationship between the exogenous input and its internal state. We are free to choose the input representation with an appropriate kernel, and changing the kernel does not impact the system nor the learning algorithm. Moreover, we show that this novel framework can outperform both traditional hidden Markov model (HMM) speech processing as well as neuromorphic implementations based on spiking neural network (SNN), yielding accurate and ultra-low power word spotters. As a proof of concept, we demonstrate its capabilities using the benchmark TI-46 digit corpus for isolated-word automatic speech recognition (ASR) or keyword spotting. Compared to HMM using Mel-frequency cepstral coefficient (MFCC) front-end without time-derivatives, our MFCC-KAARMA offered improved performance. For spike-train front-end, spike-KAARMA also outperformed state-of-the-art SNN solutions. Furthermore, compared to MFCCs, spike trains provided enhanced noise robustness in certain low signal-to-noise ratio (SNR) regime. PMID:29666568

  2. Biologically-Inspired Spike-Based Automatic Speech Recognition of Isolated Digits Over a Reproducing Kernel Hilbert Space.

    PubMed

    Li, Kan; Príncipe, José C

    2018-01-01

    This paper presents a novel real-time dynamic framework for quantifying time-series structure in spoken words using spikes. Audio signals are converted into multi-channel spike trains using a biologically-inspired leaky integrate-and-fire (LIF) spike generator. These spike trains are mapped into a function space of infinite dimension, i.e., a Reproducing Kernel Hilbert Space (RKHS) using point-process kernels, where a state-space model learns the dynamics of the multidimensional spike input using gradient descent learning. This kernelized recurrent system is very parsimonious and achieves the necessary memory depth via feedback of its internal states when trained discriminatively, utilizing the full context of the phoneme sequence. A main advantage of modeling nonlinear dynamics using state-space trajectories in the RKHS is that it imposes no restriction on the relationship between the exogenous input and its internal state. We are free to choose the input representation with an appropriate kernel, and changing the kernel does not impact the system nor the learning algorithm. Moreover, we show that this novel framework can outperform both traditional hidden Markov model (HMM) speech processing as well as neuromorphic implementations based on spiking neural network (SNN), yielding accurate and ultra-low power word spotters. As a proof of concept, we demonstrate its capabilities using the benchmark TI-46 digit corpus for isolated-word automatic speech recognition (ASR) or keyword spotting. Compared to HMM using Mel-frequency cepstral coefficient (MFCC) front-end without time-derivatives, our MFCC-KAARMA offered improved performance. For spike-train front-end, spike-KAARMA also outperformed state-of-the-art SNN solutions. Furthermore, compared to MFCCs, spike trains provided enhanced noise robustness in certain low signal-to-noise ratio (SNR) regime.

  3. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    NASA Astrophysics Data System (ADS)

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.; Brumfield, B. E.; Phillips, M. C.; Miloshevsky, G.

    2017-06-01

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during their early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of the surrounding ambient: photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early times of their creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with a pulse duration of 6 ns are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density, and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times, while space and time resolved spectroscopy is used for evaluating the emission features and for inferring plasma physical conditions at on- and off-axis positions. The structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using the computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms, and molecules are separated in time with early time temperatures and densities in excess of 35 000 K and 4 × 1018/cm3 with an existence of thermal equilibrium. However, the emission from the off-kernel positions from the breakdown plasmas showed enhanced ultraviolet radiation with the presence of N2 bands and is represented by non-local thermodynamic equilibrium (non-LTE) conditions. Our results also highlight that the ultraviolet radiation emitted during the early time of spark evolution is the predominant source of the photo-excitation of the surrounding medium.

  4. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during its early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of surrounding ambient: viz. photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early timesmore » of its creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission features of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with 6 ns pulse duration are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times while space and time resolved spectroscopy is used for evaluating the emission features as well as for inferring plasma fundaments at on- and off-axis. Structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms and molecules are separated in time with an early time temperatures and densities in excess of 35000 K and 4×10 18 /cm 3 with an existence of thermal equilibrium. However, the emission from the off-kernel positions from the breakdown plasmas showed enhanced ultraviolet radiation with the presence of N 2 bands and represented by non-LTE conditions. Finally, our results also highlight that the ultraviolet radiation emitted during early time of spark evolution is the predominant source of the photo-excitation of the surrounding medium.« less

  5. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    DOE PAGES

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.; ...

    2017-06-01

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during its early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of surrounding ambient: viz. photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early timesmore » of its creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission features of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with 6 ns pulse duration are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times while space and time resolved spectroscopy is used for evaluating the emission features as well as for inferring plasma fundaments at on- and off-axis. Structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms and molecules are separated in time with an early time temperatures and densities in excess of 35000 K and 4×1018 /cm3 with an existence of thermal equilibrium. However, the emission from the off-kernel positions from the breakdown plasmas showed enhanced ultraviolet radiation with the presence of N2 bands and represented by non-LTE conditions. Our results also highlight that the ultraviolet radiation emitted during early time of spark evolution is the predominant source of the photo-excitation of the surrounding medium.« less

  6. Zero-Point Calibration for AGN Black-Hole Mass Estimates

    NASA Technical Reports Server (NTRS)

    Peterson, B. M.; Onken, C. A.

    2004-01-01

    We discuss the measurement and associated uncertainties of AGN reverberation-based black-hole masses, since these provide the zero-point calibration for scaling relationships that allow black-hole mass estimates for quasars. We find that reverberation-based mass estimates appear to be accurate to within a factor of about 3.

  7. Davies Critical Point and Tunneling

    NASA Astrophysics Data System (ADS)

    La, Hoseong

    2012-04-01

    From the point of view of tunneling, the physical meaning of the Davies critical point of a second-order phase transition in the black hole thermodynamics is clarified. At the critical point, the nonthermal contribution vanishes so that the black hole radiation is entirely thermal. It separates two phases: one with radiation enhanced by the nonthermal contribution, the other suppressed by the nonthermal contribution. We show this in both charged and rotating black holes. The phase transition is also analyzed in the cases in which emissions of charges and angular momenta are incorporated.

  8. The Unified Floating Point Vector Coprocessor for Reconfigurable Hardware

    NASA Astrophysics Data System (ADS)

    Kathiara, Jainik

    There has been an increased interest recently in using embedded cores on FPGAs. Many of the applications that make use of these cores have floating point operations. Due to the complexity and expense of floating point hardware, these algorithms are usually converted to fixed point operations or implemented using floating-point emulation in software. As the technology advances, more and more homogeneous computational resources and fixed function embedded blocks are added to FPGAs and hence implementation of floating point hardware becomes a feasible option. In this research we have implemented a high performance, autonomous floating point vector Coprocessor (FPVC) that works independently within an embedded processor system. We have presented a unified approach to vector and scalar computation, using a single register file for both scalar operands and vector elements. The Hybrid vector/SIMD computational model of FPVC results in greater overall performance for most applications along with improved peak performance compared to other approaches. By parameterizing vector length and the number of vector lanes, we can design an application specific FPVC and take optimal advantage of the FPGA fabric. For this research we have also initiated designing a software library for various computational kernels, each of which adapts FPVC's configuration and provide maximal performance. The kernels implemented are from the area of linear algebra and include matrix multiplication and QR and Cholesky decomposition. We have demonstrated the operation of FPVC on a Xilinx Virtex 5 using the embedded PowerPC.

  9. Black holes as critical point of quantum phase transition.

    PubMed

    Dvali, Gia; Gomez, Cesar

    We reformulate the quantum black hole portrait in the language of modern condensed matter physics. We show that black holes can be understood as a graviton Bose-Einstein condensate at the critical point of a quantum phase transition, identical to what has been observed in systems of cold atoms. The Bogoliubov modes that become degenerate and nearly gapless at this point are the holographic quantum degrees of freedom responsible for the black hole entropy and the information storage. They have no (semi)classical counterparts and become inaccessible in this limit. These findings indicate a deep connection between the seemingly remote systems and suggest a new quantum foundation of holography. They also open an intriguing possibility of simulating black hole information processing in table-top labs.

  10. Black holes and beyond

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mathur, Samir D., E-mail: mathur.16@osu.edu

    The black hole information paradox forces us into a strange situation: we must find a way to break the semiclassical approximation in a domain where no quantum gravity effects would normally be expected. Traditional quantizations of gravity do not exhibit any such breakdown, and this forces us into a difficult corner: either we must give up quantum mechanics or we must accept the existence of troublesome 'remnants'. In string theory, however, the fundamental quanta are extended objects, and it turns out that the bound states of such objects acquire a size that grows with the number of quanta in themore » bound state. The interior of the black hole gets completely altered to a 'fuzzball' structure, and information is able to escape in radiation from the hole. The semiclassical approximation can break at macroscopic scales due to the large entropy of the hole: the measure in the path integral competes with the classical action, instead of giving a subleading correction. Putting this picture of black hole microstates together with ideas about entangled states leads to a natural set of conjectures on many long-standing questions in gravity: the significance of Rindler and de Sitter entropies, the notion of black hole complementarity, and the fate of an observer falling into a black hole. - Highlights: Black-Right-Pointing-Pointer The information paradox is a serious problem. Black-Right-Pointing-Pointer To solve it we need to find 'hair' on black holes. Black-Right-Pointing-Pointer In string theory we find 'hair' by the fuzzball construction. Black-Right-Pointing-Pointer Fuzzballs help to resolve many other issues in gravity.« less

  11. Preparation of UC0.07-0.10N0.90-0.93 spheres for TRISO coated fuel particles

    NASA Astrophysics Data System (ADS)

    Hunt, R. D.; Silva, C. M.; Lindemer, T. B.; Johnson, J. A.; Collins, J. L.

    2014-05-01

    The US Department of Energy is considering a new nuclear fuel that would be less susceptible to ruptures during a loss-of-coolant accident. The fuel would consist of tristructural isotropic coated particles with dense uranium nitride (UN) kernels with diameters of 650 or 800 μm. The objectives of this effort are to make uranium oxide microspheres with adequately dispersed carbon nanoparticles and to convert these microspheres into UN spheres, which could be then sintered into kernels. Recent improvements to the internal gelation process were successfully applied to the production of uranium gel spheres with different concentrations of carbon black. After the spheres were washed and dried, a simple two-step heat profile was used to produce porous microspheres with a chemical composition of UC0.07-0.10N0.90-0.93. The first step involved heating the microspheres to 2023 K in a vacuum, and in the second step, the microspheres were held at 1873 K for 6 h in flowing nitrogen.

  12. Optimized formulas for the gravitational field of a tesseroid

    NASA Astrophysics Data System (ADS)

    Grombein, Thomas; Seitz, Kurt; Heck, Bernhard

    2013-07-01

    Various tasks in geodesy, geophysics, and related geosciences require precise information on the impact of mass distributions on gravity field-related quantities, such as the gravitational potential and its partial derivatives. Using forward modeling based on Newton's integral, mass distributions are generally decomposed into regular elementary bodies. In classical approaches, prisms or point mass approximations are mostly utilized. Considering the effect of the sphericity of the Earth, alternative mass modeling methods based on tesseroid bodies (spherical prisms) should be taken into account, particularly in regional and global applications. Expressions for the gravitational field of a point mass are relatively simple when formulated in Cartesian coordinates. In the case of integrating over a tesseroid volume bounded by geocentric spherical coordinates, it will be shown that it is also beneficial to represent the integral kernel in terms of Cartesian coordinates. This considerably simplifies the determination of the tesseroid's potential derivatives in comparison with previously published methodologies that make use of integral kernels expressed in spherical coordinates. Based on this idea, optimized formulas for the gravitational potential of a homogeneous tesseroid and its derivatives up to second-order are elaborated in this paper. These new formulas do not suffer from the polar singularity of the spherical coordinate system and can, therefore, be evaluated for any position on the globe. Since integrals over tesseroid volumes cannot be solved analytically, the numerical evaluation is achieved by means of expanding the integral kernel in a Taylor series with fourth-order error in the spatial coordinates of the integration point. As the structure of the Cartesian integral kernel is substantially simplified, Taylor coefficients can be represented in a compact and computationally attractive form. Thus, the use of the optimized tesseroid formulas particularly benefits from a significant decrease in computation time by about 45 % compared to previously used algorithms. In order to show the computational efficiency and to validate the mathematical derivations, the new tesseroid formulas are applied to two realistic numerical experiments and are compared to previously published tesseroid methods and the conventional prism approach.

  13. Reassessing the Status of Black English (Review Article).

    ERIC Educational Resources Information Center

    Spears, Arthur K.

    1992-01-01

    Summarizes the main points presented in the 1989 book, "The Death of Black English" by R.R. Butlers (1989). Butler's book presents most important research of last 20 years and subjects the results to variation analysis. It is concluded that the history of linguistic assimilation points to the eventual disappearance of Black English in…

  14. Leptokurtic portfolio theory

    NASA Astrophysics Data System (ADS)

    Kitt, R.; Kalda, J.

    2006-03-01

    The question of optimal portfolio is addressed. The conventional Markowitz portfolio optimisation is discussed and the shortcomings due to non-Gaussian security returns are outlined. A method is proposed to minimise the likelihood of extreme non-Gaussian drawdowns of the portfolio value. The theory is called Leptokurtic, because it minimises the effects from “fat tails” of returns. The leptokurtic portfolio theory provides an optimal portfolio for investors, who define their risk-aversion as unwillingness to experience sharp drawdowns in asset prices. Two types of risks in asset returns are defined: a fluctuation risk, that has Gaussian distribution, and a drawdown risk, that deals with distribution tails. These risks are quantitatively measured by defining the “noise kernel” — an ellipsoidal cloud of points in the space of asset returns. The size of the ellipse is controlled with the threshold parameter: the larger the threshold parameter, the larger return are accepted for investors as normal fluctuations. The return vectors falling into the kernel are used for calculation of fluctuation risk. Analogously, the data points falling outside the kernel are used for the calculation of drawdown risks. As a result the portfolio optimisation problem becomes three-dimensional: in addition to the return, there are two types of risks involved. Optimal portfolio for drawdown-averse investors is the portfolio minimising variance outside the noise kernel. The theory has been tested with MSCI North America, Europe and Pacific total return stock indices.

  15. Re-examining the XMM-Newton spectrum of the black hole candidate XTE J1652-453

    NASA Astrophysics Data System (ADS)

    Chiang, Chia-Ying; Reis, R. C.; Walton, D. J.; Fabian, A. C.

    2012-10-01

    The XMM-Newton spectrum of the black hole candidate XTE J1652-453 shows a broad and strong Fe Kα emission line, generally believed to originate from reflection of the inner accretion disc. These data have been analysed by Hiemstra et al. using a variety of phenomenological models. We re-examine the spectrum with a self-consistent relativistic reflection model. A narrow absorption line near 7.2 keV may be present, which if real is likely the Fe XXVI absorption line arising from highly ionized, rapidly outflowing disc wind. The blueshift of this feature corresponds to a velocity of about 11 100 km s-1, which is much larger than the typical values seen in stellar mass black holes. Given that we also find the source to have a low inclination (i ≲ 32°; close to face-on), we would therefore be seeing through the very base of outflow. This could be a possible explanation for the unusually high velocity. We use a reflection model combined with a relativistic convolution kernel which allows for both prograde and retrograde black hole spin, and treat the potential absorption feature with a physical model for a photoionized plasma. In this manner, assuming the disc is not truncated, we could only constrain the spin of the black hole in XTE J1652-453 to be less than ˜0.5 Jc/GM2 at the 90 per cent confidence limit.

  16. Strategic environmental assessment in tourism planning - Extent of application and quality of documentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carvalho Lemos, Clara, E-mail: clara@sc.usp.br; Fischer, Thomas B., E-mail: fischer@liverpool.ac.uk; Pereira Souza, Marcelo, E-mail: mps@usp.br

    Strategic environmental assessment (SEA) has been applied throughout the world in different sectors and in various ways. This paper reports on results of a PhD research on SEA applied to tourism development planning, reflecting the situation in mid-2010. First, the extent of tourism specific SEA application world-wide is established. Then, based on a review of the quality of 10 selected SEA reports, good practice, as well as challenges, trends and opportunities for tourism specific SEA are identified. Shortcomings of SEA in tourism planning are established and implications for future research are outlined. - Highlights: Black-Right-Pointing-Pointer The extent of tourism specificmore » SEA practice is identified. Black-Right-Pointing-Pointer Selected SEA/Tourism reports are evaluated. Black-Right-Pointing-Pointer SEA application to tourism planning is still limited. Black-Right-Pointing-Pointer A number of shortcomings can be pointed out.« less

  17. Separation anxiety: Stress, tension and cytokinesis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mohan, Krithika; Iglesias, Pablo A., E-mail: pi@jhu.edu; Robinson, Douglas N., E-mail: dnr@jhmi.edu

    Cytokinesis, the physical separation of a mother cell into two daughter cells, progresses through a series of well-defined changes in morphology. These changes involve distinct biochemical and mechanical processes. Here, we review the mechanical features of cells during cytokinesis, discussing both the material properties as well as sources of stresses, both active and passive, which lead to the observed changes in morphology. We also describe a mechanosensory feedback control system that regulates protein localization and shape progression during cytokinesis. -- Highlights: Black-Right-Pointing-Pointer Cytokinesis progresses through three distinct mechanical phases. Black-Right-Pointing-Pointer Cortical tension initially resists deformation of mother cell. Black-Right-Pointing-Pointer Latemore » in cytokinesis, cortical tension provides stress, enabling furrow ingression. Black-Right-Pointing-Pointer A mechanosensory feedback control system regulates cytokinesis.« less

  18. Tcl as a Software Environment for a TCS

    NASA Astrophysics Data System (ADS)

    Terrett, David L.

    2002-12-01

    This paper describes how the Tcl scripting language and C API has been used as the software environment for a telescope pointing kernel so that new pointing algorithms and software architectures can be developed and tested without needing a real-time operating system or real-time software environment. It has enabled development to continue outside the framework of a specific telescope project while continuing to build a system that is sufficiently complete to be capable of controlling real hardware but expending minimum effort on replacing the services that would normally by provided by a real-time software environment. Tcl is used as a scripting language for configuring the system at startup and then as the command interface for controlling the running system; the Tcl C language API is used to provided a system independent interface to file and socket I/O and other operating system services. The pointing algorithms themselves are implemented as a set of C++ objects calling C library functions that implement the algorithms described in [2]. Although originally designed as a test and development environment, the system, running as a soft real-time process on Linux, has been used to test the SOAR mount control system and will be used as the pointing kernel of the SOAR telescope control system

  19. Infrared microspectroscopic imaging of plant tissues: spectral visualization of Triticum aestivum kernel and Arabidopsis leaf microstructure

    PubMed Central

    Warren, Frederick J; Perston, Benjamin B; Galindez-Najera, Silvia P; Edwards, Cathrina H; Powell, Prudence O; Mandalari, Giusy; Campbell, Grant M; Butterworth, Peter J; Ellis, Peter R

    2015-01-01

    Infrared microspectroscopy is a tool with potential for studies of the microstructure, chemical composition and functionality of plants at a subcellular level. Here we present the use of high-resolution bench top-based infrared microspectroscopy to investigate the microstructure of Triticum aestivum L. (wheat) kernels and Arabidopsis leaves. Images of isolated wheat kernel tissues and whole wheat kernels following hydrothermal processing and simulated gastric and duodenal digestion were generated, as well as images of Arabidopsis leaves at different points during a diurnal cycle. Individual cells and cell walls were resolved, and large structures within cells, such as starch granules and protein bodies, were clearly identified. Contrast was provided by converting the hyperspectral image cubes into false-colour images using either principal component analysis (PCA) overlays or by correlation analysis. The unsupervised PCA approach provided a clear view of the sample microstructure, whereas the correlation analysis was used to confirm the identity of different anatomical structures using the spectra from isolated components. It was then demonstrated that gelatinized and native starch within cells could be distinguished, and that the loss of starch during wheat digestion could be observed, as well as the accumulation of starch in leaves during a diurnal period. PMID:26400058

  20. Modeling RF Fields in Hot Plasmas with Parallel Full Wave Code

    NASA Astrophysics Data System (ADS)

    Spencer, Andrew; Svidzinski, Vladimir; Zhao, Liangji; Galkin, Sergei; Kim, Jin-Soo

    2016-10-01

    FAR-TECH, Inc. is developing a suite of full wave RF plasma codes. It is based on a meshless formulation in configuration space with adapted cloud of computational points (CCP) capability and using the hot plasma conductivity kernel to model the nonlocal plasma dielectric response. The conductivity kernel is calculated by numerically integrating the linearized Vlasov equation along unperturbed particle trajectories. Work has been done on the following calculations: 1) the conductivity kernel in hot plasmas, 2) a monitor function based on analytic solutions of the cold-plasma dispersion relation, 3) an adaptive CCP based on the monitor function, 4) stencils to approximate the wave equations on the CCP, 5) the solution to the full wave equations in the cold-plasma model in tokamak geometry for ECRH and ICRH range of frequencies, and 6) the solution to the wave equations using the calculated hot plasma conductivity kernel. We will present results on using a meshless formulation on adaptive CCP to solve the wave equations and on implementing the non-local hot plasma dielectric response to the wave equations. The presentation will include numerical results of wave propagation and absorption in the cold and hot tokamak plasma RF models, using DIII-D geometry and plasma parameters. Work is supported by the U.S. DOE SBIR program.

  1. Factors affecting cadmium absorbed by pistachio kernel in calcareous soils, southeast of Iran.

    PubMed

    Shirani, H; Hosseinifard, S J; Hashemipour, H

    2018-03-01

    Cadmium (Cd) which does not have a biological role is one of the most toxic heavy metals for organisms. This metal enters environment through industrial processes and fertilizers. The main objective of this study was to determine the relationships between absorbed Cd by pistachio kernel and some of soil physical and chemical characteristics using modeling by stepwise regression and Artificial Neural Network (ANN), in calcareous soils in Rafsanjan region, southeast of Iran. For these purposes, 220 pistachio orchards were selected, and soil samples were taken from two depths of 0-40 and 40-80cm. Besides, fruit and leaf samples from branches with and without fruit were taken in each sampling point. The results showed that affecting factors on absorbed Cd by pistachio kernel which were obtained by regression method (pH and clay percent) were not interpretable, and considering unsuitable vales of determinant coefficient (R 2 ) and Root Mean Squares Error (RMSE), the model did not have sufficient validity. However, ANN modeling was highly accurate and reliable. Based on its results, soil available P and Zn and soil salinity were the most important factors affecting the concentration of Cd in pistachio kernel in pistachio growing areas of Rafsanjan. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Spatiotemporal Domain Decomposition for Massive Parallel Computation of Space-Time Kernel Density

    NASA Astrophysics Data System (ADS)

    Hohl, A.; Delmelle, E. M.; Tang, W.

    2015-07-01

    Accelerated processing capabilities are deemed critical when conducting analysis on spatiotemporal datasets of increasing size, diversity and availability. High-performance parallel computing offers the capacity to solve computationally demanding problems in a limited timeframe, but likewise poses the challenge of preventing processing inefficiency due to workload imbalance between computing resources. Therefore, when designing new algorithms capable of implementing parallel strategies, careful spatiotemporal domain decomposition is necessary to account for heterogeneity in the data. In this study, we perform octtree-based adaptive decomposition of the spatiotemporal domain for parallel computation of space-time kernel density. In order to avoid edge effects near subdomain boundaries, we establish spatiotemporal buffers to include adjacent data-points that are within the spatial and temporal kernel bandwidths. Then, we quantify computational intensity of each subdomain to balance workloads among processors. We illustrate the benefits of our methodology using a space-time epidemiological dataset of Dengue fever, an infectious vector-borne disease that poses a severe threat to communities in tropical climates. Our parallel implementation of kernel density reaches substantial speedup compared to sequential processing, and achieves high levels of workload balance among processors due to great accuracy in quantifying computational intensity. Our approach is portable of other space-time analytical tests.

  3. Speeding Up the Bilateral Filter: A Joint Acceleration Way.

    PubMed

    Dai, Longquan; Yuan, Mengke; Zhang, Xiaopeng

    2016-06-01

    Computational complexity of the brute-force implementation of the bilateral filter (BF) depends on its filter kernel size. To achieve the constant-time BF whose complexity is irrelevant to the kernel size, many techniques have been proposed, such as 2D box filtering, dimension promotion, and shiftability property. Although each of the above techniques suffers from accuracy and efficiency problems, previous algorithm designers were used to take only one of them to assemble fast implementations due to the hardness of combining them together. Hence, no joint exploitation of these techniques has been proposed to construct a new cutting edge implementation that solves these problems. Jointly employing five techniques: kernel truncation, best N-term approximation as well as previous 2D box filtering, dimension promotion, and shiftability property, we propose a unified framework to transform BF with arbitrary spatial and range kernels into a set of 3D box filters that can be computed in linear time. To the best of our knowledge, our algorithm is the first method that can integrate all these acceleration techniques and, therefore, can draw upon one another's strong point to overcome deficiencies. The strength of our method has been corroborated by several carefully designed experiments. In particular, the filtering accuracy is significantly improved without sacrificing the efficiency at running time.

  4. 75 FR 47751 - Proposed Flood Elevation Determinations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-09

    ... Ferry Road and Black Point Road. Along the shoreline at +8 +24 the intersection of Black Point Road and.... Knight, Deputy Federal Insurance and Mitigation Administrator, Mitigation, Department of Homeland...

  5. Ford Motor Company NDE facility shielding design.

    PubMed

    Metzger, Robert L; Van Riper, Kenneth A; Jones, Martin H

    2005-01-01

    Ford Motor Company proposed the construction of a large non-destructive evaluation laboratory for radiography of automotive power train components. The authors were commissioned to design the shielding and to survey the completed facility for compliance with radiation doses for occupationally and non-occupationally exposed personnel. The two X-ray sources are Varian Linatron 3000 accelerators operating at 9-11 MV. One performs computed tomography of automotive transmissions, while the other does real-time radiography of operating engines and transmissions. The shield thickness for the primary barrier and all secondary barriers were determined by point-kernel techniques. Point-kernel techniques did not work well for skyshine calculations and locations where multiple sources (e.g. tube head leakage and various scatter fields) impacted doses. Shielding for these areas was determined using transport calculations. A number of MCNP [Briesmeister, J. F. MCNPCA general Monte Carlo N-particle transport code version 4B. Los Alamos National Laboratory Manual (1997)] calculations focused on skyshine estimates and the office areas. Measurements on the operational facility confirmed the shielding calculations.

  6. Singularities, swallowtails and Dirac points. An analysis for families of Hamiltonians and applications to wire networks, especially the Gyroid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaufmann, Ralph M., E-mail: rkaufman@math.purdue.edu; Khlebnikov, Sergei, E-mail: skhleb@physics.purdue.edu; Wehefritz-Kaufmann, Birgit, E-mail: ebkaufma@math.purdue.edu

    2012-11-15

    Motivated by the Double Gyroid nanowire network we develop methods to detect Dirac points and classify level crossings, aka. singularities in the spectrum of a family of Hamiltonians. The approach we use is singularity theory. Using this language, we obtain a characterization of Dirac points and also show that the branching behavior of the level crossings is given by an unfolding of A{sub n} type singularities. Which type of singularity occurs can be read off a characteristic region inside the miniversal unfolding of an A{sub k} singularity. We then apply these methods in the setting of families of graph Hamiltonians,more » such as those for wire networks. In the particular case of the Double Gyroid we analytically classify its singularities and show that it has Dirac points. This indicates that nanowire systems of this type should have very special physical properties. - Highlights: Black-Right-Pointing-Pointer New method for analytically finding Dirac points. Black-Right-Pointing-Pointer Novel relation of level crossings to singularity theory. Black-Right-Pointing-Pointer More precise version of the von-Neumann-Wigner theorem for arbitrary smooth families of Hamiltonians of fixed size. Black-Right-Pointing-Pointer Analytical proof of the existence of Dirac points for the Gyroid wire network.« less

  7. Evolution and End Point of the Black String Instability: Large D Solution.

    PubMed

    Emparan, Roberto; Suzuki, Ryotaku; Tanabe, Kentaro

    2015-08-28

    We derive a simple set of nonlinear, (1+1)-dimensional partial differential equations that describe the dynamical evolution of black strings and branes to leading order in the expansion in the inverse of the number of dimensions D. These equations are easily solved numerically. Their solution shows that thin enough black strings are unstable to developing inhomogeneities along their length, and at late times they asymptote to stable nonuniform black strings. This proves an earlier conjecture about the end point of the instability of black strings in a large enough number of dimensions. If the initial black string is very thin, the final configuration is highly nonuniform and resembles a periodic array of localized black holes joined by short necks. We also present the equations that describe the nonlinear dynamics of anti-de Sitter black branes at large D.

  8. Thermodynamics of hairy black holes in Lovelock gravity

    NASA Astrophysics Data System (ADS)

    Hennigar, Robie A.; Tjoa, Erickson; Mann, Robert B.

    2017-02-01

    We perform a thorough study of the thermodynamic properties of a class of Lovelock black holes with conformal scalar hair arising from coupling of a real scalar field to the dimensionally extended Euler densities. We study the linearized equations of motion of the theory and describe constraints under which the theory is free from ghosts/tachyons. We then consider, within the context of black hole chemistry, the thermodynamics of the hairy black holes in the Gauss-Bonnet and cubic Lovelock theories. We clarify the connection between isolated critical points and thermodynamic singularities, finding a one parameter family of these critical points which occur for well-defined thermodynamic parameters. We also report on a number of novel results, including `virtual triple points' and the first example of a `λ-line' — a line of second order phase transitions — in black hole thermodynamics.

  9. Evaluating the intersection of a regional wildlife connectivity network with highways.

    PubMed

    Cushman, Samuel A; Lewis, Jesse S; Landguth, Erin L

    2013-01-01

    Reliable predictions of regional-scale population connectivity are needed to prioritize conservation actions. However, there have been few examples of regional connectivity models that are empirically derived and validated. The central goals of this paper were to (1) evaluate the effectiveness of factorial least cost path corridor mapping on an empirical resistance surface in reflecting the frequency of highway crossings by American black bear, (2) predict the location and predicted intensity of use of movement corridors for American black bear, and (3) identify where these corridors cross major highways and rank the intensity of these crossings. We used factorial least cost path modeling coupled with resistant kernel analysis to predict a network of movement corridors across a 30.2 million hectare analysis area in Montana and Idaho, USA. Factorial least cost path corridor mapping was associated with the locations of actual bear highway crossings. We identified corridor-highway intersections and ranked these based on corridor strength. We found that a major wildlife crossing overpass structure was located close to one of the most intense predicted corridors, and that the vast majority of the predicted corridor network was "protected" under federal management. However, narrow, linear corridors connecting the Greater Yellowstone Ecosystem to the rest of the analysis area had limited protection by federal ownership, making these additionally vulnerable to habitat loss and fragmentation. Factorial least cost path modeling coupled with resistant kernel analysis provides detailed, synoptic information about connectivity across populations that vary in distribution and density in complex landscapes. Specifically, our results could be used to quantify the structure of the connectivity network, identify critical linkage nodes and core areas, map potential barriers and fracture zones, and prioritize locations for mitigation, restoration and conservation actions.

  10. Diversity in Secondary Metabolites Including Mycotoxins from Strains of Aspergillus Section Nigri Isolated from Raw Cashew Nuts from Benin, West Africa.

    PubMed

    Lamboni, Yendouban; Nielsen, Kristian F; Linnemann, Anita R; Gezgin, Yüksel; Hell, Kerstin; Nout, Martinus J R; Smid, Eddy J; Tamo, Manuele; van Boekel, Martinus A J S; Hoof, Jakob Blæsbjerg; Frisvad, Jens Christian

    2016-01-01

    In a previous study, raw cashew kernels were assayed for the fungal contamination focusing on strains belonging to the genus Aspergillus and on aflatoxins producers. These samples showed high contamination with Aspergillus section Nigri species and absence of aflatoxins. To investigate the diversity of secondary metabolites, including mycotoxins, the species of A. section Nigri may produce and thus threaten to contaminate the raw cashew kernels, 150 strains were isolated from cashew samples and assayed for their production of secondary metabolites using liquid chromatography high resolution mass spectrometry (LC-HRMS). Seven species of black Aspergilli were isolated based on morphological and chemical identification: A. tubingensis (44%), A. niger (32%), A. brasiliensis (10%), A. carbonarius (8.7%), A. luchuensis (2.7%), A. aculeatus (2%) and A. aculeatinus (0.7%). From these, 45 metabolites and their isomers were identified. Aurasperone and pyranonigrin A, produced by all species excluding A. aculeatus and A. aculeatinus, were most prevalent and were encountered in 146 (97.3%) and 145 (95.7%) isolates, respectively. Three mycotoxins groups were detected: fumonisins (B2 and B4) (2.7%) ochratoxin A (13.3%), and secalonic acids (2%), indicating that these mycotoxins could occur in raw cashew nuts. Thirty strains of black Aspergilli were randomly sampled for verification of species identity based on sequences of β-tubulin and calmodulin genes. Among them, 27 isolates were positive to the primers used and 11 were identified as A. niger, 7 as A. tubingensis, 6 as A. carbonarius, 2 as A. luchuensis and 1 as A. welwitschiae confirming the species names as based on morphology and chemical features. These strains clustered in 5 clades in A. section Nigri. Chemical profile clustering also showed also 5 groups confirming the species specific metabolites production.

  11. Diversity in Secondary Metabolites Including Mycotoxins from Strains of Aspergillus Section Nigri Isolated from Raw Cashew Nuts from Benin, West Africa

    PubMed Central

    Lamboni, Yendouban; Nielsen, Kristian F.; Linnemann, Anita R.; Gezgin, Yüksel; Hell, Kerstin; Nout, Martinus J. R.; Smid, Eddy J.; Tamo, Manuele; van Boekel, Martinus A. J. S.; Hoof, Jakob Blæsbjerg; Frisvad, Jens Christian

    2016-01-01

    In a previous study, raw cashew kernels were assayed for the fungal contamination focusing on strains belonging to the genus Aspergillus and on aflatoxins producers. These samples showed high contamination with Aspergillus section Nigri species and absence of aflatoxins. To investigate the diversity of secondary metabolites, including mycotoxins, the species of A. section Nigri may produce and thus threaten to contaminate the raw cashew kernels, 150 strains were isolated from cashew samples and assayed for their production of secondary metabolites using liquid chromatography high resolution mass spectrometry (LC-HRMS). Seven species of black Aspergilli were isolated based on morphological and chemical identification: A. tubingensis (44%), A. niger (32%), A. brasiliensis (10%), A. carbonarius (8.7%), A. luchuensis (2.7%), A. aculeatus (2%) and A. aculeatinus (0.7%). From these, 45 metabolites and their isomers were identified. Aurasperone and pyranonigrin A, produced by all species excluding A. aculeatus and A. aculeatinus, were most prevalent and were encountered in 146 (97.3%) and 145 (95.7%) isolates, respectively. Three mycotoxins groups were detected: fumonisins (B2 and B4) (2.7%) ochratoxin A (13.3%), and secalonic acids (2%), indicating that these mycotoxins could occur in raw cashew nuts. Thirty strains of black Aspergilli were randomly sampled for verification of species identity based on sequences of β-tubulin and calmodulin genes. Among them, 27 isolates were positive to the primers used and 11 were identified as A. niger, 7 as A. tubingensis, 6 as A. carbonarius, 2 as A. luchuensis and 1 as A. welwitschiae confirming the species names as based on morphology and chemical features. These strains clustered in 5 clades in A. section Nigri. Chemical profile clustering also showed also 5 groups confirming the species specific metabolites production. PMID:27768708

  12. Intelligent Design of Metal Oxide Gas Sensor Arrays Using Reciprocal Kernel Support Vector Regression

    NASA Astrophysics Data System (ADS)

    Dougherty, Andrew W.

    Metal oxides are a staple of the sensor industry. The combination of their sensitivity to a number of gases, and the electrical nature of their sensing mechanism, make the particularly attractive in solid state devices. The high temperature stability of the ceramic material also make them ideal for detecting combustion byproducts where exhaust temperatures can be high. However, problems do exist with metal oxide sensors. They are not very selective as they all tend to be sensitive to a number of reduction and oxidation reactions on the oxide's surface. This makes sensors with large numbers of sensors interesting to study as a method for introducing orthogonality to the system. Also, the sensors tend to suffer from long term drift for a number of reasons. In this thesis I will develop a system for intelligently modeling metal oxide sensors and determining their suitability for use in large arrays designed to analyze exhaust gas streams. It will introduce prior knowledge of the metal oxide sensors' response mechanisms in order to produce a response function for each sensor from sparse training data. The system will use the same technique to model and remove any long term drift from the sensor response. It will also provide an efficient means for determining the orthogonality of the sensor to determine whether they are useful in gas sensing arrays. The system is based on least squares support vector regression using the reciprocal kernel. The reciprocal kernel is introduced along with a method of optimizing the free parameters of the reciprocal kernel support vector machine. The reciprocal kernel is shown to be simpler and to perform better than an earlier kernel, the modified reciprocal kernel. Least squares support vector regression is chosen as it uses all of the training points and an emphasis was placed throughout this research for extracting the maximum information from very sparse data. The reciprocal kernel is shown to be effective in modeling the sensor responses in the time, gas and temperature domains, and the dual representation of the support vector regression solution is shown to provide insight into the sensor's sensitivity and potential orthogonality. Finally, the dual weights of the support vector regression solution to the sensor's response are suggested as a fitness function for a genetic algorithm, or some other method for efficiently searching large parameter spaces.

  13. Bait Preference of Free-Ranging Feral Swine for Delivery of a Novel Toxicant

    PubMed Central

    Snow, Nathan P.; Halseth, Joseph M.; Lavelle, Michael J.; Hanson, Thomas E.; Blass, Chad R.; Foster, Justin A.; Humphrys, Simon T.; Staples, Linton D.; Hewitt, David G.; VerCauteren, Kurt C.

    2016-01-01

    Invasive feral swine (Sus scrofa) cause extensive damage to agricultural and wildlife resources throughout the United States. Development of sodium nitrite as a new, orally delivered toxicant is underway to provide an additional tool to curtail growth and expansion of feral swine populations. A micro-encapsulation coating around sodium nitrite is used to minimize detection by feral swine and maximize stability for the reactive molecule. To maximize uptake of this toxicant by feral swine, development a bait matrix is needed to 1) protect the micro-encapsulation coating so that sodium nitrite remains undetectable to feral swine, 2) achieve a high degree of acceptance by feral swine, and 3) be minimally appealing to non-target species. With these purposes, a field evaluation at 88 sites in south-central Texas was conducted using remote cameras to evaluate preferences by feral swine for several oil-based bait matrices including uncolored peanut paste, black-colored peanut paste, and peanut-based slurry mixed onto whole-kernel corn. These placebo baits were compared to a reference food, whole-kernel corn, known to be readily taken by feral swine (i.e., control). The amount of bait consumed by feral swine was also estimated using remote cameras and grid boards at 5 additional sites. On initial exposure, feral swine showed reduced visitations to the uncolored peanut paste and peanut slurry treatments. This reduced visitation subsided by the end of the treatment period, suggesting that feral swine needed time to accept these bait types. The black-colored peanut paste was visited equally to the control throughout the study, and enough of this matrix was consumed to deliver lethal doses of micro-encapsulated sodium nitrite to most feral swine during 1–2 feeding events. None of the treatment matrices reduced visitations by nontarget species, but feral swine dominated visitations for all matrices. It was concluded that black-colored peanut paste achieved satisfactory preference and consumption by feral swine, and no discernable preference by non-target species, compared to the other treatments. PMID:26812148

  14. Graviton 1-loop partition function for 3-dimensional massive gravity

    NASA Astrophysics Data System (ADS)

    Gaberdiel, Matthias R.; Grumiller, Daniel; Vassilevich, Dmitri

    2010-11-01

    Thegraviton1-loop partition function in Euclidean topologically massivegravity (TMG) is calculated using heat kernel techniques. The partition function does not factorize holomorphically, and at the chiral point it has the structure expected from a logarithmic conformal field theory. This gives strong evidence for the proposal that the dual conformal field theory to TMG at the chiral point is indeed logarithmic. We also generalize our results to new massive gravity.

  15. Glycosylatable GFP as a compartment-specific membrane topology reporter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hunsang; Min, Jisoo; Heijne, Gunnar von

    2012-11-02

    Highlights: Black-Right-Pointing-Pointer An N-linked glycosylation site is introduced near the GFP fluorophore. Black-Right-Pointing-Pointer gGFP is not glycosylated and is fully fluorescent in the cytosol. Black-Right-Pointing-Pointer gGFP is glycosylated and non-fluorescent in the lumen of the ER. Black-Right-Pointing-Pointer gGFP is fused to membrane proteins of known topology. Black-Right-Pointing-Pointer Its applicability as a membrane topology reporter is demonstrated. -- Abstract: Determination of the membrane topology is an essential step in structural and functional studies of integral membrane proteins, yet the choices of membrane topology reporters are limited and the experimental analysis can be laborious, especially in eukaryotic cells. Here, we present amore » robust membrane topology reporter, glycosylatable green fluorescent protein (gGFP). gGFP is fully fluorescent in the yeast cytosol but becomes glycosylated and does not fluoresce in the lumen of the endoplasmic reticulum (ER). Thus, by assaying fluorescence and the glycosylation status of C-terminal fusions of gGFP to target membrane proteins in whole-cell lysates, the localization of the gGFP moiety (and hence the fusion joint) relative to the ER membrane can be unambiguously determined.« less

  16. Redox active molecules cytochrome c and vitamin C enhance heme-enzyme peroxidations by serving as non-specific agents for redox relay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gade, Sudeep Kumar; Bhattacharya, Subarna; Manoj, Kelath Murali, E-mail: satyamjayatu@yahoo.com

    2012-03-09

    Highlights: Black-Right-Pointing-Pointer At low concentrations, cytochrome c/vitamin C do not catalyze peroxidations. Black-Right-Pointing-Pointer But low levels of cytochrome c/vitamin C enhance diverse heme peroxidase activities. Black-Right-Pointing-Pointer Enhancement positively correlates to the concentration of peroxide in reaction. Black-Right-Pointing-Pointer Reducible additives serve as non-specific agents for redox relay in the system. Black-Right-Pointing-Pointer Insight into electron transfer processes in routine and oxidative-stress states. -- Abstract: We report that incorporation of very low concentrations of redox protein cytochrome c and redox active small molecule vitamin C impacted the outcome of one-electron oxidations mediated by structurally distinct plant/fungal heme peroxidases. Evidence suggests that cytochrome cmore » and vitamin C function as a redox relay for diffusible reduced oxygen species in the reaction system, without invoking specific or affinity-based molecular interactions for electron transfers. The findings provide novel perspectives to understanding - (1) the promiscuous role of cytochrome b{sub 5} in the metabolism mediated by liver microsomal xenobiotic metabolizing systems and (2) the roles of antioxidant molecules in affording relief from oxidative stress.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, Stefanie; Sommer, Anja; Distel, Luitpold V.R.

    Highlights: Black-Right-Pointing-Pointer Ultrasmall citrate-coated SPIONs with {gamma}Fe{sub 2}O{sub 3} and Fe{sub 3}O{sub 4} structure were prepared. Black-Right-Pointing-Pointer SPIONs uptaken by MCF-7 cells increase the ROS production for about 240%. Black-Right-Pointing-Pointer The SPION induced ROS production is due to released iron ions and catalytically active surfaces. Black-Right-Pointing-Pointer Released iron ions and SPION surfaces initiate the Fenton and Haber-Weiss reaction. Black-Right-Pointing-Pointer X-ray irradiation of internalized SPIONs leads to an increase of catalytically active surfaces. -- Abstract: Internalization of citrate-coated and uncoated superparamagnetic iron oxide nanoparticles by human breast cancer (MCF-7) cells was verified by transmission electron microscopy imaging. Cytotoxicity studies employing metabolicmore » and trypan blue assays manifested their excellent biocompatibility. The production of reactive oxygen species in iron oxide nanoparticle loaded MCF-7 cells was explained to originate from both, the release of iron ions and their catalytically active surfaces. Both initiate the Fenton and Haber-Weiss reaction. Additional oxidative stress caused by X-ray irradiation of MCF-7 cells was attributed to the increase of catalytically active iron oxide nanoparticle surfaces.« less

  18. Endotoxin-induced basal respiration alterations of renal HK-2 cells: A sign of pathologic metabolism down-regulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quoilin, C., E-mail: cquoilin@ulg.ac.be; Mouithys-Mickalad, A.; Duranteau, J.

    Highlights: Black-Right-Pointing-Pointer A HK-2 cells model of inflammation-induced acute kidney injury. Black-Right-Pointing-Pointer Two oximetry methods: high resolution respirometry and ESR spectroscopy. Black-Right-Pointing-Pointer Oxygen consumption rates of renal cells decrease when treated with LPS. Black-Right-Pointing-Pointer Cells do not recover normal respiration when the LPS treatment is removed. Black-Right-Pointing-Pointer This basal respiration alteration is a sign of pathologic metabolism down-regulation. -- Abstract: To study the mechanism of oxygen regulation in inflammation-induced acute kidney injury, we investigate the effects of a bacterial endotoxin (lipopolysaccharide, LPS) on the basal respiration of proximal tubular epithelial cells (HK-2) both by high-resolution respirometry and electron spin resonancemore » spectroscopy. These two complementary methods have shown that HK-2 cells exhibit a decreased oxygen consumption rate when treated with LPS. Surprisingly, this cellular respiration alteration persists even after the stress factor was removed. We suggested that this irreversible decrease in renal oxygen consumption after LPS challenge is related to a pathologic metabolic down-regulation such as a lack of oxygen utilization by cells.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Dyke, Natalya; Chanchorn, Ekkawit; Van Dyke, Michael W., E-mail: mvandyke@email.wcu.edu

    Highlights: Black-Right-Pointing-Pointer Stm1p confers increased resistance to the macrolide starvation-mimic rapamycin. Black-Right-Pointing-Pointer Stm1p maintains 80S ribosome integrity during stationary phase-induced quiescence. Black-Right-Pointing-Pointer Stm1p facilitates polysome formation following quiescence exit. Black-Right-Pointing-Pointer Stm1p facilitates protein synthesis following quiescence exit. Black-Right-Pointing-Pointer Stm1p is a ribosome preservation factor under conditions of nutrient deprivation. -- Abstract: Once cells exhaust nutrients from their environment, they enter an alternative resting state known as quiescence, whereby proliferation ceases and essential nutrients are obtained through internal stores and through the catabolism of existing macromolecules and organelles. One example of this is ribophagy, the degradation of ribosomes through the processmore » of autophagy. However, some ribosomes need to be preserved for an anticipated recovery from nutrient deprivation. We found that the ribosome-associated protein Stm1p greatly increases the quantity of 80S ribosomes present in quiescent yeast cells and that these ribosomes facilitate increased protein synthesis rates once nutrients are restored. These findings suggest that Stm1p can act as a ribosome preservation factor under conditions of nutrient deprivation and restoration.« less

  20. "Black Like Me": Reframing Blackness for Decolonial Politics

    ERIC Educational Resources Information Center

    Dei, George J. Sefa

    2018-01-01

    From a particular vantage point, as an African-born scholar with a politics to affirm my Black subjectivity and Indigeneity in a diasporic context, my article engages a (re)theorization of Blackness for decolonial politics. Building on existing works of how Black scholars, themselves, have theorized Blackness, and recognizing the fluid,…

  1. GIS-based support vector machine modeling of earthquake-triggered landslide susceptibility in the Jianjiang River watershed, China

    NASA Astrophysics Data System (ADS)

    Xu, Chong; Dai, Fuchu; Xu, Xiwei; Lee, Yuan Hsi

    2012-04-01

    Support vector machine (SVM) modeling is based on statistical learning theory. It involves a training phase with associated input and target output values. In recent years, the method has become increasingly popular. The main purpose of this study is to evaluate the mapping power of SVM modeling in earthquake triggered landslide-susceptibility mapping for a section of the Jianjiang River watershed using a Geographic Information System (GIS) software. The river was affected by the Wenchuan earthquake of May 12, 2008. Visual interpretation of colored aerial photographs of 1-m resolution and extensive field surveys provided a detailed landslide inventory map containing 3147 landslides related to the 2008 Wenchuan earthquake. Elevation, slope angle, slope aspect, distance from seismogenic faults, distance from drainages, and lithology were used as the controlling parameters. For modeling, three groups of positive and negative training samples were used in concert with four different kernel functions. Positive training samples include the centroids of 500 large landslides, those of all 3147 landslides, and 5000 randomly selected points in landslide polygons. Negative training samples include 500, 3147, and 5000 randomly selected points on slopes that remained stable during the Wenchuan earthquake. The four kernel functions are linear, polynomial, radial basis, and sigmoid. In total, 12 cases of landslide susceptibility were mapped. Comparative analyses of landslide-susceptibility probability and area relation curves show that both the polynomial and radial basis functions suitably classified the input data as either landslide positive or negative though the radial basis function was more successful. The 12 generated landslide-susceptibility maps were compared with known landslide centroid locations and landslide polygons to verify the success rate and predictive accuracy of each model. The 12 results were further validated using area-under-curve analysis. Group 3 with 5000 randomly selected points on the landslide polygons, and 5000 randomly selected points along stable slopes gave the best results with a success rate of 79.20% and predictive accuracy of 79.13% under the radial basis function. Of all the results, the sigmoid kernel function was the least skillful when used in concert with the centroid data of all 3147 landslides as positive training samples, and the negative training samples of 3147 randomly selected points in regions of stable slope (success rate = 54.95%; predictive accuracy = 61.85%). This paper also provides suggestions and reference data for selecting appropriate training samples and kernel function types for earthquake triggered landslide-susceptibility mapping using SVM modeling. Predictive landslide-susceptibility maps could be useful in hazard mitigation by helping planners understand the probability of landslides in different regions.

  2. Phase transition of charged-AdS black holes and quasinormal modes: A time domain analysis

    NASA Astrophysics Data System (ADS)

    Chabab, M.; El Moumni, H.; Iraoui, S.; Masmar, K.

    2017-10-01

    In this work, we investigate the time evolution of a massless scalar perturbation around small and large RN-AdS4 black holes for the purpose of probing the thermodynamic phase transition. We show that below the critical point the scalar perturbation decays faster with increasing of the black hole size, both for small and large black hole phases. Our analysis of the time profile of quasinormal mode reveals a sharp distinction between the behaviors of both phases, providing a reliable tool to probe the black hole phase transition. However at the critical point P=Pc, as the black hole size extends, we note that the damping time increases and the perturbation decays faster, the oscillation frequencies raise either in small and large black hole phase. In this case the time evolution approach fails to track the AdS4 black hole phase.

  3. 40 CFR 458.10 - Applicability; description of the carbon black furnace process subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... carbon black furnace process subcategory. 458.10 Section 458.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Furnace Process Subcategory § 458.10 Applicability; description of the carbon black...

  4. 40 CFR 458.20 - Applicability: description of the carbon black thermal process subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... carbon black thermal process subcategory. 458.20 Section 458.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Thermal Process Subcategory § 458.20 Applicability: description of the carbon black...

  5. 40 CFR 458.30 - Applicability; description of the carbon black channel process subcategory.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... carbon black channel process subcategory. 458.30 Section 458.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Channel Process Subcategory § 458.30 Applicability; description of the carbon black...

  6. How can we deal with ANN in flood forecasting? As a simulation model or updating kernel!

    NASA Astrophysics Data System (ADS)

    Hassan Saddagh, Mohammad; Javad Abedini, Mohammad

    2010-05-01

    Flood forecasting and early warning, as a non-structural measure for flood control, is often considered to be the most effective and suitable alternative to mitigate the damage and human loss caused by flood. Forecast results which are output of hydrologic, hydraulic and/or black box models should secure accuracy of flood values and timing, especially for long lead time. The application of the artificial neural network (ANN) in flood forecasting has received extensive attentions in recent years due to its capability to capture the dynamics inherent in complex processes including flood. However, results obtained from executing plain ANN as simulation model demonstrate dramatic reduction in performance indices as lead time increases. This paper is intended to monitor the performance indices as it relates to flood forecasting and early warning using two different methodologies. While the first method employs a multilayer neural network trained using back-propagation scheme to forecast output hydrograph of a hypothetical river for various forecast lead time up to 6.0 hr, the second method uses 1D hydrodynamic MIKE11 model as forecasting model and multilayer neural network as updating kernel to monitor and assess the performance indices compared to ANN alone in light of increase in lead time. Results presented in both graphical and tabular format indicate superiority of MIKE11 coupled with ANN as updating kernel compared to ANN as simulation model alone. While plain ANN produces more accurate results for short lead time, the errors increase expeditiously for longer lead time. The second methodology provides more accurate and reliable results for longer forecast lead time.

  7. P- and S-wave Receiver Function Imaging with Scattering Kernels

    NASA Astrophysics Data System (ADS)

    Hansen, S. M.; Schmandt, B.

    2017-12-01

    Full waveform inversion provides a flexible approach to the seismic parameter estimation problem and can account for the full physics of wave propagation using numeric simulations. However, this approach requires significant computational resources due to the demanding nature of solving the forward and adjoint problems. This issue is particularly acute for temporary passive-source seismic experiments (e.g. PASSCAL) that have traditionally relied on teleseismic earthquakes as sources resulting in a global scale forward problem. Various approximation strategies have been proposed to reduce the computational burden such as hybrid methods that embed a heterogeneous regional scale model in a 1D global model. In this study, we focus specifically on the problem of scattered wave imaging (migration) using both P- and S-wave receiver function data. The proposed method relies on body-wave scattering kernels that are derived from the adjoint data sensitivity kernels which are typically used for full waveform inversion. The forward problem is approximated using ray theory yielding a computationally efficient imaging algorithm that can resolve dipping and discontinuous velocity interfaces in 3D. From the imaging perspective, this approach is closely related to elastic reverse time migration. An energy stable finite-difference method is used to simulate elastic wave propagation in a 2D hypothetical subduction zone model. The resulting synthetic P- and S-wave receiver function datasets are used to validate the imaging method. The kernel images are compared with those generated by the Generalized Radon Transform (GRT) and Common Conversion Point stacking (CCP) methods. These results demonstrate the potential of the kernel imaging approach to constrain lithospheric structure in complex geologic environments with sufficiently dense recordings of teleseismic data. This is demonstrated using a receiver function dataset from the Central California Seismic Experiment which shows several dipping interfaces related to the tectonic assembly of this region. Figure 1. Scattering kernel examples for three receiver function phases. A) direct P-to-s (Ps), B) direct S-to-p and C) free-surface PP-to-s (PPs).

  8. Experimental pencil beam kernels derivation for 3D dose calculation in flattening filter free modulated fields

    NASA Astrophysics Data System (ADS)

    Diego Azcona, Juan; Barbés, Benigno; Wang, Lilie; Burguete, Javier

    2016-01-01

    This paper presents a method to obtain the pencil-beam kernels that characterize a megavoltage photon beam generated in a flattening filter free (FFF) linear accelerator (linac) by deconvolution from experimental measurements at different depths. The formalism is applied to perform independent dose calculations in modulated fields. In our previous work a formalism was developed for ideal flat fluences exiting the linac’s head. That framework could not deal with spatially varying energy fluences, so any deviation from the ideal flat fluence was treated as a perturbation. The present work addresses the necessity of implementing an exact analysis where any spatially varying fluence can be used such as those encountered in FFF beams. A major improvement introduced here is to handle the actual fluence in the deconvolution procedure. We studied the uncertainties associated to the kernel derivation with this method. Several Kodak EDR2 radiographic films were irradiated with a 10 MV FFF photon beam from two linacs from different vendors, at the depths of 5, 10, 15, and 20cm in polystyrene (RW3 water-equivalent phantom, PTW Freiburg, Germany). The irradiation field was a 50mm diameter circular field, collimated with a lead block. The 3D kernel for a FFF beam was obtained by deconvolution using the Hankel transform. A correction on the low dose part of the kernel was performed to reproduce accurately the experimental output factors. Error uncertainty in the kernel derivation procedure was estimated to be within 0.2%. Eighteen modulated fields used clinically in different treatment localizations were irradiated at four measurement depths (total of fifty-four film measurements). Comparison through the gamma-index to their corresponding calculated absolute dose distributions showed a number of passing points (3%, 3mm) mostly above 99%. This new procedure is more reliable and robust than the previous one. Its ability to perform accurate independent dose calculations was demonstrated.

  9. Producing data-based sensitivity kernels from convolution and correlation in exploration geophysics.

    NASA Astrophysics Data System (ADS)

    Chmiel, M. J.; Roux, P.; Herrmann, P.; Rondeleux, B.

    2016-12-01

    Many studies have shown that seismic interferometry can be used to estimate surface wave arrivals by correlation of seismic signals recorded at a pair of locations. In the case of ambient noise sources, the convergence towards the surface wave Green's functions is obtained with the criterion of equipartitioned energy. However, seismic acquisition with active, controlled sources gives more possibilities when it comes to interferometry. The use of controlled sources makes it possible to recover the surface wave Green's function between two points using either correlation or convolution. We investigate the convolutional and correlational approaches using land active-seismic data from exploration geophysics. The data were recorded on 10,710 vertical receivers using 51,808 sources (seismic vibrator trucks). The sources spacing is the same in both X and Y directions (30 m) which is known as a "carpet shooting". The receivers are placed in parallel lines with a spacing 150 m in the X direction and 30 m in the Y direction. Invoking spatial reciprocity between sources and receivers, correlation and convolution functions can thus be constructed between either pairs of receivers or pairs of sources. Benefiting from the dense acquisition, we extract sensitivity kernels from correlation and convolution measurements of the seismic data. These sensitivity kernels are subsequently used to produce phase-velocity dispersion curves between two points and to separate the higher mode from the fundamental mode for surface waves. Potential application to surface wave cancellation is also envisaged.

  10. Isolation of rat adrenocortical mitochondria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solinas, Paola; Department of Medicine, Center for Mitochondrial Disease, School of Medicine, Case Western Reserve University, Cleveland, OH 44106; Fujioka, Hisashi

    2012-10-12

    Highlights: Black-Right-Pointing-Pointer A method for isolation of adrenocortical mitochondria from the adrenal gland of rats is described. Black-Right-Pointing-Pointer The purified isolated mitochondria show excellent morphological integrity. Black-Right-Pointing-Pointer The properties of oxidative phosphorylation are excellent. Black-Right-Pointing-Pointer The method increases the opportunity of direct analysis of adrenal mitochondria from small animals. -- Abstract: This report describes a relatively simple and reliable method for isolating adrenocortical mitochondria from rats in good, reasonably pure yield. These organelles, which heretofore have been unobtainable in isolated form from small laboratory animals, are now readily accessible. A high degree of mitochondrial purity is shown by the electronmore » micrographs, as well as the structural integrity of each mitochondrion. That these organelles have retained their functional integrity is shown by their high respiratory control ratios. In general, the biochemical performance of these adrenal cortical mitochondria closely mirrors that of typical hepatic or cardiac mitochondria.« less

  11. Establishing a Geologic Baseline Of Cape Canaveral's Natural Landscape: Black Point Drive

    NASA Technical Reports Server (NTRS)

    Parkinson, Randall W.

    2001-01-01

    The goal of this project is to identify the process responsible for the formation of geomorphic features in the Black Point Drive area of Merritt Island National Wildlife Refuge/Kennedy Space Center (MINWR/KSC), northwest Cape Canaveral. This study confirms the principal landscape components (geomorphology) of Black Point Drive reflect interaction between surficial sediments deposited in association with late-Quaternary sea-level highstands and the chemical evolution of late-Cenozoic subsurface limestone formations. The Black Point Drive landscape consists of an undulatory mesic terrain which dips westward into myriad circular and channel-like depression marshes and lakes. This geomorphic gradient may reflect: (1) spatial distinctions in the elevation, character or age of buried (pre-Miocene) limestone formations, (2) dissolution history of late-Quaternary coquina and/or (3) thickness of unconsolidated surface sediment. More detailed evaluation of subsurface data will be necessary before this uncertainty can be resolved.

  12. Establishing A Geologic Baseline of Cape Canaveral''s Natural Landscape: Black Point Drive

    NASA Technical Reports Server (NTRS)

    Parkinson, Randall W.

    2002-01-01

    The goal of this project is to identify the process responsible for the formation of geomorphic features in the Black Point Drive area of Merritt Island National Wildlife Refuge/Kennedy Space Center (MINWR/KSC), northwest Cape Canaveral. This study confirms the principal landscape components (geomorphology) of Black Point Drive reflect interaction between surficial sediments deposited in association with late-Quaternary sea-level highstands and the chemical evolution of late-Cenozoic sub-surface limestone formations. The Black Point Drive landscape consists of an undulatory mesic terrain which dips westward into myriad circular and channel-like depression marshes and lakes. This geomorphic gradient may reflect: (1) spatial distinctions in the elevation, character or age of buried (pre-Miocene) limestone formations, (2) dissolution history of late-Quaternary coquina and/or (3) thickness of unconsolidated surface sediment. More detailed evaluation of subsurface data will be necessary before this uncertain0 can be resolved.

  13. Friction surfaced Stellite6 coatings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, K. Prasad; Damodaram, R.; Rafi, H. Khalid, E-mail: khalidrafi@gmail.com

    2012-08-15

    Solid state Stellite6 coatings were deposited on steel substrate by friction surfacing and compared with Stellite6 cast rod and coatings deposited by gas tungsten arc and plasma transferred arc welding processes. Friction surfaced coatings exhibited finer and uniformly distributed carbides and were characterized by the absence of solidification structure and compositional homogeneity compared to cast rod, gas tungsten arc and plasma transferred coatings. Friction surfaced coating showed relatively higher hardness. X-ray diffraction of samples showed only face centered cubic Co peaks while cold worked coating showed hexagonally close packed Co also. - Highlights: Black-Right-Pointing-Pointer Stellite6 used as coating material formore » friction surfacing. Black-Right-Pointing-Pointer Friction surfaced (FS) coatings compared with casting, GTA and PTA processes. Black-Right-Pointing-Pointer Finer and uniformly distributed carbides in friction surfaced coatings. Black-Right-Pointing-Pointer Absence of melting results compositional homogeneity in FS Stellite6 coatings.« less

  14. Implementation of radiation shielding calculation methods. Volume 2: Seminar/Workshop notes

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.

    1971-01-01

    Detailed descriptions are presented of the input data for each of the MSFC computer codes applied to the analysis of a realistic nuclear propelled vehicle. The analytical techniques employed include cross section data, preparation, one and two dimensional discrete ordinates transport, point kernel, and single scatter methods.

  15. Virtual reality based adaptive dose assessment method for arbitrary geometries in nuclear facility decommissioning.

    PubMed

    Liu, Yong-Kuo; Chao, Nan; Xia, Hong; Peng, Min-Jun; Ayodeji, Abiodun

    2018-05-17

    This paper presents an improved and efficient virtual reality-based adaptive dose assessment method (VRBAM) applicable to the cutting and dismantling tasks in nuclear facility decommissioning. The method combines the modeling strength of virtual reality with the flexibility of adaptive technology. The initial geometry is designed with the three-dimensional computer-aided design tools, and a hybrid model composed of cuboids and a point-cloud is generated automatically according to the virtual model of the object. In order to improve the efficiency of dose calculation while retaining accuracy, the hybrid model is converted to a weighted point-cloud model, and the point kernels are generated by adaptively simplifying the weighted point-cloud model according to the detector position, an approach that is suitable for arbitrary geometries. The dose rates are calculated with the Point-Kernel method. To account for radiation scattering effects, buildup factors are calculated with the Geometric-Progression formula in the fitting function. The geometric modeling capability of VRBAM was verified by simulating basic geometries, which included a convex surface, a concave surface, a flat surface and their combination. The simulation results show that the VRBAM is more flexible and superior to other approaches in modeling complex geometries. In this paper, the computation time and dose rate results obtained from the proposed method were also compared with those obtained using the MCNP code and an earlier virtual reality-based method (VRBM) developed by the same authors. © 2018 IOP Publishing Ltd.

  16. A voting-based statistical cylinder detection framework applied to fallen tree mapping in terrestrial laser scanning point clouds

    NASA Astrophysics Data System (ADS)

    Polewski, Przemyslaw; Yao, Wei; Heurich, Marco; Krzystek, Peter; Stilla, Uwe

    2017-07-01

    This paper introduces a statistical framework for detecting cylindrical shapes in dense point clouds. We target the application of mapping fallen trees in datasets obtained through terrestrial laser scanning. This is a challenging task due to the presence of ground vegetation, standing trees, DTM artifacts, as well as the fragmentation of dead trees into non-collinear segments. Our method shares the concept of voting in parameter space with the generalized Hough transform, however two of its significant drawbacks are improved upon. First, the need to generate samples on the shape's surface is eliminated. Instead, pairs of nearby input points lying on the surface cast a vote for the cylinder's parameters based on the intrinsic geometric properties of cylindrical shapes. Second, no discretization of the parameter space is required: the voting is carried out in continuous space by means of constructing a kernel density estimator and obtaining its local maxima, using automatic, data-driven kernel bandwidth selection. Furthermore, we show how the detected cylindrical primitives can be efficiently merged to obtain object-level (entire tree) semantic information using graph-cut segmentation and a tailored dynamic algorithm for eliminating cylinder redundancy. Experiments were performed on 3 plots from the Bavarian Forest National Park, with ground truth obtained through visual inspection of the point clouds. It was found that relative to sample consensus (SAC) cylinder fitting, the proposed voting framework can improve the detection completeness by up to 10 percentage points while maintaining the correctness rate.

  17. The SPH consistency problem and some astrophysical applications

    NASA Astrophysics Data System (ADS)

    Klapp, Jaime; Sigalotti, Leonardo; Rendon, Otto; Gabbasov, Ruslan; Torres, Ayax

    2017-11-01

    We discuss the SPH kernel and particle consistency problem and demonstrate that SPH has a limiting second-order convergence rate. We also present a solution to the SPH consistency problem. We present examples of how SPH implementations that are not mathematically consistent may lead to erroneous results. The new formalism has been implemented into the Gadget 2 code, including an improved scheme for the artificial viscosity. We present results for the ``Standard Isothermal Test Case'' of gravitational collapse and fragmentation of protostellar molecular cores that produce a very different evolution than with the standard SPH theory. A further application of accretion onto a black hole is presented.

  18. Turning Point Instabilities for Relativistic Stars and Black Holes

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua; Wald, Robert

    2014-03-01

    In the light of recent results relating dynamic and thermodynamic stability of relativistic stars and black holes, we re-examine the relationship between ``turning points''--i.e., extrema of thermodynamic variables along a one-parameter family of solutions--and instabilities. We give a proof of Sorkin's general result--showing the existence of a thermodynamic instability on one side of a turning point--that does not rely on heuristic arguments involving infinite dimensional manifold structure. We use the turning point results to prove the existence of a dynamic instability of black rings in 5 spacetime dimensions in the region where cJ > 0 , in agreement with a result of Figueras, Murata, and Reall.

  19. Control of Early Flame Kernel Growth by Multi-Wavelength Laser Pulses for Enhanced Ignition

    DOE PAGES

    Dumitrache, Ciprian; VanOsdol, Rachel; Limbach, Christopher M.; ...

    2017-08-31

    The present contribution examines the impact of plasma dynamics and plasma-driven fluid dynamics on the flame growth of laser ignited mixtures and shows that a new dual-pulse scheme can be used to control the kernel formation process in ways that extend the lean ignition limit. We do this by performing a comparative study between (conventional) single-pulse laser ignition (λ = 1064 nm) and a novel dual-pulse method based on combining an ultraviolet (UV) pre-ionization pulse (λ = 266 nm) with an overlapped near-infrared (NIR) energy addition pulse (λ = 1064 nm). We employ OH* chemiluminescence to visualize the evolution ofmore » the early flame kernel. For single-pulse laser ignition at lean conditions, the flame kernel separates through third lobe detachment, corresponding to high strain rates that extinguish the flame. In this work, we investigate the capabilities of the dual-pulse to control the plasma-driven fluid dynamics by adjusting the axial offset of the two focal points. In particular, we find there exists a beam waist offset whereby the resulting vorticity suppresses formation of the third lobe, consequently reducing flame stretch. With this approach, we demonstrate that the dual-pulse method enables reduced flame speeds (at early times), an extended lean limit, increased combustion efficiency, and decreased laser energy requirements.« less

  20. Infrared microspectroscopic imaging of plant tissues: spectral visualization of Triticum aestivum kernel and Arabidopsis leaf microstructure.

    PubMed

    Warren, Frederick J; Perston, Benjamin B; Galindez-Najera, Silvia P; Edwards, Cathrina H; Powell, Prudence O; Mandalari, Giusy; Campbell, Grant M; Butterworth, Peter J; Ellis, Peter R

    2015-11-01

    Infrared microspectroscopy is a tool with potential for studies of the microstructure, chemical composition and functionality of plants at a subcellular level. Here we present the use of high-resolution bench top-based infrared microspectroscopy to investigate the microstructure of Triticum aestivum L. (wheat) kernels and Arabidopsis leaves. Images of isolated wheat kernel tissues and whole wheat kernels following hydrothermal processing and simulated gastric and duodenal digestion were generated, as well as images of Arabidopsis leaves at different points during a diurnal cycle. Individual cells and cell walls were resolved, and large structures within cells, such as starch granules and protein bodies, were clearly identified. Contrast was provided by converting the hyperspectral image cubes into false-colour images using either principal component analysis (PCA) overlays or by correlation analysis. The unsupervised PCA approach provided a clear view of the sample microstructure, whereas the correlation analysis was used to confirm the identity of different anatomical structures using the spectra from isolated components. It was then demonstrated that gelatinized and native starch within cells could be distinguished, and that the loss of starch during wheat digestion could be observed, as well as the accumulation of starch in leaves during a diurnal period. © 2015 The Authors The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  1. Control of Early Flame Kernel Growth by Multi-Wavelength Laser Pulses for Enhanced Ignition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumitrache, Ciprian; VanOsdol, Rachel; Limbach, Christopher M.

    The present contribution examines the impact of plasma dynamics and plasma-driven fluid dynamics on the flame growth of laser ignited mixtures and shows that a new dual-pulse scheme can be used to control the kernel formation process in ways that extend the lean ignition limit. We do this by performing a comparative study between (conventional) single-pulse laser ignition (λ = 1064 nm) and a novel dual-pulse method based on combining an ultraviolet (UV) pre-ionization pulse (λ = 266 nm) with an overlapped near-infrared (NIR) energy addition pulse (λ = 1064 nm). We employ OH* chemiluminescence to visualize the evolution ofmore » the early flame kernel. For single-pulse laser ignition at lean conditions, the flame kernel separates through third lobe detachment, corresponding to high strain rates that extinguish the flame. In this work, we investigate the capabilities of the dual-pulse to control the plasma-driven fluid dynamics by adjusting the axial offset of the two focal points. In particular, we find there exists a beam waist offset whereby the resulting vorticity suppresses formation of the third lobe, consequently reducing flame stretch. With this approach, we demonstrate that the dual-pulse method enables reduced flame speeds (at early times), an extended lean limit, increased combustion efficiency, and decreased laser energy requirements.« less

  2. Initial Kernel Timing Using a Simple PIM Performance Model

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Block, Gary L.; Springer, Paul L.; Sterling, Thomas; Brockman, Jay B.; Callahan, David

    2005-01-01

    This presentation will describe some initial results of paper-and-pencil studies of 4 or 5 application kernels applied to a processor-in-memory (PIM) system roughly similar to the Cascade Lightweight Processor (LWP). The application kernels are: * Linked list traversal * Sun of leaf nodes on a tree * Bitonic sort * Vector sum * Gaussian elimination The intent of this work is to guide and validate work on the Cascade project in the areas of compilers, simulators, and languages. We will first discuss the generic PIM structure. Then, we will explain the concepts needed to program a parallel PIM system (locality, threads, parcels). Next, we will present a simple PIM performance model that will be used in the remainder of the presentation. For each kernel, we will then present a set of codes, including codes for a single PIM node, and codes for multiple PIM nodes that move data to threads and move threads to data. These codes are written at a fairly low level, between assembly and C, but much closer to C than to assembly. For each code, we will present some hand-drafted timing forecasts, based on the simple PIM performance model. Finally, we will conclude by discussing what we have learned from this work, including what programming styles seem to work best, from the point-of-view of both expressiveness and performance.

  3. Control of Early Flame Kernel Growth by Multi-Wavelength Laser Pulses for Enhanced Ignition.

    PubMed

    Dumitrache, Ciprian; VanOsdol, Rachel; Limbach, Christopher M; Yalin, Azer P

    2017-08-31

    The present contribution examines the impact of plasma dynamics and plasma-driven fluid dynamics on the flame growth of laser ignited mixtures and shows that a new dual-pulse scheme can be used to control the kernel formation process in ways that extend the lean ignition limit. We perform a comparative study between (conventional) single-pulse laser ignition (λ = 1064 nm) and a novel dual-pulse method based on combining an ultraviolet (UV) pre-ionization pulse (λ = 266 nm) with an overlapped near-infrared (NIR) energy addition pulse (λ = 1064 nm). We employ OH* chemiluminescence to visualize the evolution of the early flame kernel. For single-pulse laser ignition at lean conditions, the flame kernel separates through third lobe detachment, corresponding to high strain rates that extinguish the flame. In this work, we investigate the capabilities of the dual-pulse to control the plasma-driven fluid dynamics by adjusting the axial offset of the two focal points. In particular, we find there exists a beam waist offset whereby the resulting vorticity suppresses formation of the third lobe, consequently reducing flame stretch. With this approach, we demonstrate that the dual-pulse method enables reduced flame speeds (at early times), an extended lean limit, increased combustion efficiency, and decreased laser energy requirements.

  4. Fault Network Reconstruction using Agglomerative Clustering: Applications to South Californian Seismicity

    NASA Astrophysics Data System (ADS)

    Kamer, Yavor; Ouillon, Guy; Sornette, Didier; Wössner, Jochen

    2014-05-01

    We present applications of a new clustering method for fault network reconstruction based on the spatial distribution of seismicity. Unlike common approaches that start from the simplest large scale and gradually increase the complexity trying to explain the small scales, our method uses a bottom-up approach, by an initial sampling of the small scales and then reducing the complexity. The new approach also exploits the location uncertainty associated with each event in order to obtain a more accurate representation of the spatial probability distribution of the seismicity. For a given dataset, we first construct an agglomerative hierarchical cluster (AHC) tree based on Ward's minimum variance linkage. Such a tree starts out with one cluster and progressively branches out into an increasing number of clusters. To atomize the structure into its constitutive protoclusters, we initialize a Gaussian Mixture Modeling (GMM) at a given level of the hierarchical clustering tree. We then let the GMM converge using an Expectation Maximization (EM) algorithm. The kernels that become ill defined (less than 4 points) at the end of the EM are discarded. By incrementing the number of initialization clusters (by atomizing at increasingly populated levels of the AHC tree) and repeating the procedure above, we are able to determine the maximum number of Gaussian kernels the structure can hold. The kernels in this configuration constitute our protoclusters. In this setting, merging of any pair will lessen the likelihood (calculated over the pdf of the kernels) but in turn will reduce the model's complexity. The information loss/gain of any possible merging can thus be quantified based on the Minimum Description Length (MDL) principle. Similar to an inter-distance matrix, where the matrix element di,j gives the distance between points i and j, we can construct a MDL gain/loss matrix where mi,j gives the information gain/loss resulting from the merging of kernels i and j. Based on this matrix, merging events resulting in MDL gain are performed in descending order until no gainful merging is possible anymore. We envision that the results of this study could lead to a better understanding of the complex interactions within the Californian fault system and hopefully use the acquired insights for earthquake forecasting.

  5. Experiences modeling ocean circulation problems on a 30 node commodity cluster with 3840 GPU processor cores.

    NASA Astrophysics Data System (ADS)

    Hill, C.

    2008-12-01

    Low cost graphic cards today use many, relatively simple, compute cores to deliver support for memory bandwidth of more than 100GB/s and theoretical floating point performance of more than 500 GFlop/s. Right now this performance is, however, only accessible to highly parallel algorithm implementations that, (i) can use a hundred or more, 32-bit floating point, concurrently executing cores, (ii) can work with graphics memory that resides on the graphics card side of the graphics bus and (iii) can be partially expressed in a language that can be compiled by a graphics programming tool. In this talk we describe our experiences implementing a complete, but relatively simple, time dependent shallow-water equations simulation targeting a cluster of 30 computers each hosting one graphics card. The implementation takes into account the considerations (i), (ii) and (iii) listed previously. We code our algorithm as a series of numerical kernels. Each kernel is designed to be executed by multiple threads of a single process. Kernels are passed memory blocks to compute over which can be persistent blocks of memory on a graphics card. Each kernel is individually implemented using the NVidia CUDA language but driven from a higher level supervisory code that is almost identical to a standard model driver. The supervisory code controls the overall simulation timestepping, but is written to minimize data transfer between main memory and graphics memory (a massive performance bottle-neck on current systems). Using the recipe outlined we can boost the performance of our cluster by nearly an order of magnitude, relative to the same algorithm executing only on the cluster CPU's. Achieving this performance boost requires that many threads are available to each graphics processor for execution within each numerical kernel and that the simulations working set of data can fit into the graphics card memory. As we describe, this puts interesting upper and lower bounds on the problem sizes for which this technology is currently most useful. However, many interesting problems fit within this envelope. Looking forward, we extrapolate our experience to estimate full-scale ocean model performance and applicability. Finally we describe preliminary hybrid mixed 32-bit and 64-bit experiments with graphics cards that support 64-bit arithmetic, albeit at a lower performance.

  6. EGCG debilitates the persistence of EBV latency by reducing the DNA binding potency of nuclear antigen 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Ya-Lin; Tsai, Hsing-Lyn; Peng, Chih-Wen, E-mail: pengcw@mail.tcu.edu.tw

    Highlights: Black-Right-Pointing-Pointer Two cell-based reporter platforms were established for screening of EBNA1 inhibitors. Black-Right-Pointing-Pointer EGCG acts as an inhibitor to block EBNA1 binding with the cognate oriP sequence. Black-Right-Pointing-Pointer EGCG debilitates EBNA1-dependent transcription enhancement and episome maintenance. Black-Right-Pointing-Pointer EGCG impairs persistence of EBV latency. Black-Right-Pointing-Pointer EGCG is a potent anti-EBV agent for targeting the latent cascade of EBV. -- Abstract: Because the expression of EBNA1 is prevalent in all EBV-associated tumors, it has become one of the most attractive drug targets for the discovery of anti-EBV compounds. In a cell-based reporter system, EBNA1 consistently upregulated the transcription of an oriP-Lucmore » mini-EBV episome by 6- to 8-fold. The treatment of cells with 50 {mu}M EGCG effectively blocked the binding of EBNA1 to oriP-DNA both in vivo and in vitro, which led to the abrogation of EBNA1-dependent episome maintenance and transcriptional enhancement. Importantly, the anti-EBNA1 effects caused by EGCG ultimately impaired the persistence of EBV latent infection. Our data suggest that the inhibition of EBNA1 activity by EGCG could be a promising starting point for the development of new protocols for anti-EBV therapy.« less

  7. Livermore Compiler Analysis Loop Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hornung, R. D.

    2013-03-01

    LCALS is designed to evaluate compiler optimizations and performance of a variety of loop kernels and loop traversal software constructs. Some of the loop kernels are pulled directly from "Livermore Loops Coded in C", developed at LLNL (see item 11 below for details of earlier code versions). The older suites were used to evaluate floating-point performances of hardware platforms prior to porting larger application codes. The LCALS suite is geared toward assissing C++ compiler optimizations and platform performance related to SIMD vectorization, OpenMP threading, and advanced C++ language features. LCALS contains 20 of 24 loop kernels from the older Livermoremore » Loop suites, plus various others representative of loops found in current production appkication codes at LLNL. The latter loops emphasize more diverse loop constructs and data access patterns than the others, such as multi-dimensional difference stencils. The loops are included in a configurable framework, which allows control of compilation, loop sampling for execution timing, which loops are run and their lengths. It generates timing statistics for analysis and comparing variants of individual loops. Also, it is easy to add loops to the suite as desired.« less

  8. Delimiting Areas of Endemism through Kernel Interpolation

    PubMed Central

    Oliveira, Ubirajara; Brescovit, Antonio D.; Santos, Adalberto J.

    2015-01-01

    We propose a new approach for identification of areas of endemism, the Geographical Interpolation of Endemism (GIE), based on kernel spatial interpolation. This method differs from others in being independent of grid cells. This new approach is based on estimating the overlap between the distribution of species through a kernel interpolation of centroids of species distribution and areas of influence defined from the distance between the centroid and the farthest point of occurrence of each species. We used this method to delimit areas of endemism of spiders from Brazil. To assess the effectiveness of GIE, we analyzed the same data using Parsimony Analysis of Endemism and NDM and compared the areas identified through each method. The analyses using GIE identified 101 areas of endemism of spiders in Brazil GIE demonstrated to be effective in identifying areas of endemism in multiple scales, with fuzzy edges and supported by more synendemic species than in the other methods. The areas of endemism identified with GIE were generally congruent with those identified for other taxonomic groups, suggesting that common processes can be responsible for the origin and maintenance of these biogeographic units. PMID:25611971

  9. Exploring the Brighter-fatter Effect with the Hyper Suprime-Cam

    NASA Astrophysics Data System (ADS)

    Coulton, William R.; Armstrong, Robert; Smith, Kendrick M.; Lupton, Robert H.; Spergel, David N.

    2018-06-01

    The brighter-fatter effect has been postulated to arise due to the build up of a transverse electric field, produced as photocharges accumulate in the pixels’ potential wells. We investigate the brighter-fatter effect in the Hyper Suprime-Cam by examining flat fields and moments of stars. We observe deviations from the expected linear relation in the photon transfer curve (PTC), luminosity-dependent correlations between pixels in flat-field images, and a luminosity-dependent point-spread function (PSF) in stellar observations. Under the key assumptions of translation invariance and Maxwell’s equations in the quasi-static limit, we give a first-principles proof that the effect can be parameterized by a translationally invariant scalar kernel. We describe how this kernel can be estimated from flat fields and discuss how this kernel has been used to remove the brighter-fatter distortions in Hyper Suprime-Cam images. We find that our correction restores the expected linear relation in the PTCs and significantly reduces, but does not completely remove, the luminosity dependence of the PSF over a wide range of magnitudes.

  10. The coordinate coherent states approach revisited

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miao, Yan-Gang, E-mail: miaoyg@nankai.edu.cn; Zhang, Shao-Jun, E-mail: sjzhang@mail.nankai.edu.cn

    2013-02-15

    We revisit the coordinate coherent states approach through two different quantization procedures in the quantum field theory on the noncommutative Minkowski plane. The first procedure, which is based on the normal commutation relation between an annihilation and creation operators, deduces that a point mass can be described by a Gaussian function instead of the usual Dirac delta function. However, we argue this specific quantization by adopting the canonical one (based on the canonical commutation relation between a field and its conjugate momentum) and show that a point mass should still be described by the Dirac delta function, which implies thatmore » the concept of point particles is still valid when we deal with the noncommutativity by following the coordinate coherent states approach. In order to investigate the dependence on quantization procedures, we apply the two quantization procedures to the Unruh effect and Hawking radiation and find that they give rise to significantly different results. Under the first quantization procedure, the Unruh temperature and Unruh spectrum are not deformed by noncommutativity, but the Hawking temperature is deformed by noncommutativity while the radiation specturm is untack. However, under the second quantization procedure, the Unruh temperature and Hawking temperature are untack but the both spectra are modified by an effective greybody (deformed) factor. - Highlights: Black-Right-Pointing-Pointer Suggest a canonical quantization in the coordinate coherent states approach. Black-Right-Pointing-Pointer Prove the validity of the concept of point particles. Black-Right-Pointing-Pointer Apply the canonical quantization to the Unruh effect and Hawking radiation. Black-Right-Pointing-Pointer Find no deformations in the Unruh temperature and Hawking temperature. Black-Right-Pointing-Pointer Provide the modified spectra of the Unruh effect and Hawking radiation.« less

  11. Revisiting the Cramér Rao Lower Bound for Elastography: Predicting the Performance of Axial, Lateral and Polar Strain Elastograms.

    PubMed

    Verma, Prashant; Doyley, Marvin M

    2017-09-01

    We derived the Cramér Rao lower bound for 2-D estimators employed in quasi-static elastography. To illustrate the theory, we modeled the 2-D point spread function as a sinc-modulated sine pulse in the axial direction and as a sinc function in the lateral direction. We compared theoretical predictions of the variance incurred in displacements and strains when quasi-static elastography was performed under varying conditions (different scanning methods, different configuration of conventional linear array imaging and different-size kernels) with those measured from simulated or experimentally acquired data. We performed studies to illustrate the application of the derived expressions when performing vascular elastography with plane wave and compounded plane wave imaging. Standard deviations in lateral displacements were an order higher than those in axial. Additionally, the derived expressions predicted that peak performance should occur when 2% strain is applied, the same order of magnitude as observed in simulations (1%) and experiments (1%-2%). We assessed how different configurations of conventional linear array imaging (number of active reception and transmission elements) influenced the quality of axial and lateral strain elastograms. The theoretical expressions predicted that 2-D echo tracking should be performed with wide kernels, but the length of the kernels should be selected using knowledge of the magnitude of the applied strain: specifically, longer kernels for small strains (<5%) and shorter kernels for larger strains. Although the general trends of theoretical predictions and experimental observations were similar, biases incurred during beamforming and subsample displacement estimation produced noticeable differences. Copyright © 2017 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  12. Validation of Born Traveltime Kernels

    NASA Astrophysics Data System (ADS)

    Baig, A. M.; Dahlen, F. A.; Hung, S.

    2001-12-01

    Most inversions for Earth structure using seismic traveltimes rely on linear ray theory to translate observed traveltime anomalies into seismic velocity anomalies distributed throughout the mantle. However, ray theory is not an appropriate tool to use when velocity anomalies have scale lengths less than the width of the Fresnel zone. In the presence of these structures, we need to turn to a scattering theory in order to adequately describe all of the features observed in the waveform. By coupling the Born approximation to ray theory, the first order dependence of heterogeneity on the cross-correlated traveltimes (described by the Fréchet derivative or, more colourfully, the banana-doughnut kernel) may be determined. To determine for what range of parameters these banana-doughnut kernels outperform linear ray theory, we generate several random media specified by their statistical properties, namely the RMS slowness perturbation and the scale length of the heterogeneity. Acoustic waves are numerically generated from a point source using a 3-D pseudo-spectral wave propagation code. These waves are then recorded at a variety of propagation distances from the source introducing a third parameter to the problem: the number of wavelengths traversed by the wave. When all of the heterogeneity has scale lengths larger than the width of the Fresnel zone, ray theory does as good a job at predicting the cross-correlated traveltime as the banana-doughnut kernels do. Below this limit, wavefront healing becomes a significant effect and ray theory ceases to be effective even though the kernels remain relatively accurate provided the heterogeneity is weak. The study of wave propagation in random media is of a more general interest and we will also show our measurements of the velocity shift and the variance of traveltime compare to various theoretical predictions in a given regime.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ren; Trindade, Alexandre; Instituto Gulbenkian de Ciencia, Oeiras

    Highlights: Black-Right-Pointing-Pointer Low dose Dll4-Fc increases vascular proliferation and overall perfusion. Black-Right-Pointing-Pointer Low dose Dll4-Fc helps vascular injury recovery in hindlimb ischemia model. Black-Right-Pointing-Pointer Low dose Dll4-Fc helps vascular injury recovery in skin flap model. Black-Right-Pointing-Pointer Dll4 heterozygous deletion promotes vascular injury recovery. Black-Right-Pointing-Pointer Dll4 overexpression delays vascular injury recovery. -- Abstract: Notch pathway regulates vessel development and maturation. Dll4, a high-affinity ligand for Notch, is expressed predominantly in the arterial endothelium and is induced by hypoxia among other factors. Inhibition of Dll4 has paradoxical effects of reducing the maturation and perfusion in newly forming vessels while increasing the densitymore » of vessels. We hypothesized that partial and/or intermittent inhibition of Dll4 may lead to increased vascular response and still allow vascular maturation to occur. Thus tissue perfusion can be restored rapidly, allowing quicker recovery from ischemia or tissue injury. Our studies in two different models (hindlimb ischemia and skin flap) show that inhibition of Dll4 at low dose allows faster recovery from vascular and tissue injury. This opens a new possibility for Dll4 blockade's therapeutic application in promoting recovery from vascular injury and restoring blood supply to ischemic tissues.« less

  14. Two stage fluid bed-plasma gasification process for solid waste valorisation: Technical review and preliminary thermodynamic modelling of sulphur emissions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morrin, Shane, E-mail: shane.morrin@ucl.ac.uk; Advanced Plasma Power, South Marston Business park, Swindon, SN3 4DE; Lettieri, Paola, E-mail: p.lettieri@ucl.ac.uk

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer We investigate sulphur during MSW gasification within a fluid bed-plasma process. Black-Right-Pointing-Pointer We review the literature on the feed, sulphur and process principles therein. Black-Right-Pointing-Pointer The need for research in this area was identified. Black-Right-Pointing-Pointer We perform thermodynamic modelling of the fluid bed stage. Black-Right-Pointing-Pointer Initial findings indicate the prominence of solid phase sulphur. - Abstract: Gasification of solid waste for energy has significant potential given an abundant feed supply and strong policy drivers. Nonetheless, significant ambiguities in the knowledge base are apparent. Consequently this study investigates sulphur mechanisms within a novel two stage fluid bed-plasma gasification process.more » This paper includes a detailed review of gasification and plasma fundamentals in relation to the specific process, along with insight on MSW based feedstock properties and sulphur pollutant therein. As a first step to understanding sulphur partitioning and speciation within the process, thermodynamic modelling of the fluid bed stage has been performed. Preliminary findings, supported by plant experience, indicate the prominence of solid phase sulphur species (as opposed to H{sub 2}S) - Na and K based species in particular. Work is underway to further investigate and validate this.« less

  15. VCC-1 over-expression inhibits cisplatin-induced apoptosis in HepG2 cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Zhitao; Lu, Xiao; Zhu, Ping

    Highlights: Black-Right-Pointing-Pointer VCC-1 is hypothesized to be associated with carcinogenesis. Black-Right-Pointing-Pointer Levels of VCC-1 are increased significantly in HCC. Black-Right-Pointing-Pointer Over-expression of VCC-1 could promotes cellular proliferation rate. Black-Right-Pointing-Pointer Over-expression of VCC-1 inhibit the cisplatin-provoked apoptosis in HepG2 cells. Black-Right-Pointing-Pointer VCC-1 plays an important role in control the tumor growth and apoptosis. -- Abstract: Vascular endothelial growth factor-correlated chemokine 1 (VCC-1), a recently described chemokine, is hypothesized to be associated with carcinogenesis. However, the molecular mechanisms by which aberrant VCC-1 expression determines poor outcomes of cancers are unknown. In this study, we found that VCC-1 was highly expressed in hepatocellularmore » carcinoma (HCC) tissue. It was also associated with proliferation of HepG2 cells, and inhibition of cisplatin-induced apoptosis of HepG2 cells. Conversely, down-regulation of VCC-1 in HepG2 cells increased cisplatin-induced apoptosis of HepG2 cells. In summary, these results suggest that VCC-1 is involved in cisplatin-induced apoptosis of HepG2 cells, and also provides some evidence for VCC-1 as a potential cellular target for chemotherapy.« less

  16. Single Particle Tracking reveals two distinct environments for CD4 receptors at the surface of living T lymphocytes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mascalchi, Patrice; Lamort, Anne Sophie; Salome, Laurence

    2012-01-06

    Highlights: Black-Right-Pointing-Pointer We studied the diffusion of single CD4 receptors on living lymphocytes. Black-Right-Pointing-Pointer This study reveals that CD4 receptors have either a random or confined diffusion. Black-Right-Pointing-Pointer The dynamics of unconfined CD4 receptors was accelerated by a temperature raise. Black-Right-Pointing-Pointer The dynamics of confined CD4 receptors was unchanged by a temperature raise. Black-Right-Pointing-Pointer Our results suggest the existence of two different environments for CD4 receptors. -- Abstract: We investigated the lateral diffusion of the HIV receptor CD4 at the surface of T lymphocytes at 20 Degree-Sign C and 37 Degree-Sign C by Single Particle Tracking using Quantum Dots. Wemore » found that the receptors presented two major distinct behaviors that were not equally affected by temperature changes. About half of the receptors showed a random diffusion with a diffusion coefficient increasing upon raising the temperature. The other half of the receptors was permanently or transiently confined with unchanged dynamics on raising the temperature. These observations suggest that two distinct subpopulations of CD4 receptors with different environments are present at the surface of living T lymphocytes.« less

  17. Platelets to rings: Influence of sodium dodecyl sulfate on Zn-Al layered double hydroxide morphology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Ceren; Unal, Ugur; Koc University, Chemistry Department, Rumelifeneri yolu, Sariyer 34450, Istanbul

    2012-03-15

    In the current study, influence of sodium dodecyl sulfate (SDS) on the crystallization of Zn-Al layered double hydroxide (LDH) was investigated. Depending on the SDS concentration coral-like and for the first time ring-like morphologies were obtained in a urea-hydrolysis method. It was revealed that the surfactant level in the starting solution plays an important role in the morphology. Concentration of surfactant equal to or above the anion exchange capacity of the LDH is influential in creating different morphologies. Another important parameter was the critical micelle concentration (CMC) of the surfactant. Surfactant concentrations well above CMC value resulted in ring-like structures.more » The crystallization mechanism was discussed. - Graphical abstract: Dependence of ZnAl LDH Morphology on SDS concentration. Highlights: Black-Right-Pointing-Pointer In-situ intercalation of SDS in ZnAl LDH was achieved via urea hydrolysis method. Black-Right-Pointing-Pointer Morphology of ZnAl LDH intercalated with SDS depended on the SDS concentration. Black-Right-Pointing-Pointer Ring like morphology for SDS intercalated ZnAl LDH was obtained for the first time. Black-Right-Pointing-Pointer Growth mechanism was discussed. Black-Right-Pointing-Pointer Template assisted growth of ZnAl LDH was proposed.« less

  18. Conservation laws and stress-energy-momentum tensors for systems with background fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gratus, Jonathan, E-mail: j.gratus@lancaster.ac.uk; The Cockcroft Institute, Daresbury Laboratory, Warrington WA4 4AD; Obukhov, Yuri N., E-mail: yo@thp.uni-koeln.de

    2012-10-15

    This article attempts to delineate the roles played by non-dynamical background structures and Killing symmetries in the construction of stress-energy-momentum tensors generated from a diffeomorphism invariant action density. An intrinsic coordinate independent approach puts into perspective a number of spurious arguments that have historically lead to the main contenders, viz the Belinfante-Rosenfeld stress-energy-momentum tensor derived from a Noether current and the Einstein-Hilbert stress-energy-momentum tensor derived in the context of Einstein's theory of general relativity. Emphasis is placed on the role played by non-dynamical background (phenomenological) structures that discriminate between properties of these tensors particularly in the context of electrodynamics inmore » media. These tensors are used to construct conservation laws in the presence of Killing Lie-symmetric background fields. - Highlights: Black-Right-Pointing-Pointer The role of background fields in diffeomorphism invariant actions is demonstrated. Black-Right-Pointing-Pointer Interrelations between different stress-energy-momentum tensors are emphasised. Black-Right-Pointing-Pointer The Abraham and Minkowski electromagnetic tensors are discussed in this context. Black-Right-Pointing-Pointer Conservation laws in the presence of nondynamic background fields are formulated. Black-Right-Pointing-Pointer The discussion is facilitated by the development of a new variational calculus.« less

  19. CD147 and AGR2 expression promote cellular proliferation and metastasis of head and neck squamous cell carcinoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweeny, Larissa, E-mail: larissasweeny@gmail.com; Liu, Zhiyong; Bush, Benjamin D.

    2012-08-15

    The signaling pathways facilitating metastasis of head and neck squamous cell carcinoma (HNSCC) cells are not fully understood. CD147 is a transmembrane glycoprotein known to induce cell migration and invasion. AGR2 is a secreted peptide also known to promote cell metastasis. Here we describe their importance in the migration and invasion of HNSCC cells (FADU and OSC-19) in vitro and in vivo. In vitro, knockdown of CD147 or AGR2 decreased cellular proliferation, migration and invasion. In vivo, knockdown of CD147 or AGR2 expression decreased primary tumor growth as well as regional and distant metastasis. -- Highlights: Black-Right-Pointing-Pointer We investigated AGR2more » in head and neck squamous cell carcinoma for the first time. Black-Right-Pointing-Pointer We explored the relationship between AGR2 and CD147 for the first time. Black-Right-Pointing-Pointer AGR2 and CD147 appear to co-localize in head and squamous cell carcinoma samples. Black-Right-Pointing-Pointer Knockdown of both AGR2 and CD147 reduced migration and invasion in vitro. Black-Right-Pointing-Pointer Knockdown of both AGR2 and CD147 decreased metastasis in vivo.« less

  20. Triple points and phase diagrams in the extended phase space of charged Gauss-Bonnet black holes in AdS space

    NASA Astrophysics Data System (ADS)

    Wei, Shao-Wen; Liu, Yu-Xiao

    2014-08-01

    We study the triple points and phase diagrams in the extended phase space of the charged Gauss-Bonnet black holes in d-dimensional anti-de Sitter space, where the cosmological constant appears as a dynamical pressure of the system and its conjugate quantity is the thermodynamic volume of the black holes. Employing the equation of state T=T(v,P), we demonstrate that the information of the phase transition and behavior of the Gibbs free energy are potential encoded in the T-v (T-rh) line with fixed pressure P. We get the phase diagrams for the charged Gauss-Bonnet black holes with different values of the charge Q and dimension d. The result shows that the small/large black hole phase transitions appear for any d, which is reminiscent of the liquid/gas transition of a Van der Waals type. Moreover, the interesting thermodynamic phenomena, i.e., the triple points and the small/intermediate/large black hole phase transitions are observed for d=6 and Q ∈(0.1705,0.1946).

  1. Multiple reentrant phase transitions and triple points in Lovelock thermodynamics

    NASA Astrophysics Data System (ADS)

    Frassino, Antonia M.; Kubizňák, David; Mann, Robert B.; Simovic, Fil

    2014-09-01

    We investigate the effects of higher curvature corrections from Lovelock gravity on the phase structure of asymptotically AdS black holes, treating the cosmological constant as a thermodynamic pressure. We examine how various thermodynamic phenomena, such as Van der Waals behaviour, reentrant phase transitions (RPT), and tricritical points are manifest for U(1) charged black holes in Gauss-Bonnet and 3rd-order Lovelock gravities. We furthermore observe a new phenomenon of `multiple RPT' behaviour, in which for fixed pressure the small/large/small/large black hole phase transition occurs as the temperature of the system increases. We also find that when the higher-order Lovelock couplings are related in a particular way, a peculiar isolated critical point emerges for hyperbolic black holes and is characterized by non-standard critical exponents.

  2. 40 CFR 458.41 - Specialized definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process Subcategory... apply to this subpart. (b) The term “product” shall mean carbon black manufactured by the lamp process. ...

  3. 40 CFR 458.41 - Specialized definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process... shall apply to this subpart. (b) The term “product” shall mean carbon black manufactured by the lamp...

  4. 40 CFR 458.41 - Specialized definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process... shall apply to this subpart. (b) The term “product” shall mean carbon black manufactured by the lamp...

  5. 40 CFR 458.41 - Specialized definitions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process... shall apply to this subpart. (b) The term “product” shall mean carbon black manufactured by the lamp...

  6. 40 CFR 458.41 - Specialized definitions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process Subcategory... apply to this subpart. (b) The term “product” shall mean carbon black manufactured by the lamp process. ...

  7. Systemic Growth of F. graminearum in Wheat Plants and Related Accumulation of Deoxynivalenol

    PubMed Central

    Moretti, Antonio; Panzarini, Giuseppe; Somma, Stefania; Campagna, Claudio; Ravaglia, Stefano; Logrieco, Antonio F.; Solfrizzo, Michele

    2014-01-01

    Fusarium head blight (FHB) is an important disease of wheat worldwide caused mainly by Fusarium graminearum (syn. Gibberella zeae). This fungus can be highly aggressive and can produce several mycotoxins such as deoxynivalenol (DON), a well known harmful metabolite for humans, animals, and plants. The fungus can survive overwinter on wheat residues and on the soil, and can usually attack the wheat plant at their point of flowering, being able to infect the heads and to contaminate the kernels at the maturity. Contaminated kernels can be sometimes used as seeds for the cultivation of the following year. Poor knowledge on the ability of the strains of F. graminearum occurring on wheat seeds to be transmitted to the plant and to contribute to the final DON contamination of kernels is available. Therefore, this study had the goals of evaluating: (a) the capability of F. graminearum causing FHB of wheat to be transmitted from the seeds or soil to the kernels at maturity and the progress of the fungus within the plant at different growth stages; (b) the levels of DON contamination in both plant tissues and kernels. The study has been carried out for two years in a climatic chamber. The F. gramineraum strain selected for the inoculation was followed within the plant by using Vegetative Compatibility technique, and quantified by Real-Time PCR. Chemical analyses of DON were carried out by using immunoaffinity cleanup and HPLC/UV/DAD. The study showed that F. graminearum originated from seeds or soil can grow systemically in the plant tissues, with the exception of kernels and heads. There seems to be a barrier that inhibits the colonization of the heads by the fungus. High levels of DON and F. graminearum were found in crowns, stems, and straw, whereas low levels of DON and no detectable levels of F. graminearum were found in both heads and kernels. Finally, in all parts of the plant (heads, crowns, and stems at milk and vitreous ripening stages, and straw at vitreous ripening), also the accumulation of significant quantities of DON-3-glucoside (DON-3G), a product of DON glycosylation, was detected, with decreasing levels in straw, crown, stems and kernels. The presence of DON and DON-3G in heads and kernels without the occurrence of F. graminearum may be explained by their water solubility that could facilitate their translocation from stem to heads and kernels. The presence of DON-3G at levels 23 times higher than DON in the heads at milk stage without the occurrence of F. graminearum may indicate that an active glycosylation of DON also occurs in the head tissues. Finally, the high levels of DON accumulated in straws are worrisome since they represent additional sources of mycotoxin for livestock. PMID:24727554

  8. Kernel abortion in maize : I. Carbohydrate concentration patterns and Acid invertase activity of maize kernels induced to abort in vitro.

    PubMed

    Hanft, J M; Jones, R J

    1986-06-01

    Kernels cultured in vitro were induced to abort by high temperature (35 degrees C) and by culturing six kernels/cob piece. Aborting kernels failed to enter a linear phase of dry mass accumulation and had a final mass that was less than 6% of nonaborting field-grown kernels. Kernels induced to abort by high temperature failed to synthesize starch in the endosperm and had elevated sucrose concentrations and low fructose and glucose concentrations in the pedicel during early growth compared to nonaborting kernels. Kernels induced to abort by high temperature also had much lower pedicel soluble acid invertase activities than did nonaborting kernels. These results suggest that high temperature during the lag phase of kernel growth may impair the process of sucrose unloading in the pedicel by indirectly inhibiting soluble acid invertase activity and prevent starch synthesis in the endosperm. Kernels induced to abort by culturing six kernels/cob piece had reduced pedicel fructose, glucose, and sucrose concentrations compared to kernels from field-grown ears. These aborting kernels also had a lower pedicel soluble acid invertase activity compared to nonaborting kernels from the same cob piece and from field-grown ears. The low invertase activity in pedicel tissue of the aborting kernels was probably caused by a lack of substrate (sucrose) for the invertase to cleave due to the intense competition for available assimilates. In contrast to kernels cultured at 35 degrees C, aborting kernels from cob pieces containing all six kernels accumulated starch in a linear fashion. These results indicate that kernels cultured six/cob piece abort because of an inadequate supply of sugar and are similar to apical kernels from field-grown ears that often abort prior to the onset of linear growth.

  9. Phases of higher spin black holes: Hawking-Page, transitions between black holes, and a critical point

    NASA Astrophysics Data System (ADS)

    Bañados, Máximo; Düring, Gustavo; Faraggi, Alberto; Reyes, Ignacio A.

    2017-08-01

    We study the thermodynamic phase diagram of three-dimensional s l (N ;R ) higher spin black holes. By analyzing the semiclassical partition function we uncover a rich structure that includes Hawking-Page transitions to the AdS3 vacuum, first order phase transitions among black hole states, and a second order critical point. Our analysis is explicit for N =4 but we extrapolate some of our conclusions to arbitrary N . In particular, we argue that even N is stable in the ensemble under consideration but odd N is not.

  10. Financial Impact of Breast Cancer in Black Versus White Women.

    PubMed

    Wheeler, Stephanie B; Spencer, Jennifer C; Pinheiro, Laura C; Carey, Lisa A; Olshan, Andrew F; Reeder-Hayes, Katherine E

    2018-04-18

    Purpose Racial variation in the financial impact of cancer may contribute to observed differences in the use of guideline-recommended treatments. We describe racial differences with regard to the financial impact of breast cancer in a large population-based prospective cohort study. Methods The Carolina Breast Cancer Study oversampled black women and women younger than age 50 years with incident breast cancer in North Carolina from 2008 to 2013. Participants provided medical records and data regarding demographics, socioeconomic status, and financial impact of cancer at 5 and 25 months postdiagnosis. We report unadjusted and adjusted financial impact at 25 months postdiagnosis by race. Results The sample included 2,494 women who completed follow-up surveys (49% black, 51% white). Since diagnosis, 58% of black women reported any adverse financial impact of cancer ( v 39% of white women; P < .001). In models adjusted for age, stage at diagnosis, and treatment received, black women were more likely to report adverse financial impact attributable to cancer (adjusted risk difference [aRD], +14 percentage points; P < .001), including income loss (aRD, +10 percentage points; P < .001), health care-related financial barriers (aRD, +10 percentage points; P < .001), health care-related transportation barriers (aRD, +10 percentage points; P < .001), job loss (aRD, 6 percentage points; P < .001), and loss of health insurance (aRD, +3 percentage points; P < .001). The effect of race was attenuated when socioeconomic factors were included but remained significant for job loss, transportation barriers, income loss, and overall financial impact. Conclusion Compared with white women, black women with breast cancer experience a significantly worse financial impact. Disproportionate financial strain may contribute to higher stress, lower treatment compliance, and worse outcomes by race. Policies that help to limit the effect of cancer-related financial strain are needed.

  11. 7 CFR 810.602 - Definition of other terms.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Damaged kernels. Kernels and pieces of flaxseed kernels that are badly ground-damaged, badly weather... instructions. Also, underdeveloped, shriveled, and small pieces of flaxseed kernels removed in properly... recleaning. (c) Heat-damaged kernels. Kernels and pieces of flaxseed kernels that are materially discolored...

  12. Kernel Abortion in Maize 1

    PubMed Central

    Hanft, Jonathan M.; Jones, Robert J.

    1986-01-01

    Kernels cultured in vitro were induced to abort by high temperature (35°C) and by culturing six kernels/cob piece. Aborting kernels failed to enter a linear phase of dry mass accumulation and had a final mass that was less than 6% of nonaborting field-grown kernels. Kernels induced to abort by high temperature failed to synthesize starch in the endosperm and had elevated sucrose concentrations and low fructose and glucose concentrations in the pedicel during early growth compared to nonaborting kernels. Kernels induced to abort by high temperature also had much lower pedicel soluble acid invertase activities than did nonaborting kernels. These results suggest that high temperature during the lag phase of kernel growth may impair the process of sucrose unloading in the pedicel by indirectly inhibiting soluble acid invertase activity and prevent starch synthesis in the endosperm. Kernels induced to abort by culturing six kernels/cob piece had reduced pedicel fructose, glucose, and sucrose concentrations compared to kernels from field-grown ears. These aborting kernels also had a lower pedicel soluble acid invertase activity compared to nonaborting kernels from the same cob piece and from field-grown ears. The low invertase activity in pedicel tissue of the aborting kernels was probably caused by a lack of substrate (sucrose) for the invertase to cleave due to the intense competition for available assimilates. In contrast to kernels cultured at 35°C, aborting kernels from cob pieces containing all six kernels accumulated starch in a linear fashion. These results indicate that kernels cultured six/cob piece abort because of an inadequate supply of sugar and are similar to apical kernels from field-grown ears that often abort prior to the onset of linear growth. PMID:16664846

  13. Entanglement Entropy of Black Holes.

    PubMed

    Solodukhin, Sergey N

    2011-01-01

    The entanglement entropy is a fundamental quantity, which characterizes the correlations between sub-systems in a larger quantum-mechanical system. For two sub-systems separated by a surface the entanglement entropy is proportional to the area of the surface and depends on the UV cutoff, which regulates the short-distance correlations. The geometrical nature of entanglement-entropy calculation is particularly intriguing when applied to black holes when the entangling surface is the black-hole horizon. I review a variety of aspects of this calculation: the useful mathematical tools such as the geometry of spaces with conical singularities and the heat kernel method, the UV divergences in the entropy and their renormalization, the logarithmic terms in the entanglement entropy in four and six dimensions and their relation to the conformal anomalies. The focus in the review is on the systematic use of the conical singularity method. The relations to other known approaches such as 't Hooft's brick-wall model and the Euclidean path integral in the optical metric are discussed in detail. The puzzling behavior of the entanglement entropy due to fields, which non-minimally couple to gravity, is emphasized. The holographic description of the entanglement entropy of the blackhole horizon is illustrated on the two- and four-dimensional examples. Finally, I examine the possibility to interpret the Bekenstein-Hawking entropy entirely as the entanglement entropy.

  14. Entanglement Entropy of Black Holes

    NASA Astrophysics Data System (ADS)

    Solodukhin, Sergey N.

    2011-10-01

    The entanglement entropy is a fundamental quantity, which characterizes the correlations between sub-systems in a larger quantum-mechanical system. For two sub-systems separated by a surface the entanglement entropy is proportional to the area of the surface and depends on the UV cutoff, which regulates the short-distance correlations. The geometrical nature of entanglement-entropy calculation is particularly intriguing when applied to black holes when the entangling surface is the black-hole horizon. I review a variety of aspects of this calculation: the useful mathematical tools such as the geometry of spaces with conical singularities and the heat kernel method, the UV divergences in the entropy and their renormalization, the logarithmic terms in the entanglement entropy in four and six dimensions and their relation to the conformal anomalies. The focus in the review is on the systematic use of the conical singularity method. The relations to other known approaches such as 't Hooft's brick-wall model and the Euclidean path integral in the optical metric are discussed in detail. The puzzling behavior of the entanglement entropy due to fields, which non-minimally couple to gravity, is emphasized. The holographic description of the entanglement entropy of the blackhole horizon is illustrated on the two- and four-dimensional examples. Finally, I examine the possibility to interpret the Bekenstein-Hawking entropy entirely as the entanglement entropy.

  15. Antioxidant potential of Juglans nigra, black walnut, husks extracted using supercritical carbon dioxide with an ethanol modifier.

    PubMed

    Wenzel, Jonathan; Storer Samaniego, Cheryl; Wang, Lihua; Burrows, Laron; Tucker, Evan; Dwarshuis, Nathan; Ammerman, Michelle; Zand, Ali

    2017-03-01

    The black walnut, Junglas nigra, is indigenous to eastern North America, and abscission of its fruit occurs around October. The fruit consists of a husk, a hard shell, and kernel. The husk is commonly discarded in processing, though it contains phenolic compounds that exhibit antioxidant and antimicrobial properties. For this study, black walnut husks were extracted using supercritical carbon dioxide with an ethanol modifier. The effects of temperature, ethanol concentration, and drying of walnut husks prior to extraction upon antioxidant potential were evaluated using a factorial design of experiments. The solvent density was held constant at 0.75 g/mL. The optimal extraction conditions were found to be 68°C and 20 wt-% ethanol in supercritical carbon dioxide. At these conditions, the antioxidant potential as measured by the ferric reducing ability of plasma (FRAP) assay was 0.027 mmol trolox equivalent/g (mmol TE/g) for dried walnut husk and 0.054 mmol TE/g for walnut husks that were not dried. Antioxidant potential was also evaluated using the total phenolic content (TPC) and 1,1-diphenyl-2-picryl-hydrazyl (DPPH) assays and the FRAP assay was found to linearly correlate to the TPC assay.

  16. 7 CFR 810.1202 - Definition of other terms.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... kernels. Kernels, pieces of rye kernels, and other grains that are badly ground-damaged, badly weather.... Also, underdeveloped, shriveled, and small pieces of rye kernels removed in properly separating the...-damaged kernels. Kernels, pieces of rye kernels, and other grains that are materially discolored and...

  17. Selective document image data compression technique

    DOEpatents

    Fu, C.Y.; Petrich, L.I.

    1998-05-19

    A method of storing information from filled-in form-documents comprises extracting the unique user information in the foreground from the document form information in the background. The contrast of the pixels is enhanced by a gamma correction on an image array, and then the color value of each of pixel is enhanced. The color pixels lying on edges of an image are converted to black and an adjacent pixel is converted to white. The distance between black pixels and other pixels in the array is determined, and a filled-edge array of pixels is created. User information is then converted to a two-color format by creating a first two-color image of the scanned image by converting all pixels darker than a threshold color value to black. All the pixels that are lighter than the threshold color value to white. Then a second two-color image of the filled-edge file is generated by converting all pixels darker than a second threshold value to black and all pixels lighter than the second threshold color value to white. The first two-color image and the second two-color image are then combined and filtered to smooth the edges of the image. The image may be compressed with a unique Huffman coding table for that image. The image file is also decimated to create a decimated-image file which can later be interpolated back to produce a reconstructed image file using a bilinear interpolation kernel. 10 figs.

  18. Selective document image data compression technique

    DOEpatents

    Fu, Chi-Yung; Petrich, Loren I.

    1998-01-01

    A method of storing information from filled-in form-documents comprises extracting the unique user information in the foreground from the document form information in the background. The contrast of the pixels is enhanced by a gamma correction on an image array, and then the color value of each of pixel is enhanced. The color pixels lying on edges of an image are converted to black and an adjacent pixel is converted to white. The distance between black pixels and other pixels in the array is determined, and a filled-edge array of pixels is created. User information is then converted to a two-color format by creating a first two-color image of the scanned image by converting all pixels darker than a threshold color value to black. All the pixels that are lighter than the threshold color value to white. Then a second two-color image of the filled-edge file is generated by converting all pixels darker than a second threshold value to black and all pixels lighter than the second threshold color value to white. The first two-color image and the second two-color image are then combined and filtered to smooth the edges of the image. The image may be compressed with a unique Huffman coding table for that image. The image file is also decimated to create a decimated-image file which can later be interpolated back to produce a reconstructed image file using a bilinear interpolation kernel.--(235 words)

  19. Kernel Ada Programming Support Environment (KAPSE) Interface Team: Public Report. Volume II.

    DTIC Science & Technology

    1982-10-28

    essential I parameters from our work so far in this area and, using trade-offs concerning these, construct the KIT’s recommended alternative. 1145...environment that are also in the development states. At this point in development it is essential for the KITEC to provide a forum and act as a focal...standardization in this area. Moreover, this is an area with considerable divergence in proposed approaches. Or the other hand, an essential tool from the point of

  20. The Genetic Basis of Natural Variation in Kernel Size and Related Traits Using a Four-Way Cross Population in Maize.

    PubMed

    Chen, Jiafa; Zhang, Luyan; Liu, Songtao; Li, Zhimin; Huang, Rongrong; Li, Yongming; Cheng, Hongliang; Li, Xiantang; Zhou, Bo; Wu, Suowei; Chen, Wei; Wu, Jianyu; Ding, Junqiang

    2016-01-01

    Kernel size is an important component of grain yield in maize breeding programs. To extend the understanding on the genetic basis of kernel size traits (i.e., kernel length, kernel width and kernel thickness), we developed a set of four-way cross mapping population derived from four maize inbred lines with varied kernel sizes. In the present study, we investigated the genetic basis of natural variation in seed size and other components of maize yield (e.g., hundred kernel weight, number of rows per ear, number of kernels per row). In total, ten QTL affecting kernel size were identified, three of which (two for kernel length and one for kernel width) had stable expression in other components of maize yield. The possible genetic mechanism behind the trade-off of kernel size and yield components was discussed.

  1. The Genetic Basis of Natural Variation in Kernel Size and Related Traits Using a Four-Way Cross Population in Maize

    PubMed Central

    Liu, Songtao; Li, Zhimin; Huang, Rongrong; Li, Yongming; Cheng, Hongliang; Li, Xiantang; Zhou, Bo; Wu, Suowei; Chen, Wei; Wu, Jianyu; Ding, Junqiang

    2016-01-01

    Kernel size is an important component of grain yield in maize breeding programs. To extend the understanding on the genetic basis of kernel size traits (i.e., kernel length, kernel width and kernel thickness), we developed a set of four-way cross mapping population derived from four maize inbred lines with varied kernel sizes. In the present study, we investigated the genetic basis of natural variation in seed size and other components of maize yield (e.g., hundred kernel weight, number of rows per ear, number of kernels per row). In total, ten QTL affecting kernel size were identified, three of which (two for kernel length and one for kernel width) had stable expression in other components of maize yield. The possible genetic mechanism behind the trade-off of kernel size and yield components was discussed. PMID:27070143

  2. Intestinal stem cells in the adult Drosophila midgut

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Huaqi, E-mail: Huaqi.Jiang@UTSouthwestern.edu; Edgar, Bruce A., E-mail: b.edgar@dkfz.de; Division of Basic Sciences, Fred Hutchinson Cancer Research Center, 1100 Fairview Ave. N., Seattle, WA 98109

    Drosophila has long been an excellent model organism for studying stem cell biology. Notably, studies of Drosophila's germline stem cells have been instrumental in developing the stem cell niche concept. The recent discovery of somatic stem cells in adult Drosophila, particularly the intestinal stem cells (ISCs) of the midgut, has established Drosophila as an exciting model to study stem cell-mediated adult tissue homeostasis and regeneration. Here, we review the major signaling pathways that regulate the self-renewal, proliferation and differentiation of Drosophila ISCs, discussing how this regulation maintains midgut homeostasis and mediates regeneration of the intestinal epithelium after injury. -- Highlights:more » Black-Right-Pointing-Pointer The homeostasis and regeneration of adult fly midguts are mediated by ISCs. Black-Right-Pointing-Pointer Damaged enterocytes induce the proliferation of intestinal stem cells (ISC). Black-Right-Pointing-Pointer EGFR and Jak/Stat signalings mediate compensatory ISC proliferation. Black-Right-Pointing-Pointer Notch signaling regulates ISC self-renewal and differentiation.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corvellec, Herve, E-mail: herve.corvellec@ism.lu.se; Bramryd, Torleif

    Highlights: Black-Right-Pointing-Pointer Swedish municipally owned waste management companies are active on political, material, technical, and commercial markets. Black-Right-Pointing-Pointer These markets differ in kind and their demands follow different logics. Black-Right-Pointing-Pointer These markets affect the public service, processing, and marketing of Swedish waste management. Black-Right-Pointing-Pointer Articulating these markets is a strategic challenge for Swedish municipally owned waste management. - Abstract: This paper describes how the business model of two leading Swedish municipally owned solid waste management companies exposes them to four different but related markets: a political market in which their legitimacy as an organization is determined; a waste-as-material market thatmore » determines their access to waste as a process input; a technical market in which these companies choose what waste processing technique to use; and a commercial market in which they market their products. Each of these markets has a logic of its own. Managing these logics and articulating the interrelationships between these markets is a key strategic challenge for these companies.« less

  4. Extracellular ATP inhibits Schwann cell dedifferentiation and proliferation in an ex vivo model of Wallerian degeneration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shin, Youn Ho; Lee, Seo Jin; Jung, Junyang, E-mail: jjung@khu.ac.kr

    Highlights: Black-Right-Pointing-Pointer ATP-treated sciatic explants shows the decreased expression of p75NGFR. Black-Right-Pointing-Pointer Extracellular ATP inhibits the expression of phospho-ERK1/2. Black-Right-Pointing-Pointer Lysosomal exocytosis is involved in Schwann cell dedifferentiation. Black-Right-Pointing-Pointer Extracellular ATP blocks Schwann cell proliferation in sciatic explants. -- Abstract: After nerve injury, Schwann cells proliferate and revert to a phenotype that supports nerve regeneration. This phenotype-changing process can be viewed as Schwann cell dedifferentiation. Here, we investigated the role of extracellular ATP in Schwann cell dedifferentiation and proliferation during Wallerian degeneration. Using several markers of Schwann cell dedifferentiation and proliferation in sciatic explants, we found that extracellular ATP inhibitsmore » Schwann cell dedifferentiation and proliferation during Wallerian degeneration. Furthermore, the blockage of lysosomal exocytosis in ATP-treated sciatic explants is sufficient to induce Schwann cell dedifferentiation. Together, these findings suggest that ATP-induced lysosomal exocytosis may be involved in Schwann cell dedifferentiation.« less

  5. Promoting crystallisation of the Salmonella enteritidis fimbriae 14 pilin SefD using deuterium oxide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Bing; Garnett, James A.; Lee, Wei-chao

    Highlights: Black-Right-Pointing-Pointer The benefits of D{sub 2}O in screening for crystallisation was explored. Black-Right-Pointing-Pointer The crystal structures of the SefD pilin in both H{sub 2}O and D{sub 2}O reveal differences. Black-Right-Pointing-Pointer Crystallisation improvements are explained by altered interactions in D{sub 2}O crystals. Black-Right-Pointing-Pointer D{sub 2}O is useful additive in sparse-matrix screening for crystallisation. -- Abstract: The use of heavy water (D{sub 2}O) as a solvent is commonplace in many spectroscopic techniques for the study of biological macromolecules. A significant deuterium isotope effect exists where hydrogen-bonding is important, such as in protein stability, dynamics and assembly. Here we illustrate the usemore » of D{sub 2}O in additive screening for the production of reproducible diffraction-quality crystals for the Salmonella enteritidis fimbriae 14 (SEF14) putative tip adhesin, SefD.« less

  6. IMPLEMENTATION OF THE SMOKE EMISSION DATA PROCESSOR AND SMOKE TOOL INPUT DATA PROCESSOR IN MODELS-3

    EPA Science Inventory

    The U.S. Environmental Protection Agency has implemented Version 1.3 of SMOKE (Sparse Matrix Object Kernel Emission) processor for preparation of area, mobile, point, and biogenic sources emission data within Version 4.1 of the Models-3 air quality modeling framework. The SMOK...

  7. 40 CFR 458.45 - Standards of performance for new sources.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) EFFLUENT GUIDELINES AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp... paragraph, which may be discharged from the carbon black lamp process by a new source subject to the...

  8. [Spatial analysis of road traffic accidents with fatalities in Spain, 2008-2011].

    PubMed

    Gómez-Barroso, Diana; López-Cuadrado, Teresa; Llácer, Alicia; Palmera Suárez, Rocío; Fernández-Cuenca, Rafael

    2015-09-01

    To estimate the areas of greatest density of road traffic accidents with fatalities at 24 hours per km(2)/year in Spain from 2008 to 2011, using a geographic information system. Accidents were geocodified using the road and kilometer points where they occurred. The average nearest neighbor was calculated to detect possible clusters and to obtain the bandwidth for kernel density estimation. A total of 4775 accidents were analyzed, of which 73.3% occurred on conventional roads. The estimated average distance between accidents was 1,242 meters, and the average expected distance was 10,738 meters. The nearest neighbor index was 0.11, indicating that there were aggregations of accidents in space. A map showing the kernel density was obtained with a resolution of 1 km(2), which identified the areas of highest density. This methodology allowed a better approximation to locating accident risks by taking into account kilometer points. The map shows areas where there was a greater density of accidents. This could be an advantage in decision-making by the relevant authorities. Copyright © 2014 SESPAS. Published by Elsevier Espana. All rights reserved.

  9. GRAYSKY-A new gamma-ray skyshine code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Witts, D.J.; Twardowski, T.; Watmough, M.H.

    1993-01-01

    This paper describes a new prototype gamma-ray skyshine code GRAYSKY (Gamma-RAY SKYshine) that has been developed at BNFL, as part of an industrially based master of science course, to overcome the problems encountered with SKYSHINEII and RANKERN. GRAYSKY is a point kernel code based on the use of a skyshine response function. The scattering within source or shield materials is accounted for by the use of buildup factors. This is an approximate method of solution but one that has been shown to produce results that are acceptable for dose rate predictions on operating plants. The novel features of GRAYSKY aremore » as follows: 1. The code is fully integrated with a semianalytical point kernel shielding code, currently under development at BNFL, which offers powerful solid-body modeling capabilities. 2. The geometry modeling also allows the skyshine response function to be used in a manner that accounts for the shielding of air-scattered radiation. 3. Skyshine buildup factors calculated using the skyshine response function have been used as well as dose buildup factors.« less

  10. Evaluation of the black light test for screening aflatoxin-contaminated maize in the Brazilian food industry.

    PubMed

    Gloria, E M; Fonseca, H; Calori-Domingues, M A; Souza, I M

    1998-01-01

    The results of the black light test for aflatoxin-contaminated maize carried out in a large food factory in the State of São Paulo was evaluated against bi-directional thin layer chromatography (TLC) analysis for 286 samples of maize. All 286 samples were accepted by the black light test (< 7 fluorescent points), however, the results from TLC analysis showed that 96 samples were contaminated and 14 showed aflatoxin B1 contamination levels higher than 20 micrograms/kg. There were 14 false negative results and no false positives and out of the 14 samples, six did not show visible fluorescent points. If the rejection criterion of one or more fluorescent points were applied, the six samples would be accepted by the black light test. But, in this case, 95 samples would be rejected and 87 results would be false positives because they did not have contamination levels over 20 micrograms/kg which is the acceptance limit of the black light test. The results indicate that the black light test, as utilized by this factory, was not able to indicate lots with possible contamination and the black light test, as recommended in the literature, would produce a high number of false positives. It is necessary to make more studies on the use of black light as a screening test for possible aflatoxin B1-contaminated maize.

  11. 7 CFR 810.802 - Definition of other terms.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) Damaged kernels. Kernels and pieces of grain kernels for which standards have been established under the.... (d) Heat-damaged kernels. Kernels and pieces of grain kernels for which standards have been...

  12. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  13. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  14. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  15. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  16. The Vanishing Black Family: Myth or Reality?

    ERIC Educational Resources Information Center

    Banks, Ivan W.

    Recent media reports have characterized the black family as vanishing and being replaced by institutionalized chaos, indifference, and "benign neglect." Journalists point to the high rate of teen pregnancies and the increased proportions of black families headed by single women, and describe black men as having little regard for the role of…

  17. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    PubMed

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Guofeng; Xu, Jingren; Li, Zengchun, E-mail: lizc.2007@yahoo.com.cn

    Highlights: Black-Right-Pointing-Pointer RAGE overexpression suppresses cell proliferation in MC3T3-E1 cells. Black-Right-Pointing-Pointer RAGE overexpression decreases Wnt/{beta}-catenin signaling. Black-Right-Pointing-Pointer RAGE overexpression decreases ERK and PI3K signaling. Black-Right-Pointing-Pointer Inhibition of Wnt signaling abolishes PI3K signaling restored by RAGE blockade. Black-Right-Pointing-Pointer Inhibition of Wnt signaling abolishes ERK signaling restored by RAGE blockade. -- Abstract: Expression of receptor for advanced glycation end products (RAGE) plays a crucial role in bone metabolism. However, the role of RAGE in the control of osteoblast proliferation is not yet evaluated. In the present study, we demonstrate that RAGE overexpression inhibits osteoblast proliferation in vitro. The negative regulation of RAGEmore » on cell proliferation results from suppression of Wnt, PI3K and ERK signaling, and is restored by RAGE neutralizing antibody. Prevention of Wnt signaling using Sfrp1 or DKK1 rescues RAGE-decreased PI3K and ERK signaling and cell proliferation, indicating that the altered cell growth in RAGE overexpressing cells is in part secondary to alterations in Wnt signaling. Consistently, RAGE overexpression inhibits the expression of Wnt targets cyclin D1 and c-myc, which is partially reversed by RAGE blockade. Overall, these results suggest that RAGE inhibits osteoblast proliferation via suppression of Wnt, PI3K and ERK signaling, which provides novel mechanisms by which RAGE regulates osteoblast growth.« less

  19. Formulating a VET roadmap for the waste and recycling sector: A case study from Queensland, Australia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, G., E-mail: gudavis@cytanet.com.cy

    2012-10-15

    Highlights: Black-Right-Pointing-Pointer Existing qualifications do not meet the needs of the sector in Queensland. Black-Right-Pointing-Pointer Businesses may not be best positioned to identify training needs. Black-Right-Pointing-Pointer Companies are developing training internally to meet their own specific needs. Black-Right-Pointing-Pointer Smaller companies lack the resources to develop internal training are disadvantaged. Black-Right-Pointing-Pointer There is industry support for an entry-level, minimum industry qualification. - Abstract: Vocational Education and Training (VET) is an essential tool for providing waste management and recycling workers with the necessary skills and knowledge needed to beneficially influence their own employment and career development; and to also ensure productivity andmore » safe working conditions within the organisations in which they are employed. Current training opportunities within Queensland for the sector are limited and not widely communicated or marketed; with other States, particularly Victoria and New South Wales, realising higher numbers of VET enrollments for waste management courses. This paper presents current VET opportunities and trends for the Queensland waste management sector. Results from a facilitated workshop to identify workforce requirements and future training needs organised by the Waste Contractors and Recyclers Association of Queensland (WCRAQ) are also presented and discussion follows on the future training needs of the industry within Queensland.« less

  20. Crystal structure of Helicobacter pylori neutrophil-activating protein with a di-nuclear ferroxidase center in a zinc or cadmium-bound form

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yokoyama, Hideshi, E-mail: h-yokoya@u-shizuoka-ken.ac.jp; Tsuruta, Osamu; Akao, Naoya

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Structures of a metal-bound Helicobacter pylori neutrophil-activating protein were determined. Black-Right-Pointing-Pointer Two zinc ions were tetrahedrally coordinated by ferroxidase center (FOC) residues. Black-Right-Pointing-Pointer Two cadmium ions were coordinated in a trigonal-bipyramidal and octahedral manner. Black-Right-Pointing-Pointer The second metal ion was more weakly coordinated than the first at the FOC. Black-Right-Pointing-Pointer A zinc ion was found in one negatively-charged pore suitable as an ion path. -- Abstract: Helicobacter pylori neutrophil-activating protein (HP-NAP) is a Dps-like iron storage protein forming a dodecameric shell, and promotes adhesion of neutrophils to endothelial cells. The crystal structure of HP-NAP in a Zn{sup 2+}-more » or Cd{sup 2+}-bound form reveals the binding of two zinc or two cadmium ions and their bridged water molecule at the ferroxidase center (FOC). The two zinc ions are coordinated in a tetrahedral manner to the conserved residues among HP-NAP and Dps proteins. The two cadmium ions are coordinated in a trigonal-bipyramidal and distorted octahedral manner. In both structures, the second ion is more weakly coordinated than the first. Another zinc ion is found inside of the negatively-charged threefold-related pore, which is suitable for metal ions to pass through.« less

  1. Reversible hydration and aqueous exfoliation of the acetate-intercalated layered double hydroxide of Ni and Al: Observation of an ordered interstratified phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manohara, G.V.; Vishnu Kamath, P., E-mail: vishnukamath8@hotmail.com; Milius, Wolfgang

    2012-12-15

    Acetate-intercalated layered double hydroxides (LDHs) of Ni and Al undergo reversible hydration in the solid state in response to the ambient humidity. The LDH with a high layer charge (0.33/formula unit) undergoes facile hydration in a single step, whereas the LDH with a lower layer charge (0.24/formula unit) exhibits an ordered interstratified intermediate, comprising the hydrated and dehydrated layers stacked alternatively. This phase, also known as the staged S-2 phase, coexists with the end members suggesting the existence of a solution-type equilibrium between the S-2 phase and the end members of the hydration cycle. These LDHs also undergo facile aqueousmore » exfoliation into 2-5 nm-thick tactoids with a radial dimension of 0.2-0.5 {mu}m. - Graphical abstract: Schematic of the hydrated, dehydrated and interstratified phases observed during the hydration-dehydration of Ni/Al-CH{sub 3}COO LDH. Highlights: Black-Right-Pointing-Pointer Ni/Al-acetate LDHs were synthesized by HPFS method by hydrolysis of acetamide. Black-Right-Pointing-Pointer Intercalated acetate ion shows reversible hydration with variation in humidity. Black-Right-Pointing-Pointer An ordered interstratified phase was observed during hydration/dehydration cycle. Black-Right-Pointing-Pointer A solution type equilibrium is observed between hydration-dehydration phases. Black-Right-Pointing-Pointer These LDHs undergo facile aqueous exfoliation.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Horie, Masafumi; Saito, Akira, E-mail: asaitou-tky@umin.ac.jp; Mikami, Yu

    Highlights: Black-Right-Pointing-Pointer We established three patient-paired sets of CAFs and NFs. Black-Right-Pointing-Pointer CAFs and NFs were analyzed using three-dimensional co-culture experiments. Black-Right-Pointing-Pointer CAFs clearly enhanced collagen gel contraction. Black-Right-Pointing-Pointer CAFs showed higher {alpha}-SMA expression than NFs. Black-Right-Pointing-Pointer CAFs were implicated in invasion and differentiation of lung cancer cells. -- Abstract: Lung cancer is the most common cause of cancer-related death worldwide. Stromal cancer-associated fibroblasts (CAFs) play crucial roles in carcinogenesis, proliferation, invasion, and metastasis of non-small cell lung carcinoma, and targeting of CAFs could be a novel strategy for cancer treatment. However, the characteristics of human CAFs still remain tomore » be better defined. In this study, we established patient-matched CAFs and normal fibroblasts (NFs), from tumoral and non-tumoral portions of resected lung tissue from lung cancer patients. CAFs showed higher {alpha}-smooth muscle actin ({alpha}-SMA) expression than NFs, and CAFs clearly enhanced collagen gel contraction. Furthermore, we employed three-dimensional co-culture assay with A549 lung cancer cells, where CAFs were more potent in inducing collagen gel contraction. Hematoxylin and eosin staining of co-cultured collagen gel revealed that CAFs had the potential to increase invasion of A549 cells compared to NFs. These observations provide evidence that lung CAFs have the tumor-promoting capacity distinct from NFs.« less

  3. A critical review of seven selected neighborhood sustainability assessment tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharifi, Ayyoob, E-mail: sharifi.ayyoob@a.mbox.nagoya-u.ac.jp; Murayama, Akito, E-mail: murayama@corot.nuac.nagoya-u.ac.jp

    2013-01-15

    Neighborhood sustainability assessment tools have become widespread since the turn of 21st century and many communities, mainly in the developed world, are utilizing these tools to measure their success in approaching sustainable development goals. In this study, seven tools from Australia, Europe, Japan, and the United States are selected and analyzed with the aim of providing insights into the current situations; highlighting the strengths, weaknesses, successes, and failures; and making recommendations for future improvements. Using a content analysis, the issues of sustainability coverage, pre-requisites, local adaptability, scoring and weighting, participation, reporting, and applicability are discussed in this paper. The resultsmore » of this study indicate that most of the tools are not doing well regarding the coverage of social, economic, and institutional aspects of sustainability; there are ambiguities and shortcomings in the weighting, scoring, and rating; in most cases, there is no mechanism for local adaptability and participation; and, only those tools which are embedded within the broader planning framework are doing well with regard to applicability. - Highlights: Black-Right-Pointing-Pointer Seven widely used assessment tools were analyzed. Black-Right-Pointing-Pointer There is a lack of balanced assessment of sustainability dimensions. Black-Right-Pointing-Pointer Tools are not doing well regarding the applicability. Black-Right-Pointing-Pointer Refinements are needed to make the tools more effective. Black-Right-Pointing-Pointer Assessment tools must be integrated into the planning process.« less

  4. Refuse derived soluble bio-organics enhancing tomato plant growth and productivity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sortino, Orazio; Dipasquale, Mauro; Montoneri, Enzo, E-mail: enzo.montoneri@unito.it

    2012-10-15

    Highlights: Black-Right-Pointing-Pointer Municipal bio-wastes are a sustainable source of bio-based products. Black-Right-Pointing-Pointer Refuse derived soluble bio-organics promote chlorophyll synthesis. Black-Right-Pointing-Pointer Refuse derived soluble bio-organics enhance plant growth and fruit ripening rate. Black-Right-Pointing-Pointer Sustainable chemistry exploiting urban refuse allows sustainable development. Black-Right-Pointing-Pointer Chemistry, agriculture and the environment benefit from biowaste technology. - Abstract: Municipal bio-refuse (CVD), containing kitchen wastes, home gardening residues and public park trimmings, was treated with alkali to yield a soluble bio-organic fraction (SBO) and an insoluble residue. These materials were characterized using elemental analysis, potentiometric titration, and 13C NMR spectroscopy, and then applied as organic fertilizers tomore » soil for tomato greenhouse cultivation. Their performance was compared with a commercial product obtained from animal residues. Plant growth, fruit yield and quality, and soil and leaf chemical composition were the selected performance indicators. The SBO exhibited the best performance by enhancing leaf chlorophyll content, improving plant growth and fruit ripening rate and yield. No product performance-chemical composition relationship could be assessed. Solubility could be one reason for the superior performance of SBO as a tomato growth promoter. The enhancement of leaf chlorophyll content is discussed to identify a possible link with the SBO photosensitizing properties that have been demonstrated in other work, and thus with photosynthetic performance.« less

  5. Modeling of the reburning process using sewage sludge-derived syngas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Werle, Sebastian, E-mail: sebastian.werle@polsl.pl

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer Gasification provides an attractive method for sewage sludges treatment. Black-Right-Pointing-Pointer Gasification generates a fuel gas (syngas) which can be used as a reburning fuel. Black-Right-Pointing-Pointer Reburning potential of sewage sludge gasification gases was defined. Black-Right-Pointing-Pointer Numerical simulation of co-combustion of syngases in coal fired boiler has been done. Black-Right-Pointing-Pointer Calculation shows that analysed syngases can provide higher than 80% reduction of NO{sub x}. - Abstract: Gasification of sewage sludge can provide clean and effective reburning fuel for combustion applications. The motivation of this work was to define the reburning potential of the sewage sludge gasification gas (syngas). Amore » numerical simulation of the co-combustion process of syngas in a hard coal-fired boiler was done. All calculations were performed using the Chemkin programme and a plug-flow reactor model was used. The calculations were modelled using the GRI-Mech 2.11 mechanism. The highest conversions for nitric oxide (NO) were obtained at temperatures of approximately 1000-1200 K. The combustion of hard coal with sewage sludge-derived syngas reduces NO emissions. The highest reduction efficiency (>90%) was achieved when the molar flow ratio of the syngas was 15%. Calculations show that the analysed syngas can provide better results than advanced reburning (connected with ammonia injection), which is more complicated process.« less

  6. Xanthorrhizol induced DNA fragmentation in HepG2 cells involving Bcl-2 family proteins

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tee, Thiam-Tsui, E-mail: thiamtsu@yahoo.com; Cheah, Yew-Hoong; Bioassay Unit, Herbal Medicine Research Center, Institute for Medical Research, Jalan Pahang, Kuala Lumpur

    Highlights: Black-Right-Pointing-Pointer We isolated xanthorrhizol, a sesquiterpenoid compound from Curcuma xanthorrhiza. Black-Right-Pointing-Pointer Xanthorrhizol induced apoptosis in HepG2 cells as observed using SEM. Black-Right-Pointing-Pointer Apoptosis in xanthorrhizol-treated HepG2 cells involved Bcl-2 family proteins. Black-Right-Pointing-Pointer DNA fragmentation was observed in xanthorrhizol-treated HepG2 cells. Black-Right-Pointing-Pointer DNA fragmentation maybe due to cleavage of PARP and DFF45/ICAD proteins. -- Abstract: Xanthorrhizol is a plant-derived pharmacologically active sesquiterpenoid compound isolated from Curcuma xanthorrhiza. Previously, we have reported that xanthorrhizol inhibited the proliferation of HepG2 human hepatoma cells by inducing apoptotic cell death via caspase activation. Here, we attempt to further elucidate the mode of action ofmore » xanthorrhizol. Apoptosis in xanthorrhizol-treated HepG2 cells as observed by scanning electron microscopy was accompanied by truncation of BID; reduction of both anti-apoptotic Bcl-2 and Bcl-X{sub L} expression; cleavage of PARP and DFF45/ICAD proteins and DNA fragmentation. Taken together, these results suggest xanthorrhizol as a potent antiproliferative agent on HepG2 cells by inducing apoptosis via Bcl-2 family members. Hence we proposed that xanthorrhizol could be used as an anti-liver cancer drug for future studies.« less

  7. Transmembrane myosin chitin synthase involved in mollusc shell formation produced in Dictyostelium is active

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schoenitzer, Veronika; Universitaet Regensburg, Biochemie I, Universitaetsstrasse 31, D-93053 Regensburg; Eichner, Norbert

    Highlights: Black-Right-Pointing-Pointer Dictyostelium produces the 264 kDa myosin chitin synthase of bivalve mollusc Atrina. Black-Right-Pointing-Pointer Chitin synthase activity releases chitin, partly associated with the cell surface. Black-Right-Pointing-Pointer Membrane extracts of transgenic slime molds produce radiolabeled chitin in vitro. Black-Right-Pointing-Pointer Chitin producing Dictyostelium cells can be characterized by atomic force microscopy. Black-Right-Pointing-Pointer This model system enables us to study initial processes of chitin biomineralization. -- Abstract: Several mollusc shells contain chitin, which is formed by a transmembrane myosin motor enzyme. This protein could be involved in sensing mechanical and structural changes of the forming, mineralizing extracellular matrix. Here we report themore » heterologous expression of the transmembrane myosin chitin synthase Ar-CS1 of the bivalve mollusc Atrina rigida (2286 amino acid residues, M.W. 264 kDa/monomer) in Dictyostelium discoideum, a model organism for myosin motor proteins. Confocal laser scanning immunofluorescence microscopy (CLSM), chitin binding GFP detection of chitin on cells and released to the cell culture medium, and a radiochemical activity assay of membrane extracts revealed expression and enzymatic activity of the mollusc chitin synthase in transgenic slime mold cells. First high-resolution atomic force microscopy (AFM) images of Ar-CS1 transformed cellulose synthase deficient D. discoideumdcsA{sup -} cell lines are shown.« less

  8. Novel application of stem cell-derived factors for periodontal regeneration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inukai, Takeharu, E-mail: t-inukai@med.nagoya-u.ac.jp; Katagiri, Wataru, E-mail: w-kat@med.nagoya-u.ac.jp; Yoshimi, Ryoko, E-mail: lianzi@med.nagoya-u.ac.jp

    Highlights: Black-Right-Pointing-Pointer Mesenchymal stem cells (MSCs) secrete a variety of cytokines. Black-Right-Pointing-Pointer Cytokines were detected in conditioned medium from cultured MSCs (MSC-CM). Black-Right-Pointing-Pointer MSC-CM enhanced activation of dog MSCs and periodontal ligament cells. Black-Right-Pointing-Pointer MSC-CM significantly promoted alveolar bone and cementum regeneration. Black-Right-Pointing-Pointer Multiple cytokines contained in MSC-CM promote periodontal regeneration. -- Abstract: The effect of conditioned medium from cultured mesenchymal stem cells (MSC-CM) on periodontal regeneration was evaluated. In vitro, MSC-CM stimulated migration and proliferation of dog MSCs (dMSCs) and dog periodontal ligament cells (dPDLCs). Cytokines such as insulin-like growth factor, vascular endothelial growth factor, transforming growth factor-{beta}1, andmore » hepatocyte growth factor were detected in MSC-CM. In vivo, one-wall critical-size, intrabony periodontal defects were surgically created in the mandible of dogs. Dogs with these defects were divided into three groups that received MSC-CM, PBS, or no implants. Absorbable atelo-collagen sponges (TERUPLUG Registered-Sign ) were used as a scaffold material. Based on radiographic and histological observation 4 weeks after transplantation, the defect sites in the MSC-CM group displayed significantly greater alveolar bone and cementum regeneration than the other groups. These findings suggest that MSC-CM enhanced periodontal regeneration due to multiple cytokines contained in MSC-CM.« less

  9. Performance evaluation of an anaerobic/aerobic landfill-based digester using yard waste for energy and compost production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yazdani, Ramin, E-mail: ryazdani@sbcglobal.net; Civil and Environmental Engineering, University of California, One Shields Avenue, Ghausi Hall, Davis, CA 95616; Barlaz, Morton A., E-mail: barlaz@eos.ncsu.edu

    2012-05-15

    Highlights: Black-Right-Pointing-Pointer Biochemical methane potential decreased by 83% during the two-stage operation. Black-Right-Pointing-Pointer Net energy produced was 84.3 MWh or 46 kWh per million metric tons (Mg). Black-Right-Pointing-Pointer The average removal efficiency of volatile organic compounds (VOCs) was 96-99%. Black-Right-Pointing-Pointer The average removal efficiency of non-methane organic compounds (NMOCs) was 68-99%. Black-Right-Pointing-Pointer The two-stage batch digester proved to be simple to operate and cost-effective. - Abstract: The objective of this study was to evaluate a new alternative for yard waste management by constructing, operating and monitoring a landfill-based two-stage batch digester (anaerobic/aerobic) with the recovery of energy and compost. Themore » system was initially operated under anaerobic conditions for 366 days, after which the yard waste was aerated for an additional 191 days. Off gas generated from the aerobic stage was treated by biofilters. Net energy recovery was 84.3 MWh, or 46 kWh per million metric tons of wet waste (as received), and the biochemical methane potential of the treated waste decreased by 83% during the two-stage operation. The average removal efficiencies of volatile organic compounds and non-methane organic compounds in the biofilters were 96-99% and 68-99%, respectively.« less

  10. Controlled drug release on amine functionalized spherical MCM-41

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Szegedi, Agnes, E-mail: szegedi@chemres.hu; Popova, Margarita; Goshev, Ivan

    2012-10-15

    MCM-41 silica with spherical morphology and small particle sizes (100 nm) was synthesized and modified by post-synthesis method with different amounts of 3-aminopropyltriethoxysilane (APTES). A comparative study of the adsorption and release of a model drug, ibuprofen, was carried out. The modified and drug loaded mesoporous materials were characterized by XRD, TEM, N{sub 2} physisorption, elemental analysis, thermal analysis and FT-IR spectroscopy. A new method was developed for the quantitative determination of amino groups in surface modified mesoporous materials by the ninhydrin reaction. Good correlation was found between the amino content of the MCM-41 materials determined by the ninhydrin methodmore » and their ibuprofen adsorption capacity. Amino modification resulted in high degree of ibuprofen loading and slow release rate in comparison to the parent non-modified MCM-41. - Graphical abstract: Determination of surface amino groups by ninhidrin method. Highlights: Black-Right-Pointing-Pointer Spherical MCM-41 modified by different amounts of APTES was studied. Black-Right-Pointing-Pointer Ibuprofen (IBU) adsorption and release characteristics was tested. Black-Right-Pointing-Pointer The ninhydrin reaction was used for the quantitative determination of amino groups. Black-Right-Pointing-Pointer Stoichiometric amount of APTES is enough for totally covering the surface with amino groups. Black-Right-Pointing-Pointer Good correlation was found between the amino content and IBU adsorption capacity.« less

  11. Microstructures and microhardness evolutions of melt-spun Al-8Ni-5Nd-4Si alloy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karakoese, Ercan, E-mail: ekarakose@karatekin.edu.tr; Keskin, Mustafa

    2012-03-15

    Al-Ni-Nd-Si alloy with nominal composition of Al-8 wt.%Ni-5 wt.%Nd-4 wt.%Si was rapidly solidified by using melt-spinning technique to examine the influence of the cooling rate/conditions on microstructure and mechanical properties. The resulting conventional cast (ingot) and melt-spun ribbons were characterized by X-ray diffraction, optical microscopy, scanning electron microscopy together with energy dispersive spectroscopy, differential scanning calorimetry, differential thermal analysis and Vickers microhardness tester. The ingot alloys consists of four phases namely {alpha}-Al, intermetallic Al{sub 3}Ni, Al{sub 11}Nd{sub 3} and fcc Si. Melt-spun ribbons are completely composed of {alpha}-Al phase. The optical microscopy and scanning electron microscopy results show that themore » microstructures of rapidly solidified ribbons are clearly different from their ingot alloy. The change in microhardness is discussed based on the microstructural observations. - Highlights: Black-Right-Pointing-Pointer Rapid solidification allows a reduction in grain size, extended solid solution ranges. Black-Right-Pointing-Pointer We observed the matrix lattice parameter increases with increasing wheel speed. Black-Right-Pointing-Pointer Melt-spun ribbons consist of partly amorphous phases embedded in crystalline phases. Black-Right-Pointing-Pointer The solidification rate is high enough to retain most of alloying elements in the Al matrix. Black-Right-Pointing-Pointer The rapid solidification has effect on the phase constitution.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gwyther, Ceri L.; Jones, David L.; Golyshin, Peter N.

    Highlights: Black-Right-Pointing-Pointer Bioreduction is a novel on-farm storage option for livestock carcasses. Black-Right-Pointing-Pointer Legislation demands that pathogens are contained and do not proliferate during carcass storage. Black-Right-Pointing-Pointer We examined the survival of key pathogens in lab-scale bioreduction vessels. Black-Right-Pointing-Pointer Pathogen numbers reduced in the resulting liquor waste and bioaerosols. Black-Right-Pointing-Pointer The results indicate that bioreduction should be validated for industry use. - Abstract: The EU Animal By-Products Regulations generated the need for novel methods of storage and disposal of dead livestock. Bioreduction prior to rendering or incineration has been proposed as a practical and potentially cost-effective method; however, its biosecuritymore » characteristics need to be elucidated. To address this, Salmonella enterica (serovars Senftenberg and Poona), Enterococcus faecalis, Campylobacter jejuni, Campylobacter coli and a lux-marked strain of Escherichia coli O157 were inoculated into laboratory-scale bioreduction vessels containing sheep carcass constituents. Numbers of all pathogens and the metabolic activity of E. coli O157 decreased significantly within the liquor waste over time, and only E. faecalis remained detectable after 3 months. Only very low numbers of Salmonella spp. and E. faecalis were detected in bioaerosols, and only at initial stages of the trial. These results further indicate that bioreduction represents a suitable method of storing and reducing the volume of livestock carcasses prior to ultimate disposal.« less

  13. Classification With Truncated Distance Kernel.

    PubMed

    Huang, Xiaolin; Suykens, Johan A K; Wang, Shuning; Hornegger, Joachim; Maier, Andreas

    2018-05-01

    This brief proposes a truncated distance (TL1) kernel, which results in a classifier that is nonlinear in the global region but is linear in each subregion. With this kernel, the subregion structure can be trained using all the training data and local linear classifiers can be established simultaneously. The TL1 kernel has good adaptiveness to nonlinearity and is suitable for problems which require different nonlinearities in different areas. Though the TL1 kernel is not positive semidefinite, some classical kernel learning methods are still applicable which means that the TL1 kernel can be directly used in standard toolboxes by replacing the kernel evaluation. In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying the TL1 kernel a promising nonlinear kernel for classification tasks.

  14. Quintessence Reissner Nordström Anti de Sitter Black Holes and Joule Thomson Effect

    NASA Astrophysics Data System (ADS)

    Ghaffarnejad, H.; Yaraie, E.; Farsam, M.

    2018-06-01

    In this work we investigate corrections of the quintessence regime of the dark energy on the Joule-Thomson (JT) effect of the Reissner Nordström anti de Sitter (RNAdS) black hole. The quintessence dark energy has equation of state as p q = ω ρ q in which -1<ω <- 1/3. Our calculations are restricted to ansatz: ω = - 1 (the cosmological constant regime) and ω =- 2/3 (quintessence dark energy). To study the JT expansion of the AdS gas under the constant black hole mass, we calculate inversion temperature T i of the quintessence RNAdS black hole where its cooling phase is changed to heating phase at a particular (inverse) pressure P i . Position of the inverse point { T i , P i } is determined by crossing the inverse curves with the corresponding Gibbons-Hawking temperature on the T-P plan. We determine position of the inverse point versus different numerical values of the mass M and the charge Q of the quintessence AdS RN black hole. The cooling-heating phase transition (JT effect) is happened for M > Q in which the causal singularity is still covered by the horizon. Our calculations show sensitivity of the inverse point { T i , P i } position on the T-P plan to existence of the quintessence dark energy just for large numerical values of the AdS RN black holes charge Q. In other words the quintessence dark energy dose not affect on position of the inverse point when the AdS RN black hole takes on small charges.

  15. Analysis of the spatial distribution of dengue cases in the city of Rio de Janeiro, 2011 and 2012

    PubMed Central

    Carvalho, Silvia; Magalhães, Mônica de Avelar Figueiredo Mafra; Medronho, Roberto de Andrade

    2017-01-01

    ABSTRACT OBJECTIVE Analyze the spatial distribution of classical dengue and severe dengue cases in the city of Rio de Janeiro. METHODS Exploratory study, considering cases of classical dengue and severe dengue with laboratory confirmation of the infection in the city of Rio de Janeiro during the years 2011/2012. The georeferencing technique was applied for the cases notified in the Notification Increase Information System in the period of 2011 and 2012. For this process, the fields “street” and “number” were used. The ArcGis10 program’s Geocoding tool’s automatic process was performed. The spatial analysis was done through the kernel density estimator. RESULTS Kernel density pointed out hotspots for classic dengue that did not coincide geographically with severe dengue and were in or near favelas. The kernel ratio did not show a notable change in the spatial distribution pattern observed in the kernel density analysis. The georeferencing process showed a loss of 41% of classic dengue registries and 17% of severe dengue registries due to the address in the Notification Increase Information System form. CONCLUSIONS The hotspots near the favelas suggest that the social vulnerability of these localities can be an influencing factor for the occurrence of this aggravation since there is a deficiency of the supply and access to essential goods and services for the population. To reduce this vulnerability, interventions must be related to macroeconomic policies. PMID:28832752

  16. Comparative analysis of genetic architectures for nine developmental traits of rye.

    PubMed

    Masojć, Piotr; Milczarski, P; Kruszona, P

    2017-08-01

    Genetic architectures of plant height, stem thickness, spike length, awn length, heading date, thousand-kernel weight, kernel length, leaf area and chlorophyll content were aligned on the DArT-based high-density map of the 541 × Ot1-3 RILs population of rye using the genes interaction assorting by divergent selection (GIABDS) method. Complex sets of QTL for particular traits contained 1-5 loci of the epistatic D class and 10-28 loci of the hypostatic, mostly R and E classes controlling traits variation through D-E or D-R types of two-loci interactions. QTL were distributed on each of the seven rye chromosomes in unique positions or as a coinciding loci for 2-8 traits. Detection of considerable numbers of the reversed (D', E' and R') classes of QTL might be attributed to the transgression effects observed for most of the studied traits. First examples of E* and F QTL classes, defined in the model, are reported for awn length, leaf area, thousand-kernel weight and kernel length. The results of this study extend experimental data to 11 quantitative traits (together with pre-harvest sprouting and alpha-amylase activity) for which genetic architectures fit the model of mechanism underlying alleles distribution within tails of bi-parental populations. They are also a valuable starting point for map-based search of genes underlying detected QTL and for planning advanced marker-assisted multi-trait breeding strategies.

  17. Acceleration of GPU-based Krylov solvers via data transfer reduction

    DOE PAGES

    Anzt, Hartwig; Tomov, Stanimire; Luszczek, Piotr; ...

    2015-04-08

    Krylov subspace iterative solvers are often the method of choice when solving large sparse linear systems. At the same time, hardware accelerators such as graphics processing units continue to offer significant floating point performance gains for matrix and vector computations through easy-to-use libraries of computational kernels. However, as these libraries are usually composed of a well optimized but limited set of linear algebra operations, applications that use them often fail to reduce certain data communications, and hence fail to leverage the full potential of the accelerator. In this study, we target the acceleration of Krylov subspace iterative methods for graphicsmore » processing units, and in particular the Biconjugate Gradient Stabilized solver that significant improvement can be achieved by reformulating the method to reduce data-communications through application-specific kernels instead of using the generic BLAS kernels, e.g. as provided by NVIDIA’s cuBLAS library, and by designing a graphics processing unit specific sparse matrix-vector product kernel that is able to more efficiently use the graphics processing unit’s computing power. Furthermore, we derive a model estimating the performance improvement, and use experimental data to validate the expected runtime savings. Finally, considering that the derived implementation achieves significantly higher performance, we assert that similar optimizations addressing algorithm structure, as well as sparse matrix-vector, are crucial for the subsequent development of high-performance graphics processing units accelerated Krylov subspace iterative methods.« less

  18. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach

    PubMed Central

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-01-01

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification. PMID:28629202

  19. Gabor-based kernel PCA with fractional power polynomial models for face recognition.

    PubMed

    Liu, Chengjun

    2004-05-01

    This paper presents a novel Gabor-based kernel Principal Component Analysis (PCA) method by integrating the Gabor wavelet representation of face images and the kernel PCA method for face recognition. Gabor wavelets first derive desirable facial features characterized by spatial frequency, spatial locality, and orientation selectivity to cope with the variations due to illumination and facial expression changes. The kernel PCA method is then extended to include fractional power polynomial models for enhanced face recognition performance. A fractional power polynomial, however, does not necessarily define a kernel function, as it might not define a positive semidefinite Gram matrix. Note that the sigmoid kernels, one of the three classes of widely used kernel functions (polynomial kernels, Gaussian kernels, and sigmoid kernels), do not actually define a positive semidefinite Gram matrix either. Nevertheless, the sigmoid kernels have been successfully used in practice, such as in building support vector machines. In order to derive real kernel PCA features, we apply only those kernel PCA eigenvectors that are associated with positive eigenvalues. The feasibility of the Gabor-based kernel PCA method with fractional power polynomial models has been successfully tested on both frontal and pose-angled face recognition, using two data sets from the FERET database and the CMU PIE database, respectively. The FERET data set contains 600 frontal face images of 200 subjects, while the PIE data set consists of 680 images across five poses (left and right profiles, left and right half profiles, and frontal view) with two different facial expressions (neutral and smiling) of 68 subjects. The effectiveness of the Gabor-based kernel PCA method with fractional power polynomial models is shown in terms of both absolute performance indices and comparative performance against the PCA method, the kernel PCA method with polynomial kernels, the kernel PCA method with fractional power polynomial models, the Gabor wavelet-based PCA method, and the Gabor wavelet-based kernel PCA method with polynomial kernels.

  20. Hairy black holes in cubic quasi-topological gravity

    NASA Astrophysics Data System (ADS)

    Dykaar, Hannah; Hennigar, Robie A.; Mann, Robert B.

    2017-05-01

    We construct a class of five dimensional black hole solutions to cubic quasi-topological gravity with conformal scalar hair and study their thermodynamics. We find these black holes provide the second example of black hole λ-lines: a line of second order (continuous) phase transitions, akin to the fluid/superfluid transition of 4He. Examples of isolated critical points are found for spherical black holes, marking the first in the literature to date. We also find various novel and interesting phase structures, including an isolated critical point occurring in conjunction with a double reentrant phase transition. The AdS vacua of the theory are studied, finding ghost-free configurations where the scalar field takes on a non-zero constant value, in notable contrast to the five dimensional Lovelock case.

  1. A multi-label learning based kernel automatic recommendation method for support vector machine.

    PubMed

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance.

  2. A Multi-Label Learning Based Kernel Automatic Recommendation Method for Support Vector Machine

    PubMed Central

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance. PMID:25893896

  3. 40 CFR 458.20 - Applicability: description of the carbon black thermal process subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability: description of the carbon black thermal process subcategory. 458.20 Section 458.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Thermal...

  4. 40 CFR 458.10 - Applicability; description of the carbon black furnace process subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability; description of the carbon black furnace process subcategory. 458.10 Section 458.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Furnace...

  5. 40 CFR 458.30 - Applicability; description of the carbon black channel process subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability; description of the carbon black channel process subcategory. 458.30 Section 458.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Channel...

  6. 40 CFR 458.10 - Applicability; description of the carbon black furnace process subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability; description of the carbon black furnace process subcategory. 458.10 Section 458.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Furnace...

  7. 40 CFR 458.20 - Applicability: description of the carbon black thermal process subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability: description of the carbon black thermal process subcategory. 458.20 Section 458.20 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Thermal...

  8. 40 CFR 458.40 - Applicability; description of the carbon black lamp process subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability; description of the carbon black lamp process subcategory. 458.40 Section 458.40 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process...

  9. 40 CFR 458.30 - Applicability; description of the carbon black channel process subcategory.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability; description of the carbon black channel process subcategory. 458.30 Section 458.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Channel...

  10. 40 CFR 458.40 - Applicability; description of the carbon black lamp process subcategory.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability; description of the carbon black lamp process subcategory. 458.40 Section 458.40 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp Process...

  11. 7 CFR 981.7 - Edible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Edible kernel. 981.7 Section 981.7 Agriculture... Regulating Handling Definitions § 981.7 Edible kernel. Edible kernel means a kernel, piece, or particle of almond kernel that is not inedible. [41 FR 26852, June 30, 1976] ...

  12. A conformal approach for the analysis of the non-linear stability of radiation cosmologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luebbe, Christian, E-mail: c.luebbe@ucl.ac.uk; Department of Mathematics, University of Leicester, University Road, LE1 8RH; Valiente Kroon, Juan Antonio, E-mail: j.a.valiente-kroon@qmul.ac.uk

    2013-01-15

    The conformal Einstein equations for a trace-free (radiation) perfect fluid are derived in terms of the Levi-Civita connection of a conformally rescaled metric. These equations are used to provide a non-linear stability result for de Sitter-like trace-free (radiation) perfect fluid Friedman-Lemaitre-Robertson-Walker cosmological models. The solutions thus obtained exist globally towards the future and are future geodesically complete. - Highlights: Black-Right-Pointing-Pointer We study the Einstein-Euler system in General Relativity using conformal methods. Black-Right-Pointing-Pointer We analyze the structural properties of the associated evolution equations. Black-Right-Pointing-Pointer We establish the non-linear stability of pure radiation cosmological models.

  13. Forced Ignition Study Based On Wavelet Method

    NASA Astrophysics Data System (ADS)

    Martelli, E.; Valorani, M.; Paolucci, S.; Zikoski, Z.

    2011-05-01

    The control of ignition in a rocket engine is a critical problem for combustion chamber design. Therefore it is essential to fully understand the mechanism of ignition during its earliest stages. In this paper the characteristics of flame kernel formation and initial propagation in a hydrogen-argon-oxygen mixing layer are studied using 2D direct numerical simulations with detailed chemistry and transport properties. The flame kernel is initiated by adding an energy deposition source term in the energy equation. The effect of unsteady strain rate is studied by imposing a 2D turbulence velocity field, which is initialized by means of a synthetic field. An adaptive wavelet method, based on interpolating wavelets is used in this study to solve the compressible reactive Navier- Stokes equations. This method provides an alternative means to refine the computational grid points according to local demands of the physical solution. The present simulations show that in the very early instants the kernel perturbed by the turbulent field is characterized by an increased burning area and a slightly increased rad- ical formation. In addition, the calculations show that the wavelet technique yields a significant reduction in the number of degrees of freedom necessary to achieve a pre- scribed solution accuracy.

  14. Novel procedure for characterizing nonlinear systems with memory: 2017 update

    NASA Astrophysics Data System (ADS)

    Nuttall, Albert H.; Katz, Richard A.; Hughes, Derke R.; Koch, Robert M.

    2017-05-01

    The present article discusses novel improvements in nonlinear signal processing made by the prime algorithm developer, Dr. Albert H. Nuttall and co-authors, a consortium of research scientists from the Naval Undersea Warfare Center Division, Newport, RI. The algorithm, called the Nuttall-Wiener-Volterra or 'NWV' algorithm is named for its principal contributors [1], [2],[ 3] . The NWV algorithm significantly reduces the computational workload for characterizing nonlinear systems with memory. Following this formulation, two measurement waveforms are required in order to characterize a specified nonlinear system under consideration: (1) an excitation input waveform, x(t) (the transmitted signal); and, (2) a response output waveform, z(t) (the received signal). Given these two measurement waveforms for a given propagation channel, a 'kernel' or 'channel response', h= [h0,h1,h2,h3] between the two measurement points, is computed via a least squares approach that optimizes modeled kernel values by performing a best fit between measured response z(t) and a modeled response y(t). New techniques significantly diminish the exponential growth of the number of computed kernel coefficients at second and third order and alleviate the Curse of Dimensionality (COD) in order to realize practical nonlinear solutions of scientific and engineering interest.

  15. Rocksalt or cesium chloride: Investigating the relative stability of the cesium halide structures with random phase approximation based methods

    NASA Astrophysics Data System (ADS)

    Nepal, Niraj K.; Ruzsinszky, Adrienn; Bates, Jefferson E.

    2018-03-01

    The ground state structural and energetic properties for rocksalt and cesium chloride phases of the cesium halides were explored using the random phase approximation (RPA) and beyond-RPA methods to benchmark the nonempirical SCAN meta-GGA and its empirical dispersion corrections. The importance of nonadditivity and higher-order multipole moments of dispersion in these systems is discussed. RPA generally predicts the equilibrium volume for these halides within 2.4% of the experimental value, while beyond-RPA methods utilizing the renormalized adiabatic LDA (rALDA) exchange-correlation kernel are typically within 1.8%. The zero-point vibrational energy is small and shows that the stability of these halides is purely due to electronic correlation effects. The rAPBE kernel as a correction to RPA overestimates the equilibrium volume and could not predict the correct phase ordering in the case of cesium chloride, while the rALDA kernel consistently predicted results in agreement with the experiment for all of the halides. However, due to its reasonable accuracy with lower computational cost, SCAN+rVV10 proved to be a good alternative to the RPA-like methods for describing the properties of these ionic solids.

  16. Bandlimited computerized improvements in characterization of nonlinear systems with memory

    NASA Astrophysics Data System (ADS)

    Nuttall, Albert H.; Katz, Richard A.; Hughes, Derke R.; Koch, Robert M.

    2016-05-01

    The present article discusses some inroads in nonlinear signal processing made by the prime algorithm developer, Dr. Albert H. Nuttall and co-authors, a consortium of research scientists from the Naval Undersea Warfare Center Division, Newport, RI. The algorithm, called the Nuttall-Wiener-Volterra 'NWV' algorithm is named for its principal contributors [1], [2],[ 3] over many years of developmental research. The NWV algorithm significantly reduces the computational workload for characterizing nonlinear systems with memory. Following this formulation, two measurement waveforms on the system are required in order to characterize a specified nonlinear system under consideration: (1) an excitation input waveform, x(t) (the transmitted signal); and, (2) a response output waveform, z(t) (the received signal). Given these two measurement waveforms for a given propagation channel, a 'kernel' or 'channel response', h= [h0,h1,h2,h3] between the two measurement points, is computed via a least squares approach that optimizes modeled kernel values by performing a best fit between measured response z(t) and a modeled response y(t). New techniques significantly diminish the exponential growth of the number of computed kernel coefficients at second and third order in order to combat and reasonably alleviate the curse of dimensionality.

  17. Comptonization in Ultra-Strong Magnetic Fields: Numerical Solution to the Radiative Transfer Problem

    NASA Technical Reports Server (NTRS)

    Ceccobello, C.; Farinelli, R.; Titarchuk, L.

    2014-01-01

    We consider the radiative transfer problem in a plane-parallel slab of thermal electrons in the presence of an ultra-strong magnetic field (B approximately greater than B(sub c) approx. = 4.4 x 10(exp 13) G). Under these conditions, the magnetic field behaves like a birefringent medium for the propagating photons, and the electromagnetic radiation is split into two polarization modes, ordinary and extraordinary, that have different cross-sections. When the optical depth of the slab is large, the ordinary-mode photons are strongly Comptonized and the photon field is dominated by an isotropic component. Aims. The radiative transfer problem in strong magnetic fields presents many mathematical issues and analytical or numerical solutions can be obtained only under some given approximations. We investigate this problem both from the analytical and numerical point of view, provide a test of the previous analytical estimates, and extend these results with numerical techniques. Methods. We consider here the case of low temperature black-body photons propagating in a sub-relativistic temperature plasma, which allows us to deal with a semi-Fokker-Planck approximation of the radiative transfer equation. The problem can then be treated with the variable separation method, and we use a numerical technique to find solutions to the eigenvalue problem in the case of a singular kernel of the space operator. The singularity of the space kernel is the result of the strong angular dependence of the electron cross-section in the presence of a strong magnetic field. Results. We provide the numerical solution obtained for eigenvalues and eigenfunctions of the space operator, and the emerging Comptonization spectrum of the ordinary-mode photons for any eigenvalue of the space equation and for energies significantly lesser than the cyclotron energy, which is on the order of MeV for the intensity of the magnetic field here considered. Conclusions. We derived the specific intensity of the ordinary photons, under the approximation of large angle and large optical depth. These assumptions allow the equation to be treated using a diffusion-like approximation.

  18. Exploiting graph kernels for high performance biomedical relation extraction.

    PubMed

    Panyam, Nagesh C; Verspoor, Karin; Cohn, Trevor; Ramamohanarao, Kotagiri

    2018-01-30

    Relation extraction from biomedical publications is an important task in the area of semantic mining of text. Kernel methods for supervised relation extraction are often preferred over manual feature engineering methods, when classifying highly ordered structures such as trees and graphs obtained from syntactic parsing of a sentence. Tree kernels such as the Subset Tree Kernel and Partial Tree Kernel have been shown to be effective for classifying constituency parse trees and basic dependency parse graphs of a sentence. Graph kernels such as the All Path Graph kernel (APG) and Approximate Subgraph Matching (ASM) kernel have been shown to be suitable for classifying general graphs with cycles, such as the enhanced dependency parse graph of a sentence. In this work, we present a high performance Chemical-Induced Disease (CID) relation extraction system. We present a comparative study of kernel methods for the CID task and also extend our study to the Protein-Protein Interaction (PPI) extraction task, an important biomedical relation extraction task. We discuss novel modifications to the ASM kernel to boost its performance and a method to apply graph kernels for extracting relations expressed in multiple sentences. Our system for CID relation extraction attains an F-score of 60%, without using external knowledge sources or task specific heuristic or rules. In comparison, the state of the art Chemical-Disease Relation Extraction system achieves an F-score of 56% using an ensemble of multiple machine learning methods, which is then boosted to 61% with a rule based system employing task specific post processing rules. For the CID task, graph kernels outperform tree kernels substantially, and the best performance is obtained with APG kernel that attains an F-score of 60%, followed by the ASM kernel at 57%. The performance difference between the ASM and APG kernels for CID sentence level relation extraction is not significant. In our evaluation of ASM for the PPI task, ASM performed better than APG kernel for the BioInfer dataset, in the Area Under Curve (AUC) measure (74% vs 69%). However, for all the other PPI datasets, namely AIMed, HPRD50, IEPA and LLL, ASM is substantially outperformed by the APG kernel in F-score and AUC measures. We demonstrate a high performance Chemical Induced Disease relation extraction, without employing external knowledge sources or task specific heuristics. Our work shows that graph kernels are effective in extracting relations that are expressed in multiple sentences. We also show that the graph kernels, namely the ASM and APG kernels, substantially outperform the tree kernels. Among the graph kernels, we showed the ASM kernel as effective for biomedical relation extraction, with comparable performance to the APG kernel for datasets such as the CID-sentence level relation extraction and BioInfer in PPI. Overall, the APG kernel is shown to be significantly more accurate than the ASM kernel, achieving better performance on most datasets.

  19. MO-FG-CAMPUS-TeP1-05: Rapid and Efficient 3D Dosimetry for End-To-End Patient-Specific QA of Rotational SBRT Deliveries Using a High-Resolution EPID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Y M; Han, B; Xing, L

    2016-06-15

    Purpose: EPID-based patient-specific quality assurance provides verification of the planning setup and delivery process that phantomless QA and log-file based virtual dosimetry methods cannot achieve. We present a method for EPID-based QA utilizing spatially-variant EPID response kernels that allows for direct calculation of the entrance fluence and 3D phantom dose. Methods: An EPID dosimetry system was utilized for 3D dose reconstruction in a cylindrical phantom for the purposes of end-to-end QA. Monte Carlo (MC) methods were used to generate pixel-specific point-spread functions (PSFs) characterizing the spatially non-uniform EPID portal response in the presence of phantom scatter. The spatially-variant PSFs weremore » decomposed into spatially-invariant basis PSFs with the symmetric central-axis kernel as the primary basis kernel and off-axis representing orthogonal perturbations in pixel-space. This compact and accurate characterization enables the use of a modified Richardson-Lucy deconvolution algorithm to directly reconstruct entrance fluence from EPID images without iterative scatter subtraction. High-resolution phantom dose kernels were cogenerated in MC with the PSFs enabling direct recalculation of the resulting phantom dose by rapid forward convolution once the entrance fluence was calculated. A Delta4 QA phantom was used to validate the dose reconstructed in this approach. Results: The spatially-invariant representation of the EPID response accurately reproduced the entrance fluence with >99.5% fidelity with a simultaneous reduction of >60% in computational overhead. 3D dose for 10{sub 6} voxels was reconstructed for the entire phantom geometry. A 3D global gamma analysis demonstrated a >95% pass rate at 3%/3mm. Conclusion: Our approach demonstrates the capabilities of an EPID-based end-to-end QA methodology that is more efficient than traditional EPID dosimetry methods. Displacing the point of measurement external to the QA phantom reduces the necessary complexity of the phantom itself while offering a method that is highly scalable and inherently generalizable to rotational and trajectory based deliveries. This research was partially supported by Varian.« less

  20. 7 CFR 810.2202 - Definition of other terms.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... kernels, foreign material, and shrunken and broken kernels. The sum of these three factors may not exceed... the removal of dockage and shrunken and broken kernels. (g) Heat-damaged kernels. Kernels, pieces of... sample after the removal of dockage and shrunken and broken kernels. (h) Other grains. Barley, corn...

  1. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.8 Section 981.8 Agriculture... Regulating Handling Definitions § 981.8 Inedible kernel. Inedible kernel means a kernel, piece, or particle of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or...

  2. 7 CFR 51.1415 - Inedible kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Inedible kernels. 51.1415 Section 51.1415 Agriculture... Standards for Grades of Pecans in the Shell 1 Definitions § 51.1415 Inedible kernels. Inedible kernels means that the kernel or pieces of kernels are rancid, moldy, decayed, injured by insects or otherwise...

  3. Solidification observations and sliding wear behavior of vacuum arc melting processed Ni-Al-TiC composites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karantzalis, A.E., E-mail: akarantz@cc.uoi.gr; Lekatou, A.; Tsirka, K.

    2012-07-15

    Monolithic Ni{sub 3}Al and Ni-25 at.%Al intermetallic matrix TiC-reinforced composites were successfully produced by vacuum arc melting. TiC crystals were formed through a dissolution-reprecipitation mechanism and their final morphology is explained by means of a) Jackson's classical nucleation and growth phenomena and b) solidification rate considerations. The TiC presence altered the matrix microconstituents most likely due to specific melt-particle interactions and crystal plane epitaxial matching. TiC particles caused a significant decrease on the specific wear rate of the monolithic Ni{sub 3}Al alloy and the possible wear mechanisms are approached by means of a) surface oxidation, b) crack/flaws formation, c) materialmore » detachment and d) debris-counter surfaces interactions. - Highlights: Black-Right-Pointing-Pointer Vacuum arc melting (VAM) of Ni-Al based intermetallic matrix composite materials. Black-Right-Pointing-Pointer Solidification phenomena examination. Black-Right-Pointing-Pointer TiC crystal formation and growth mechanisms. Black-Right-Pointing-Pointer Sliding wear examination.« less

  4. Bioluminescent system for dynamic imaging of cell and animal behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hara-Miyauchi, Chikako; Laboratory for Cell Function Dynamics, Brain Science Institute, RIKEN, Saitama 351-0198; Department of Biophysics and Biochemistry, Graduate School of Health Care Sciences, Tokyo Medical and Dental University, Tokyo 113-8510

    2012-03-09

    Highlights: Black-Right-Pointing-Pointer We combined a yellow variant of GFP and firefly luciferase to make ffLuc-cp156. Black-Right-Pointing-Pointer ffLuc-cp156 showed improved photon yield in cultured cells and transgenic mice. Black-Right-Pointing-Pointer ffLuc-cp156 enabled video-rate bioluminescence imaging of freely-moving animals. Black-Right-Pointing-Pointer ffLuc-cp156 mice enabled tracking real-time drug delivery in conscious animals. -- Abstract: The current utility of bioluminescence imaging is constrained by a low photon yield that limits temporal sensitivity. Here, we describe an imaging method that uses a chemiluminescent/fluorescent protein, ffLuc-cp156, which consists of a yellow variant of Aequorea GFP and firefly luciferase. We report an improvement in photon yield by over threemore » orders of magnitude over current bioluminescent systems. We imaged cellular movement at high resolution including neuronal growth cones and microglial cell protrusions. Transgenic ffLuc-cp156 mice enabled video-rate bioluminescence imaging of freely moving animals, which may provide a reliable assay for drug distribution in behaving animals for pre-clinical studies.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Qi; Wang, Xuedi; Zhang, Hanguang

    Highlights: Black-Right-Pointing-Pointer Cat S is highly expressed in HCC cells with high metastatic potential. Black-Right-Pointing-Pointer Knockdown of Cat S inhibits growth and invasion of HCC cells. Black-Right-Pointing-Pointer Knockdown of Cat S inhibits HCC-associated angiogenesis. Black-Right-Pointing-Pointer Cat S might be a potential target for HCC therapy. -- Abstract: Cathepsin S (Cat S) plays an important role in tumor invasion and metastasis by its ability to degrade extracellular matrix (ECM). Our previous study suggested there could be a potential association between Cat S and hepatocellular carcinoma (HCC) metastasis. The present study was designed to determine the role of Cat S in HCCmore » cell growth, invasion and angiogenesis, using RNA interference technology. Small interfering RNA (siRNA) sequences for the Cat S gene were synthesized and transfected into human HCC cell line MHCC97-H. The Cat S gene targeted siRNA-mediated knockdown of Cat S expression, leading to potent suppression of MHCC97-H cell proliferation, invasion and angiogenesis. These data suggest that Cat S might be a potential target for HCC therapy.« less

  6. A spectroscopic study on the interaction between gold nanoparticles and hemoglobin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garabagiu, Sorina, E-mail: sgarabagiu@itim-cj.ro

    2011-12-15

    Highlights: Black-Right-Pointing-Pointer The interaction was studied using UV-vis and fluorescence spectroscopy. Black-Right-Pointing-Pointer Gold nanoparticles quench the fluorescence emission of hemoglobin solution. Black-Right-Pointing-Pointer The binding and thermodynamic constants were calculated. Black-Right-Pointing-Pointer Major impact: electrochemical applications of the complex onto a substrate. -- Abstract: The interaction between horse hemoglobin and gold nanoparticles was studied using optical spectroscopy. UV-vis and fluorescence spectra show that a spontaneous binding process occurred between hemoglobin and gold nanoparticles. The Soret band of hemoglobin in the presence of gold nanoparticles does not show significant changes, which proves that the protein retained its biological function. A shift to longermore » wavelengths appears in the plasmonic band of gold nanoparticles upon the attachment of hemoglobin molecules. Gold nanoparticles quench the fluorescence emission of tryptophan residues in the structure of hemoglobin. The Stern-Volmer quenching constant, the binding constant and the number of binding sites were also calculated. Thermodynamic parameters indicate that the binding was mainly due to hydrophobic interactions.« less

  7. Reduction of nuclear encoded enzymes of mitochondrial energy metabolism in cells devoid of mitochondrial DNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, Edith E., E-mail: ed.mueller@salk.at; Mayr, Johannes A., E-mail: h.mayr@salk.at; Zimmermann, Franz A., E-mail: f.zimmermann@salk.at

    2012-01-20

    Highlights: Black-Right-Pointing-Pointer We examined OXPHOS and citrate synthase enzyme activities in HEK293 cells devoid of mtDNA. Black-Right-Pointing-Pointer Enzymes partially encoded by mtDNA show reduced activities. Black-Right-Pointing-Pointer Also the entirely nuclear encoded complex II and citrate synthase exhibit reduced activities. Black-Right-Pointing-Pointer Loss of mtDNA induces a feedback mechanism that downregulates complex II and citrate synthase. -- Abstract: Mitochondrial DNA (mtDNA) depletion syndromes are generally associated with reduced activities of oxidative phosphorylation (OXPHOS) enzymes that contain subunits encoded by mtDNA. Conversely, entirely nuclear encoded mitochondrial enzymes in these syndromes, such as the tricarboxylic acid cycle enzyme citrate synthase (CS) and OXPHOS complexmore » II, usually exhibit normal or compensatory enhanced activities. Here we report that a human cell line devoid of mtDNA (HEK293 {rho}{sup 0} cells) has diminished activities of both complex II and CS. This finding indicates the existence of a feedback mechanism in {rho}{sup 0} cells that downregulates the expression of entirely nuclear encoded components of mitochondrial energy metabolism.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongfen, E-mail: wanghongfen11@163.com; Wang, Zhiqi; Chen, Shougang

    Molybdenum carbides with surfactants as carbon sources were prepared using the carbothermal reduction of the appropriate precursors (molybdenum oxides deposited on surfactant micelles) at 1023 K under hydrogen gas. The carburized products were characterized using scanning electron microscopy (SEM), X-ray diffraction and BET surface area measurements. From the SEM images, hollow microspherical and rod-like molybdenum carbides were observed. X-ray diffraction patterns showed that the annealing time of carburization had a large effect on the conversion of molybdenum oxides to molybdenum carbides. And BET surface area measurements indicated that the difference of carbon sources brought a big difference in specific surfacemore » areas of molybdenum carbides. - Graphical abstract: Molybdenum carbides having hollow microspherical and hollow rod-like morphologies that are different from the conventional monodipersed platelet-like morphologies. Highlights: Black-Right-Pointing-Pointer Molybdenum carbides were prepared using surfactants as carbon sources. Black-Right-Pointing-Pointer The kinds of surfactants affected the morphologies of molybdenum carbides. Black-Right-Pointing-Pointer The time of heat preservation at 1023 K affected the carburization process. Black-Right-Pointing-Pointer Molybdenum carbides with hollow structures had larger specific surface areas.« less

  9. Coupling individual kernel-filling processes with source-sink interactions into GREENLAB-Maize.

    PubMed

    Ma, Yuntao; Chen, Youjia; Zhu, Jinyu; Meng, Lei; Guo, Yan; Li, Baoguo; Hoogenboom, Gerrit

    2018-02-13

    Failure to account for the variation of kernel growth in a cereal crop simulation model may cause serious deviations in the estimates of crop yield. The goal of this research was to revise the GREENLAB-Maize model to incorporate source- and sink-limited allocation approaches to simulate the dry matter accumulation of individual kernels of an ear (GREENLAB-Maize-Kernel). The model used potential individual kernel growth rates to characterize the individual potential sink demand. The remobilization of non-structural carbohydrates from reserve organs to kernels was also incorporated. Two years of field experiments were conducted to determine the model parameter values and to evaluate the model using two maize hybrids with different plant densities and pollination treatments. Detailed observations were made on the dimensions and dry weights of individual kernels and other above-ground plant organs throughout the seasons. Three basic traits characterizing an individual kernel were compared on simulated and measured individual kernels: (1) final kernel size; (2) kernel growth rate; and (3) duration of kernel filling. Simulations of individual kernel growth closely corresponded to experimental data. The model was able to reproduce the observed dry weight of plant organs well. Then, the source-sink dynamics and the remobilization of carbohydrates for kernel growth were quantified to show that remobilization processes accompanied source-sink dynamics during the kernel-filling process. We conclude that the model may be used to explore options for optimizing plant kernel yield by matching maize management to the environment, taking into account responses at the level of individual kernels. © The Author(s) 2018. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Unconventional protein sources: apricot seed kernels.

    PubMed

    Gabrial, G N; El-Nahry, F I; Awadalla, M Z; Girgis, S M

    1981-09-01

    Hamawy apricot seed kernels (sweet), Amar apricot seed kernels (bitter) and treated Amar apricot kernels (bitterness removed) were evaluated biochemically. All kernels were found to be high in fat (42.2--50.91%), protein (23.74--25.70%) and fiber (15.08--18.02%). Phosphorus, calcium, and iron were determined in all experimental samples. The three different apricot seed kernels were used for extensive study including the qualitative determination of the amino acid constituents by acid hydrolysis, quantitative determination of some amino acids, and biological evaluation of the kernel proteins in order to use them as new protein sources. Weanling albino rats failed to grow on diets containing the Amar apricot seed kernels due to low food consumption because of its bitterness. There was no loss in weight in that case. The Protein Efficiency Ratio data and blood analysis results showed the Hamawy apricot seed kernels to be higher in biological value than treated apricot seed kernels. The Net Protein Ratio data which accounts for both weight, maintenance and growth showed the treated apricot seed kernels to be higher in biological value than both Hamawy and Amar kernels. The Net Protein Ratio for the last two kernels were nearly equal.

  11. A spatial analysis of health-related resources in three diverse metropolitan areas

    PubMed Central

    Smiley, Melissa J.; Diez Roux, Ana V.; Brines, Shannon J.; Brown, Daniel G.; Evenson, Kelly R.; Rodriguez, Daniel A.

    2010-01-01

    Few studies have investigated the spatial clustering of multiple health-related resources. We constructed 0.5-mile kernel densities of resources for census areas in New York City, NY (n=819 block groups), Baltimore, MD (n=737), and Winston-Salem, NC (n=169). Three of the four resource densities (supermarkets/produce stores, retail areas, and recreational facilities) tended to be correlated with each other, whereas park density was less consistently and sometimes negatively correlated with the others. Blacks were more likely to live in block groups with multiple low resource densities. Spatial regression models showed that block groups with higher proportions of black residents tended to have lower supermarket/produce, retail, and recreational facility densities, although these associations did not always achieve statistical significance. A measure that combined local and neighboring block group racial composition was often a stronger predictor of resources than the local measure alone. Overall, our results from three diverse U.S. cities show that health-related resources are not randomly distributed across space and that disadvantage in multiple domains often clusters with residential racial patterning. PMID:20478737

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinho, Graca; Pires, Ana, E-mail: ana.lourenco.pires@gmail.com; Saraiva, Luanha

    Highlights: Black-Right-Pointing-Pointer The article shows WEEE plastics characterization from a recycling unit in Portugal. Black-Right-Pointing-Pointer The recycling unit has low machinery, with hand sorting of plastics elements. Black-Right-Pointing-Pointer Most common polymers are PS, ABS, PC/ABS, HIPS and PP. Black-Right-Pointing-Pointer Most plastics found have no identification of plastic type or flame retardants. Black-Right-Pointing-Pointer Ecodesign is still not practiced for EEE, with repercussions in end of life stage. - Abstract: This paper describes a direct analysis study carried out in a recycling unit for waste electrical and electronic equipment (WEEE) in Portugal to characterize the plastic constituents of WEEE. Approximately 3400 items,more » including cooling appliances, small WEEE, printers, copying equipment, central processing units, cathode ray tube (CRT) monitors and CRT televisions were characterized, with the analysis finding around 6000 kg of plastics with several polymer types. The most common polymers are polystyrene, acrylonitrile-butadiene-styrene, polycarbonate blends, high-impact polystyrene and polypropylene. Additives to darken color are common contaminants in these plastics when used in CRT televisions and small WEEE. These additives can make plastic identification difficult, along with missing polymer identification and flame retardant identification marks. These drawbacks contribute to the inefficiency of manual dismantling of WEEE, which is the typical recycling process in Portugal. The information found here can be used to set a baseline for the plastics recycling industry and provide information for ecodesign in electrical and electronic equipment production.« less

  13. Human Nanog pseudogene8 promotes the proliferation of gastrointestinal cancer cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uchino, Keita, E-mail: uchino13@intmed1.med.kyushu-u.ac.jp; Hirano, Gen; Hirahashi, Minako

    2012-09-10

    There is emerging evidence that human solid tumor cells originate from cancer stem cells (CSCs). In cancer cell lines, tumor-initiating CSCs are mainly found in the side population (SP) that has the capacity to extrude dyes such as Hoechst 33342. We found that Nanog is expressed specifically in SP cells of human gastrointestinal (GI) cancer cells. Nucleotide sequencing revealed that NanogP8 but not Nanog was expressed in GI cancer cells. Transfection of NanogP8 into GI cancer cell lines promoted cell proliferation, while its inhibition by anti-Nanog siRNA suppressed the proliferation. Immunohistochemical staining of primary GI cancer tissues revealed NanogP8 proteinmore » to be strongly expressed in 3 out of 60 cases. In these cases, NanogP8 was found especially in an infiltrative part of the tumor, in proliferating cells with Ki67 expression. These data suggest that NanogP8 is involved in GI cancer development in a fraction of patients, in whom it presumably acts by supporting CSC proliferation. -- Highlights: Black-Right-Pointing-Pointer Nanog maintains pluripotency by regulating embryonic stem cells differentiation. Black-Right-Pointing-Pointer Nanog is expressed in cancer stem cells of human gastrointestinal cancer cells. Black-Right-Pointing-Pointer Nucleotide sequencing revealed that Nanog pseudogene8 but not Nanog was expressed. Black-Right-Pointing-Pointer Nanog pseudogene8 promotes cancer stem cells proliferation. Black-Right-Pointing-Pointer Nanog pseudogene8 is involved in gastrointestinal cancer development.« less

  14. Experimental and numerical analysis of metal leaching from fly ash-amended highway bases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cetin, Bora; Aydilek, Ahmet H., E-mail: aydilek@umd.edu; Li, Lin

    2012-05-15

    Highlights: Black-Right-Pointing-Pointer This study is the evaluation of leaching potential of fly ash-lime mixed soils. Black-Right-Pointing-Pointer This objective is met with experimental and numerical analysis. Black-Right-Pointing-Pointer Zn leaching decreases with increase in fly ash content while Ba, B, Cu increases. Black-Right-Pointing-Pointer Decrease in lime content promoted leaching of Ba, B and Cu while Zn increases. Black-Right-Pointing-Pointer Numerical analysis predicted lower field metal concentrations. - Abstract: A study was conducted to evaluate the leaching potential of unpaved road materials (URM) mixed with lime activated high carbon fly ashes and to evaluate groundwater impacts of barium, boron, copper, and zinc leaching. Thismore » objective was met by a combination of batch water leach tests, column leach tests, and computer modeling. The laboratory tests were conducted on soil alone, fly ash alone, and URM-fly ash-lime kiln dust mixtures. The results indicated that an increase in fly ash and lime content has significant effects on leaching behavior of heavy metals from URM-fly ash mixture. An increase in fly ash content and a decrease in lime content promoted leaching of Ba, B and Cu whereas Zn leaching was primarily affected by the fly ash content. Numerically predicted field metal concentrations were significantly lower than the peak metal concentrations obtained in laboratory column leach tests, and field concentrations decreased with time and distance due to dispersion in soil vadose zone.« less

  15. Bio-processing of solid wastes and secondary resources for metal extraction - A review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jae-chun; Pandey, Banshi Dhar, E-mail: bd_pandey@yahoo.co.uk; CSIR - National Metallurgical Laboratory, Jamshedpur 831007

    2012-01-15

    Highlights: Black-Right-Pointing-Pointer Review focuses on bio-extraction of metals from solid wastes of industries and consumer goods. Black-Right-Pointing-Pointer Bio-processing of certain effluents/wastewaters with metals is also included in brief. Black-Right-Pointing-Pointer Quantity/composition of wastes are assessed, and microbes used and leaching conditions included. Black-Right-Pointing-Pointer Bio-recovery using bacteria, fungi and archaea is highlighted for resource recycling. Black-Right-Pointing-Pointer Process methodology/mechanism, R and D direction and scope of large scale use are briefly included. - Abstract: Metal containing wastes/byproducts of various industries, used consumer goods, and municipal waste are potential pollutants, if not treated properly. They may also be important secondary resources if processed inmore » eco-friendly manner for secured supply of contained metals/materials. Bio-extraction of metals from such resources with microbes such as bacteria, fungi and archaea is being increasingly explored to meet the twin objectives of resource recycling and pollution mitigation. This review focuses on the bio-processing of solid wastes/byproducts of metallurgical and manufacturing industries, chemical/petrochemical plants, electroplating and tanning units, besides sewage sludge and fly ash of municipal incinerators, electronic wastes (e-wastes/PCBs), used batteries, etc. An assessment has been made to quantify the wastes generated and its compositions, microbes used, metal leaching efficiency etc. Processing of certain effluents and wastewaters comprising of metals is also included in brief. Future directions of research are highlighted.« less

  16. A new approach to criteria for health risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spickett, Jeffery, E-mail: J.Spickett@curtin.edu.au; Faculty of Health Sciences, School of Public Health, Curtin University, Perth, Western Australia; Katscherian, Dianne

    2012-01-15

    Health Impact Assessment (HIA) is a developing component of the overall impact assessment process and as such needs access to procedures that can enable more consistent approaches to the stepwise process that is now generally accepted in both EIA and HIA. The guidelines developed during this project provide a structured process, based on risk assessment procedures which use consequences and likelihood, as a way of ranking risks to adverse health outcomes from activities subjected to HIA or HIA as part of EIA. The aim is to assess the potential for both acute and chronic health outcomes. The consequences component alsomore » identifies a series of consequences for the health care system, depicted as expressions of financial expenditure and the capacity of the health system. These more specific health risk assessment characteristics should provide for a broader consideration of health consequences and a more consistent estimation of the adverse health risks of a proposed development at both the scoping and risk assessment stages of the HIA process. - Highlights: Black-Right-Pointing-Pointer A more objective approach to health risk assessment is provided. Black-Right-Pointing-Pointer An objective set of criteria for the consequences for chronic and acute impacts. Black-Right-Pointing-Pointer An objective set of criteria for the consequences on the health care system. Black-Right-Pointing-Pointer An objective set of criteria for event frequency that could impact on health. Black-Right-Pointing-Pointer The approach presented is currently being trialled in Australia.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oh, Yohan; Chung, Kwang Chul, E-mail: kchung@yonsei.ac.kr

    Highlights: Black-Right-Pointing-Pointer ZNF131 directly interacts with ER{alpha}. Black-Right-Pointing-Pointer The binding affinity of ZNF131 to ER{alpha} increases upon E2 stimulation. Black-Right-Pointing-Pointer ZNF131 inhibits ER{alpha}-mediated trans-activation by suppressing its homo-dimerization. Black-Right-Pointing-Pointer ZNF131 inhibits ER{alpha}-dimerization and E2-induced breast cancer cell proliferation. Black-Right-Pointing-Pointer ZNF131 inhibits estrogen signaling by acting as an ER{alpha}-co-repressor. -- Abstract: Steroid hormone estrogen elicits various physiological functions, many of which are mediated through two structurally and functionally distinct estrogen receptors, ER{alpha} and ER{beta}. The functional role of zinc finger protein 131 (ZNF131) is poorly understood, but it is assumed to possess transcriptional regulation activity due to the presence of amore » DNA binding motif. A few recent reports, including ours, revealed that ZNF131 acts as a negative regulator of ER{alpha} and that SUMO modification potentiates the negative effect of ZNF131 on estrogen signaling. However, its molecular mechanism for ER{alpha} inhibition has not been elucidated in detail. Here, we demonstrate that ZNF131 directly interacts with ER{alpha}, which consequently inhibits ER{alpha}-mediated trans-activation by suppressing its homo-dimerization. Moreover, we show that the C-terminal region of ZNF131 containing the SUMOylation site is necessary for its inhibition of estrogen signaling. Taken together, these data suggest that ZNF131 inhibits estrogen signaling by acting as an ER{alpha}-co-repressor.« less

  18. Study of structural, elastic, electronic and optical properties of seven SrZrO{sub 3} phases: First-principles calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Qi-Jun, E-mail: dianerliu@yahoo.com.cn; Liu, Zheng-Tang; Feng, Li-Ping

    2012-12-15

    On the plane-wave ultrasoft pseudopotential technique based on the first-principles density functional theory (DFT), we calculated the structural, elastic, electronic and optical properties of the seven different phases of SrZrO{sub 3}. The obtained ground-state properties are in good agreement with previous experiments and calculations, which indicate that the most stable phase is orthorhombic Pnma structure. Seven phases of SrZrO{sub 3} are mechanically stable with cubic, tetragonal and orthorhombic structures. The mechanical and thermodynamic properties have been obtained by using the Voigt-Reuss-Hill approach and Debye-Grueneisen model. The electronic structures and optical properties are obtained and compared with the available experimental andmore » theoretical data. - Graphical abstract: Energy versus volume of seven phases SrZrO{sub 3} shows the Pnma phase has the minimum ground-state energy. Highlights: Black-Right-Pointing-Pointer We calculated the physical and chemical properties of seven SrZrO{sub 3} polymorphs. Black-Right-Pointing-Pointer The order of stability is Pnma>Imma>Cmcm>I4/mcm>P4/mbm>P4mm>Pm3-bar m. Black-Right-Pointing-Pointer The most stable phase is orthorhombic Pnma structure. Black-Right-Pointing-Pointer Seven phases of SrZrO{sub 3} are mechanically stable. Black-Right-Pointing-Pointer The relationship between n and {rho}{sub m} is n=1+0.18{rho}{sub m}.« less

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doi, Keiko; Central Research Institute for Advanced Molecular Medicine, Fukuoka University, Fukuoka; Central Research Institute of Life Sciences for the Next Generation of Women Scientists, Fukuoka University, Fukuoka

    Highlights: Black-Right-Pointing-Pointer We generated Cd4-Cre-mediated T cell-specific Zfat-deficient mice. Black-Right-Pointing-Pointer Zfat-deficiency leads to reduction in the number of the peripheral T cells. Black-Right-Pointing-Pointer Impaired T cell receptor-mediated response in Zfat-deficient peripheral T cells. Black-Right-Pointing-Pointer Decreased expression of IL-7R{alpha}, IL-2R{alpha} and IL-2 in Zfat-deficient peripheral T cells. Black-Right-Pointing-Pointer Zfat plays critical roles in peripheral T cell homeostasis. -- Abstract: ZFAT, originally identified as a candidate susceptibility gene for autoimmune thyroid disease, has been reported to be involved in apoptosis, development and primitive hematopoiesis. Zfat is highly expressed in T- and B-cells in the lymphoid tissues, however, its physiological function in themore » immune system remains totally unknown. Here, we generated the T cell-specific Zfat-deficient mice and demonstrated that Zfat-deficiency leads to a remarkable reduction in the number of the peripheral T cells. Intriguingly, a reduced expression of IL-7R{alpha} and the impaired responsiveness to IL-7 for the survival were observed in the Zfat-deficient T cells. Furthermore, a severe defect in proliferation and increased apoptosis in the Zfat-deficient T cells following T cell receptor (TCR) stimulation was observed with a reduced IL-2R{alpha} expression as well as a reduced IL-2 production. Thus, our findings reveal that Zfat is a critical regulator in peripheral T cell homeostasis and its TCR-mediated response.« less

  20. Knowledge, data and interests: Challenges in participation of diverse stakeholders in HIA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negev, Maya, E-mail: negevm@bgu.ac.il

    2012-02-15

    Stakeholder participation is considered an integral part of HIA. However, the challenges that participation implies in a multi-disciplinary and multi-ethnic society are less studied. This paper presents the manifestations of the multiplicity of sectors and population groups in HIA and discusses the challenges that such diversity imposes. Specifically, there is no common ground between participants, as their positions entail contradictory knowledge regarding the current situation, reliance on distinct data and conflicting interests. This entails usage of multiple professional and ethnic languages, disagreements regarding the definition of health and prioritizing health issues in HIA, and divergent perceptions of risk. These differencesmore » between participants are embedded culturally, socially, individually and, maybe most importantly, professionally. This complex picture of diverse stakeholder attributes is grounded in a case study of stakeholder participation in HIA, regarding zoning of a hazardous industry site in Israel. The implication is that participatory HIAs should address the multiplicity of stakeholders and types of knowledge, data and interests in a more comprehensive way. - Highlights: Black-Right-Pointing-Pointer This paper analyses challenges in participation of diverse stakeholders in HIA. Black-Right-Pointing-Pointer The multiplicity of disciplines and population groups raises fundamental challenges. Black-Right-Pointing-Pointer Stakeholders possess distinct and often contradictory knowledge, data and interests. Black-Right-Pointing-Pointer They speak different languages, and differ on approaches to health and risk perceptions. Black-Right-Pointing-Pointer Substantial amendments to diverse participation are needed, in HIA and generally.« less

  1. Rapid dimerization of quercetin through an oxidative mechanism in the presence of serum albumin decreases its ability to induce cytotoxicity in MDA-MB-231 cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pham, Anh; Bortolazzo, Anthony; White, J. Brandon, E-mail: Brandon.White@sjsu.edu

    Highlights: Black-Right-Pointing-Pointer Quercetin cannot be detected intracellularly despite killing MDA-MB-231 cells. Black-Right-Pointing-Pointer Quercetin forms a heterodimer through oxidation in media with serum. Black-Right-Pointing-Pointer The quercetin heterodimer does not kill MDA-MB-231 cells. Black-Right-Pointing-Pointer Ascorbic acid stabilizes quercetin increasing cell death in quercetin treated cells. Black-Right-Pointing-Pointer Quercetin, and not a modified form, is responsible for apoptosis and cell death. -- Abstract: Quercetin is a member of the flavonoid family and has been previously shown to have a variety of anti-cancer activities. We and others have reported anti-proliferation, cell cycle arrest, and induction of apoptosis of cancer cells after treatment with quercetin. Quercetinmore » has also been shown to undergo oxidation. However, it is unclear if quercetin or one of its oxidized forms is responsible for cell death. Here we report that quercetin rapidly oxidized in cell culture media to form a dimer. The quercetin dimer is identical to a dimer that is naturally produced by onions. The quercetin dimer and quercetin-3-O-glucopyranoside are unable to cross the cell membrane and do not kill MDA-MB-231 cells. Finally, supplementing the media with ascorbic acid increases quercetin's ability to induce cell death probably by reduction oxidative dimerization. Our results suggest that an unmodified quercetin is the compound that elicits cell death.« less

  2. Expression and immunogenicity of novel subunit enterovirus 71 VP1 antigens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Juan; Department of Microbiology and Immunology, Nanjing Medical University; Wang, Shixia

    Highlights: Black-Right-Pointing-Pointer EV71 is a major emerging infectious disease in many Asian countries. Black-Right-Pointing-Pointer Inactivated EV71 vaccines are in clinical studies but their safety and efficacy are unknown. Black-Right-Pointing-Pointer Developing subunit based EV71 vaccines is significant and novel antigen design is needed. Black-Right-Pointing-Pointer DNA immunization is an efficient tool to test the immunogenicity of VP1 based EV71 vaccines. Black-Right-Pointing-Pointer Multiple VP1 antigens are developed showing immunogenic potential. -- Abstract: Hand, foot, and mouth disease (HFMD) is a common viral illness in young children. HFMD is caused by viruses belonging to the enterovirus genus of the picornavirus family. Recently, enterovirus 71more » (EV71) has emerged as a virulent agent for HFMD with severe clinical outcomes. In the current report, we conducted a pilot antigen engineering study to optimize the expression and immunogenicity of subunit VP1 antigen for the design of EV71 vaccines. DNA immunization was adopted as a simple technical approach to test different designs of VP1 antigens without the need to express VP1 protein in vitro first. Our studies indicated that the expression and immunogenicity of VP1 protein can be improved with alternated VP1 antigen designs. Data presented in the current report revealed novel pathways to optimize the design of VP1 antigen-based EV71 vaccines.« less

  3. Reversible immortalization of Nestin-positive precursor cells from pancreas and differentiation into insulin-secreting cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Pei; Li, Li; Qi, Hui

    2012-02-10

    Highlights: Black-Right-Pointing-Pointer The NPPCs from mouse pancreas were isolated. Black-Right-Pointing-Pointer Tet-on system for SV40 large in NPPCs was used to get RINPPCs. Black-Right-Pointing-Pointer The RINPPCs can undergo at least 80 population doublings without senescence. Black-Right-Pointing-Pointer The RINPPCs can be induced to differentiate into insulin-producing cells. Black-Right-Pointing-Pointer The combination of GLP-1 and sodium butyrate promoted the differentiation process. -- Abstract: Pancreatic stem cells or progenitor cells posses the ability of directed differentiation into pancreatic {beta} cells. However, these cells usually have limited proliferative capacity and finite lifespan in vitro. In the present study, Nestin-positive progenitor cells (NPPCs) from mouse pancreas thatmore » expressed the pancreatic stem cells or progenitor cell marker Nestin were isolated to obtain a sufficient number of differentiated pancreatic {beta} cells. Tet-on system for SV40 large T-antigen expression in NPPCs was used to achieve reversible immortalization. The reversible immortal Nestin-positive progenitor cells (RINPPCs) can undergo at least 80 population doublings without senescence in vitro while maintaining their biological and genetic characteristics. RINPPCs can be efficiently induced to differentiate into insulin-producing cells that contain a combination of glucagon-like peptide-1 (GLP-1) and sodium butyrate. The results of the present study can be used to explore transplantation therapy of type I diabetes mellitus.« less

  4. Comparing urban solid waste recycling from the viewpoint of urban metabolism based on physical input-output model: A case of Suzhou in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang Sai, E-mail: liangsai09@gmail.com; Zhang Tianzhu, E-mail: zhangtz@mail.tsinghua.edu.cn

    Highlights: Black-Right-Pointing-Pointer Impacts of solid waste recycling on Suzhou's urban metabolism in 2015 are analyzed. Black-Right-Pointing-Pointer Sludge recycling for biogas is regarded as an accepted method. Black-Right-Pointing-Pointer Technical levels of reusing scrap tires and food wastes should be improved. Black-Right-Pointing-Pointer Other fly ash utilization methods should be exploited. Black-Right-Pointing-Pointer Secondary wastes from reusing food wastes and sludge should be concerned. - Abstract: Investigating impacts of urban solid waste recycling on urban metabolism contributes to sustainable urban solid waste management and urban sustainability. Using a physical input-output model and scenario analysis, urban metabolism of Suzhou in 2015 is predicted and impactsmore » of four categories of solid waste recycling on urban metabolism are illustrated: scrap tire recycling, food waste recycling, fly ash recycling and sludge recycling. Sludge recycling has positive effects on reducing all material flows. Thus, sludge recycling for biogas is regarded as an accepted method. Moreover, technical levels of scrap tire recycling and food waste recycling should be improved to produce positive effects on reducing more material flows. Fly ash recycling for cement production has negative effects on reducing all material flows except solid wastes. Thus, other fly ash utilization methods should be exploited. In addition, the utilization and treatment of secondary wastes from food waste recycling and sludge recycling should be concerned.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu Qin, E-mail: zhuqin@fudan.edu.cn; Peng Xizhe, E-mail: xzpeng@fudan.edu.cn

    This study examines the impacts of population size, population structure, and consumption level on carbon emissions in China from 1978 to 2008. To this end, we expanded the stochastic impacts by regression on population, affluence, and technology model and used the ridge regression method, which overcomes the negative influences of multicollinearity among independent variables under acceptable bias. Results reveal that changes in consumption level and population structure were the major impact factors, not changes in population size. Consumption level and carbon emissions were highly correlated. In terms of population structure, urbanization, population age, and household size had distinct effects onmore » carbon emissions. Urbanization increased carbon emissions, while the effect of age acted primarily through the expansion of the labor force and consequent overall economic growth. Shrinking household size increased residential consumption, resulting in higher carbon emissions. Households, rather than individuals, are a more reasonable explanation for the demographic impact on carbon emissions. Potential social policies for low carbon development are also discussed. - Highlights: Black-Right-Pointing-Pointer We examine the impacts of population change on carbon emissions in China. Black-Right-Pointing-Pointer We expand the STIRPAT model by containing population structure factors in the model. Black-Right-Pointing-Pointer The population structure includes age structure, urbanization level, and household size. Black-Right-Pointing-Pointer The ridge regression method is used to estimate the model with multicollinearity. Black-Right-Pointing-Pointer The population structure plays a more important role compared with the population size.« less

  6. Inverted repeats in the promoter as an autoregulatory sequence for TcrX in Mycobacterium tuberculosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhattacharya, Monolekha; Das, Amit Kumar, E-mail: amitk@hijli.iitkgp.ernet.in

    Highlights: Black-Right-Pointing-Pointer The regulatory sequences recognized by TcrX have been identified. Black-Right-Pointing-Pointer The regulatory region comprises of inverted repeats segregated by 30 bp region. Black-Right-Pointing-Pointer The mode of binding of TcrX with regulatory sequence is unique. Black-Right-Pointing-Pointer In silico TcrX-DNA docked model binds one of the inverted repeats. Black-Right-Pointing-Pointer Both phosphorylated and unphosphorylated TcrX binds regulatory sequence in vitro. -- Abstract: TcrY, a histidine kinase, and TcrX, a response regulator, constitute a two-component system in Mycobacterium tuberculosis. tcrX, which is expressed during iron scarcity, is instrumental in the survival of iron-dependent M. tuberculosis. However, the regulator of tcrX/Y has notmore » been fully characterized. Crosslinking studies of TcrX reveal that it can form oligomers in vitro. Electrophoretic mobility shift assays (EMSAs) show that TcrX recognizes two regions in the promoter that are comprised of inverted repeats separated by {approx}30 bp. The dimeric in silico model of TcrX predicts binding to one of these inverted repeat regions. Site-directed mutagenesis and radioactive phosphorylation indicate that D54 of TcrX is phosphorylated by H256 of TcrY. However, phosphorylated and unphosphorylated TcrX bind the regulatory sequence with equal efficiency, which was shown with an EMSA using the D54A TcrX mutant.« less

  7. SIRT1 interacts with and protects glyceraldehyde-3-phosphate dehydrogenase (GAPDH) from nuclear translocation: Implications for cell survival after irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joo, Hyun-Yoo; Laboratory of Biochemistry, School of Life Sciences and Biotechnology, Korea University, Seoul 136-713; Woo, Seon Rang

    2012-08-10

    Highlights: Black-Right-Pointing-Pointer SIRT1 serves to retain GAPDH in the cytosol, preventing GAPDH nuclear translocation. Black-Right-Pointing-Pointer When SIRT1 is depleted, GAPDH translocation occurs even in the absence of stress. Black-Right-Pointing-Pointer Upon irradiation, SIRT1 interacts with GAPDH. Black-Right-Pointing-Pointer SIRT1 prevents irradiation-induced nuclear translocation of GAPDH. Black-Right-Pointing-Pointer SIRT1 presence rather than activity is essential for inhibiting GAPDH translocation. -- Abstract: Upon apoptotic stimulation, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), a cytosolic enzyme normally active in glycolysis, translocates into the nucleus and activates an apoptotic cascade therein. In the present work, we show that SIRT1 prevents nuclear translocation of GAPDH via interaction with GAPDH. SIRT1 depletion triggeredmore » nuclear translocation of cytosolic GAPDH even in the absence of apoptotic stress. Such translocation was not, however, observed when SIRT1 enzymatic activity was inhibited, indicating that SIRT1 protein per se, rather than the deacetylase activity of the protein, is required to inhibit GAPDH translocation. Upon irradiation, SIRT1 prevented irradiation-induced nuclear translocation of GAPDH, accompanied by interaction of SIRT1 and GAPDH. Thus, SIRT1 functions to retain GAPDH in the cytosol, protecting the enzyme from nuclear translocation via interaction with these two proteins. This serves as a mechanism whereby SIRT1 regulates cell survival upon induction of apoptotic stress by means that include irradiation.« less

  8. The effect of inertia on the Dirac electron, the spin Hall current and the momentum space Berry curvature

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chowdhury, Debashree, E-mail: debashreephys@gmail.com; Basu, B., E-mail: sribbasu@gmail.com

    2013-02-15

    We have studied the spin dependent force and the associated momentum space Berry curvature in an accelerating system. The results are derived by taking into consideration the non-relativistic limit of a generally covariant Dirac equation with an electromagnetic field present, where the methodology of the Foldy-Wouthuysen transformation is applied to achieve the non-relativistic limit. Spin currents appear due to the combined action of the external electric field, the crystal field and the induced inertial electric field via the total effective spin-orbit interaction. In an accelerating frame, the crucial role of momentum space Berry curvature in the spin dynamics has alsomore » been addressed from the perspective of spin Hall conductivity. For time dependent acceleration, the expression for the spin polarization has been derived. - Highlights: Black-Right-Pointing-Pointer We study the effect of acceleration on the Dirac electron in the presence of an electromagnetic field, where the acceleration induces an electric field. Black-Right-Pointing-Pointer Spin currents appear due to the total effective electric field via the total spin-orbit interaction. Black-Right-Pointing-Pointer We derive the expression for the spin dependent force and the spin Hall current, which is zero for a particular acceleration. Black-Right-Pointing-Pointer The role of the momentum space Berry curvature in an accelerating system is discussed. Black-Right-Pointing-Pointer An expression for the spin polarization for time dependent acceleration is derived.« less

  9. Microscopic heat pulses induce contraction of cardiomyocytes without calcium transients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oyama, Kotaro; Mizuno, Akari; Shintani, Seine A.

    Highlights: Black-Right-Pointing-Pointer Infra-red laser beam generates microscopic heat pulses. Black-Right-Pointing-Pointer Heat pulses induce contraction of cardiomyocytes. Black-Right-Pointing-Pointer Ca{sup 2+} transients during the contraction were not detected. Black-Right-Pointing-Pointer Skinned cardiomyocytes in free Ca{sup 2+} solution also contracted. Black-Right-Pointing-Pointer Heat pulses regulated the contractions without Ca{sup 2+} dynamics. -- Abstract: It was recently demonstrated that laser irradiation can control the beating of cardiomyocytes and hearts, however, the precise mechanism remains to be clarified. Among the effects induced by laser irradiation on biological tissues, temperature change is one possible effect which can alter physiological functions. Therefore, we investigated the mechanism by which heatmore » pulses, produced by infra-red laser light under an optical microscope, induce contractions of cardiomyocytes. Here we show that microscopic heat pulses induce contraction of rat adult cardiomyocytes. The temperature increase, {Delta}T, required for inducing contraction of cardiomyocytes was dependent upon the ambient temperature; that is, {Delta}T at physiological temperature was lower than that at room temperature. Ca{sup 2+} transients, which are usually coupled to contraction, were not detected. We confirmed that the contractions of skinned cardiomyocytes were induced by the heat pulses even in free Ca{sup 2+} solution. This heat pulse-induced Ca{sup 2+}-decoupled contraction technique has the potential to stimulate heart and skeletal muscles in a manner different from the conventional electrical stimulations.« less

  10. Cathodoluminescence microscopy and petrographic image analysis of aggregates in concrete pavements affected by alkali-silica reaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stastna, A., E-mail: astastna@gmail.com; Sachlova, S.; Pertold, Z.

    2012-03-15

    Various microscopic techniques (cathodoluminescence, polarizing and electron microscopy) were combined with image analysis with the aim to determine a) the modal composition and degradation features within concrete, and b) the petrographic characteristics and the geological types (rocks, and their provenance) of the aggregates. Concrete samples were taken from five different portions of Highway Nos. D1, D11, and D5 (the Czech Republic). Coarse and fine aggregates were found to be primarily composed of volcanic, plutonic, metamorphic and sedimentary rocks, as well as of quartz and feldspar aggregates of variable origins. The alkali-silica reaction was observed to be the main degradation mechanism,more » based upon the presence of microcracks and alkali-silica gels in the concrete. Use of cathodoluminescence enabled the identification of the source materials of the quartz aggregates, based upon their CL characteristics (i.e., color, intensity, microfractures, deformation, and zoning), which is difficult to distinguish only employing polarizing and electron microscopy. - Highlights: Black-Right-Pointing-Pointer ASR in concrete pavements on the Highways Nos. D1, D5 and D11 (Czech Republic). Black-Right-Pointing-Pointer Cathodoluminescence was combined with various microscopic techniques and image analysis. Black-Right-Pointing-Pointer ASR was attributed to aggregates. Black-Right-Pointing-Pointer Source materials of aggregates were identified based on cathodoluminescence characteristics. Black-Right-Pointing-Pointer Quartz comes from different volcanic, plutonic and metamorphic parent rocks.« less

  11. The Use of Image-Spectroscopy Technology as a Diagnostic Method for Seed Health Testing and Variety Identification

    PubMed Central

    Vrešak, Martina; Halkjaer Olesen, Merete; Gislum, René; Bavec, Franc; Ravn Jørgensen, Johannes

    2016-01-01

    Application of rapid and time-efficient health diagnostic and identification technology in the seed industry chain could accelerate required analysis, characteristic description and also ultimately availability of new desired varieties. The aim of the study was to evaluate the potential of multispectral imaging and single kernel near-infrared spectroscopy (SKNIR) for determination of seed health and variety separation of winter wheat (Triticum aestivum L.) and winter triticale (Triticosecale Wittm. & Camus). The analysis, carried out in autumn 2013 at AU-Flakkebjerg, Denmark, included nine winter triticale varieties and 27 wheat varieties provided by the Faculty of Agriculture and Life Sciences Maribor, Slovenia. Fusarium sp. and black point disease-infected parts of the seed surface could successfully be distinguished from uninfected parts with use of a multispectral imaging device (405–970 nm wavelengths). SKNIR was applied in this research to differentiate all 36 involved varieties based on spectral differences due to variation in the chemical composition. The study produced an interesting result of successful distinguishing between the infected and uninfected parts of the seed surface. Furthermore, the study was able to distinguish between varieties. Together these components could be used in further studies for the development of a sorting model by combining data from multispectral imaging and SKNIR for identifying disease(s) and varieties. PMID:27010656

  12. An introduction to kernel-based learning algorithms.

    PubMed

    Müller, K R; Mika, S; Rätsch, G; Tsuda, K; Schölkopf, B

    2001-01-01

    This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by discussing applications such as optical character recognition and DNA analysis.

  13. Inequalities in the Educational Experiences of Black and White Americans, Background Paper.

    ERIC Educational Resources Information Center

    Chadima, Steven; Wabnick, Richard

    There are inequalities in the educational experiences of blacks and whites. Black students tend to have lower grade point averages than do white students. Also, they are suspended more often and for longer spells than whites. Fewer blacks remain in secondary school beyond the compulsory attendance age, fewer graduate from high school, and fewer…

  14. Social Class, School and Non-School Environments, and Black/White Inequalities in Children's Learning

    ERIC Educational Resources Information Center

    Condron, Dennis J.

    2009-01-01

    As social and economic stratification between black and white Americans persists at the dawn of the twenty-first century, disparities in educational outcomes remain an especially formidable barrier. Recent research on the black/white achievement gap points to a perplexing pattern in this regard. Schools appear to exacerbate black/white disparities…

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bhaskar,; Kumari, Neeti; Goyal, Neena, E-mail: neenacdri@yahoo.com

    Highlights: Black-Right-Pointing-Pointer The study presents cloning and characterization of TCP1{gamma} gene from L. donovani. Black-Right-Pointing-Pointer TCP1{gamma} is a subunit of T-complex protein-1 (TCP1), a chaperonin class of protein. Black-Right-Pointing-Pointer LdTCP{gamma} exhibited differential expression in different stages of promastigotes. Black-Right-Pointing-Pointer LdTCP{gamma} co-localized with actin, a cytoskeleton protein. Black-Right-Pointing-Pointer The data suggests that this gene may have a role in differentiation/biogenesis. Black-Right-Pointing-Pointer First report on this chapronin in Leishmania. -- Abstract: T-complex protein-1 (TCP1) complex, a chaperonin class of protein, ubiquitous in all genera of life, is involved in intracellular assembly and folding of various proteins. The gamma subunit of TCP1 complexmore » (TCP1{gamma}), plays a pivotal role in the folding and assembly of cytoskeleton protein(s) as an individual or complexed with other subunits. Here, we report for the first time cloning, characterization and expression of the TCP1{gamma} of Leishmania donovani (LdTCP1{gamma}), the causative agent of Indian Kala-azar. Primary sequence analysis of LdTCP1{gamma} revealed the presence of all the characteristic features of TCP1{gamma}. However, leishmanial TCP1{gamma} represents a distinct kinetoplastid group, clustered in a separate branch of the phylogenic tree. LdTCP1{gamma} exhibited differential expression in different stages of promastigotes. The non-dividing stationary phase promastigotes exhibited 2.5-fold less expression of LdTCP1{gamma} as compared to rapidly dividing log phase parasites. The sub-cellular distribution of LdTCP1{gamma} was studied in log phase promastigotes by employing indirect immunofluorescence microscopy. The protein was present not only in cytoplasm but it was also localized in nucleus, peri-nuclear region, flagella, flagellar pocket and apical region. Co-localization of LdTCP1{gamma} with actin suggests that, this gene may have a role in maintaining the structural dynamics of cytoskeleton of parasite.« less

  16. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.408 Section 981.408 Agriculture... Administrative Rules and Regulations § 981.408 Inedible kernel. Pursuant to § 981.8, the definition of inedible kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as...

  17. Design of CT reconstruction kernel specifically for clinical lung imaging

    NASA Astrophysics Data System (ADS)

    Cody, Dianna D.; Hsieh, Jiang; Gladish, Gregory W.

    2005-04-01

    In this study we developed a new reconstruction kernel specifically for chest CT imaging. An experimental flat-panel CT scanner was used on large dogs to produce 'ground-truth" reference chest CT images. These dogs were also examined using a clinical 16-slice CT scanner. We concluded from the dog images acquired on the clinical scanner that the loss of subtle lung structures was due mostly to the presence of the background noise texture when using currently available reconstruction kernels. This qualitative evaluation of the dog CT images prompted the design of a new recon kernel. This new kernel consisted of the combination of a low-pass and a high-pass kernel to produce a new reconstruction kernel, called the 'Hybrid" kernel. The performance of this Hybrid kernel fell between the two kernels on which it was based, as expected. This Hybrid kernel was also applied to a set of 50 patient data sets; the analysis of these clinical images is underway. We are hopeful that this Hybrid kernel will produce clinical images with an acceptable tradeoff of lung detail, reliable HU, and image noise.

  18. A new discriminative kernel from probabilistic models.

    PubMed

    Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert

    2002-10-01

    Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.

  19. Two-Dimensional Dirac Fermions Protected by Space-Time Inversion Symmetry in Black Phosphorus

    NASA Astrophysics Data System (ADS)

    Kim, Jimin; Baik, Seung Su; Jung, Sung Won; Sohn, Yeongsup; Ryu, Sae Hee; Choi, Hyoung Joon; Yang, Bohm-Jung; Kim, Keun Su

    2017-12-01

    We report the realization of novel symmetry-protected Dirac fermions in a surface-doped two-dimensional (2D) semiconductor, black phosphorus. The widely tunable band gap of black phosphorus by the surface Stark effect is employed to achieve a surprisingly large band inversion up to ˜0.6 eV . High-resolution angle-resolved photoemission spectra directly reveal the pair creation of Dirac points and their movement along the axis of the glide-mirror symmetry. Unlike graphene, the Dirac point of black phosphorus is stable, as protected by space-time inversion symmetry, even in the presence of spin-orbit coupling. Our results establish black phosphorus in the inverted regime as a simple model system of 2D symmetry-protected (topological) Dirac semimetals, offering an unprecedented opportunity for the discovery of 2D Weyl semimetals.

  20. On non-linear magnetic-charged black hole surrounded by quintessence

    NASA Astrophysics Data System (ADS)

    Nam, Cao H.

    2018-06-01

    We derive a non-linear magnetic-charged black hole surrounded by quintessence, which behaves asymptotically like the Schwarzschild black hole surrounded by quintessence but at the short distances like the dS geometry. The horizon properties of this black hole are investigated in detail. The thermodynamics of the black hole is studied in the local and global views. Finally, by calculating the heat capacity and the free energy, we point to that the black hole may undergo a thermal phase transition, between a larger unstable black hole and a smaller stable black hole, at a critical temperature.

  1. Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.

    PubMed

    Kwak, Nojun

    2016-05-20

    Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.

  2. Graph embedding and extensions: a general framework for dimensionality reduction.

    PubMed

    Yan, Shuicheng; Xu, Dong; Zhang, Benyu; Zhang, Hong-Jiang; Yang, Qiang; Lin, Stephen

    2007-01-01

    Over the past few decades, a large family of algorithms - supervised or unsupervised; stemming from statistics or geometry theory - has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric properties of a data set, with constraints from scale normalization or a penalty graph that characterizes a statistical or geometric property that should be avoided. Furthermore, the graph embedding framework can be used as a general platform for developing new dimensionality reduction algorithms. By utilizing this framework as a tool, we propose a new supervised dimensionality reduction algorithm called Marginal Fisher Analysis in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizes the interclass separability. We show that MFA effectively overcomes the limitations of the traditional Linear Discriminant Analysis algorithm due to data distribution assumptions and available projection directions. Real face recognition experiments show the superiority of our proposed MFA in comparison to LDA, also for corresponding kernel and tensor extensions.

  3. An evaluation of potential sampling locations in a reservoir with emphasis on conserved spatial correlation structure.

    PubMed

    Yenilmez, Firdes; Düzgün, Sebnem; Aksoy, Aysegül

    2015-01-01

    In this study, kernel density estimation (KDE) was coupled with ordinary two-dimensional kriging (OK) to reduce the number of sampling locations in measurement and kriging of dissolved oxygen (DO) concentrations in Porsuk Dam Reservoir (PDR). Conservation of the spatial correlation structure in the DO distribution was a target. KDE was used as a tool to aid in identification of the sampling locations that would be removed from the sampling network in order to decrease the total number of samples. Accordingly, several networks were generated in which sampling locations were reduced from 65 to 10 in increments of 4 or 5 points at a time based on kernel density maps. DO variograms were constructed, and DO values in PDR were kriged. Performance of the networks in DO estimations were evaluated through various error metrics, standard error maps (SEM), and whether the spatial correlation structure was conserved or not. Results indicated that smaller number of sampling points resulted in loss of information in regard to spatial correlation structure in DO. The minimum representative sampling points for PDR was 35. Efficacy of the sampling location selection method was tested against the networks generated by experts. It was shown that the evaluation approach proposed in this study provided a better sampling network design in which the spatial correlation structure of DO was sustained for kriging.

  4. Applications of the line-of-response probability density function resolution model in PET list mode reconstruction.

    PubMed

    Jian, Y; Yao, R; Mulnix, T; Jin, X; Carson, R E

    2015-01-07

    Resolution degradation in PET image reconstruction can be caused by inaccurate modeling of the physical factors in the acquisition process. Resolution modeling (RM) is a common technique that takes into account the resolution degrading factors in the system matrix. Our previous work has introduced a probability density function (PDF) method of deriving the resolution kernels from Monte Carlo simulation and parameterizing the LORs to reduce the number of kernels needed for image reconstruction. In addition, LOR-PDF allows different PDFs to be applied to LORs from different crystal layer pairs of the HRRT. In this study, a thorough test was performed with this new model (LOR-PDF) applied to two PET scanners-the HRRT and Focus-220. A more uniform resolution distribution was observed in point source reconstructions by replacing the spatially-invariant kernels with the spatially-variant LOR-PDF. Specifically, from the center to the edge of radial field of view (FOV) of the HRRT, the measured in-plane FWHMs of point sources in a warm background varied slightly from 1.7 mm to 1.9 mm in LOR-PDF reconstructions. In Minihot and contrast phantom reconstructions, LOR-PDF resulted in up to 9% higher contrast at any given noise level than image-space resolution model. LOR-PDF also has the advantage in performing crystal-layer-dependent resolution modeling. The contrast improvement by using LOR-PDF was verified statistically by replicate reconstructions. In addition, [(11)C]AFM rats imaged on the HRRT and [(11)C]PHNO rats imaged on the Focus-220 were utilized to demonstrated the advantage of the new model. Higher contrast between high-uptake regions of only a few millimeter diameter and the background was observed in LOR-PDF reconstruction than in other methods.

  5. Applications of the line-of-response probability density function resolution model in PET list mode reconstruction

    PubMed Central

    Jian, Y; Yao, R; Mulnix, T; Jin, X; Carson, R E

    2016-01-01

    Resolution degradation in PET image reconstruction can be caused by inaccurate modeling of the physical factors in the acquisition process. Resolution modeling (RM) is a common technique that takes into account the resolution degrading factors in the system matrix. Our previous work has introduced a probability density function (PDF) method of deriving the resolution kernels from Monte Carlo simulation and parameterizing the LORs to reduce the number of kernels needed for image reconstruction. In addition, LOR-PDF allows different PDFs to be applied to LORs from different crystal layer pairs of the HRRT. In this study, a thorough test was performed with this new model (LOR-PDF) applied to two PET scanners - the HRRT and Focus-220. A more uniform resolution distribution was observed in point source reconstructions by replacing the spatially-invariant kernels with the spatially-variant LOR-PDF. Specifically, from the center to the edge of radial field of view (FOV) of the HRRT, the measured in-plane FWHMs of point sources in a warm background varied slightly from 1.7 mm to 1.9 mm in LOR-PDF reconstructions. In Minihot and contrast phantom reconstructions, LOR-PDF resulted in up to 9% higher contrast at any given noise level than image-space resolution model. LOR-PDF also has the advantage in performing crystal-layer-dependent resolution modeling. The contrast improvement by using LOR-PDF was verified statistically by replicate reconstructions. In addition, [11C]AFM rats imaged on the HRRT and [11C]PHNO rats imaged on the Focus-220 were utilized to demonstrated the advantage of the new model. Higher contrast between high-uptake regions of only a few millimeter diameter and the background was observed in LOR-PDF reconstruction than in other methods. PMID:25490063

  6. Alaska/Yukon Geoid Improvement by a Data-Driven Stokes's Kernel Modification Approach

    NASA Astrophysics Data System (ADS)

    Li, Xiaopeng; Roman, Daniel R.

    2015-04-01

    Geoid modeling over Alaska of USA and Yukon Canada being a trans-national issue faces a great challenge primarily due to the inhomogeneous surface gravity data (Saleh et al, 2013) and the dynamic geology (Freymueller et al, 2008) as well as its complex geological rheology. Previous study (Roman and Li 2014) used updated satellite models (Bruinsma et al 2013) and newly acquired aerogravity data from the GRAV-D project (Smith 2007) to capture the gravity field changes in the targeting areas primarily in the middle-to-long wavelength. In CONUS, the geoid model was largely improved. However, the precision of the resulted geoid model in Alaska was still in the decimeter level, 19cm at the 32 tide bench marks and 24cm on the 202 GPS/Leveling bench marks that gives a total of 23.8cm at all of these calibrated surface control points, where the datum bias was removed. Conventional kernel modification methods in this area (Li and Wang 2011) had limited effects on improving the precision of the geoid models. To compensate the geoid miss fits, a new Stokes's kernel modification method based on a data-driven technique is presented in this study. First, the method was tested on simulated data sets (Fig. 1), where the geoid errors have been reduced by 2 orders of magnitude (Fig 2). For the real data sets, some iteration steps are required to overcome the rank deficiency problem caused by the limited control data that are irregularly distributed in the target area. For instance, after 3 iterations, the standard deviation dropped about 2.7cm (Fig 3). Modification at other critical degrees can further minimize the geoid model miss fits caused either by the gravity error or the remaining datum error in the control points.

  7. Lagged kernel machine regression for identifying time windows of susceptibility to exposures of complex mixtures.

    PubMed

    Liu, Shelley H; Bobb, Jennifer F; Lee, Kyu Ha; Gennings, Chris; Claus Henn, Birgit; Bellinger, David; Austin, Christine; Schnaas, Lourdes; Tellez-Rojo, Martha M; Hu, Howard; Wright, Robert O; Arora, Manish; Coull, Brent A

    2018-07-01

    The impact of neurotoxic chemical mixtures on children's health is a critical public health concern. It is well known that during early life, toxic exposures may impact cognitive function during critical time intervals of increased vulnerability, known as windows of susceptibility. Knowledge on time windows of susceptibility can help inform treatment and prevention strategies, as chemical mixtures may affect a developmental process that is operating at a specific life phase. There are several statistical challenges in estimating the health effects of time-varying exposures to multi-pollutant mixtures, such as: multi-collinearity among the exposures both within time points and across time points, and complex exposure-response relationships. To address these concerns, we develop a flexible statistical method, called lagged kernel machine regression (LKMR). LKMR identifies critical exposure windows of chemical mixtures, and accounts for complex non-linear and non-additive effects of the mixture at any given exposure window. Specifically, LKMR estimates how the effects of a mixture of exposures change with the exposure time window using a Bayesian formulation of a grouped, fused lasso penalty within a kernel machine regression (KMR) framework. A simulation study demonstrates the performance of LKMR under realistic exposure-response scenarios, and demonstrates large gains over approaches that consider each time window separately, particularly when serial correlation among the time-varying exposures is high. Furthermore, LKMR demonstrates gains over another approach that inputs all time-specific chemical concentrations together into a single KMR. We apply LKMR to estimate associations between neurodevelopment and metal mixtures in Early Life Exposures in Mexico and Neurotoxicology, a prospective cohort study of child health in Mexico City.

  8. Increasing accuracy of dispersal kernels in grid-based population models

    USGS Publications Warehouse

    Slone, D.H.

    2011-01-01

    Dispersal kernels in grid-based population models specify the proportion, distance and direction of movements within the model landscape. Spatial errors in dispersal kernels can have large compounding effects on model accuracy. Circular Gaussian and Laplacian dispersal kernels at a range of spatial resolutions were investigated, and methods for minimizing errors caused by the discretizing process were explored. Kernels of progressively smaller sizes relative to the landscape grid size were calculated using cell-integration and cell-center methods. These kernels were convolved repeatedly, and the final distribution was compared with a reference analytical solution. For large Gaussian kernels (σ > 10 cells), the total kernel error was <10 &sup-11; compared to analytical results. Using an invasion model that tracked the time a population took to reach a defined goal, the discrete model results were comparable to the analytical reference. With Gaussian kernels that had σ ≤ 0.12 using the cell integration method, or σ ≤ 0.22 using the cell center method, the kernel error was greater than 10%, which resulted in invasion times that were orders of magnitude different than theoretical results. A goal-seeking routine was developed to adjust the kernels to minimize overall error. With this, corrections for small kernels were found that decreased overall kernel error to <10-11 and invasion time error to <5%.

  9. On randomized algorithms for numerical solution of applied Fredholm integral equations of the second kind

    NASA Astrophysics Data System (ADS)

    Voytishek, Anton V.; Shipilov, Nikolay M.

    2017-11-01

    In this paper, the systematization of numerical (implemented on a computer) randomized functional algorithms for approximation of a solution of Fredholm integral equation of the second kind is carried out. Wherein, three types of such algorithms are distinguished: the projection, the mesh and the projection-mesh methods. The possibilities for usage of these algorithms for solution of practically important problems is investigated in detail. The disadvantages of the mesh algorithms, related to the necessity of calculation values of the kernels of integral equations in fixed points, are identified. On practice, these kernels have integrated singularities, and calculation of their values is impossible. Thus, for applied problems, related to solving Fredholm integral equation of the second kind, it is expedient to use not mesh, but the projection and the projection-mesh randomized algorithms.

  10. UAV remote sensing atmospheric degradation image restoration based on multiple scattering APSF estimation

    NASA Astrophysics Data System (ADS)

    Qiu, Xiang; Dai, Ming; Yin, Chuan-li

    2017-09-01

    Unmanned aerial vehicle (UAV) remote imaging is affected by the bad weather, and the obtained images have the disadvantages of low contrast, complex texture and blurring. In this paper, we propose a blind deconvolution model based on multiple scattering atmosphere point spread function (APSF) estimation to recovery the remote sensing image. According to Narasimhan analytical theory, a new multiple scattering restoration model is established based on the improved dichromatic model. Then using the L0 norm sparse priors of gradient and dark channel to estimate APSF blur kernel, the fast Fourier transform is used to recover the original clear image by Wiener filtering. By comparing with other state-of-the-art methods, the proposed method can correctly estimate blur kernel, effectively remove the atmospheric degradation phenomena, preserve image detail information and increase the quality evaluation indexes.

  11. Anthraquinones isolated from the browned Chinese chestnut kernels (Castanea mollissima blume)

    NASA Astrophysics Data System (ADS)

    Zhang, Y. L.; Qi, J. H.; Qin, L.; Wang, F.; Pang, M. X.

    2016-08-01

    Anthraquinones (AQS) represent a group of secondary metallic products in plants. AQS are often naturally occurring in plants and microorganisms. In a previous study, we found that AQS were produced by enzymatic browning reaction in Chinese chestnut kernels. To find out whether non-enzymatic browning reaction in the kernels could produce AQS too, AQS were extracted from three groups of chestnut kernels: fresh kernels, non-enzymatic browned kernels, and browned kernels, and the contents of AQS were determined. High performance liquid chromatography (HPLC) and nuclear magnetic resonance (NMR) methods were used to identify two compounds of AQS, rehein(1) and emodin(2). AQS were barely exists in the fresh kernels, while both browned kernel groups sample contained a high amount of AQS. Thus, we comfirmed that AQS could be produced during both enzymatic and non-enzymatic browning process. Rhein and emodin were the main components of AQS in the browned kernels.

  12. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions

    NASA Astrophysics Data System (ADS)

    Novosad, Philip; Reader, Andrew J.

    2016-06-01

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [18F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [11C]SCH23390 data, showing promising results.

  13. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions.

    PubMed

    Novosad, Philip; Reader, Andrew J

    2016-06-21

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [(18)F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel model can also be used for effective post-reconstruction denoising, through the use of an EM-like image-space algorithm. Finally, we applied the proposed algorithm to reconstruction of real high-resolution dynamic [(11)C]SCH23390 data, showing promising results.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Yan-Ming; Su, Ying; Li, Jia

    Highlights: Black-Right-Pointing-Pointer NHE protect against intracellular hydrogen overload. Black-Right-Pointing-Pointer NHE protect {beta}-cells against strong acidification. Black-Right-Pointing-Pointer NHE inhibitors improve myocardial ischemia and reperfusion. -- Abstract: Micro- and macrovascular complications are the main cause of morbidity and mortality in diabetes mellitus. The Na{sup +}/H{sup +} exchanger (NHE) is a family of proteins which exchange Na{sup +} for H{sup +} according to their concentration gradients in an electroneutral manner. The exchanger also plays a key role in several other cellular functions including proliferation, differentiation, apoptosis, migration, and cytoskeletal organization. Since not much is known on the relationship between NHE and diabetes mellitus,more » this review outlines the contribution of NHE to chronic complications of diabetes mellitus, such as diabetic nephropathy; diabetic cardiomyopathy.« less

  15. Investigation of plastic deformation heterogeneities in duplex steel by EBSD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wronski, S., E-mail: wronski@ftj.agh.edu.pl; Tarasiuk, J., E-mail: tarasiuk@ftj.agh.edu.pl; Bacroix, B., E-mail: brigitte.bacroix@univ-paris13.fr

    2012-11-15

    An EBSD analysis of a duplex steel (austeno-ferritic) deformed in tension up to fracture is presented. The main purpose of the paper is to describe, qualitatively and quantitatively, the differences in the behavior of the two phases during plastic deformation. In order to do so, several topological maps are measured on the deformed state using the electron backscatter diffraction technique. Distributions of grain size, misorientation, image quality factor and texture are then analyzed in detail. - Highlights: Black-Right-Pointing-Pointer Heterogeneities in duplex steel is studied. Black-Right-Pointing-Pointer The behavior of the two phases during plastic deformation is studied. Black-Right-Pointing-Pointer IQ factor distributionmore » and misorientation characteristics are examined using EBSD.« less

  16. Broken rice kernels and the kinetics of rice hydration and texture during cooking.

    PubMed

    Saleh, Mohammed; Meullenet, Jean-Francois

    2013-05-01

    During rice milling and processing, broken kernels are inevitably present, although to date it has been unclear as to how the presence of broken kernels affects rice hydration and cooked rice texture. Therefore, this work intended to study the effect of broken kernels in a rice sample on rice hydration and texture during cooking. Two medium-grain and two long-grain rice cultivars were harvested, dried and milled, and the broken kernels were separated from unbroken kernels. Broken rice kernels were subsequently combined with unbroken rice kernels forming treatments of 0, 40, 150, 350 or 1000 g kg(-1) broken kernels ratio. Rice samples were then cooked and the moisture content of the cooked rice, the moisture uptake rate, and rice hardness and stickiness were measured. As the amount of broken rice kernels increased, rice sample texture became increasingly softer (P < 0.05) but the unbroken kernels became significantly harder. Moisture content and moisture uptake rate were positively correlated, and cooked rice hardness was negatively correlated to the percentage of broken kernels in rice samples. Differences in the proportions of broken rice in a milled rice sample play a major role in determining the texture properties of cooked rice. Variations in the moisture migration kinetics between broken and unbroken kernels caused faster hydration of the cores of broken rice kernels, with greater starch leach-out during cooking affecting the texture of the cooked rice. The texture of cooked rice can be controlled, to some extent, by varying the proportion of broken kernels in milled rice. © 2012 Society of Chemical Industry.

  17. A quantum dot close to Stoner instability: The role of the Berry phase

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Saha, Arijit, E-mail: arijitsahahri@gmail.com; Gefen, Yuval; Burmistrov, Igor

    2012-10-15

    The physics of a quantum dot with electron-electron interactions is well captured by the so called 'Universal Hamiltonian' if the dimensionless conductance of the dot is much higher than unity. Within this scheme interactions are represented by three spatially independent terms which describe the charging energy, the spin-exchange and the interaction in the Cooper channel. In this paper we concentrate on the exchange interaction and generalize the functional bosonization formalism developed earlier for the charging energy. This turned out to be challenging as the effective bosonic action is formulated in terms of a vector field and is non-abelian due tomore » the non-commutativity of the spin operators. Here we develop a geometric approach which is particularly useful in the mesoscopic Stoner regime, i.e., when the strong exchange interaction renders the system close to the Stoner instability. We show that it is sufficient to sum over the adiabatic paths of the bosonic vector field and, for these paths, the crucial role is played by the Berry phase. Using these results we were able to calculate the magnetic susceptibility of the dot. The latter, in close vicinity of the Stoner instability point, matches very well with the exact solution [I.S. Burmistrov, Y. Gefen, M.N. Kiselev, JETP Lett. 92 (2010) 179]. - Highlights: Black-Right-Pointing-Pointer We consider a conducting QD whose dynamics is governed by exchange interaction. Black-Right-Pointing-Pointer We study the model within the 'Universal Hamiltonian' framework. Black-Right-Pointing-Pointer The ensuing bosonic action is non-abelian (hence non-trivial). Black-Right-Pointing-Pointer We find that the low energy dynamics is governed by a fluctuating Berry phase term. Black-Right-Pointing-Pointer We calculate the partition function and the zero frequency magnetic susceptibility.« less

  18. 34 CFR 628.32 - What funding priorities does the Secretary use in evaluating an application for an endowment...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...

  19. 34 CFR 628.32 - What funding priorities does the Secretary use in evaluating an application for an endowment...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...

  20. 34 CFR 628.32 - What funding priorities does the Secretary use in evaluating an application for an endowment...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...

  1. 34 CFR 628.32 - What funding priorities does the Secretary use in evaluating an application for an endowment...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...

  2. 34 CFR 628.32 - What funding priorities does the Secretary use in evaluating an application for an endowment...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and Universities, or Strengthening Historically Black Graduate Institutions Program. (Total: 20 points) The Secretary... Institutions, Special Needs, Hispanic-Serving Institutions, Strengthening Historically Black Colleges and...

  3. The maximum vector-angular margin classifier and its fast training on large datasets using a core vector machine.

    PubMed

    Hu, Wenjun; Chung, Fu-Lai; Wang, Shitong

    2012-03-01

    Although pattern classification has been extensively studied in the past decades, how to effectively solve the corresponding training on large datasets is a problem that still requires particular attention. Many kernelized classification methods, such as SVM and SVDD, can be formulated as the corresponding quadratic programming (QP) problems, but computing the associated kernel matrices requires O(n2)(or even up to O(n3)) computational complexity, where n is the size of the training patterns, which heavily limits the applicability of these methods for large datasets. In this paper, a new classification method called the maximum vector-angular margin classifier (MAMC) is first proposed based on the vector-angular margin to find an optimal vector c in the pattern feature space, and all the testing patterns can be classified in terms of the maximum vector-angular margin ρ, between the vector c and all the training data points. Accordingly, it is proved that the kernelized MAMC can be equivalently formulated as the kernelized Minimum Enclosing Ball (MEB), which leads to a distinctive merit of MAMC, i.e., it has the flexibility of controlling the sum of support vectors like v-SVC and may be extended to a maximum vector-angular margin core vector machine (MAMCVM) by connecting the core vector machine (CVM) method with MAMC such that the corresponding fast training on large datasets can be effectively achieved. Experimental results on artificial and real datasets are provided to validate the power of the proposed methods. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Using kernel density estimates to investigate lymphatic filariasis in northeast Brazil

    PubMed Central

    Medeiros, Zulma; Bonfim, Cristine; Brandão, Eduardo; Netto, Maria José Evangelista; Vasconcellos, Lucia; Ribeiro, Liany; Portugal, José Luiz

    2012-01-01

    After more than 10 years of the Global Program to Eliminate Lymphatic Filariasis (GPELF) in Brazil, advances have been seen, but the endemic disease persists as a public health problem. The aim of this study was to describe the spatial distribution of lymphatic filariasis in the municipality of Jaboatão dos Guararapes, Pernambuco, Brazil. An epidemiological survey was conducted in the municipality, and positive filariasis cases identified in this survey were georeferenced in point form, using the GPS. A kernel intensity estimator was applied to identify clusters with greater intensity of cases. We examined 23 673 individuals and 323 individuals with microfilaremia were identified, representing a mean prevalence rate of 1.4%. Around 88% of the districts surveyed presented cases of filarial infection, with prevalences of 0–5.6%. The male population was more affected by the infection, with 63.8% of the cases (P<0.005). Positive cases were found in all age groups examined. The kernel intensity estimator identified the areas of greatest intensity and least intensity of filarial infection cases. The case distribution was heterogeneous across the municipality. The kernel estimator identified spatial clusters of cases, thus indicating locations with greater intensity of transmission. The main advantage of this type of analysis lies in its ability to rapidly and easily show areas with the highest concentration of cases, thereby contributing towards planning, monitoring, and surveillance of filariasis elimination actions. Incorporation of geoprocessing and spatial analysis techniques constitutes an important tool for use within the GPELF. PMID:22943547

  5. Studies of fatty acid composition, physicochemical and thermal properties, and crystallization behavior of mango kernel fats from various Thai varieties.

    PubMed

    Sonwai, Sopark; Ponprachanuvut, Punnee

    2014-01-01

    Mango kernel fat (MKF) has received attention in recent years due to the resemblance between its characteristics and those of cocoa butter (CB). In this work, fatty acid (FA) composition, physicochemical and thermal properties and crystallization behavior of MKFs obtained from four varieties of Thai mangoes: Keaw-Morakot (KM), Keaw-Sawoey (KS), Nam-Dokmai (ND) and Aok-Rong (AR), were characterized. The fat content of the mango kernels was 6.40, 5.78, 5.73 and 7.74% (dry basis) for KM, KS, ND and AR, respectively. The analysis of FA composition revealed that all four cultivars had oleic and stearic acids as the main FA components with ND and AR exhibiting highest and lowest stearic acid content, respectively. ND had the highest slip melting point and solid fat content (SFC) followed by KS, KM and AR. All fat samples exhibited high SFC at 20℃ and below. They melted slowly as the temperature increased and became complete liquids as the temperature approached 35°C. During static isothermal crystallization at 20°C, ND displayed the highest Avrami rate constant k followed by KS, KM and AR, indicating that the crystallization was fastest for ND and slowest for AR. The Avrami exponent n of all samples ranged from 0.89 to 1.73. The x-ray diffraction analysis showed that all MKFs crystallized into a mixture of pseudo-β', β', sub-β and β structures with β' being the predominant polymorph. Finally, the crystals of the kernel fats from all mango varieties exhibited spherulitic morphology.

  6. Reply to Comments to X. Li and Y. M. Wang (2011) Comparisons of geoid models over Alaska computed with different Stokes' kernel modifications, JGS 1(2): 136-142 by L. E. Sjöberg

    NASA Astrophysics Data System (ADS)

    Wang, Y.

    2012-01-01

    The authors thank professor Sjöberg for having interest in our paper. The main goal of the paper is to test kernel modification methods used in geoid computations. Our tests found that Vanicek/Kleusberg's and Featherstone's methods fit the GPS/leveling data the best in the relative sense at various cap sizes. At the same time, we also pointed out that their methods are unstable and the mean values change from dm to meters by just changing the cap size. By contrast, the modification of the Wong and Gore type (including the spectral combination, method of Heck and Grüninger) is stable and insensitive to the truncation degree and cap size. This feature is especially useful when we know the accuracy of the gravity field at different frequency bands. For instance, it is advisable to truncate Stokes' kernel at a degree to which the satellite model is believed to be more accurate than surface data. The method of the Wong and Goretype does this job quite well. In contrast, the low degrees of Stokes' kernel are modified by Molodensky's coefficients tn in Vanicek/Kleusberg's and Featherstone's methods (cf. Eq. (6) in Li and Wang (2011)). It implies that the low degree gravity field of the reference model will be altered by less accurate surface data in the final geoid. This is also the cause of the larger variation in mean values of the geoid.

  7. An Introduction to the Historical Development of Black English: Some Implications for American Education.

    ERIC Educational Resources Information Center

    Taylor, Orlando L.

    In discussing the rich linguistic history of Afro-Americans, the author points out that black people had a linguistic system when they came to the New World and frequently had a knowledge of a form of English which had been influenced by Black Portuguese and West African languages. Despite many assertions to the contrary, Black English, "the…

  8. A distance-driven deconvolution method for CT image-resolution improvement

    NASA Astrophysics Data System (ADS)

    Han, Seokmin; Choi, Kihwan; Yoo, Sang Wook; Yi, Jonghyon

    2016-12-01

    The purpose of this research is to achieve high spatial resolution in CT (computed tomography) images without hardware modification. The main idea is to consider geometry optics model, which can provide the approximate blurring PSF (point spread function) kernel, which varies according to the distance from the X-ray tube to each point. The FOV (field of view) is divided into several band regions based on the distance from the X-ray source, and each region is deconvolved with a different deconvolution kernel. As the number of subbands increases, the overshoot of the MTF (modulation transfer function) curve increases first. After that, the overshoot begins to decrease while still showing a larger MTF than the normal FBP (filtered backprojection). The case of five subbands seems to show balanced performance between MTF boost and overshoot minimization. It can be seen that, as the number of subbands increases, the noise (STD) can be seen to show a tendency to decrease. The results shows that spatial resolution in CT images can be improved without using high-resolution detectors or focal spot wobbling. The proposed algorithm shows promising results in improving spatial resolution while avoiding excessive noise boost.

  9. Kernel-based discriminant feature extraction using a representative dataset

    NASA Astrophysics Data System (ADS)

    Li, Honglin; Sancho Gomez, Jose-Luis; Ahalt, Stanley C.

    2002-07-01

    Discriminant Feature Extraction (DFE) is widely recognized as an important pre-processing step in classification applications. Most DFE algorithms are linear and thus can only explore the linear discriminant information among the different classes. Recently, there has been several promising attempts to develop nonlinear DFE algorithms, among which is Kernel-based Feature Extraction (KFE). The efficacy of KFE has been experimentally verified by both synthetic data and real problems. However, KFE has some known limitations. First, KFE does not work well for strongly overlapped data. Second, KFE employs all of the training set samples during the feature extraction phase, which can result in significant computation when applied to very large datasets. Finally, KFE can result in overfitting. In this paper, we propose a substantial improvement to KFE that overcomes the above limitations by using a representative dataset, which consists of critical points that are generated from data-editing techniques and centroid points that are determined by using the Frequency Sensitive Competitive Learning (FSCL) algorithm. Experiments show that this new KFE algorithm performs well on significantly overlapped datasets, and it also reduces computational complexity. Further, by controlling the number of centroids, the overfitting problem can be effectively alleviated.

  10. Nonlinear Deep Kernel Learning for Image Annotation.

    PubMed

    Jiu, Mingyuan; Sahbi, Hichem

    2017-02-08

    Multiple kernel learning (MKL) is a widely used technique for kernel design. Its principle consists in learning, for a given support vector classifier, the most suitable convex (or sparse) linear combination of standard elementary kernels. However, these combinations are shallow and often powerless to capture the actual similarity between highly semantic data, especially for challenging classification tasks such as image annotation. In this paper, we redefine multiple kernels using deep multi-layer networks. In this new contribution, a deep multiple kernel is recursively defined as a multi-layered combination of nonlinear activation functions, each one involves a combination of several elementary or intermediate kernels, and results into a positive semi-definite deep kernel. We propose four different frameworks in order to learn the weights of these networks: supervised, unsupervised, kernel-based semisupervised and Laplacian-based semi-supervised. When plugged into support vector machines (SVMs), the resulting deep kernel networks show clear gain, compared to several shallow kernels for the task of image annotation. Extensive experiments and analysis on the challenging ImageCLEF photo annotation benchmark, the COREL5k database and the Banana dataset validate the effectiveness of the proposed method.

  11. Multineuron spike train analysis with R-convolution linear combination kernel.

    PubMed

    Tezuka, Taro

    2018-06-01

    A spike train kernel provides an effective way of decoding information represented by a spike train. Some spike train kernels have been extended to multineuron spike trains, which are simultaneously recorded spike trains obtained from multiple neurons. However, most of these multineuron extensions were carried out in a kernel-specific manner. In this paper, a general framework is proposed for extending any single-neuron spike train kernel to multineuron spike trains, based on the R-convolution kernel. Special subclasses of the proposed R-convolution linear combination kernel are explored. These subclasses have a smaller number of parameters and make optimization tractable when the size of data is limited. The proposed kernel was evaluated using Gaussian process regression for multineuron spike trains recorded from an animal brain. It was compared with the sum kernel and the population Spikernel, which are existing ways of decoding multineuron spike trains using kernels. The results showed that the proposed approach performs better than these kernels and also other commonly used neural decoding methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Study on Energy Productivity Ratio (EPR) at palm kernel oil processing factory: case study on PT-X at Sumatera Utara Plantation

    NASA Astrophysics Data System (ADS)

    Haryanto, B.; Bukit, R. Br; Situmeang, E. M.; Christina, E. P.; Pandiangan, F.

    2018-02-01

    The purpose of this study was to determine the performance, productivity and feasibility of the operation of palm kernel processing plant based on Energy Productivity Ratio (EPR). EPR is expressed as the ratio of output to input energy and by-product. Palm Kernel plan is process in palm kernel to become palm kernel oil. The procedure started from collecting data needed as energy input such as: palm kernel prices, energy demand and depreciation of the factory. The energy output and its by-product comprise the whole production price such as: palm kernel oil price and the remaining products such as shells and pulp price. Calculation the equality of energy of palm kernel oil is to analyze the value of Energy Productivity Ratio (EPR) bases on processing capacity per year. The investigation has been done in Kernel Oil Processing Plant PT-X at Sumatera Utara plantation. The value of EPR was 1.54 (EPR > 1), which indicated that the processing of palm kernel into palm kernel oil is feasible to be operated based on the energy productivity.

  13. Photon orbits and thermodynamic phase transition of d -dimensional charged AdS black holes

    NASA Astrophysics Data System (ADS)

    Wei, Shao-Wen; Liu, Yu-Xiao

    2018-05-01

    We study the relationship between the null geodesics and thermodynamic phase transition for the charged AdS black hole. In the reduced parameter space, we find that there exist nonmonotonic behaviors of the photon sphere radius and the minimum impact parameter for the pressure below its critical value. The study also shows that the changes of the photon sphere radius and the minimum impact parameter can serve as order parameters for the small-large black hole phase transition. In particular, these changes have an universal exponent of 1/2 near the critical point for any dimension d of spacetime. These results imply that there may exist universal critical behavior of gravity near the thermodynamic critical point of the black hole system.

  14. Dynamic fisheye grids for binary black hole simulations

    NASA Astrophysics Data System (ADS)

    Zilhão, Miguel; Noble, Scott C.

    2014-03-01

    We present a new warped gridding scheme adapted to simulating gas dynamics in binary black hole spacetimes. The grid concentrates grid points in the vicinity of each black hole to resolve the smaller scale structures there, and rarefies grid points away from each black hole to keep the overall problem size at a practical level. In this respect, our system can be thought of as a ‘double’ version of the fisheye coordinate system, used before in numerical relativity codes for evolving binary black holes. The gridding scheme is constructed as a mapping between a uniform coordinate system—in which the equations of motion are solved—to the distorted system representing the spatial locations of our grid points. Since we are motivated to eventually use this system for circumbinary disc calculations, we demonstrate how the distorted system can be constructed to asymptote to the typical spherical polar coordinate system, amenable to efficiently simulating orbiting gas flows about central objects with little numerical diffusion. We discuss its implementation in the Harm3d code, tailored to evolve the magnetohydrodynamics equations in curved spacetimes. We evaluate the performance of the system’s implementation in Harm3d with a series of tests, such as the advected magnetic field loop test, magnetized Bondi accretion, and evolutions of hydrodynamic discs about a single black hole and about a binary black hole. Like we have done with Harm3d, this gridding scheme can be implemented in other unigrid codes as a (possibly) simpler alternative to adaptive mesh refinement.

  15. The black hole interior and the type II Weyl fermions

    NASA Astrophysics Data System (ADS)

    Zubkov, M. A.

    2018-03-01

    It was proposed recently that the black hole may undergo a transition to the state, where inside the horizon the Fermi surface is formed that reveals an analogy with the recently discovered type II Weyl semimetals. In this scenario, the low energy effective theory outside of the horizon is the Standard Model, which describes excitations that reside near a certain point P(0) in momentum space of the hypothetical unified theory. Inside the horizon the low energy physics is due to the excitations that reside at the points in momentum space close to the Fermi surface. We argue that those points may be essentially distant from P(0) and, therefore, inside the black hole the quantum states are involved in the low energy dynamics that are not described by the Standard Model. We analyze the consequences of this observation for the physics of the black holes and present the model based on the direct analogy with the type II Weyl semimetals, which illustrates this pattern.

  16. The 21.5-kDa isoform of myelin basic protein has a non-traditional PY-nuclear-localization signal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Graham S.T.; Seymour, Lauren V.; Boggs, Joan M.

    2012-06-15

    Highlights: Black-Right-Pointing-Pointer Full-length 21.5-kDa MBP isoform is translocated to the nucleus. Black-Right-Pointing-Pointer We hypothesized that the exon-II-encoded sequence contained the NLS. Black-Right-Pointing-Pointer We mutated this sequence in RFP-tagged constructs and transfected N19-cells. Black-Right-Pointing-Pointer Abolition of two key positively-charged residues resulted in loss of nuclear-trafficking. Black-Right-Pointing-Pointer The 21.5-kDa isoform of classic MBP contains a non-traditional PY-NLS. -- Abstract: The predominant 18.5-kDa classic myelin basic protein (MBP) is mainly responsible for compaction of the myelin sheath in the central nervous system, but is multifunctional, having numerous interactions with Ca{sup 2+}-calmodulin, actin, tubulin, and SH3-domains, and can tether these proteins to a lipidmore » membrane in vitro. The full-length 21.5-kDa MBP isoform has an additional 26 residues encoded by exon-II of the classic gene, which causes it to be trafficked to the nucleus of oligodendrocytes (OLGs). We have performed site-directed mutagenesis of selected residues within this segment in red fluorescent protein (RFP)-tagged constructs, which were then transfected into the immortalized N19-OLG cell line to view protein localization using epifluorescence microscopy. We found that 21.5-kDa MBP contains two non-traditional PY-nuclear-localization signals, and that arginine and lysine residues within these motifs were involved in subcellular trafficking of this protein to the nucleus, where it may have functional roles during myelinogenesis.« less

  17. Daintain/AIF-1 (Allograft Inflammatory Factor-1) accelerates type 1 diabetes in NOD mice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Yan-Ying, E-mail: biozyy@163.com; Huang, Xin-Yuan; Chen, Zheng-Wang

    Highlights: Black-Right-Pointing-Pointer Daintain/AIF-1 is over-expressed in the blood of NOD mice suffering from insulitis. Black-Right-Pointing-Pointer Daintain/AIF-1 stimulates white blood cell proliferation in NOD mice. Black-Right-Pointing-Pointer Daintain/AIF-1 increases blood glucose levels and triggers type 1 diabetes. Black-Right-Pointing-Pointer Daintain/AIF-1 accelerates insulitis, while its antibody prevents insulitis. Black-Right-Pointing-Pointer Daintain/AIF-1 enhances the levels of nitric oxide in the pancreases of NOD mice. -- Abstract: A large body of experimental evidence suggests that cytokines trigger pancreatic {beta}-cell death in type 1 diabetes mellitus. Daintain/AIF-1 (Allograft Inflammatory Factor-1), a specific marker for activated macrophages, is accumulated in the pancreatic islets of pre-diabetic BB rats. In themore » present study, we demonstrate that daintain/AIF-1 is released into blood and the levels of daintain/AIF-1 in the blood of type 1 diabetes-prone non-obese diabetic (NOD) mice suffering from insulitis are significantly higher than that in healthy NOD mice. When injected intravenously into NOD mice, daintain/AIF-1 stimulates white blood cell proliferation, increases the concentrations of blood glucose, impairs insulin expression, up-regulates nitric oxide (NO) production in pancreases and accelerates diabetes in NOD mice, while the antibody against daintain/AIF-1 delays or prevents insulitis in NOD mice. These results imply daintain/AIF-1 triggers type 1 diabetes probably via arousing immune cells activation and induction of NO production in pancreas of NOD mice.« less

  18. UK and Italian EIA systems: A comparative study on management practice and performance in the construction industry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bassi, Andrea, E-mail: ab395@bath.co.uk; Howard, Robert, E-mail: robhoward@constcom.demon.co.uk; Geneletti, Davide, E-mail: davide.geneletti@ing.unitn.it

    This study evaluates and contrasts the management practice and the performance that characterise Environmental Impact Assessments (EIA) in Italy and in the UK. The methodology relies on the investigation of six carefully selected case studies, critically reviewed by referring to EIA and project design information, as well as collecting the opinion of key project participants. The study focuses on the construction industry and on specific key sectors like infrastructure for transport and renewable energy and commercial and tourism development. A main term of reference for the analyses has been established by critically reviewing international literature so as to outline commonmore » good practice, requirements for the enhancement of sustainability principles and typically incurred drawbacks. The proposed approach enhances transfer of knowledge and of experiences between the analyzed contexts and allows the provision of guidelines for practitioners. Distinctive differences between the UK and the Italian EIA systems have been detected for pivotal phases and elements of EIA, like screening, scoping, analysis of alternatives and of potential impacts, definition of mitigation strategies, review, decision making, public participation and follow up. - Highlights: Black-Right-Pointing-Pointer The Italian and the UK Environmental Impact Assessment systems are compared. Black-Right-Pointing-Pointer The research is centred on the construction industry. Black-Right-Pointing-Pointer Issues and shortcomings are analysed by investigating six case studies. Black-Right-Pointing-Pointer Integration of EIA with sustainability principles is appraised. Black-Right-Pointing-Pointer General guidelines are provided to assist practitioners in the two national contexts.« less

  19. Nitrogen management in landfill leachate: Application of SHARON, ANAMMOX and combined SHARON-ANAMMOX process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sri Shalini, S., E-mail: srishalini10@gmail.com; Joseph, Kurian, E-mail: kuttiani@gmail.com

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer Significant research on ammonia removal from leachate by SHARON and ANAMMOX process. Black-Right-Pointing-Pointer Operational parameters, microbiology, biochemistry and application of the process. Black-Right-Pointing-Pointer SHARON-ANAMMOX process for leachate a new research and this paper gives wide facts. Black-Right-Pointing-Pointer Cost-effective process, alternative to existing technologies for leachate treatment. Black-Right-Pointing-Pointer Address the issues and operational conditions for application in leachate treatment. - Abstract: In today's context of waste management, landfilling of Municipal Solid Waste (MSW) is considered to be one of the standard practices worldwide. Leachate generated from municipal landfills has become a great threat to the surroundings as it containsmore » high concentration of organics, ammonia and other toxic pollutants. Emphasis has to be placed on the removal of ammonia nitrogen in particular, derived from the nitrogen content of the MSW and it is a long term pollution problem in landfills which determines when the landfill can be considered stable. Several biological processes are available for the removal of ammonia but novel processes such as the Single Reactor System for High Activity Ammonia Removal over Nitrite (SHARON) and Anaerobic Ammonium Oxidation (ANAMMOX) process have great potential and several advantages over conventional processes. The combined SHARON-ANAMMOX process for municipal landfill leachate treatment is a new, innovative and significant approach that requires more research to identify and solve critical issues. This review addresses the operational parameters, microbiology, biochemistry and application of both the processes to remove ammonia from leachate.« less

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morita, Takahiro; Satoh, Ryosuke; Japan Society for the Promotion of Science, 1-8 Chiyoda-ku, Tokyo 102-8472

    Highlights: Black-Right-Pointing-Pointer Stress granules (SGs) as a mechanism of doxorubicin tolerance. Black-Right-Pointing-Pointer We characterize the role of stress granules in doxorubicin tolerance. Black-Right-Pointing-Pointer Deletion of components of SGs enhances doxorubicin sensitivity in fission yeast. Black-Right-Pointing-Pointer Doxorubicin promotes SG formation when combined with heat shock. Black-Right-Pointing-Pointer Doxorubicin regulates stress granule assembly independent of eIF2{alpha} phosphorylation. -- Abstract: Doxorubicin is an anthracycline antibiotic widely used for chemotherapy. Although doxorubicin is effective in the treatment of several cancers, including solid tumors and leukemias, the basis of its mechanism of action is not completely understood. Here, we describe the effects of doxorubicin and itsmore » relationship with stress granules formation in the fission yeast, Schizosaccharomyces pombe. We show that disruption of genes encoding the components of stress granules, including vgl1{sup +}, which encodes a multi-KH type RNA-binding protein, and pab1{sup +}, which encodes a poly(A)-binding protein, resulted in greater sensitivity to doxorubicin than seen in wild-type cells. Disruption of the vgl1{sup +} and pab1{sup +} genes did not confer sensitivity to other anti-cancer drugs such as cisplatin, 5-fluorouracil, and paclitaxel. We also showed that doxorubicin treatment promoted stress granule formation when combined with heat shock. Notably, doxorubicin treatment did not induce hyperphosphorylation of eIF2{alpha}, suggesting that doxorubicin is involved in stress granule assembly independent of eIF2{alpha} phosphorylation. Our results demonstrate the usefulness of fission yeast for elucidating the molecular targets of doxorubicin toxicity and suggest a novel drug-resistance mechanism involving stress granule assembly.« less

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pariona, Moises Meza, E-mail: mmpariona@uepg.br; Teleginski, Viviane; Santos, Kelly dos

    Laser beam welding has recently been incorporated into the fabrication process of aircraft and automobile structures. Surface roughness is an important parameter of product quality that strongly affects the performance of mechanical parts, as well as production costs. This parameter influences the mechanical properties such as fatigue behavior, corrosion resistance, creep life, etc., and other functional characteristics such as friction, wear, light reflection, heat transmission, lubrification, electrical conductivity, etc. The effects of laser surface remelting (LSR) on the morphology of Al-Fe aerospace alloys were examined before and after surface treatments, using optical microscopy (OM), scanning electron microscopy (SEM), low-angle X-raymore » diffraction (LA-XRD), atomic force microscopy (AFM), microhardness measurements (Vickers hardness), and cyclic voltammetry. This analysis was performed on both laser-treated and untreated sanded surfaces, revealing significant differences. The LA-XRD analysis revealed the presence of alumina, simple metals and metastable intermetallic phases, which considerably improved the microhardness of laser-remelted surfaces. The morphology produced by laser surface remelting enhanced the microstructure of the Al-Fe alloys by reducing their roughness and increasing their hardness. The treated surfaces showed passivity and stability characteristics in the electrolytic medium employed in this study. - Highlights: Black-Right-Pointing-Pointer The samples laser-treated and untreated showed significant differences. Black-Right-Pointing-Pointer The La-XRD revealed the presence of alumina in Al-1.5 wt.% Fe. Black-Right-Pointing-Pointer The laser-treated reducing the roughness and increasing the hardness. Black-Right-Pointing-Pointer The laser-treated surfaces showed characteristic passive in the electrolytic medium. Black-Right-Pointing-Pointer The laser-treated is a promising technique for applications technological.« less

  2. SiRNAs conjugated with aromatic compounds induce RISC-mediated antisense strand selection and strong gene-silencing activity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kubo, Takanori, E-mail: kubo-t@yasuda-u.ac.jp; Yanagihara, Kazuyoshi; Division of Genetics, National Cancer Center Research Institute, 5-1-1 Tsukiji, Chuo-ku, Tokyo 104-0045

    2012-10-05

    Highlights: Black-Right-Pointing-Pointer SiRNAs conjugated with aromatic compounds (Ar-siRNAs) at 5 Prime -sense strand were synthesized. Black-Right-Pointing-Pointer Ar-siRNAs increased resistance against nuclease degradation. Black-Right-Pointing-Pointer Ar-siRNAs were thermodynamically stable compared with the unmodified siRNA. Black-Right-Pointing-Pointer High levels of cellular uptake and cytoplasmic localization were found. Black-Right-Pointing-Pointer Strong gene-silencing efficacy was exhibited in the Ar-siRNAs. -- Abstract: Short interference RNA (siRNA) is a powerful tool for suppressing gene expression in mammalian cells. In this study, we focused on the development of siRNAs conjugated with aromatic compounds in order to improve the potency of RNAi and thus to overcome several problems with siRNAs, suchmore » as cellular delivery and nuclease stability. The siRNAs conjugated with phenyl, hydroxyphenyl, naphthyl, and pyrenyl derivatives showed strong resistance to nuclease degradation, and were thermodynamically stable compared with unmodified siRNA. A high level of membrane permeability in HeLa cells was also observed. Moreover, these siRNAs exhibited enhanced RNAi efficacy, which exceeded that of locked nucleic acid (LNA)-modified siRNAs, against exogenous Renilla luciferase in HeLa cells. In particular, abundant cytoplasmic localization and strong gene-silencing efficacy were found in the siRNAs conjugated with phenyl and hydroxyphenyl derivatives. The novel siRNAs conjugated with aromatic compounds are promising candidates for a new generation of modified siRNAs that can solve many of the problems associated with RNAi technology.« less

  3. Metformin induces differentiation in acute promyelocytic leukemia by activating the MEK/ERK signaling pathway

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huai, Lei; Wang, Cuicui; Zhang, Cuiping

    2012-06-08

    Highlights: Black-Right-Pointing-Pointer Metformin induces differentiation in NB4 and primary APL cells. Black-Right-Pointing-Pointer Metformin induces activation of the MEK/ERK signaling pathway in APL cells. Black-Right-Pointing-Pointer Metformin synergizes with ATRA to trigger maturation of NB4 and primary APL cells. Black-Right-Pointing-Pointer Metformin induces the relocalization and degradation of the PML-RAR{alpha} fusion protein. Black-Right-Pointing-Pointer The study may be applicable for new differentiation therapy in cancer treatment. -- Abstract: Recent studies have shown that metformin, a widely used antidiabetic agent, may reduce the risk of cancer development. In this study, we investigated the antitumoral effect of metformin on both acute myeloid leukemia (AML) and acutemore » promyelocytic leukemia (APL) cells. Metformin induced apoptosis with partial differentiation in an APL cell line, NB4, but only displayed a proapoptotic effect on several non-M3 AML cell lines. Further analysis revealed that a strong synergistic effect existed between metformin and all-trans retinoic acid (ATRA) during APL cell maturation and that metformin induced the hyperphosphorylation of extracellular signal-regulated kinase (ERK) in APL cells. U0126, a specific MEK/ERK activation inhibitor, abrogated metformin-induced differentiation. Finally, we found that metformin induced the degradation of the oncoproteins PML-RAR{alpha} and c-Myc and activated caspase-3. In conclusion, these results suggest that metformin treatment may contribute to the enhancement of ATRA-induced differentiation in APL, which may deepen the understanding of APL maturation and thus provide insight for new therapy strategies.« less

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biasotto, G.; Simoes, A.Z., E-mail: alezipo@yahoo.com; Foschini, C.R.

    Highlights: Black-Right-Pointing-Pointer BiFeO{sub 3} (BFO) nanoparticles were grown by hydrothermal microwave method (HTMW). Black-Right-Pointing-Pointer The soaking time is effective in improving phase formation. Black-Right-Pointing-Pointer Rietveld refinement reveals an orthorhombic structure. Black-Right-Pointing-Pointer The observed magnetism of the BFO crystallites is a consequence of particle size. Black-Right-Pointing-Pointer The HTMW is a genuine technique for low temperatures and short times of synthesis. -- Abstract: Hydrothermal microwave method (HTMW) was used to synthesize crystalline bismuth ferrite (BiFeO{sub 3}) nanoparticles (BFO) in the temperature of 180 Degree-Sign C with times ranging from 5 min to 1 h. BFO nanoparticles were characterized by means of X-raymore » analyses, FT-IR, Raman spectroscopy, TG-DTA and FE-SEM. X-ray diffraction results indicated that longer soaking time was benefit to refraining the formation of any impurity phases and growing BFO crystallites into almost single-phase perovskites. Typical FT-IR spectra for BFO nanoparticles presented well defined bands, indicating a substantial short-range order in the system. TG-DTA analyses confirmed the presence of lattice OH{sup -} groups, commonly found in materials obtained by HTMW process. Compared with the conventional solid-state reaction process, submicron BFO crystallites with better homogeneity could be produced at the temperature as low as 180 Degree-Sign C. These results show that the HTMW synthesis route is rapid, cost effective, and could be used as an alternative to obtain BFO nanoparticles in the temperature of 180 Degree-Sign C for 1 h.« less

  5. Twin nucleation and migration in FeCr single crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, L.; Abuzaid, Wael; Sehitoglu, Huseyin, E-mail: huseyin@illinois.edu

    2013-01-15

    Tension and compression experiments were conducted on body-centered cubic Fe -47.8 at pct. Cr single crystals. The critical resolved shear stress (CRSS) magnitudes for slip nucleation, twin nucleation and twin migration were established. We show that the nucleation of slip occurs at a CRSS of about 88 MPa, while twinning nucleates at a CRSS of about 191 MPa with an associated load drop. Following twin nucleation, twin migration proceeds at a CRSS that is lower than the initiation stress ( Almost-Equal-To 114-153 MPa). The experimental results of the nucleation stresses indicate that the Schmid law holds to a first approximationmore » for the slip and twin nucleation cases, but to a lesser extent for twin migration particularly when considerable slip strains preceded twinning. The CRSSs were determined experimentally using digital image correlation (DIC) in conjunction with electron back scattering diffraction (EBSD). The DIC measurements enabled pinpointing the precise stress on the stress-strain curves where twins or slip were activated. The crystal orientations were obtained using EBSD and used to determine the activated twin and slip systems through trace analysis. - Highlights: Black-Right-Pointing-Pointer Digital image correlation allows to capture slip/twin initiation for bcc FeCr. Black-Right-Pointing-Pointer Crystal orientations from EBSD allow slip/twin system indexing. Black-Right-Pointing-Pointer Nucleation of slip always precedes twinning. Black-Right-Pointing-Pointer Twin growth is sustained with a lower stress than required for nucleation. Black-Right-Pointing-Pointer Twin-slip interactions provide high hardening at the onset of plasticity.« less

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Yilin; Yang, Yang; Cai, Yanyan

    Highlights: Black-Right-Pointing-Pointer We demonstrated that HBV represses MIA2 gene expression both invitro and in vivo. Black-Right-Pointing-Pointer The X protein of HBV plays a major role in such regulation. Black-Right-Pointing-Pointer Knock-down of MIA2 in HepG2 cells activates cell growth and proliferation. Black-Right-Pointing-Pointer HBx activates cell proliferation, over-expression of MIA2 impaired such regulation. Black-Right-Pointing-Pointer HBx activates hepatoma cell proliferation through repressing MIA2 expression. -- Abstract: Hepatocellular carcinoma (HCC) is the fourth leading cause of cancer deaths globally. Chronic hepatitis B virus (HBV) infection accounts for over 75% of all HCC cases; however, the molecular pathogenesis of HCC is not well understood. Inmore » this study, we found that the expression of the newly identified gene melanoma inhibitory activity 2 (MIA2) was reduced by HBV infection invitro and invivo, and that HBV X protein (HBx) plays a major role in this regulation. Recent studies have revealed that MIA2 is a potential tumor suppressor, and that, in most HCCs, MIA2 expression is down-regulated or lost. We found that the knock-down of MIA2 in HepG2 cells activated cell growth and proliferation, suggesting that MIA2 inhibits HCC cell growth and proliferation. In addition, the over-expression of HBx alone induced cell proliferation, whereas MIA2 over-expression impaired the HBx-mediated induction of proliferation. Taken together, our results suggest that HBx activates hepatoma cell growth and proliferation through repression of the potential tumor suppressor MIA2.« less

  7. Recombinant expression and solution structure of antimicrobial peptide aurelin from jellyfish Aurelia aurita

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shenkarev, Zakhar O.; Panteleev, Pavel V.; Balandin, Sergey V.

    Highlights: Black-Right-Pointing-Pointer Aurelin was overexpressed in Escherichia coli, and its spatial structure was studied by NMR. Black-Right-Pointing-Pointer Aurelin compact structure encloses helical regions cross-linked by three disulfide bonds. Black-Right-Pointing-Pointer Aurelin shows structural homology to the BgK and ShK toxins of sea anemones. Black-Right-Pointing-Pointer Aurelin binds to the anionic lipid vesicles, but does not interact with zwitterionic ones. Black-Right-Pointing-Pointer Aurelin binds to DPC micelle surface with moderate affinity via two helical regions. -- Abstract: Aurelin is a 40-residue cationic antimicrobial peptide isolated from the mezoglea of a scyphoid jellyfish Aurelia aurita. Aurelin and its {sup 15}N-labeled analogue were overexpressed in Escherichiamore » coli and purified. Antimicrobial activity of the recombinant peptide was examined, and its spatial structure was studied by NMR spectroscopy. Aurelin represents a compact globule, enclosing one 3{sub 10}-helix and two {alpha}-helical regions cross-linked by three disulfide bonds. The peptide binds to anionic lipid (POPC/DOPG, 3:1) vesicles even at physiological salt concentration, it does not interact with zwitterionic (POPC) vesicles and interacts with the DPC micelle surface with moderate affinity via two {alpha}-helical regions. Although aurelin shows structural homology to the BgK and ShK toxins of sea anemones, its surface does not possess the 'functional dyad' required for the high-affinity interaction with the K{sup +}-channels. The obtained data permit to correlate the modest antibacterial properties and membrane activity of aurelin.« less

  8. Unitary cocycle representations of the Galilean line group: Quantum mechanical principle of equivalence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MacGregor, B.R.; McCoy, A.E.; Wickramasekara, S., E-mail: wickrama@grinnell.edu

    2012-09-15

    We present a formalism of Galilean quantum mechanics in non-inertial reference frames and discuss its implications for the equivalence principle. This extension of quantum mechanics rests on the Galilean line group, the semidirect product of the real line and the group of analytic functions from the real line to the Euclidean group in three dimensions. This group provides transformations between all inertial and non-inertial reference frames and contains the Galilei group as a subgroup. We construct a certain class of unitary representations of the Galilean line group and show that these representations determine the structure of quantum mechanics in non-inertialmore » reference frames. Our representations of the Galilean line group contain the usual unitary projective representations of the Galilei group, but have a more intricate cocycle structure. The transformation formula for the Hamiltonian under the Galilean line group shows that in a non-inertial reference frame it acquires a fictitious potential energy term that is proportional to the inertial mass, suggesting the equivalence of inertial mass and gravitational mass in quantum mechanics. - Highlights: Black-Right-Pointing-Pointer A formulation of Galilean quantum mechanics in non-inertial reference frames is given. Black-Right-Pointing-Pointer The key concept is the Galilean line group, an infinite dimensional group. Black-Right-Pointing-Pointer Unitary, cocycle representations of the Galilean line group are constructed. Black-Right-Pointing-Pointer A non-central extension of the group underlies these representations. Black-Right-Pointing-Pointer Quantum equivalence principle and gravity emerge from these representations.« less

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scolozzi, Rocco, E-mail: rocco.scolozzi@fmach.it; Geneletti, Davide, E-mail: geneletti@ing.unitn.it

    Habitat loss and fragmentation are often concurrent to land conversion and urbanization. Simple application of GIS-based landscape pattern indicators may be not sufficient to support meaningful biodiversity impact assessment. A review of the literature reveals that habitat definition and habitat fragmentation are frequently inadequately considered in environmental assessment, notwithstanding the increasing number of tools and approaches reported in the landscape ecology literature. This paper presents an approach for assessing impacts on habitats on a local scale, where availability of species data is often limited, developed for an alpine valley in northern Italy. The perspective of the methodology is multiple scalemore » and species-oriented, and provides both qualitative and quantitative definitions of impact significance. A qualitative decision model is used to assess ecological values in order to support land-use decisions at the local level. Building on recent studies in the same region, the methodology integrates various approaches, such as landscape graphs, object-oriented rule-based habitat assessment and expert knowledge. The results provide insights into future habitat loss and fragmentation caused by land-use changes, and aim at supporting decision-making in planning and suggesting possible ecological compensation. - Highlights: Black-Right-Pointing-Pointer Many environmental assessments inadequately consider habitat loss and fragmentation. Black-Right-Pointing-Pointer Species-perspective for defining habitat quality and connectivity is claimed. Black-Right-Pointing-Pointer Species-based tools are difficult to be applied with limited availability of data. Black-Right-Pointing-Pointer We propose a species-oriented and multiple scale-based qualitative approach. Black-Right-Pointing-Pointer Advantages include being species-oriented and providing value-based information.« less

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Era, Saho; Radiation Genetics, Graduate School of Medicine, Kyoto University, Yoshida-Konoe, Sakyo-ku, Kyoto 606-8501; Abe, Takuya

    Highlights: Black-Right-Pointing-Pointer SENP1 knockout chicken DT40 cells are hypersensitive to spindle poisons. Black-Right-Pointing-Pointer Spindle poison treatment of SENP1{sup -/-} cells leads to increased mitotic slippage. Black-Right-Pointing-Pointer Mitotic slippage in SENP1{sup -/-} cells associates with apoptosis and endoreplication. Black-Right-Pointing-Pointer SENP1 counteracts sister chromatid separation during mitotic arrest. Black-Right-Pointing-Pointer Plk1-mediated cohesion down-regulation is involved in colcemid cytotoxicity. -- Abstract: SUMO conjugation is a reversible posttranslational modification that regulates protein function. SENP1 is one of the six SUMO-specific proteases present in vertebrate cells and its altered expression is observed in several carcinomas. To characterize SENP1 role in genome integrity, we generated Senp1 knockoutmore » chicken DT40 cells. SENP1{sup -/-} cells show normal proliferation, but are sensitive to spindle poisons. This hypersensitivity correlates with increased sister chromatid separation, mitotic slippage, and apoptosis. To test whether the cohesion defect had a causal relationship with the observed mitotic events, we restored the cohesive status of sister chromatids by introducing the TOP2{alpha}{sup +/-} mutation, which leads to increased catenation, or by inhibiting Plk1 and Aurora B kinases that promote cohesin release from chromosomes during prolonged mitotic arrest. Although TOP2{alpha} is SUMOylated during mitosis, the TOP2{alpha}{sup +/-} mutation had no obvious effect. By contrast, inhibition of Plk1 or Aurora B rescued the hypersensitivity of SENP1{sup -/-} cells to colcemid. In conclusion, we identify SENP1 as a novel factor required for mitotic arrest and cohesion maintenance during prolonged mitotic arrest induced by spindle poisons.« less

  11. Interaction of Berberine derivative with protein POT1 affect telomere function in cancer cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Nannan; Chen, Siqi; Ma, Yan

    Highlights: Black-Right-Pointing-Pointer The protein POT1 plays an important role in telomere protection. Black-Right-Pointing-Pointer Functional POT1 was overexpressed in Escherichia coli for the first time, and purified. Black-Right-Pointing-Pointer Compound Sysu-00692 was found to be the first POT1-binding ligand. Black-Right-Pointing-Pointer Sysu-00692 could interfere with the binding activity of POT1 in vivo. Black-Right-Pointing-Pointer Sysu-00692 had inhibition on telomerase and cell proliferation. -- Abstract: The protein POT1 plays an important role in telomere protection, which is related with telomere elongation and cell immortality. The protein has been recognized as a promising drug target for cancer treatment. In the present study, we cloned, overexpressed inmore » Escherichia coli for the first time, and purified recombinant human POT1. The protein was proved to be active through filter binding assay, FRET and CD experiments. In the initial screening for protein binding ligands using SPR, compound Sysu-00692 was found to bind well with the POT1, which was confirmed with EMSA. Its in vivo activity study showed that compound Sysu-00692 could interfere with the binding between human POT1 and the telomeric DNA through chromatin immunoprecipitation. Besides, the compound showed mild inhibition on telomerase and cell proliferation. As we know, compound Sysu-00692 is the first reported POT1-binding ligand, which could serve as a lead compound for further improvement. This work offered a potentially new approach for drug design for the treatment of cancers.« less

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagland, Hanne R.; Nilsson, Linn I.H.; Burri, Lena

    Highlights: Black-Right-Pointing-Pointer We investigated mechanisms of mitochondrial regulation in rat hepatocytes. Black-Right-Pointing-Pointer Tetradecylthioacetic acid (TTA) was employed to activate mitochondrial oxidation. Black-Right-Pointing-Pointer Mitochondrial biogenesis and respiration were induced. Black-Right-Pointing-Pointer It was confirmed that PPAR target genes were induced. Black-Right-Pointing-Pointer The mechanism involved activation mTOR. -- Abstract: The hypolipidemic effect of peroxisome proliferator-activated receptor (PPAR) activators has been explained by increasing mitochondrial fatty acid oxidation, as observed in livers of rats treated with the pan-PPAR activator tetradecylthioacetic acid (TTA). PPAR-activation does, however, not fully explain the metabolic adaptations observed in hepatocytes after treatment with TTA. We therefore characterized the mitochondrial effects,more » and linked this to signalling by the metabolic sensor, the mammalian target of rapamycin (mTOR). In hepatocytes isolated from TTA-treated rats, the changes in cellular content and morphology were consistent with hypertrophy. This was associated with induction of multiple mitochondrial biomarkers, including mitochondrial DNA, citrate synthase and mRNAs of mitochondrial proteins. Transcription analysis further confirmed activation of PPAR{alpha}-associated genes, in addition to genes related to mitochondrial biogenesis and function. Analysis of mitochondrial respiration revealed that the capacity of both electron transport and oxidative phosphorylation were increased. These effects coincided with activation of the stress related factor, ERK1/2, and mTOR. The protein level and phosphorylation of the downstream mTOR actors eIF4G and 4E-BP1 were induced. In summary, TTA increases mitochondrial respiration by inducing hypertrophy and mitochondrial biogenesis in rat hepatocytes, via adaptive regulation of PPARs as well as mTOR.« less

  13. Low-temperature sintering behavior of nanocrystalline indium tin oxide prepared from polymer-containing sols

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koroesi, Laszlo, E-mail: l.korosi@chem.u-szeged.hu; Papp, Szilvia; Oszko, Albert

    2012-04-15

    Highlights: Black-Right-Pointing-Pointer The synthesis of ITO powders and thin films from PVP-containing sols is presented. Black-Right-Pointing-Pointer The nano- and microstructures of ITO are more compact when PVP is used. Black-Right-Pointing-Pointer PVP acts both as a steric stabilizer of the sol and as a pre-sintering agent. Black-Right-Pointing-Pointer The PVP-induced enhanced sintering results in ITO with lower electrical resistance. Black-Right-Pointing-Pointer The surface composition of the ITO films is independent of the initial PVP content. -- Abstract: Indium tin hydroxide (ITH) xerogel powders and thin films with different polyvinylpyrrolidone (PVP) contents (0-22%, w/w) were prepared by a classical sol-gel method. To obtain nanocrystallinemore » indium tin oxide (ITO), the ITH xerogels were calcined at 550 Degree-Sign C. The effect of the initial polymer content on the structure of the ITO powders was studied by means of N{sub 2}-sorption measurements, small-angle X-ray scattering (SAXS), transmission and scanning electron microscopy. The N{sub 2}-sorption measurements revealed that the ITO powders obtained contained micropores and both their porosity and specific surface area decreased with increasing PVP content of the ITH xerogels. The SAXS measurements confirmed the enhanced sintering of the particles in the presence of PVP. The calculated mass fractal dimensions of the ITO powders increased significantly, indicating a significant compaction in structure. The pre-sintered structure could be achieved at relatively low temperature, which induced a significant decreasing (three orders of magnitude) in the electrical resistance of the ITO films.« less

  14. 7 CFR 981.9 - Kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Kernel weight. 981.9 Section 981.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Definitions § 981.9 Kernel weight. Kernel weight means the weight of kernels, including...

  15. An SVM model with hybrid kernels for hydrological time series

    NASA Astrophysics Data System (ADS)

    Wang, C.; Wang, H.; Zhao, X.; Xie, Q.

    2017-12-01

    Support Vector Machine (SVM) models have been widely applied to the forecast of climate/weather and its impact on other environmental variables such as hydrologic response to climate/weather. When using SVM, the choice of the kernel function plays the key role. Conventional SVM models mostly use one single type of kernel function, e.g., radial basis kernel function. Provided that there are several featured kernel functions available, each having its own advantages and drawbacks, a combination of these kernel functions may give more flexibility and robustness to SVM approach, making it suitable for a wide range of application scenarios. This paper presents such a linear combination of radial basis kernel and polynomial kernel for the forecast of monthly flowrate in two gaging stations using SVM approach. The results indicate significant improvement in the accuracy of predicted series compared to the approach with either individual kernel function, thus demonstrating the feasibility and advantages of such hybrid kernel approach for SVM applications.

  16. Thermodynamics of novel charged dilatonic BTZ black holes

    NASA Astrophysics Data System (ADS)

    Dehghani, M.

    2017-10-01

    In this paper, the three-dimensional Einstein-Maxwell theory in the presence of a dilatonic scalar field has been studied. It has been shown that the dilatonic potential must be considered as the linear combination of two Liouville-type potentials. Two new classes of charged dilatonic BTZ black holes, as the exact solutions to the coupled scalar, vector and tensor field equations, have been obtained and their properties have been studied. The conserved charge and mass of the new black holes have been calculated, making use of the Gauss's law and Abbott-Deser proposal, respectively. Through comparison of the thermodynamical extensive quantities (i.e. temperature and entropy) obtained from both, the geometrical and the thermodynamical methods, the validity of the first law of black hole thermodynamics has been confirmed for both of the new black holes we just obtained. A black hole thermal stability or phase transition analysis has been performed, making use of the canonical ensemble method. Regarding the black hole heat capacity, it has been found that for either of the new black hole solutions there are some specific ranges in such a way that the black holes with the horizon radius in these ranges are locally stable. The points of type one and type two phase transitions have been determined. The black holes, with the horizon radius equal to the transition points are unstable. They undergo type one or type two phase transitions to be stabilized.

  17. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Black layers on historical architecture.

    PubMed

    Toniolo, Lucia; Zerbi, Carlotta M; Bugini, Roberto

    2009-03-01

    The external surface of any building in urban polluted environment is unavoidably destined to be covered with layers that assume a grey to black colour and are generally called 'black crusts'. These, according to standard protocols and glossary, are deteriorated surface layers of stone material; they can have variable thickness, are hard and fragile and can detach spontaneously from the substrate, which, in general, is quite decayed. Plain visual examination may lead to consider 'black crusts' all similar, whilst only a careful diagnostic investigation can distinguish 'black crusts' and the consequences of their formation on stone substrates. In this paper, various black layers on marble are studied and compared and the morphological and compositional characteristics discussed according to the related mechanisms of formation. Differences between old (hundred years) and recent crusts (30 years) are investigated and pointed out. Samples of black crusts collected from the Milan Cathedral façade (Candoglia Marble) have been studied and compared with the careful and synergic employ of traditional techniques: optical (transmission and reflected VIS light) and electron microscopy, X-ray spectrometry and micro-Fourier transform infrared spectroscopy. Visual examination of loose fragments does not allow to point out outstanding differences amongst the various samples; black layers have similar main mineral components, gypsum and airborne particles, with different spatial distribution. The microscopic studies allowed to point out the porosity differences, the gypsum crystallisation habit, different amount of embedded particles, level and progress of marble decay. The observations lead to define three main types of black crusts: black crust deriving from marble sulphation, compact deposit and encrustation due to exogenic materials deposition. Black crusts show evidence of sulphation in progress, without a clear continuity solution between crust and marble; the lack of separation is particularly evident in 'recent' crust, where the sulphation process is more active. Black compact deposits show a higher porosity than black crusts because gypsum is not coming from the chemical corrosion of the substrate but from outside; actually, in the former case, the substrate is sound. Encrustations show a highly regular crystal organisation of gypsum (close packed tabular crystals) that cannot be traced back to casual atmospheric deposit or to corrosion of the substrate but rather to the crystallisation of a solution coming from an external source. Also in this case, the marble is sound; evidence of the effect of some protection treatment is pointed out. In spite of the apparent similarity of the examined samples, analytical results have evidenced three main types of black crusts: black crust with decayed substrate, compact deposit and black encrustation showing a sound substrate underneath. Experimental evidence of calcite grains sulphation in progress, taking place according to a model recently proposed, has been observed. Sulphation process is prevented where particular conservation treatments had been applied in the past. New experimental studies can be focussed to understand the specific conditions (measurements of micro-climatic and thermodynamic parameters) and mechanisms for black crusts formation in situ. The problem of the kinetic of the sulphation process of marble, the assessment of black layers formation in the case of different carbonate stone materials and the study of acid attack in presence of surface protecting layers deserve further investigation.

  19. Multiple kernels learning-based biological entity relationship extraction method.

    PubMed

    Dongliang, Xu; Jingchang, Pan; Bailing, Wang

    2017-09-20

    Automatic extracting protein entity interaction information from biomedical literature can help to build protein relation network and design new drugs. There are more than 20 million literature abstracts included in MEDLINE, which is the most authoritative textual database in the field of biomedicine, and follow an exponential growth over time. This frantic expansion of the biomedical literature can often be difficult to absorb or manually analyze. Thus efficient and automated search engines are necessary to efficiently explore the biomedical literature using text mining techniques. The P, R, and F value of tag graph method in Aimed corpus are 50.82, 69.76, and 58.61%, respectively. The P, R, and F value of tag graph kernel method in other four evaluation corpuses are 2-5% higher than that of all-paths graph kernel. And The P, R and F value of feature kernel and tag graph kernel fuse methods is 53.43, 71.62 and 61.30%, respectively. The P, R and F value of feature kernel and tag graph kernel fuse methods is 55.47, 70.29 and 60.37%, respectively. It indicated that the performance of the two kinds of kernel fusion methods is better than that of simple kernel. In comparison with the all-paths graph kernel method, the tag graph kernel method is superior in terms of overall performance. Experiments show that the performance of the multi-kernels method is better than that of the three separate single-kernel method and the dual-mutually fused kernel method used hereof in five corpus sets.

  20. 49 CFR 172.327 - Petroleum sour crude oil in bulk packaging.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... least 100 mm (3.9 inches). The width of the border forming the square-on-point marking must be at least... hydrogen sulfide vapors may occur. (b) The border of the square-on-point must be black or red on a white or other suitable contrasting background. The symbol must be black and located in the center of the square...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krumeich, F., E-mail: krumeich@inorg.chem.ethz.ch; Mueller, E.; Wepf, R.A.

    While HRTEM is the well-established method to characterize the structure of dodecagonal tantalum (vanadium) telluride quasicrystals and their periodic approximants, phase-contrast imaging performed on an aberration-corrected scanning transmission electron microscope (STEM) represents a favorable alternative. The (Ta,V){sub 151}Te{sub 74} clusters, the basic structural unit in all these phases, can be visualized with high resolution. A dependence of the image contrast on defocus and specimen thickness has been observed. In thin areas, the projected crystal potential is basically imaged with either dark or bright contrast at two defocus values close to Scherzer defocus as confirmed by image simulations utilizing the principlemore » of reciprocity. Models for square-triangle tilings describing the arrangement of the basic clusters can be derived from such images. - Graphical abstract: PC-STEM image of a (Ta,V){sub 151}Te{sub 74} cluster. Highlights: Black-Right-Pointing-Pointer C{sub s}-corrected STEM is applied for the characterization of dodecagonal quasicrystals. Black-Right-Pointing-Pointer The projected potential of the structure is mirrored in the images. Black-Right-Pointing-Pointer Phase-contrast STEM imaging depends on defocus and thickness. Black-Right-Pointing-Pointer For simulations of phase-contrast STEM images, the reciprocity theorem is applicable.« less

  2. CD36 is required for myoblast fusion during myogenic differentiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Seung-Yoon; Yun, Youngeun; Kim, In-San, E-mail: iskim@knu.ac.kr

    2012-11-02

    Highlights: Black-Right-Pointing-Pointer CD36 expression was induced during myogenic differentiation. Black-Right-Pointing-Pointer CD36 expression was localized in multinucleated myotubes. Black-Right-Pointing-Pointer The expression of myogenic markers is attenuated in CD36 knockdown C2C12 cells. Black-Right-Pointing-Pointer Knockdown of CD36 significantly inhibited myotube formation during differentiation. -- Abstract: Recently, CD36 has been found to be involved in the cytokine-induced fusion of macrophage. Myoblast fusion to form multinucleated myotubes is required for myogenesis and muscle regeneration. Because a search of gene expression database revealed the attenuation of CD36 expression in the muscles of muscular dystrophy patients, the possibility that CD36 could be required for myoblast fusion wasmore » investigated. CD36 expression was markedly up-regulated during myoblast differentiation and localized in multinucleated myotubes. Knockdown of endogenous CD36 significantly decreased the expression of myogenic markers as well as myotube formation. These results support the notion that CD36 plays an important role in cell fusion during myogenic differentiation. Our finding will aid the elucidation of the common mechanism governing cell-to-cell fusion in various fusion models.« less

  3. Closing the Black-White Gap in Birth Outcomes: A Life-course Approach

    PubMed Central

    Lu, Michael C.; Kotelchuck, Milton; Hogan, Vijaya; Jones, Loretta; Wright, Kynna; Halfon, Neal

    2015-01-01

    In the United States, Black infants have significantly worse birth outcomes than White infants. Over the past decades, public health efforts to address these disparities have focused primarily on increasing access to prenatal care, however, this has not led to closing the gap in birth outcomes. We propose a 12-point plan to reduce Black-White disparities in birth outcomes using a life-course approach. The first four points (increase access to interconception care, preconception care, quality prenatal care, and healthcare throughout the life course) address the needs of African American women for quality healthcare across the lifespan. The next four points (strengthen father involvement, systems integration, reproductive social capital, and community building) go beyond individual-level interventions to address enhancing family and community systems that may influence the health of pregnant women, families, and communities. The last four points (close the education gap, reduce poverty, support working mothers, and undo racism) move beyond the biomedical model to address the social and economic inequities that underlie much of health disparities. Closing the Black-White gap in birth outcomes requires a life course approach which addresses both early life disadvantages and cumulative allostatic load over the life course. PMID:20629248

  4. A fraction of neurofibromin interacts with PML bodies in the nucleus of the CCF astrocytoma cell line

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Godin, Fabienne; Villette, Sandrine; Vallee, Beatrice

    Highlights: Black-Right-Pointing-Pointer We validate the use of specific anti-Nf1 antibodies for immunofluorescence studies. Black-Right-Pointing-Pointer We detect Nf1 in the cytoplasm and nucleus of CCF cells. Black-Right-Pointing-Pointer We demonstrate that Nf1 partially colocalizes with PML nuclear bodies. Black-Right-Pointing-Pointer We demonstrate that there is a direct interaction between a fraction of Nf1 and the PML bodies. -- Abstract: Neurofibromatosis type 1 is a common genetic disease that causes nervous system tumors, and cognitive deficits. It is due to mutations within the NF1 gene, which encodes the Nf1 protein. Nf1 has been shown to be involved in the regulation of Ras, cAMP andmore » actin cytoskeleton dynamics. In this study, using immunofluorescence experiments, we have shown a partial nuclear localization of Nf1 in the astrocytoma cell line: CCF and we have demonstrated that Nf1 partially colocalizes with PML (promyelocytic leukemia) nuclear bodies. A direct interaction between Nf1 and the multiprotein complex has further been demonstrated using 'in situ' proximity ligation assay (PLA).« less

  5. C-Cr segregation at grain boundary before the carbide nucleation in Alloy 690

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Hui, E-mail: huili@shu.edu.cn; Laboratory for Microstructures, Shanghai University, Shanghai, 200444; Xia Shuang

    2012-04-15

    The grain boundary segregation in Alloy 690 was investigated by atom probe tomography. B, C and Si segregated at the grain boundary. The high concentration regions for each segregation element form a set of straight arrays that are parallel to each other in the grain boundary plane. The concentration fluctuation has a periodicity of about 7 nm in the grain boundary plane. Before the Cr{sub 23}C{sub 6} nucleation at grain boundaries, the C-Cr co-segregate on one side of the grain boundaries while not the exact grain boundary core regions have been detected. The reasons why grain boundary carbides have coherentmore » orientation relationship only with one side of nearby grain which grain boundary is located at high index crystal plane were discussed. - Highlights: Black-Right-Pointing-Pointer Grain boundary segregation in Alloy 690 was investigated by atom probe tomography. Black-Right-Pointing-Pointer B, C and Si segregate at the grain boundary. Black-Right-Pointing-Pointer Concentration of segregated atoms periodicity fluctuated in the grain boundary plane. Black-Right-Pointing-Pointer C and Cr co-segregate on one side of the grain boundary before carbide nucleation.« less

  6. Carbon sequestration, optimum forest rotation and their environmental impact

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kula, Erhun, E-mail: erhun.kula@bahcesehir.edu.tr; Gunalay, Yavuz, E-mail: yavuz.gunalay@bahcesehir.edu.tr

    2012-11-15

    Due to their large biomass forests assume an important role in the global carbon cycle by moderating the greenhouse effect of atmospheric pollution. The Kyoto Protocol recognises this contribution by allocating carbon credits to countries which are able to create new forest areas. Sequestrated carbon provides an environmental benefit thus must be taken into account in cost-benefit analysis of afforestation projects. Furthermore, like timber output carbon credits are now tradable assets in the carbon exchange. By using British data, this paper looks at the issue of identifying optimum felling age by considering carbon sequestration benefits simultaneously with timber yields. Themore » results of this analysis show that the inclusion of carbon benefits prolongs the optimum cutting age by requiring trees to stand longer in order to soak up more CO{sub 2}. Consequently this finding must be considered in any carbon accounting calculations. - Highlights: Black-Right-Pointing-Pointer Carbon sequestration in forestry is an environmental benefit. Black-Right-Pointing-Pointer It moderates the problem of global warming. Black-Right-Pointing-Pointer It prolongs the gestation period in harvesting. Black-Right-Pointing-Pointer This paper uses British data in less favoured districts for growing Sitka spruce species.« less

  7. Closing the Black-White gap in birth outcomes: a life-course approach.

    PubMed

    Lu, Michael C; Kotelchuck, Milton; Hogan, Vijaya; Jones, Loretta; Wright, Kynna; Halfon, Neal

    2010-01-01

    In the United States, Black infants have significantly worse birth outcomes than White infants. Over the past decades, public health efforts to address these disparities have focused primarily on increasing access to prenatal care, however, this has not led to closing the gap in birth outcomes. We propose a 12-point plan to reduce Black-White disparities in birth outcomes using a life-course approach. The first four points (increase access to interconception care, preconception care, quality prenatal care, and healthcare throughout the life course) address the needs of African American women for quality healthcare across the lifespan. The next four points (strengthen father involvement, systems integration, reproductive social capital, and community building) go beyond individual-level interventions to address enhancing family and community systems that may influence the health of pregnant women, families, and communities. The last four points (close the education gap, reduce poverty, support working mothers, and undo racism) move beyond the biomedical model to address the social and economic inequities that underlie much of health disparities. Closing the Black-White gap in birth outcomes requires a life course approach which addresses both early life disadvantages and cumulative allostatic load over the life course.

  8. Functionalization of multi-walled carbon nanotubes by epoxide ring-opening polymerization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jin Fanlong; Rhee, Kyong Yop; Park, Soo-Jin, E-mail: sjpark@inha.ac.kr

    2011-12-15

    In this study, covalent functionalization of carbon nanotubes (CNTs) was accomplished by surface-initiated epoxide ring-opening polymerization. FT-IR spectra showed that polyether and epoxide group covalently attached to the sidewalls of CNTs. TGA results indicated that the polyether was successfully grown from the CNT surface, with the final products having a polymer weight percentage of ca. 14-74 wt%. The O/C ratio of CNTs increased significantly from 5.1% to 29.8% after surface functionalization of CNTs. SEM and TEM images of functionalized CNTs exhibited that the tubes were enwrapped by polymer chains with thickness of several nanometers, forming core-shell structures with CNTs atmore » the center. - Graphical abstract: Functionalized CNTs were enwrapped by polymer chains with thickness of several nanometers, forming core-shell structures with CNTs at the center. Highlights: Black-Right-Pointing-Pointer CNTs were functionalized by epoxide ring-opening polymerization. Black-Right-Pointing-Pointer Polyether and epoxide group covalently attached to the sidewalls of CNTs. Black-Right-Pointing-Pointer Functionalized CNTs have a polymer weight percentage of ca. 14-74 wt%. Black-Right-Pointing-Pointer Functionalized CNTs were enwrapped by polymer chains with thickness of several nanometers.« less

  9. MicroRNA-27a promotes myoblast proliferation by targeting myostatin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhiqing; Chen, Xiaoling; Yu, Bing

    2012-06-29

    Highlights: Black-Right-Pointing-Pointer We identified a myogenic role for miR-27a and a new target, myostatin. Black-Right-Pointing-Pointer The miR-27a was confirmed to target myostatin 3 Prime UTR. Black-Right-Pointing-Pointer miR-27a is upregulated and myostatin is downregulated during myoblast proliferation. Black-Right-Pointing-Pointer miR-27a promotes myoblast proliferation by reducing the expression of myostatin. -- Abstract: MicroRNAs (miRNAs) are a class of endogenous non-coding RNAs that play critical roles in skeletal muscle development as well as in regulation of muscle cell proliferation and differentiation. However, the role of miRNAs in myoblast proliferation remains poorly understood. Here we found that the expression of miR-27a was increased during proliferationmore » of C2C12 myoblasts. Moreover, overexpression of miR-27a in C2C12 cells promoted myoblast proliferation by reducing the expression of myostatin, a critical inhibitor of skeletal myogenesis. In addition, the miR-27a was confirmed to target myostatin 3 Prime UTR by a luciferase reporter analysis. Together, these results suggest that miR-27a promotes myoblast proliferation through targeting myostatin.« less

  10. N-fold Darboux transformation and double-Wronskian-typed solitonic structures for a variable-coefficient modified Kortweg-de Vries equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Lei, E-mail: wanglei2239@126.com; Gao, Yi-Tian; State Key Laboratory of Software Development Environment, Beijing University of Aeronautics and Astronautics, Beijing 100191

    2012-08-15

    Under investigation in this paper is a variable-coefficient modified Kortweg-de Vries (vc-mKdV) model describing certain situations from the fluid mechanics, ocean dynamics and plasma physics. N-fold Darboux transformation (DT) of a variable-coefficient Ablowitz-Kaup-Newell-Segur spectral problem is constructed via a gauge transformation. Multi-solitonic solutions in terms of the double Wronskian for the vc-mKdV model are derived by the reduction of the N-fold DT. Three types of the solitonic interactions are discussed through figures: (1) Overtaking collision; (2) Head-on collision; (3) Parallel solitons. Nonlinear, dispersive and dissipative terms have the effects on the velocities of the solitonic waves while the amplitudes ofmore » the waves depend on the perturbation term. - Highlights: Black-Right-Pointing-Pointer N-fold DT is firstly applied to a vc-AKNS spectral problem. Black-Right-Pointing-Pointer Seeking a double Wronskian solution is changed into solving two systems. Black-Right-Pointing-Pointer Effects of the variable coefficients on the multi-solitonic waves are discussed in detail. Black-Right-Pointing-Pointer This work solves the problem from Yi Zhang [Ann. Phys. 323 (2008) 3059].« less

  11. Downregulation of tumor suppressor QKI in gastric cancer and its implication in cancer prognosis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bian, Yongqian; Wang, Li; Lu, Huanyu

    2012-05-25

    Highlights: Black-Right-Pointing-Pointer QKI expression is decreased in gastric cancer samples. Black-Right-Pointing-Pointer Promoter hyper methylation contributes to the downregulation of QKI. Black-Right-Pointing-Pointer QKI inhibits the growth of gastric cancer cells. Black-Right-Pointing-Pointer Decreased QKI expression predicts poor survival. -- Abstract: Gastric cancer (GC) is the fourth most common cancer and second leading cause of cancer-related death worldwide. RNA-binding protein Quaking (QKI) is a newly identified tumor suppressor in multiple cancers, while its role in GC is largely unknown. Our study here aimed to clarify the relationship between QKI expression with the clinicopathologic characteristics and the prognosis of GC. In the 222 GCmore » patients' specimens, QKI expression was found to be significantly decreased in most of the GC tissues, which was largely due to promoter hypermethylation. QKI overexpression reduced the proliferation ability of GC cell line in vitro study. In addition, the reduced QKI expression correlated well with poor differentiation status, depth of invasion, gastric lymph node metastasis, distant metastasis, advanced TNM stage, and poor survival. Multivariate analysis showed QKI expression was an independent prognostic factor for patient survival.« less

  12. A viewpoint on the approval context of strategic environmental assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kontic, Branko; Kontic, Davor, E-mail: davor.kontic@ijs.si

    A reflection on the last report from the Commission to the Council, the European Parliament, the European Economic and Social Committee and the Committee of the Regions on the application and effectiveness of the Directive on Strategic Environmental Assessment (SEA) is provided. It covers the inadequacies of the approval/permitting context of SEA, which appears to be increasingly applied by a significant number of Member States in recent years. A viewpoint is provided on the main deficiencies of such praxis. As a practical defence of the planning context of SEA, the authors propose that the EC should consider a clear recommendationmore » to Member States to cease performing SEA in the approval/permitting context until proper amendments to the SEA Directive are made and implemented. - Highlights: Black-Right-Pointing-Pointer Administrative and permitting context of SEA, has ousted the primary environmental impact assessment goal. Black-Right-Pointing-Pointer The approval context moves from the environmental protection to the area of political power and economy. Black-Right-Pointing-Pointer SEA and EIA are misused. Black-Right-Pointing-Pointer Environmental evaluations should be used for improving the projects/plans/programmes and not for permitting them.« less

  13. 7 CFR 51.2295 - Half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Half kernel. 51.2295 Section 51.2295 Agriculture... Standards for Shelled English Walnuts (Juglans Regia) Definitions § 51.2295 Half kernel. Half kernel means the separated half of a kernel with not more than one-eighth broken off. ...

  14. 7 CFR 810.206 - Grades and grade requirements for barley.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... weight per bushel (pounds) Sound barley (percent) Maximum Limits of— Damaged kernels 1 (percent) Heat damaged kernels (percent) Foreign material (percent) Broken kernels (percent) Thin barley (percent) U.S... or otherwise of distinctly low quality. 1 Includes heat-damaged kernels. Injured-by-frost kernels and...

  15. Photosynthesis in black and red spruce and their hybrid derivatives: ecological isolation and hybrid adaptive inferiority

    Treesearch

    S.A.M Manley; F. Thomas Ledig

    1979-01-01

    Photosynthetic response5 of black and red spruce were used to define parameters of their fundamental niches. Grown at warm temperature, black spruce had highest rates of CO2 uptake at high light intensities, fitting it for a pioneering role, while red spruce had the lowest light compensation point, fitting it for a late successional role. Black...

  16. Race Still Matters: Considerations for Mentoring Black Women in Academe

    ERIC Educational Resources Information Center

    Holmes, Sharon L.; Land, Lynette Danley; Hinton-Hudson, Veronica D.

    2007-01-01

    We investigated the experiences of Black women faculty employed by predominantly White institutions. Using extant literature interwoven with narrative data, we provided an analysis of how some Black women experience mentoring and/or the mentor-mentee relationship. Emergent themes suggested two significant career trajectory points for the faculty…

  17. Reaching Black Men. Commentary

    ERIC Educational Resources Information Center

    Gassman, Marybeth

    2010-01-01

    Journalist Elizabeth Redden brings to the surface several salient issues in her article entitled, "Reaching Black Men." First, she illuminates that fact that access is not enough when it comes to educating African American men. Second, she points to the importance of having campus-wide initiatives to support the success of Black men. And…

  18. 40 CFR 458.45 - Standards of performance for new sources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... paragraph, which may be discharged from the carbon black lamp process by a new source subject to the provisions of this subpart: There shall be no discharge of process waste water pollutants to navigable waters. ...) EFFLUENT GUIDELINES AND STANDARDS CARBON BLACK MANUFACTURING POINT SOURCE CATEGORY Carbon Black Lamp...

  19. 7 CFR 51.1449 - Damage.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ...) Kernel which is “dark amber” or darker color; (e) Kernel having more than one dark kernel spot, or one dark kernel spot more than one-eighth inch in greatest dimension; (f) Shriveling when the surface of the kernel is very conspicuously wrinkled; (g) Internal flesh discoloration of a medium shade of gray...

  20. 7 CFR 51.1449 - Damage.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) Kernel which is “dark amber” or darker color; (e) Kernel having more than one dark kernel spot, or one dark kernel spot more than one-eighth inch in greatest dimension; (f) Shriveling when the surface of the kernel is very conspicuously wrinkled; (g) Internal flesh discoloration of a medium shade of gray...

  1. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Split or broken kernels. 51.2125 Section 51.2125 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will not...

  2. 7 CFR 51.2296 - Three-fourths half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Three-fourths half kernel. 51.2296 Section 51.2296 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards...-fourths half kernel. Three-fourths half kernel means a portion of a half of a kernel which has more than...

  3. The Classification of Diabetes Mellitus Using Kernel k-means

    NASA Astrophysics Data System (ADS)

    Alamsyah, M.; Nafisah, Z.; Prayitno, E.; Afida, A. M.; Imah, E. M.

    2018-01-01

    Diabetes Mellitus is a metabolic disorder which is characterized by chronicle hypertensive glucose. Automatics detection of diabetes mellitus is still challenging. This study detected diabetes mellitus by using kernel k-Means algorithm. Kernel k-means is an algorithm which was developed from k-means algorithm. Kernel k-means used kernel learning that is able to handle non linear separable data; where it differs with a common k-means. The performance of kernel k-means in detecting diabetes mellitus is also compared with SOM algorithms. The experiment result shows that kernel k-means has good performance and a way much better than SOM.

  4. UNICOS Kernel Internals Application Development

    NASA Technical Reports Server (NTRS)

    Caredo, Nicholas; Craw, James M. (Technical Monitor)

    1995-01-01

    Having an understanding of UNICOS Kernel Internals is valuable information. However, having the knowledge is only half the value. The second half comes with knowing how to use this information and apply it to the development of tools. The kernel contains vast amounts of useful information that can be utilized. This paper discusses the intricacies of developing utilities that utilize kernel information. In addition, algorithms, logic, and code will be discussed for accessing kernel information. Code segments will be provided that demonstrate how to locate and read kernel structures. Types of applications that can utilize kernel information will also be discussed.

  5. Detection of maize kernels breakage rate based on K-means clustering

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Wang, Zhuo; Gao, Lei; Bai, Xiaoping

    2017-04-01

    In order to optimize the recognition accuracy of maize kernels breakage detection and improve the detection efficiency of maize kernels breakage, this paper using computer vision technology and detecting of the maize kernels breakage based on K-means clustering algorithm. First, the collected RGB images are converted into Lab images, then the original images clarity evaluation are evaluated by the energy function of Sobel 8 gradient. Finally, the detection of maize kernels breakage using different pixel acquisition equipments and different shooting angles. In this paper, the broken maize kernels are identified by the color difference between integrity kernels and broken kernels. The original images clarity evaluation and different shooting angles are taken to verify that the clarity and shooting angles of the images have a direct influence on the feature extraction. The results show that K-means clustering algorithm can distinguish the broken maize kernels effectively.

  6. Modeling adaptive kernels from probabilistic phylogenetic trees.

    PubMed

    Nicotra, Luca; Micheli, Alessio

    2009-01-01

    Modeling phylogenetic interactions is an open issue in many computational biology problems. In the context of gene function prediction we introduce a class of kernels for structured data leveraging on a hierarchical probabilistic modeling of phylogeny among species. We derive three kernels belonging to this setting: a sufficient statistics kernel, a Fisher kernel, and a probability product kernel. The new kernels are used in the context of support vector machine learning. The kernels adaptivity is obtained through the estimation of the parameters of a tree structured model of evolution using as observed data phylogenetic profiles encoding the presence or absence of specific genes in a set of fully sequenced genomes. We report results obtained in the prediction of the functional class of the proteins of the budding yeast Saccharomyces cerevisae which favorably compare to a standard vector based kernel and to a non-adaptive tree kernel function. A further comparative analysis is performed in order to assess the impact of the different components of the proposed approach. We show that the key features of the proposed kernels are the adaptivity to the input domain and the ability to deal with structured data interpreted through a graphical model representation.

  7. Aflatoxin and nutrient contents of peanut collected from local market and their processed foods

    NASA Astrophysics Data System (ADS)

    Ginting, E.; Rahmianna, A. A.; Yusnawan, E.

    2018-01-01

    Peanut is succeptable to aflatoxin contamination and the sources of peanut as well as processing methods considerably affect aflatoxin content of the products. Therefore, the study on aflatoxin and nutrient contents of peanut collected from local market and their processed foods were performed. Good kernels of peanut were prepared into fried peanut, pressed-fried peanut, peanut sauce, peanut press cake, fermented peanut press cake (tempe) and fried tempe, while blended kernels (good and poor kernels) were processed into peanut sauce and tempe and poor kernels were only processed into tempe. The results showed that good and blended kernels which had high number of sound/intact kernels (82,46% and 62,09%), contained 9.8-9.9 ppb of aflatoxin B1, while slightly higher level was seen in poor kernels (12.1 ppb). However, the moisture, ash, protein, and fat contents of the kernels were similar as well as the products. Peanut tempe and fried tempe showed the highest increase in protein content, while decreased fat contents were seen in all products. The increase in aflatoxin B1 of peanut tempe prepared from poor kernels > blended kernels > good kernels. However, it averagely decreased by 61.2% after deep-fried. Excluding peanut tempe and fried tempe, aflatoxin B1 levels in all products derived from good kernels were below the permitted level (15 ppb). This suggests that sorting peanut kernels as ingredients and followed by heat processing would decrease the aflatoxin content in the products.

  8. Partial Deconvolution with Inaccurate Blur Kernel.

    PubMed

    Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei

    2017-10-17

    Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.

  9. Chandra Observations of the M31

    NASA Technical Reports Server (NTRS)

    Garcia, Michael; Lavoie, Anthony R. (Technical Monitor)

    2000-01-01

    We report on Chandra observations of the nearest Spiral Galaxy, M3l, The nuclear source seen with previous X-ray observatories is resolved into five point sources. One of these sources is within 1 arc-sec of the M31 central super-massive black hole. As compared to the other point sources in M3l. this nuclear source has an unusually soft spectrum. Based on the spatial coincidence and the unusual spectrum. we identify this source with the central black hole. A bright transient is detected 26 arc-sec to the west of the nucleus, which may be associated with a stellar mass black hole. We will report on a comparison of the x-ray spectrum of the diffuse emission and point sources seen in the central few arcmin

  10. Technical Note: Dose gradients and prescription isodose in orthovoltage stereotactic radiosurgery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fagerstrom, Jessica M., E-mail: fagerstrom@wisc.edu; Bender, Edward T.; Culberson, Wesley S.

    Purpose: The purpose of this work is to examine the trade-off between prescription isodose and dose gradients in orthovoltage stereotactic radiosurgery. Methods: Point energy deposition kernels (EDKs) describing photon and electron transport were calculated using Monte Carlo methods. EDKs were generated from 10  to 250 keV, in 10 keV increments. The EDKs were converted to pencil beam kernels and used to calculate dose profiles through isocenter from a 4π isotropic delivery from all angles of circularly collimated beams. Monoenergetic beams and an orthovoltage polyenergetic spectrum were analyzed. The dose gradient index (DGI) is the ratio of the 50% prescription isodosemore » volume to the 100% prescription isodose volume and represents a metric by which dose gradients in stereotactic radiosurgery (SRS) may be evaluated. Results: Using the 4π dose profiles calculated using pencil beam kernels, the relationship between DGI and prescription isodose was examined for circular cones ranging from 4 to 18 mm in diameter and monoenergetic photon beams with energies ranging from 20 to 250 keV. Values were found to exist for prescription isodose that optimize DGI. Conclusions: The relationship between DGI and prescription isodose was found to be dependent on both field size and energy. Examining this trade-off is an important consideration for designing optimal SRS systems.« less

  11. Appraisal of ALM predictions of turbulent wake features

    NASA Astrophysics Data System (ADS)

    Rocchio, Benedetto; Cilurzo, Lorenzo; Ciri, Umberto; Salvetti, Maria Vittoria; Leonardi, Stefano

    2017-11-01

    Wind turbine blades create a turbulent wake that may persist far downstream, with significant implications on wind farm design and on its power production. The numerical representation of the real blade geometry would lead to simulations beyond the present computational resources. We focus our attention on the Actuator Line Model (ALM), in which the blade is replaced by a rotating line divided into finite segments with representative aerodynamic coefficients. The total aerodynamic force is projected along the computational axis and, to avoid numerical instabilities, it is distributed among the nearest grid points by using a Gaussian regularization kernel. The standard deviation of this kernel is a fundamental parameter that strongly affects the characteristics of the wake. We compare here the wake features obtained in direct numerical simulations of the flow around 2D bodies (a flat plate and an airfoil) modeled using the Immersed Boundary Method with the results of simulations in which the body is modeled by ALM. In particular, we investigate whether the ALM is able to reproduce the mean velocity field and the turbulent kinetic energy in the wake for the considered bodies at low and high angles of attack and how this depends on the choice of the ALM kernel. S. Leonardi was supported by the National Science Foundation, Grant No. 1243482 (the WINDINSPIRE project).

  12. DANCING IN THE DARK: NEW BROWN DWARF BINARIES FROM KERNEL PHASE INTERFEROMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, Benjamin; Tuthill, Peter; Martinache, Frantz, E-mail: bjsp@physics.usyd.edu.au, E-mail: p.tuthill@physics.usyd.edu.au, E-mail: frantz@naoj.org

    2013-04-20

    This paper revisits a sample of ultracool dwarfs in the solar neighborhood previously observed with the Hubble Space Telescope's NICMOS NIC1 instrument. We have applied a novel high angular resolution data analysis technique based on the extraction and fitting of kernel phases to archival data. This was found to deliver a dramatic improvement over earlier analysis methods, permitting a search for companions down to projected separations of {approx}1 AU on NIC1 snapshot images. We reveal five new close binary candidates and present revised astrometry on previously known binaries, all of which were recovered with the technique. The new candidate binariesmore » have sufficiently close separation to determine dynamical masses in a short-term observing campaign. We also present four marginal detections of objects which may be very close binaries or high-contrast companions. Including only confident detections within 19 pc, we report a binary fraction of at least #Greek Lunate Epsilon Symbol#{sub b} = 17.2{sub -3.7}{sup +5.7}%. The results reported here provide new insights into the population of nearby ultracool binaries, while also offering an incisive case study of the benefits conferred by the kernel phase approach in the recovery of companions within a few resolution elements of the point-spread function core.« less

  13. Racial differences in vascular risk factors and outcomes of patients with intracranial atherosclerotic arterial stenosis.

    PubMed

    Waddy, Salina P; Cotsonis, George; Lynn, Michael J; Frankel, Michael R; Chaturvedi, Seemant; Williams, Janice E; Chimowitz, Marc

    2009-03-01

    Atherosclerotic intracranial stenosis is an important cause of stroke in blacks, yet there are limited data on vascular risk factors and outcome. We analyzed the vascular risk factors and outcomes of blacks and whites in the Warfarin versus Aspirin for Symptomatic Intracranial Disease (WASID) trial. Baseline characteristics and outcomes (ischemic stroke, brain hemorrhage, or vascular death combined and ischemic stroke alone) were compared between blacks (n=174) and whites (n=331) using univariate and multivariate analyses. Blacks were significantly (P<0.05) more likely than whites to be/have: female, hypertension history, diabetes history, higher LDL, higher total cholesterol, lower triglycerides, unmarried, unemployed, nonprivate insurance, no insurance, stroke as qualifying event, <70% stenosis, symptomatic anterior circulation vessel, no antithrombotic medication before qualifying event, and no family history of myocardial infarction. Blacks more frequently reached an end point of ischemic stroke, brain hemorrhage or vascular death (28% versus 20%; hazard ratio of 1.49, 95% CI 1.03 to 2.17, P=0.03), had a higher 2-year event rate (0.28 versus 0.19), and reached the end point of ischemic stroke alone (25% versus 16% at 2 years; hazard ratio of 1.62, P=0.017). In multivariate analysis, race was associated with ischemic stroke (P=0.0488) but not with the end point ischemic stroke, brain hemorrhage or vascular death (P=0.188). Blacks with intracranial stenosis are at higher risk of stroke recurrence than whites. This risk warrants additional study of factors contributing to stroke in blacks and highlights the need for aggressive risk factor management in blacks to prevent recurrence.

  14. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  15. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  16. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  17. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  18. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  19. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Half-kernel. 51.1441 Section 51.1441 Agriculture... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than one-eighth of its original volume missing...

  20. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color...

  1. Wavelet SVM in Reproducing Kernel Hilbert Space for hyperspectral remote sensing image classification

    NASA Astrophysics Data System (ADS)

    Du, Peijun; Tan, Kun; Xing, Xiaoshi

    2010-12-01

    Combining Support Vector Machine (SVM) with wavelet analysis, we constructed wavelet SVM (WSVM) classifier based on wavelet kernel functions in Reproducing Kernel Hilbert Space (RKHS). In conventional kernel theory, SVM is faced with the bottleneck of kernel parameter selection which further results in time-consuming and low classification accuracy. The wavelet kernel in RKHS is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. Implications on semiparametric estimation are proposed in this paper. Airborne Operational Modular Imaging Spectrometer II (OMIS II) hyperspectral remote sensing image with 64 bands and Reflective Optics System Imaging Spectrometer (ROSIS) data with 115 bands were used to experiment the performance and accuracy of the proposed WSVM classifier. The experimental results indicate that the WSVM classifier can obtain the highest accuracy when using the Coiflet Kernel function in wavelet transform. In contrast with some traditional classifiers, including Spectral Angle Mapping (SAM) and Minimum Distance Classification (MDC), and SVM classifier using Radial Basis Function kernel, the proposed wavelet SVM classifier using the wavelet kernel function in Reproducing Kernel Hilbert Space is capable of improving classification accuracy obviously.

  2. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    PubMed

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature

    PubMed Central

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems. PMID:29099838

  4. Hadamard Kernel SVM with applications for breast cancer outcome predictions.

    PubMed

    Jiang, Hao; Ching, Wai-Ki; Cheung, Wai-Shun; Hou, Wenpin; Yin, Hong

    2017-12-21

    Breast cancer is one of the leading causes of deaths for women. It is of great necessity to develop effective methods for breast cancer detection and diagnosis. Recent studies have focused on gene-based signatures for outcome predictions. Kernel SVM for its discriminative power in dealing with small sample pattern recognition problems has attracted a lot attention. But how to select or construct an appropriate kernel for a specified problem still needs further investigation. Here we propose a novel kernel (Hadamard Kernel) in conjunction with Support Vector Machines (SVMs) to address the problem of breast cancer outcome prediction using gene expression data. Hadamard Kernel outperform the classical kernels and correlation kernel in terms of Area under the ROC Curve (AUC) values where a number of real-world data sets are adopted to test the performance of different methods. Hadamard Kernel SVM is effective for breast cancer predictions, either in terms of prognosis or diagnosis. It may benefit patients by guiding therapeutic options. Apart from that, it would be a valuable addition to the current SVM kernel families. We hope it will contribute to the wider biology and related communities.

  5. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature.

    PubMed

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems.

  6. Weighted Bergman Kernels and Quantization}

    NASA Astrophysics Data System (ADS)

    Engliš, Miroslav

    Let Ω be a bounded pseudoconvex domain in CN, φ, ψ two positive functions on Ω such that - log ψ, - log φ are plurisubharmonic, and z∈Ω a point at which - log φ is smooth and strictly plurisubharmonic. We show that as k-->∞, the Bergman kernels with respect to the weights φkψ have an asymptotic expansion for x,y near z, where φ(x,y) is an almost-analytic extension of &\\phi(x)=φ(x,x) and similarly for ψ. Further, . If in addition Ω is of finite type, φ,ψ behave reasonably at the boundary, and - log φ, - log ψ are strictly plurisubharmonic on Ω, we obtain also an analogous asymptotic expansion for the Berezin transform and give applications to the Berezin quantization. Finally, for Ω smoothly bounded and strictly pseudoconvex and φ a smooth strictly plurisubharmonic defining function for Ω, we also obtain results on the Berezin-Toeplitz quantization.

  7. Elliptic polylogarithms and iterated integrals on elliptic curves. II. An application to the sunrise integral

    NASA Astrophysics Data System (ADS)

    Broedel, Johannes; Duhr, Claude; Dulat, Falko; Tancredi, Lorenzo

    2018-06-01

    We introduce a class of iterated integrals that generalize multiple polylogarithms to elliptic curves. These elliptic multiple polylogarithms are closely related to similar functions defined in pure mathematics and string theory. We then focus on the equal-mass and non-equal-mass sunrise integrals, and we develop a formalism that enables us to compute these Feynman integrals in terms of our iterated integrals on elliptic curves. The key idea is to use integration-by-parts identities to identify a set of integral kernels, whose precise form is determined by the branch points of the integral in question. These kernels allow us to express all iterated integrals on an elliptic curve in terms of them. The flexibility of our approach leads us to expect that it will be applicable to a large variety of integrals in high-energy physics.

  8. Scramjet Nozzles

    DTIC Science & Technology

    2010-09-01

    and y, the axial and radial coordinates respectively. Point c lies somewhere within the mesh generated by the initial expansion (the kernel). All that...and the surface will be subjected to high heat loads restricting the choice of suitable materials. Material choice has direct implications for...Some legacy trajectory codes might not be able to deal with anything other than axial forces from engines, reflecting the class of problem they were

  9. Ischemia episode detection in ECG using kernel density estimation, support vector machine and feature selection

    PubMed Central

    2012-01-01

    Background Myocardial ischemia can be developed into more serious diseases. Early Detection of the ischemic syndrome in electrocardiogram (ECG) more accurately and automatically can prevent it from developing into a catastrophic disease. To this end, we propose a new method, which employs wavelets and simple feature selection. Methods For training and testing, the European ST-T database is used, which is comprised of 367 ischemic ST episodes in 90 records. We first remove baseline wandering, and detect time positions of QRS complexes by a method based on the discrete wavelet transform. Next, for each heart beat, we extract three features which can be used for differentiating ST episodes from normal: 1) the area between QRS offset and T-peak points, 2) the normalized and signed sum from QRS offset to effective zero voltage point, and 3) the slope from QRS onset to offset point. We average the feature values for successive five beats to reduce effects of outliers. Finally we apply classifiers to those features. Results We evaluated the algorithm by kernel density estimation (KDE) and support vector machine (SVM) methods. Sensitivity and specificity for KDE were 0.939 and 0.912, respectively. The KDE classifier detects 349 ischemic ST episodes out of total 367 ST episodes. Sensitivity and specificity of SVM were 0.941 and 0.923, respectively. The SVM classifier detects 355 ischemic ST episodes. Conclusions We proposed a new method for detecting ischemia in ECG. It contains signal processing techniques of removing baseline wandering and detecting time positions of QRS complexes by discrete wavelet transform, and feature extraction from morphology of ECG waveforms explicitly. It was shown that the number of selected features were sufficient to discriminate ischemic ST episodes from the normal ones. We also showed how the proposed KDE classifier can automatically select kernel bandwidths, meaning that the algorithm does not require any numerical values of the parameters to be supplied in advance. In the case of the SVM classifier, one has to select a single parameter. PMID:22703641

  10. LZW-Kernel: fast kernel utilizing variable length code blocks from LZW compressors for protein sequence classification.

    PubMed

    Filatov, Gleb; Bauwens, Bruno; Kertész-Farkas, Attila

    2018-05-07

    Bioinformatics studies often rely on similarity measures between sequence pairs, which often pose a bottleneck in large-scale sequence analysis. Here, we present a new convolutional kernel function for protein sequences called the LZW-Kernel. It is based on code words identified with the Lempel-Ziv-Welch (LZW) universal text compressor. The LZW-Kernel is an alignment-free method, it is always symmetric, is positive, always provides 1.0 for self-similarity and it can directly be used with Support Vector Machines (SVMs) in classification problems, contrary to normalized compression distance (NCD), which often violates the distance metric properties in practice and requires further techniques to be used with SVMs. The LZW-Kernel is a one-pass algorithm, which makes it particularly plausible for big data applications. Our experimental studies on remote protein homology detection and protein classification tasks reveal that the LZW-Kernel closely approaches the performance of the Local Alignment Kernel (LAK) and the SVM-pairwise method combined with Smith-Waterman (SW) scoring at a fraction of the time. Moreover, the LZW-Kernel outperforms the SVM-pairwise method when combined with BLAST scores, which indicates that the LZW code words might be a better basis for similarity measures than local alignment approximations found with BLAST. In addition, the LZW-Kernel outperforms n-gram based mismatch kernels, hidden Markov model based SAM and Fisher kernel, and protein family based PSI-BLAST, among others. Further advantages include the LZW-Kernel's reliance on a simple idea, its ease of implementation, and its high speed, three times faster than BLAST and several magnitudes faster than SW or LAK in our tests. LZW-Kernel is implemented as a standalone C code and is a free open-source program distributed under GPLv3 license and can be downloaded from https://github.com/kfattila/LZW-Kernel. akerteszfarkas@hse.ru. Supplementary data are available at Bioinformatics Online.

  11. A framework for optimal kernel-based manifold embedding of medical image data.

    PubMed

    Zimmer, Veronika A; Lekadir, Karim; Hoogendoorn, Corné; Frangi, Alejandro F; Piella, Gemma

    2015-04-01

    Kernel-based dimensionality reduction is a widely used technique in medical image analysis. To fully unravel the underlying nonlinear manifold the selection of an adequate kernel function and of its free parameters is critical. In practice, however, the kernel function is generally chosen as Gaussian or polynomial and such standard kernels might not always be optimal for a given image dataset or application. In this paper, we present a study on the effect of the kernel functions in nonlinear manifold embedding of medical image data. To this end, we first carry out a literature review on existing advanced kernels developed in the statistics, machine learning, and signal processing communities. In addition, we implement kernel-based formulations of well-known nonlinear dimensional reduction techniques such as Isomap and Locally Linear Embedding, thus obtaining a unified framework for manifold embedding using kernels. Subsequently, we present a method to automatically choose a kernel function and its associated parameters from a pool of kernel candidates, with the aim to generate the most optimal manifold embeddings. Furthermore, we show how the calculated selection measures can be extended to take into account the spatial relationships in images, or used to combine several kernels to further improve the embedding results. Experiments are then carried out on various synthetic and phantom datasets for numerical assessment of the methods. Furthermore, the workflow is applied to real data that include brain manifolds and multispectral images to demonstrate the importance of the kernel selection in the analysis of high-dimensional medical images. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Evaluating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Wilton, Donald R.; Champagne, Nathan J.

    2008-01-01

    Recently, a formulation for evaluating the thin wire kernel was developed that employed a change of variable to smooth the kernel integrand, canceling the singularity in the integrand. Hence, the typical expansion of the wire kernel in a series for use in the potential integrals is avoided. The new expression for the kernel is exact and may be used directly to determine the gradient of the wire kernel, which consists of components that are parallel and radial to the wire axis.

  13. Mass-induced instability of SAdS black hole in Einstein-Ricci cubic gravity

    NASA Astrophysics Data System (ADS)

    Myung, Yun Soo

    2018-05-01

    We perform the stability analysis of Schwarzschild-AdS (SAdS) black hole in the Einstein-Ricci cubic gravity. It shows that the Ricci tensor perturbations exhibit unstable modes for small black holes. We call this the mass-induced instability of SAdS black hole because the instability of small black holes arises from the massiveness in the linearized Einstein-Ricci cubic gravity, but not a feature of higher-order derivative theory giving ghost states. Also, we point out that the correlated stability conjecture holds for the SAdS black hole by computing the Wald entropy of SAdS black hole in Einstein-Ricci cubic gravity.

  14. Kernel Machine SNP-set Testing under Multiple Candidate Kernels

    PubMed Central

    Wu, Michael C.; Maity, Arnab; Lee, Seunggeun; Simmons, Elizabeth M.; Harmon, Quaker E.; Lin, Xinyi; Engel, Stephanie M.; Molldrem, Jeffrey J.; Armistead, Paul M.

    2013-01-01

    Joint testing for the cumulative effect of multiple single nucleotide polymorphisms grouped on the basis of prior biological knowledge has become a popular and powerful strategy for the analysis of large scale genetic association studies. The kernel machine (KM) testing framework is a useful approach that has been proposed for testing associations between multiple genetic variants and many different types of complex traits by comparing pairwise similarity in phenotype between subjects to pairwise similarity in genotype, with similarity in genotype defined via a kernel function. An advantage of the KM framework is its flexibility: choosing different kernel functions allows for different assumptions concerning the underlying model and can allow for improved power. In practice, it is difficult to know which kernel to use a priori since this depends on the unknown underlying trait architecture and selecting the kernel which gives the lowest p-value can lead to inflated type I error. Therefore, we propose practical strategies for KM testing when multiple candidate kernels are present based on constructing composite kernels and based on efficient perturbation procedures. We demonstrate through simulations and real data applications that the procedures protect the type I error rate and can lead to substantially improved power over poor choices of kernels and only modest differences in power versus using the best candidate kernel. PMID:23471868

  15. Combined multi-kernel head computed tomography images optimized for depicting both brain parenchyma and bone.

    PubMed

    Takagi, Satoshi; Nagase, Hiroyuki; Hayashi, Tatsuya; Kita, Tamotsu; Hayashi, Katsumi; Sanada, Shigeru; Koike, Masayuki

    2014-01-01

    The hybrid convolution kernel technique for computed tomography (CT) is known to enable the depiction of an image set using different window settings. Our purpose was to decrease the number of artifacts in the hybrid convolution kernel technique for head CT and to determine whether our improved combined multi-kernel head CT images enabled diagnosis as a substitute for both brain (low-pass kernel-reconstructed) and bone (high-pass kernel-reconstructed) images. Forty-four patients with nondisplaced skull fractures were included. Our improved multi-kernel images were generated so that pixels of >100 Hounsfield unit in both brain and bone images were composed of CT values of bone images and other pixels were composed of CT values of brain images. Three radiologists compared the improved multi-kernel images with bone images. The improved multi-kernel images and brain images were identically displayed on the brain window settings. All three radiologists agreed that the improved multi-kernel images on the bone window settings were sufficient for diagnosing skull fractures in all patients. This improved multi-kernel technique has a simple algorithm and is practical for clinical use. Thus, simplified head CT examinations and fewer images that need to be stored can be expected.

  16. Critical phenomena and chemical potential of a charged AdS black hole

    NASA Astrophysics Data System (ADS)

    Wei, Shao-Wen; Liang, Bin; Liu, Yu-Xiao

    2017-12-01

    Inspired by the interpretation of the cosmological constant from the boundary gauge theory, we here treat it as the number of colors N and its conjugate quantity as the associated chemical potential μ in the black hole side. Then the thermodynamics and the chemical potential for a five-dimensional charged AdS black hole are studied. It is found that there exists a small-large black hole phase transition of van der Waals type. The critical phenomena are investigated in the N2-μ chart. The result implies that the phase transition can occur for large number of colors N , while is forbidden for small number. This to some extent implies that the interaction of the system increases with the number. In particular, in the reduced parameter space, all the thermodynamic quantities can be rescaled with the black hole charge such that these reduced quantities are charge-independent. Then we obtain the coexistence curve and the phase diagram. The latent heat is also numerically calculated. Moreover, the heat capacity and the thermodynamic scalar are studied. The result indicates that the information of the first-order black hole phase transition is encoded in the heat capacity and scalar. However, the phase transition point cannot be directly calculated with them. Nevertheless, the critical point linked to a second-order phase transition can be determined by either the heat capacity or the scalar. In addition, we calculate the critical exponents of the heat capacity and the scalar for the saturated small and large black holes near the critical point.

  17. Nineteenth Century Black Methodist Missionary Bishops in Liberia.

    ERIC Educational Resources Information Center

    Jacobs, Sylvia M.

    1981-01-01

    Traces 19th-century efforts of the American Methodist Episcopal Church to establish missions and employ Black missionary bishops in Liberia. Points out that the abolition of slavery in the United States contributed to a shift in the Methodist Church's position on recruiting Blacks in the mission movement in Africa. (Author/MJL)

  18. Inside the Black Box of Classroom Practice: Change without Reform in American Education

    ERIC Educational Resources Information Center

    Cuban, Larry

    2013-01-01

    A book that explores the problematic connection between education policy and practice while pointing in the direction of a more fruitful relationship, "Inside the Black Box of Classroom Practice" is a provocative culminating statement from one of America's most insightful education scholars and leaders. "Inside the Black Box of…

  19. 16 CFR 1211.15 - Field-installed labels.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., and (3) A message panel, with adjacent panels delineated from each other by a horizontal black line. The entire label shall be surrounded by a black border and shall measure at least 5 inches (127 mm... consisting of an orange exclamation mark on a black solid equilateral triangle background with the point of...

  20. 16 CFR § 1211.15 - Field-installed labels.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., and (3) A message panel, with adjacent panels delineated from each other by a horizontal black line. The entire label shall be surrounded by a black border and shall measure at least 5 inches (127 mm... consisting of an orange exclamation mark on a black solid equilateral triangle background with the point of...

Top