Sample records for kernel emissions smoke

  1. (EDMUNDS, WA) WILDLAND FIRE EMISSIONS MODELING: INTEGRATING BLUESKY AND SMOKE

    EPA Science Inventory

    This presentation is a status update of the BlueSky emissions modeling system. BlueSky-EM has been coupled with the Sparse Matrix Operational Kernel Emissions (SMOKE) system, and is now available as a tool for estimating emissions from wildland fires

  2. IMPLEMENTATION OF THE SMOKE EMISSION DATA PROCESSOR AND SMOKE TOOL INPUT DATA PROCESSOR IN MODELS-3

    EPA Science Inventory

    The U.S. Environmental Protection Agency has implemented Version 1.3 of SMOKE (Sparse Matrix Object Kernel Emission) processor for preparation of area, mobile, point, and biogenic sources emission data within Version 4.1 of the Models-3 air quality modeling framework. The SMOK...

  3. Transcriptome analysis of germinating maize kernels exposed to smoke-water and the active compound KAR1.

    PubMed

    Soós, Vilmos; Sebestyén, Endre; Juhász, Angéla; Light, Marnie E; Kohout, Ladislav; Szalai, Gabriella; Tandori, Júlia; Van Staden, Johannes; Balázs, Ervin

    2010-11-02

    Smoke released from burning vegetation functions as an important environmental signal promoting the germination of many plant species following a fire. It not only promotes the germination of species from fire-prone habitats, but several species from non-fire-prone areas also respond, including some crops. The germination stimulatory activity can largely be attributed to the presence of a highly active butenolide compound, 3-methyl-2H-furo[2,3-c]pyran-2-one (referred to as karrikin 1 or KAR1), that has previously been isolated from plant-derived smoke. Several hypotheses have arisen regarding the molecular background of smoke and KAR1 action. In this paper we demonstrate that although smoke-water and KAR1 treatment of maize kernels result in a similar physiological response, the gene expression and the protein ubiquitination patterns are quite different. Treatment with smoke-water enhanced the ubiquitination of proteins and activated protein-degradation-related genes. This effect was completely absent from KAR1-treated kernels, in which a specific aquaporin gene was distinctly upregulated. Our findings indicate that the array of bioactive compounds present in smoke-water form an environmental signal that may act together in germination stimulation. It is highly possible that the smoke/KAR1 'signal' is perceived by a receptor that is shared with the signal transduction system implied in perceiving environmental cues (especially stresses and light), or some kind of specialized receptor exists in fire-prone plant species which diverged from a more general one present in a common ancestor, and also found in non fire-prone plants allowing for a somewhat weaker but still significant response. Besides their obvious use in agricultural practices, smoke and KAR1 can be used in studies to gain further insight into the transcriptional changes during germination.

  4. Transcriptome analysis of germinating maize kernels exposed to smoke-water and the active compound KAR1

    PubMed Central

    2010-01-01

    Background Smoke released from burning vegetation functions as an important environmental signal promoting the germination of many plant species following a fire. It not only promotes the germination of species from fire-prone habitats, but several species from non-fire-prone areas also respond, including some crops. The germination stimulatory activity can largely be attributed to the presence of a highly active butenolide compound, 3-methyl-2H-furo[2,3-c]pyran-2-one (referred to as karrikin 1 or KAR1), that has previously been isolated from plant-derived smoke. Several hypotheses have arisen regarding the molecular background of smoke and KAR1 action. Results In this paper we demonstrate that although smoke-water and KAR1 treatment of maize kernels result in a similar physiological response, the gene expression and the protein ubiquitination patterns are quite different. Treatment with smoke-water enhanced the ubiquitination of proteins and activated protein-degradation-related genes. This effect was completely absent from KAR1-treated kernels, in which a specific aquaporin gene was distinctly upregulated. Conclusions Our findings indicate that the array of bioactive compounds present in smoke-water form an environmental signal that may act together in germination stimulation. It is highly possible that the smoke/KAR1 'signal' is perceived by a receptor that is shared with the signal transduction system implied in perceiving environmental cues (especially stresses and light), or some kind of specialized receptor exists in fire-prone plant species which diverged from a more general one present in a common ancestor, and also found in non fire-prone plants allowing for a somewhat weaker but still significant response. Besides their obvious use in agricultural practices, smoke and KAR1 can be used in studies to gain further insight into the transcriptional changes during germination. PMID:21044315

  5. 40 CFR 89.113 - Smoke emission standard.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Smoke emission standard. 89.113 Section 89.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... and Certification Provisions § 89.113 Smoke emission standard. (a) Exhaust opacity from compression...

  6. 40 CFR 89.113 - Smoke emission standard.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Smoke emission standard. 89.113 Section 89.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... and Certification Provisions § 89.113 Smoke emission standard. (a) Exhaust opacity from compression...

  7. 40 CFR 89.113 - Smoke emission standard.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Smoke emission standard. 89.113 Section 89.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... and Certification Provisions § 89.113 Smoke emission standard. (a) Exhaust opacity from compression...

  8. 40 CFR 89.113 - Smoke emission standard.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Smoke emission standard. 89.113 Section 89.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... and Certification Provisions § 89.113 Smoke emission standard. (a) Exhaust opacity from compression...

  9. 40 CFR 89.113 - Smoke emission standard.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Smoke emission standard. 89.113 Section 89.113 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED... Certification Provisions § 89.113 Smoke emission standard. (a) Exhaust opacity from compression-ignition nonroad...

  10. Cigar burning under different smoking intensities and effects on emissions.

    PubMed

    Dethloff, Ole; Mueller, Christian; Cahours, Xavier; Colard, Stéphane

    2017-12-01

    The effect of smoking intensity on cigar smoke emissions was assessed under a range of puff frequencies and puff volumes. In order to potentially reduce emissions variability and to identify patterns as accurately as possible, cigar weights and diameters were measured, and outliers were excluded prior to smoking. Portions corresponding to 25%, 50%, 75% and 100% of the cigar, measured down to the butt length, were smoked under several smoking conditions, to assess nicotine, CO and water yields. The remaining cigar butts were analysed for total alkaloids, nicotine, and moisture. Results showed accumulation effects during the burning process having a significant impact on smoke emission levels. Condensation and evaporation occur and lead to smoke emissions dependent on smoking intensity. Differences were observed for CO on one side as a gas phase compound and nicotine on the other side as a particulate phase compound. For a given intensity, while CO emission increases linearly as the cigar burns, nicotine and water emissions exhibited an exponential increase. Our investigations showed that a complex phenomena occurs during the course of cigar smoking which makes emission data: difficult to interpret, is potentially misleading to the consumer, and inappropriate for exposure assessment. The results indicate that, tobacco content and physical parameters may well be the most robust basis for product characterisation and comparison rather than smoke emission. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. 14 CFR 34.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.89 Compliance with smoke emission...

  12. 14 CFR 34.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.89 Compliance with smoke emission... in Appendix 6 to ICAO Annex 16, Environmental Protection, Volume II, Aircraft Engine Emissions...

  13. 14 CFR 34.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.89 Compliance with smoke emission... in Appendix 6 to ICAO Annex 16, Environmental Protection, Volume II, Aircraft Engine Emissions...

  14. Wild Fire Emissions for the NOAA Operational HYSPLIT Smoke Model

    NASA Astrophysics Data System (ADS)

    Huang, H. C.; ONeill, S. M.; Ruminski, M.; Shafran, P.; McQueen, J.; DiMego, G.; Kondragunta, S.; Gorline, J.; Huang, J. P.; Stunder, B.; Stein, A. F.; Stajner, I.; Upadhayay, S.; Larkin, N. K.

    2015-12-01

    Particulate Matter (PM) generated from forest fires often lead to degraded visibility and unhealthy air quality in nearby and downstream areas. To provide near-real time PM information to the state and local agencies, the NOAA/National Weather Service (NWS) operational HYSPLIT (Hybrid Single Particle Lagrangian Integrated Trajectory Model) smoke modeling system (NWS/HYSPLIT smoke) provides the forecast of smoke concentration resulting from fire emissions driven by the NWS North American Model 12 km weather predictions. The NWS/HYSPLIT smoke incorporates the U.S. Forest Service BlueSky Smoke Modeling Framework (BlueSky) to provide smoke fire emissions along with the input fire locations from the NOAA National Environmental Satellite, Data, and Information Service (NESDIS)'s Hazard Mapping System fire and smoke detection system. Experienced analysts inspect satellite imagery from multiple sensors onboard geostationary and orbital satellites to identify the location, size and duration of smoke emissions for the model. NWS/HYSPLIT smoke is being updated to use a newer version of USFS BlueSky. The updated BlueSky incorporates the Fuel Characteristic Classification System version 2 (FCCS2) over the continental U.S. and Alaska. FCCS2 includes a more detailed description of fuel loadings with additional plant type categories. The updated BlueSky also utilizes an improved fuel consumption model and fire emission production system. For the period of August 2014 and June 2015, NWS/HYSPLIT smoke simulations show that fire smoke emissions with updated BlueSky are stronger than the current operational BlueSky in the Northwest U.S. For the same comparisons, weaker fire smoke emissions from the updated BlueSky were observed over the middle and eastern part of the U.S. A statistical evaluation of NWS/HYSPLIT smoke predicted total column concentration compared to NOAA NESDIS GOES EAST Aerosol Smoke Product retrievals is underway. Preliminary results show that using the newer version

  15. Smoke and Emissions Model Intercomparison Project (SEMIP)

    NASA Astrophysics Data System (ADS)

    Larkin, N. K.; Raffuse, S.; Strand, T.; Solomon, R.; Sullivan, D.; Wheeler, N.

    2008-12-01

    Fire emissions and smoke impacts from wildland fire are a growing concern due to increasing fire season severity, dwindling tolerance of smoke by the public, tightening air quality regulations, and their role in climate change issues. Unfortunately, while a number of models and modeling system solutions are available to address these issues, the lack of quantitative information on the limitations and difference between smoke and emissions models impedes the use of these tools for real-world applications (JFSP, 2007). We describe a new, open-access project to directly address this issue, the open-access Smoke Emissions Model Intercomparison Project (SEMIP) and invite the community to participate. Preliminary work utilizing the modular BlueSky framework to directly compare fire location and size information, fuel loading amounts, fuel consumption rates, and fire emissions from a number of current models that has found model-to-model variability as high as two orders of magnitude for an individual fire. Fire emissions inventories also show significant variability on both regional and national scales that are dependant on the fire location information used (ground report vs. satellite), the fuel loading maps assumed, and the fire consumption models employed. SEMIP expands on this work and creates an open-access database of model results and observations with the goal of furthering model development and model prediction usability for real-world decision support.

  16. 40 CFR 87.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Compliance with smoke emission standards. 87.89 Section 87.89 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines...

  17. 40 CFR 87.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.89 Compliance with smoke emission standards... engine of the model being tested. An acceptable alternative to testing every engine is described in...

  18. 40 CFR 87.89 - Compliance with smoke emission standards.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.89 Compliance with smoke emission standards... engine of the model being tested. An acceptable alternative to testing every engine is described in...

  19. Enhancements in Deriving Smoke Emission Coefficients from Fire Radiative Power Measurements

    NASA Technical Reports Server (NTRS)

    Ellison, Luke; Ichoku, Charles

    2011-01-01

    Smoke emissions have long been quantified after-the-fact by simple multiplication of burned area, biomass density, fraction of above-ground biomass, and burn efficiency. A new algorithm has been suggested, as described in Ichoku & Kaufman (2005), for use in calculating smoke emissions directly from fire radiative power (FRP) measurements such that the latency and uncertainty associated with the previously listed variables are avoided. Application of this new, simpler and more direct algorithm is automatic, based only on a fire's FRP measurement and a predetermined coefficient of smoke emission for a given location. Attaining accurate coefficients of smoke emission is therefore critical to the success of this algorithm. In the aforementioned paper, an initial effort was made to derive coefficients of smoke emission for different large regions of interest using calculations of smoke emission rates from MODIS FRP and aerosol optical depth (AOD) measurements. Further work had resulted in a first draft of a 1 1 resolution map of these coefficients. This poster will present the work done to refine this algorithm toward the first production of global smoke emission coefficients. Main updates in the algorithm include: 1) inclusion of wind vectors to help refine several parameters, 2) defining new methods for calculating the fire-emitted AOD fractions, and 3) calculating smoke emission rates on a per-pixel basis and aggregating to grid cells instead of doing so later on in the process. In addition to a presentation of the methodology used to derive this product, maps displaying preliminary results as well as an outline of the future application of such a product into specific research opportunities will be shown.

  20. Smoke emissions from prescribed burning of southern California chaparral.

    Treesearch

    Colin C. Hardy; Jon C. Regelbrugge; David R. Teesdale

    1996-01-01

    This report characterizes smoke emissions from small-scale prescribed burns in southern California chaparral. In situ measurements of smoke emissions were made from 12 fires. Three replicate tests were performed in each of four distinct fuel and fire treatments common to vegetation management operations: a young and rigorous chamise-dominated stand; an old and decadent...

  1. Comparison of True and Smoothed Puff Profile Replication on Smoking Behavior and Mainstream Smoke Emissions

    PubMed Central

    2015-01-01

    To estimate exposures to smokers from cigarettes, smoking topography is typically measured and programmed into a smoking machine to mimic human smoking, and the resulting smoke emissions are tested for relative levels of harmful constituents. However, using only the summary puff data—with a fixed puff frequency, volume, and duration—may underestimate or overestimate actual exposure to smoke toxins. In this laboratory study, we used a topography-driven smoking machine that faithfully reproduces a human smoking session and individual human topography data (n = 24) collected during previous clinical research to investigate if replicating the true puff profile (TP) versus the mathematically derived smoothed puff profile (SM) resulted in differences in particle size distributions and selected toxic/carcinogenic organic compounds from mainstream smoke emissions. Particle size distributions were measured using an electrical low pressure impactor, the masses of the size-fractionated fine and ultrafine particles were determined gravimetrically, and the collected particulate was analyzed for selected particle-bound, semivolatile compounds. Volatile compounds were measured in real time using a proton transfer reaction-mass spectrometer. By and large, TP levels for the fine and ultrafine particulate masses as well as particle-bound organic compounds were slightly lower than the SM concentrations. The volatile compounds, by contrast, showed no clear trend. Differences in emissions due to the use of the TP and SM profiles are generally not large enough to warrant abandoning the procedures used to generate the simpler smoothed profile in favor of the true profile. PMID:25536227

  2. Temporal Effects on Internal Fluorescence Emissions Associated with Aflatoxin Contamination from Corn Kernel Cross-Sections Inoculated with Toxigenic and Atoxigenic Aspergillus flavus.

    PubMed

    Hruska, Zuzana; Yao, Haibo; Kincaid, Russell; Brown, Robert L; Bhatnagar, Deepak; Cleveland, Thomas E

    2017-01-01

    Non-invasive, easy to use and cost-effective technology offers a valuable alternative for rapid detection of carcinogenic fungal metabolites, namely aflatoxins, in commodities. One relatively recent development in this area is the use of spectral technology. Fluorescence hyperspectral imaging, in particular, offers a potential rapid and non-invasive method for detecting the presence of aflatoxins in maize infected with the toxigenic fungus Aspergillus flavus . Earlier studies have shown that whole maize kernels contaminated with aflatoxins exhibit different spectral signatures from uncontaminated kernels based on the external fluorescence emission of the whole kernels. Here, the effect of time on the internal fluorescence spectral emissions from cross-sections of kernels infected with toxigenic and atoxigenic A. flavus , were examined in order to elucidate the interaction between the fluorescence signals emitted by some aflatoxin contaminated maize kernels and the fungal invasion resulting in the production of aflatoxins. First, the difference in internal fluorescence emissions between cross-sections of kernels incubated in toxigenic and atoxigenic inoculum was assessed. Kernels were inoculated with each strain for 5, 7, and 9 days before cross-sectioning and imaging. There were 270 kernels (540 halves) imaged, including controls. Second, in a different set of kernels (15 kernels/group; 135 total), the germ of each kernel was separated from the endosperm to determine the major areas of aflatoxin accumulation and progression over nine growth days. Kernels were inoculated with toxigenic and atoxigenic fungal strains for 5, 7, and 9 days before the endosperm and germ were separated, followed by fluorescence hyperspectral imaging and chemical aflatoxin determination. A marked difference in fluorescence intensity was shown between the toxigenic and atoxigenic strains on day nine post-inoculation, which may be a useful indicator of the location of aflatoxin contamination

  3. Temporal Effects on Internal Fluorescence Emissions Associated with Aflatoxin Contamination from Corn Kernel Cross-Sections Inoculated with Toxigenic and Atoxigenic Aspergillus flavus

    PubMed Central

    Hruska, Zuzana; Yao, Haibo; Kincaid, Russell; Brown, Robert L.; Bhatnagar, Deepak; Cleveland, Thomas E.

    2017-01-01

    Non-invasive, easy to use and cost-effective technology offers a valuable alternative for rapid detection of carcinogenic fungal metabolites, namely aflatoxins, in commodities. One relatively recent development in this area is the use of spectral technology. Fluorescence hyperspectral imaging, in particular, offers a potential rapid and non-invasive method for detecting the presence of aflatoxins in maize infected with the toxigenic fungus Aspergillus flavus. Earlier studies have shown that whole maize kernels contaminated with aflatoxins exhibit different spectral signatures from uncontaminated kernels based on the external fluorescence emission of the whole kernels. Here, the effect of time on the internal fluorescence spectral emissions from cross-sections of kernels infected with toxigenic and atoxigenic A. flavus, were examined in order to elucidate the interaction between the fluorescence signals emitted by some aflatoxin contaminated maize kernels and the fungal invasion resulting in the production of aflatoxins. First, the difference in internal fluorescence emissions between cross-sections of kernels incubated in toxigenic and atoxigenic inoculum was assessed. Kernels were inoculated with each strain for 5, 7, and 9 days before cross-sectioning and imaging. There were 270 kernels (540 halves) imaged, including controls. Second, in a different set of kernels (15 kernels/group; 135 total), the germ of each kernel was separated from the endosperm to determine the major areas of aflatoxin accumulation and progression over nine growth days. Kernels were inoculated with toxigenic and atoxigenic fungal strains for 5, 7, and 9 days before the endosperm and germ were separated, followed by fluorescence hyperspectral imaging and chemical aflatoxin determination. A marked difference in fluorescence intensity was shown between the toxigenic and atoxigenic strains on day nine post-inoculation, which may be a useful indicator of the location of aflatoxin contamination

  4. Composition and emissions of VOCs in main- and side-stream smoke of research cigarettes

    NASA Astrophysics Data System (ADS)

    Charles, Simone M.; Batterman, S. A.; Jia, Chunrong

    It is well known that mainstream (MS) and sidestream (SS) cigarette smoke contains a vast number of chemical substances. Previous studies have emphasized SS smoke rather than MS smoke to which smokers are exposed, and most have used chamber tests that have several disadvantages such as wall losses. Emissions from standard research cigarettes have been measured, but relatively few constituents have been reported, and only the 1R4F (low nicotine) cigarette type has been tested. This study provides a comprehensive characterization of total, MS and SS smoke emissions for the 1R5F (ultra low nicotine), 2R4F (low nicotine), and 1R3F (standard nicotine) research cigarettes research cigarettes, including emission factors for a number of toxic compounds (e.g., benzene) and tobacco smoke tracers (e.g., 2,5-dimethyl furan). Emissions of volatile organic compounds (VOCs) and particulate matter (PM) are quantified using a dynamic dilution emission measurement system that is shown to produce accurate, rapid and reproducible results for over 30 VOCs and PM. SS and MS emissions were accurately apportioned based on a mass balance of total emissions. As expected, SS emissions greatly exceeded MS emissions. The ultra low nicotine cigarette had lower emissions of most VOCs compared to low and standard nicotine cigarettes, which had similar emissions. Across the three types of cigarettes, emissions of benzene (296-535 μg cig -1), toluene (541-1003 μg cig -1), styrene (90-162 μg cig -1), 2-dimethyl furan (71-244 μg cig -1), naphthalene (15-18 μg cig -1) and other VOCs were generally comparable to or somewhat higher than literature estimates using chamber tests.

  5. Correlation of black smoke and nitrogen oxides emissions through field testing of in-use diesel vehicles.

    PubMed

    Lin, Cherng-Yuan; Chen, Lih-Wei; Wang, Li-Ting

    2006-05-01

    Diesel vehicles are one of the major forms of transportation, especially in metropolitan regions. However, air pollution released from diesel vehicles causes serious damage to both human health and the environment, and as a result is of great public concern. Nitrogen oxides and black smoke are two significant emissions from diesel engines. Understanding the correlation between these two emissions is an important step toward developing the technology for an appropriate strategy to control or eliminate them. This study field-tested 185 diesel vehicles at an engine dynamometer station for their black smoke reflectivity and nitrogen oxides concentration to explore the correlation between these two pollutants. The test results revealed that most of the tested diesel vehicles emitted black smoke with low reflectivity and produced low nitrogen oxides concentration. The age of the tested vehicles has a significant influence on the NOx emission. The older the tested vehicles, the higher the NOx concentrations emitted, however, there was no obvious correlation between the age of the tested diesel vehicles and the black smoke reflectivity. In addition, if the make and engine displacement volume of the tested diesel vehicles are not taken into consideration, then the correlation between the black smoke reflectivity and nitrogen oxides emission weakens. However, when the tested vehicles were classified into various groups based on their makes and engine displacement volumes, then the make of a tested vehicle became a dominant factor for both the quantity and the trend of the black smoke reflectivity, as well as the NOx emission. Higher emission indices of black smoke reflectivity and nitrogen oxides were observed if the diesel vehicles were operated at low engine speed and full engine load conditions. Moreover, the larger the displacement volume of the engine of the tested vehicle, the lower the emission indices of both black smoke reflectivity and nitrogen oxides emitted. The

  6. 40 CFR 86.884-7 - Dynamometer operation cycle for smoke emission tests.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Dynamometer operation cycle for smoke... Dynamometer operation cycle for smoke emission tests. (a) The following sequence of operations shall be... the preconditioning prior to the smoke cycle. (ii) With the throttle remaining in the fully open...

  7. Correlation and classification of single kernel fluorescence hyperspectral data with aflatoxin concentration in corn kernels inoculated with Aspergillus flavus spores.

    PubMed

    Yao, H; Hruska, Z; Kincaid, R; Brown, R; Cleveland, T; Bhatnagar, D

    2010-05-01

    The objective of this study was to examine the relationship between fluorescence emissions of corn kernels inoculated with Aspergillus flavus and aflatoxin contamination levels within the kernels. Aflatoxin contamination in corn has been a long-standing problem plaguing the grain industry with potentially devastating consequences to corn growers. In this study, aflatoxin-contaminated corn kernels were produced through artificial inoculation of corn ears in the field with toxigenic A. flavus spores. The kernel fluorescence emission data were taken with a fluorescence hyperspectral imaging system when corn kernels were excited with ultraviolet light. Raw fluorescence image data were preprocessed and regions of interest in each image were created for all kernels. The regions of interest were used to extract spectral signatures and statistical information. The aflatoxin contamination level of single corn kernels was then chemically measured using affinity column chromatography. A fluorescence peak shift phenomenon was noted among different groups of kernels with different aflatoxin contamination levels. The fluorescence peak shift was found to move more toward the longer wavelength in the blue region for the highly contaminated kernels and toward the shorter wavelengths for the clean kernels. Highly contaminated kernels were also found to have a lower fluorescence peak magnitude compared with the less contaminated kernels. It was also noted that a general negative correlation exists between measured aflatoxin and the fluorescence image bands in the blue and green regions. The correlation coefficients of determination, r(2), was 0.72 for the multiple linear regression model. The multivariate analysis of variance found that the fluorescence means of four aflatoxin groups, <1, 1-20, 20-100, and >or=100 ng g(-1) (parts per billion), were significantly different from each other at the 0.01 level of alpha. Classification accuracy under a two-class schema ranged from 0.84 to

  8. Determining size-specific emission factors for environmental tobacco smoke particles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klepeis, Neil E.; Apte, Michael G.; Gundel, Lara A.

    Because size is a major controlling factor for indoor airborne particle behavior, human particle exposure assessments will benefit from improved knowledge of size-specific particle emissions. We report a method of inferring size-specific mass emission factors for indoor sources that makes use of an indoor aerosol dynamics model, measured particle concentration time series data, and an optimization routine. This approach provides--in addition to estimates of the emissions size distribution and integrated emission factors--estimates of deposition rate, an enhanced understanding of particle dynamics, and information about model performance. We applied the method to size-specific environmental tobacco smoke (ETS) particle concentrations measured everymore » minute with an 8-channel optical particle counter (PMS-LASAIR; 0.1-2+ micrometer diameters) and every 10 or 30 min with a 34-channel differential mobility particle sizer (TSI-DMPS; 0.01-1+ micrometer diameters) after a single cigarette or cigar was machine-smoked inside a low air-exchange-rate 20 m{sup 3} chamber. The aerosol dynamics model provided good fits to observed concentrations when using optimized values of mass emission rate and deposition rate for each particle size range as input. Small discrepancies observed in the first 1-2 hours after smoking are likely due to the effect of particle evaporation, a process neglected by the model. Size-specific ETS particle emission factors were fit with log-normal distributions, yielding an average mass median diameter of 0.2 micrometers and an average geometric standard deviation of 2.3 with no systematic differences between cigars and cigarettes. The equivalent total particle emission rate, obtained integrating each size distribution, was 0.2-0.7 mg/min for cigars and 0.7-0.9 mg/min for cigarettes.« less

  9. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) Definitions. Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 87.82 Sampling and analytical procedures for measuring smoke exhaust...

  10. Comparison of carcinogen, carbon monoxide, and ultrafine particle emissions from narghile waterpipe and cigarette smoking: Sidestream smoke measurements and assessment of second-hand smoke emission factors

    PubMed Central

    Daher, Nancy; Saleh, Rawad; Jaroudi, Ezzat; Sheheitli, Hiba; Badr, Thérèse; Sepetdjian, Elizabeth; Al Rashidi, Mariam; Saliba, Najat; Shihadeh, Alan

    2009-01-01

    The lack of scientific evidence on the constituents, properties, and health effects of second-hand waterpipe smoke has fueled controversy over whether public smoking bans should include the waterpipe. The purpose of this study was to investigate and compare emissions of ultrafine particles (UFP, <100 nm), carcinogenic polyaromatic hydrocarbons (PAH), volatile aldehydes, and carbon monoxide (CO) for cigarettes and narghile (shisha, hookah) waterpipes. These smoke constituents are associated with a variety of cancers, and heart and pulmonary diseases, and span the volatility range found in tobacco smoke. Sidestream cigarette and waterpipe smoke was captured and aged in a 1 m3 Teflon-coated chamber operating at 1.5 air changes per hour (ACH). The chamber was characterized for particle mass and number surface deposition rates. UFP and CO concentrations were measured online using a fast particle spectrometer (TSI 3090 Engine Exhaust Particle Sizer), and an indoor air quality monitor. Particulate PAH and gaseous volatile aldehydes were captured on glass fiber filters and DNPH-coated SPE cartridges, respectively, and analyzed off-line using GC–MS and HPLC–MS. PAH compounds quantified were the 5- and 6-ring compounds of the EPA priority list. Measured aldehydes consisted of formaldehyde, acetaldehyde, acrolein, methacrolein, and propionaldehyde. We found that a single waterpipe use session emits in the sidestream smoke approximately four times the carcinogenic PAH, four times the volatile aldehydes, and 30 times the CO of a single cigarette. Accounting for exhaled mainstream smoke, and given a habitual smoker smoking rate of 2 cigarettes per hour, during a typical one-hour waterpipe use session a waterpipe smoker likely generates ambient carcinogens and toxicants equivalent to 2–10 cigarette smokers, depending on the compound in question. There is therefore good reason to include waterpipe tobacco smoking in public smoking bans. PMID:20161525

  11. Comparison of carcinogen, carbon monoxide, and ultrafine particle emissions from narghile waterpipe and cigarette smoking: Sidestream smoke measurements and assessment of second-hand smoke emission factors

    NASA Astrophysics Data System (ADS)

    Daher, Nancy; Saleh, Rawad; Jaroudi, Ezzat; Sheheitli, Hiba; Badr, Thérèse; Sepetdjian, Elizabeth; Al Rashidi, Mariam; Saliba, Najat; Shihadeh, Alan

    2010-01-01

    The lack of scientific evidence on the constituents, properties, and health effects of second-hand waterpipe smoke has fueled controversy over whether public smoking bans should include the waterpipe. The purpose of this study was to investigate and compare emissions of ultrafine particles (UFP, <100 nm), carcinogenic polyaromatic hydrocarbons (PAH), volatile aldehydes, and carbon monoxide (CO) for cigarettes and narghile (shisha, hookah) waterpipes. These smoke constituents are associated with a variety of cancers, and heart and pulmonary diseases, and span the volatility range found in tobacco smoke. Sidestream cigarette and waterpipe smoke was captured and aged in a 1 m 3 Teflon-coated chamber operating at 1.5 air changes per hour (ACH). The chamber was characterized for particle mass and number surface deposition rates. UFP and CO concentrations were measured online using a fast particle spectrometer (TSI 3090 Engine Exhaust Particle Sizer), and an indoor air quality monitor. Particulate PAH and gaseous volatile aldehydes were captured on glass fiber filters and DNPH-coated SPE cartridges, respectively, and analyzed off-line using GC-MS and HPLC-MS. PAH compounds quantified were the 5- and 6-ring compounds of the EPA priority list. Measured aldehydes consisted of formaldehyde, acetaldehyde, acrolein, methacrolein, and propionaldehyde. We found that a single waterpipe use session emits in the sidestream smoke approximately four times the carcinogenic PAH, four times the volatile aldehydes, and 30 times the CO of a single cigarette. Accounting for exhaled mainstream smoke, and given a habitual smoker smoking rate of 2 cigarettes per hour, during a typical one-hour waterpipe use session a waterpipe smoker likely generates ambient carcinogens and toxicants equivalent to 2-10 cigarette smokers, depending on the compound in question. There is therefore good reason to include waterpipe tobacco smoking in public smoking bans.

  12. Potential Fuel Loadings, Fire Ignitions, and Smoke Emissions from Nuclear Bursts in Megacities

    NASA Astrophysics Data System (ADS)

    Turco, R. P.; Toon, O. B.; Robock, A.; Bardeen, C.; Oman, L.; Stenchikov, G. L.

    2006-12-01

    We consider the effects of "small" nuclear detonations in modern "megacities," focusing on the possible extent of fire ignitions, and the properties of corresponding smoke emissions. Explosive devices in the multi-kiloton yield range are being produced by a growing number of nuclear states (Toon et al., 2006), and such weapons may eventually fall into the hands of terrorists. The numbers of nuclear weapons that might be used in a regional conflict, and their potential impacts on population and infrastructure, are discussed elsewhere. Here, we estimate the smoke emissions that could lead to widespread environmental effects, including large-scale climate anomalies. We find that low-yield weapons, which emerging nuclear states have been stockpiling, and which are likely to be targeted against cities in a regional war, can generate up to 100 times as much smoke per kiloton of yield as the high-yield weapons once associated with a superpower nuclear exchange. The fuel loadings in modern cities are estimated using a variety of data, including extrapolations from earlier detailed studies. The probability of ignition and combustion of fuels, smoke emission factors and radiative properties, and prompt scavenging and dispersion of the smoke are summarized. We conclude that a small regional nuclear war might generate up to 5 teragrams of highly absorbing particles in urban firestorms, and that this smoke could initially be injected into the middle and upper troposphere. These results are used to develop smoke emission scenarios for a climate impact analysis reported by Oman et al. (2006). Uncertainties in the present smoke estimates are outlined. Oman, L., A. Robock, G. L. Stenchikov, O. B. Toon, C. Bardeen and R. P. Turco, "Climatic consequences of regional nuclear conflicts," AGU, Fall 2006. Toon, O. B., R. P. Turco, A. Robock, C. Bardeen, L. Oman and G. L. Stenchikov, "Consequences of regional scale nuclear conflicts and acts of individual nuclear terrorism," AGU, Fall

  13. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82...

  14. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82..., Environmental Protection, Volume II, Aircraft Engine Emissions, Second Edition, July 1993, effective July 26...

  15. 14 CFR 34.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., DEPARTMENT OF TRANSPORTATION AIRCRAFT FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Test Procedures for Engine Smoke Emissions (Aircraft Gas Turbine Engines) § 34.82..., Environmental Protection, Volume II, Aircraft Engine Emissions, Second Edition, July 1993, effective July 26...

  16. South American smoke coverage and flux estimations from the Fire Locating and Modeling of Burning Emissions (FLAMBE') system.

    NASA Astrophysics Data System (ADS)

    Reid, J. S.; Westphal, D. L.; Christopher, S. A.; Prins, E. M.; Gasso, S.; Reid, E.; Theisen, M.; Schmidt, C. C.; Hunter, J.; Eck, T.

    2002-05-01

    The Fire Locating and Modeling of Burning Emissions (FLAMBE') project is a joint Navy, NOAA, NASA and university project to integrate satellite products with numerical aerosol models to produce a real time fire and emissions inventory. At the center of the program is the Wildfire Automated Biomass Burning Algorithm (WF ABBA) which provides real-time fire products and the NRL Aerosol Analysis and Prediction System to model smoke transport. In this presentation we give a brief overview of the system and methods, but emphasize new estimations of smoke coverage and emission fluxes from the South American continent. Temporal and smoke patterns compare reasonably well with AERONET and MODIS aerosol optical depth products for the 2000 and 2001 fire seasons. Fluxes are computed by relating NAAPS output fields and MODIS optical depth maps with modeled wind fields. Smoke emissions and transport fluxes out of the continent can then be estimated by perturbing the modeled emissions to gain agreement with the satellite and wind products. Regional smoke emissions are also presented for grass and forest burning.

  17. Building the Fire Energetics and Emissions Research (FEER) Smoke Emissions Inventory Version 1.0

    NASA Technical Reports Server (NTRS)

    Ellison, Luke; Ichoku, Charles; Zhang, Feng; Wang, Jun

    2014-01-01

    The Fire Energetics and Emissions Research (FEER) group's new coefficient of emission global gridded product at 1x1 resolution that directly relates fire readiative energy (FRE) to smoke aerosol release, FEERv1.0 Ce, made its public debut in August 2013. Since then, steps have been taken to generate corresponding maps and totals of total particulate matter (PM) emissions using different sources of FRE, and subsequently to simulate the resulting PM(sub 2.5) in the WRF-Chem 3.5 model using emission rates from FEERv1.0 as well as other standard biomass burning emission inventories. An flowchart of the FEER algorithm to calculate Ce is outlined here along with a display of the resulting emissions of total PM globally and also regionally. The modeling results from the WRF-Chem3.5 simulations are also shown.

  18. Toxic volatile organic compounds in environmental tobacco smoke: Emission factors for modeling exposures of California populations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daisey, J.M.; Mahanama, K.R.R.; Hodgson, A.T.

    The primary objective of this study was to measure emission factors for selected toxic air contaminants in environmental tobacco smoke (ETS) using a room-sized environmental chamber. The emissions of 23 volatile organic compounds (VOCs), including, 1,3-butadiene, three aldehydes and two vapor-phase N-nitrosamines were determined for six commercial brands of cigarettes and reference cigarette 1R4F. The commercial brands were selected to represent 62.5% of the cigarettes smoked in California. For each brand, three cigarettes were machine smoked in the chamber. The experiments were conducted over four hours to investigate the effects of aging. Emission factors of the target compounds were alsomore » determined for sidestream smoke (SS). For almost all target compounds, the ETS emission factors were significantly higher than the corresponding SS values probably due to less favorable combustion conditions and wall losses in the SS apparatus. Where valid comparisons could be made, the ETS emission factors were generally in good agreement with the literature. Therefore, the ETS emission factors, rather than the SS values, are recommended for use in models to estimate population exposures from this source. The variabilities in the emission factors ({mu}g/cigarette) of the selected toxic air contaminants among brands, expressed as coefficients of variation, were 16 to 29%. Therefore, emissions among brands were Generally similar. Differences among brands were related to the smoked lengths of the cigarettes and the masses of consumed tobacco. Mentholation and whether a cigarette was classified as light or regular did not significantly affect emissions. Aging was determined not to be a significant factor for the target compounds. There were, however, deposition losses of the less volatile compounds to chamber surfaces.« less

  19. Toxic Volatile Organic Compounds in Environmental Tobacco Smoke:Emission Factors for Modeling Exposures of California Populations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daisey, J.M.; Mahanama, K.R.R.; Hodgson, A.T.

    The primary objective of this study was to measure emission factors for selected toxic air in environmental tobacco smoke (ETS) using a room-sized environmental chamber. The emissions of 23 volatile organic compounds (VOCs), including 1,3-butadiene, three aldehydes and two vapor-phase N-nitrosarnines were determined for six commercial brands of cigarettes and reference cigarette 1R4F. The commercial brands were selected to represent 62.5% of the cigarettes smoked in California. For each brand, three cigarettes were machine smoked in the chamber. The experiments were conducted over four hours to investigate the effects of aging. Emission factors of the target compounds were also determinedmore » for sidestream smoke (SS). For almost all target compounds, the ETS emission factors were significantly higher than the corresponding SS values probably due to less favorable combustion conditions and wall losses in the SS apparatus. Where valid comparisons could be made, the ETS emission factors were generally in good agreement with the literature. Therefore, the ETS emission factors, rather than the SS values, are recommended for use in models to estimate population exposures from this source. The variabilities in the emission factors (pgkigarette) of the selected toxic air contaminants among brands, expressed as coefficients of variation, were 16 to 29%. Therefore, emissions among brands were generally similar. Differences among brands were related to the smoked lengths of the cigarettes and the masses of consumed tobacco. Mentholation and whether a cigarette was classified as light or regular did not significantly affect emissions. Aging was determined not to be a significant factor for the target compounds. There were, however, deposition losses of the less volatile compounds to chamber surfaces.« less

  20. Gas-phase organics in environmental tobacco smoke. 1. Effects of smoking rate, ventilation, and furnishing level on emission factors.

    PubMed

    Singer, Brett C; Hodgson, Alfred T; Guevarra, Karla S; Hawley, Elisabeth L; Nazaroff, William W

    2002-03-01

    We measured the emissions of 26 gas-phase organic compounds in environmental tobacco smoke (ETS) using a model room that simulates realistic conditions in residences and offices. Exposure-relevant emission factors (EREFs), which include the effects of sorption and re-emission over a 24-h period, were calculated by mass balance from measured compound concentrations and chamber ventilation rates in a 50-m3 room constructed and furnished with typical materials. Experiments were conducted at three smoking rates (5, 10, and 20 cigarettes day(-1)), three ventilation rates (0.3, 0.6, and 2 h(-1)), and three furnishing levels (wallboard with aluminum flooring, wallboard with carpet, and full furnishings). Smoking rate did not affect EREFs, suggesting that sorption was linearly related to gas-phase concentration. Furnishing level and ventilation rate in the model room had little effect on EREFs of several ETS compounds including 1,3-butadiene, acrolein, acrylonitrile, benzene, toluene, and styrene. However, sorptive losses at low ventilation with full furnishings reduced EREFs for the ETS tracers nicotine and 3-ethenylpyridine by as much as 90 and 65% as compared to high ventilation, wallboard/aluminum experiments. Likewise, sorptive losses were 40-70% for phenol, cresols, naphthalene, and methylnaphthalenes. Sorption persisted for many compounds; for example, almost all of the sorbed nicotine and most of the sorbed cresol remained sorbed 3 days after smoking. EREFs can be used in models and with ETS tracer-based methods to refine and improve estimates of exposures to ETS constituents.

  1. Use of MODIS-Derived Fire Radiative Energy to Estimate Smoke Aerosol Emissions over Different Ecosystems

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Kaufman, Yoram J.

    2003-01-01

    Biomass burning is the main source of smoke aerosols and certain trace gases in the atmosphere. However, estimates of the rates of biomass consumption and emission of aerosols and trace gases from fires have not attained adequate reliability thus far. Traditional methods for deriving emission rates employ the use of emission factors e(sub x), (in g of species x per kg of biomass burned), which are difficult to measure from satellites. In this era of environmental monitoring from space, fire characterization was not a major consideration in the design of the early satellite-borne remote sensing instruments, such as AVHRR. Therefore, although they are able to provide fire location information, they were not adequately sensitive to variations in fire strength or size, because their thermal bands used for fire detection saturated at the lower end of fire radiative temperature range. As such, hitherto, satellite-based emission estimates employ proxy techniques using satellite derived fire pixel counts (which do not express the fire strength or rate of biomass consumption) or burned areas (which can only be obtained after the fire is over). The MODIS sensor, recently launched into orbit aboard EOS Terra (1999) and Aqua (2002) satellites, have a much higher saturation level and can, not only detect the fire locations 4 times daily, but also measures the at-satellite fire radiative energy (which is a measure of the fire strength) based on its 4 micron channel temperature. Also, MODIS measures the optical thickness of smoke and other aerosols. Preliminary analysis shows appreciable correlation between the MODIS-derived rates of emission of fire radiative energy and smoke over different regions across the globe. These relationships hold great promise for deriving emission coefficients, which can be used for estimating smoke aerosol emissions from MODIS active fire products. This procedure has the potential to provide more accurate emission estimates in near real

  2. Time-resolved analysis of the emission of sidestream smoke (SSS) from cigarettes during smoking by photo ionisation/time-of-flight mass spectrometry (PI-TOFMS): towards a better description of environmental tobacco smoke.

    PubMed

    Streibel, T; Mitschke, S; Adam, T; Zimmermann, R

    2013-09-01

    In this study, the chemical composition of sidestream smoke (SSS) emissions of cigarettes are characterised using a laser-based single-photon ionisation time-of-flight mass spectrometer. SSS is generated from various cigarette types (2R4F research cigarette; Burley, Oriental and Virginia single-tobacco-type cigarettes) smoked on a single-port smoking machine and collected using a so-called fishtail chimney device. Using this setup, a puff-resolved quantification of several SSS components was performed. Investigations of the dynamics of SSS emissions show that concentration profiles of various substances can be categorised into several groups, either depending on the occurrence of a puff or uninfluenced by the changes in the burning zone during puffing. The SSS emissions occurring directly after a puff strongly resemble the composition of mainstream smoke (MSS). In the smouldering phase, clear differences between MSS and SSS are observed. The changed chemical profiles of SSS and MSS might be also of importance on environmental tobacco smoke which is largely determined by SSS. Additionally, the chemical composition of the SSS is strongly affected by the tobacco type. Hence, the higher nitrogen content of Burley tobacco leads to the detection of increased amounts of nitrogen-containing substances in SSS.

  3. Wildfire Smoke Emissions webinar

    EPA Pesticide Factsheets

    This webinar presented by Wayne Cascio will highlight updates to the Wildfire Smoke Guide, as well as the Smoke Sense app, which is a mobile application that gets air quality information to people impacted by wildfire smoke, and helps those affected learn

  4. Implementation and evaluation of a comprehensive emission model for Europe

    NASA Astrophysics Data System (ADS)

    Bieser, Johannes; Aulinger, Armin; Matthias, Volker; Quante, Markus

    2010-05-01

    Crucial input data sets for Chemical Transport Models (CTM) are the meteorological fields and the emissions data. While there are several publicly available meteorological models, the situation for European emission models is still different. European emissions data either lack spatial and temporal resolution, only cover specific countries or are proprietary and not free to use. In this work the US EPA emission model SMOKE (Sparse Matrix Operator Kernel Emissions) has been successfully adapted and partially extended to create European emissions input for CTMs. The modified version of the SMOKE emission model (SMOKE/E) uses official and publicly available data sets and statistics to create emissions of CO, NOx, SO2, NH3, PM2.5, PM10, NMVOC. Currently it supports VOC splits for several photochemical mechanisms, namely CB4, CB5 and RADM2. PM2.5 is split into elemental carbon, organic carbon, sulfate, nitrate and other particles. Additionally emissions of benzo[a]pyrene (BaP) have been modelled with SMOKE Europe. The temporal resolution of the emissions is one hour, the horizontal resolution is up to 1x1 km². SMOKE/E also implements plume in grid calculations for vertical distribution of point sources. The vertical resolution is infinitely variable and is implemented in the form of pressure levels. The area covered by the emission model at this point is Europe and it's surrounding countries, including north Africa and parts of Asia. Thus far SMOKE Europe has been used to create European emissions on a 54x54km² grid covering the whole of Europe and a 18x18km² nested grid over the North and Baltic Sea for the years 1990-2006. The currently implemented datasets allow for the calculation of emissions between 1970-2010. Besides this future emissions scenarios for the timespan 2010-2020 are being calculated using the EMEP projections. The created emissions have been statistically compared to the gridded EMEP emissions as well as to data from other emission models for the

  5. Global Top-Down Smoke-Aerosol Emissions Estimation Using Satellite Fire Radiative Power Measurements

    NASA Technical Reports Server (NTRS)

    Ichoku, C.; Ellison, L.

    2014-01-01

    Fire emissions estimates have long been based on bottom-up approaches that are not only complex, but also fraught with compounding uncertainties. We present the development of a global gridded (1 deg ×1 deg) emission coefficients (Ce) product for smoke total particulate matter (TPM) based on a top-down approach using coincident measurements of fire radiative power (FRP) and aerosol optical thickness (AOT) from the Moderate-resolution Imaging Spectroradiometer (MODIS) sensors aboard the Terra and Aqua satellites. This new Fire Energetics and Emissions Research version 1.0 (FEER.v1) Ce product has now been released to the community and can be obtained from http://feer.gsfc. nasa.gov/, along with the corresponding 1-to-1 mapping of their quality assurance (QA) flags that will enable the Ce values to be filtered by quality for use in various applications. The regional averages of Ce values for different ecosystem types were found to be in the ranges of 16-21/gMJ-1 for savanna and grasslands, 15-32/gMJ-1 for tropical forest, 9-12/gMJ-1 for North American boreal forest, and 18- 26/MJ-1 for Russian boreal forest, croplands and natural vegetation. The FEER.v1 Ce product was multiplied by time-integrated FRP data to calculate regional smoke TPM emissions, which were compared with equivalent emissions products from three existing inventories. FEER.v1 showed higher and more reasonable smoke TPM estimates than two other emissions inventories that are based on bottom-up approaches and already reported in the literature to be too low, but portrayed an overall reasonable agreement with another top-down approach. This suggests that top-down approaches may hold better promise and need to be further developed to accelerate the reduction of uncertainty associated with fire emissions estimation in air-quality and climate research and applications. Results of the analysis of FEER.v1 data for 2004-2011 show that 65-85 Tg yr-1 of TPM is emitted globally from open biomass burning, with a

  6. Smoking topography and biomarkers of exposure among Japanese smokers: associations with cigarette emissions obtained using machine smoking protocols.

    PubMed

    Matsumoto, Mariko; Inaba, Yohei; Yamaguchi, Ichiro; Endo, Osamu; Hammond, David; Uchiyama, Shigehisa; Suzuki, Gen

    2013-03-01

    Although the relative risk of lung cancer due to smoking is reported to be lower in Japan than in other countries, few studies have examined the characteristics of Japanese cigarettes or potential differences in smoking patterns among Japanese smokers. To examine tar, nicotine and carbon monoxide (TNCO) emissions from ten leading cigarettes in Japan, machine smoking tests were conducted using the International Organization for Standardization (ISO) protocol and the Health Canada Intense (HCI) protocol. Smoking topography and tobacco-related biomarkers were collected from 101 Japanese smokers to examine measures of exposure. The findings indicate considerable variability in the smoking behavior of Japanese smokers. On average, puffing behaviors observed among smokers were more similar to the parameters of the HCI protocol, and brands with greater ventilation that yielded lower machine values using the ISO protocol were smoked more intensely than brands with lower levels of ventilation. The smokers of "ultra-low/low" nicotine-yield cigarettes smoked 2.7-fold more intensively than those of "medium/high" nicotine-yield cigarette smokers to achieve the same level of salivary cotinine (p = 0.024). CO levels in expiratory breath samples were associated with puff volume and self-reported smoking intensity, but not with nominal values of nicotine-yield reported on cigarette packages. Japanese smokers engaged in "compensatory smoking" to achieve their desired nicotine intake, and levels of exposure were greater than those suggested by the nominal value of nicotine and tar yields reported on cigarette packages.

  7. SMOKE TOOL FOR MODELS-3 VERSION 4.1 STRUCTURE AND OPERATION DOCUMENTATION

    EPA Science Inventory

    The SMOKE Tool is a part of the Models-3 system, a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. The SMOKE Tool is an input processor for SMOKE, (Sparse Matrix Operator Kernel Emissio...

  8. Laboratory investigation of fire radiative energy and smoke aerosol emissions

    Treesearch

    Charles Ichoku; J. Vanderlei Martins; Yoram J. Kaufman; Martin J. Wooster; Patrick H. Freeborn; Wei Min Hao; Stephen Baker; Cecily A. Ryan; Bryce L. Nordgren

    2008-01-01

    Fuel biomass samples from southern Africa and the United States were burned in a laboratory combustion chamber while measuring the biomass consumption rate, the fire radiative energy (FRE) release rate (Rfre), and the smoke concentrations of carbon monoxide (CO), carbon dioxide (CO2), and particulate matter (PM). The PM mass emission rate (RPM) was quantified from...

  9. Sensitivity of Mesoscale Modeling of Smoke Direct Radiative Effect to the Emission Inventory: a Case Study in Northern Sub-Saharan African Region

    NASA Technical Reports Server (NTRS)

    Zhang, Feng; Wang, Jun; Ichoku, Charles; Hyer, Edward J.; Yang, Zhifeng; Ge, Cui; Su, Shenjian; Zhang, Xiaoyang; Kondragunta, Shobha; Kaiser, Johannes W.; hide

    2014-01-01

    An ensemble approach is used to examine the sensitivity of smoke loading and smoke direct radiative effect in the atmosphere to uncertainties in smoke emission estimates. Seven different fire emission inventories are applied independently to WRF-Chem model (v3.5) with the same model configuration (excluding dust and other emission sources) over the northern sub-Saharan African (NSSA) biomass-burning region. Results for November and February 2010 are analyzed, respectively representing the start and end of the biomass burning season in the study region. For February 2010, estimates of total smoke emission vary by a factor of 12, but only differences by factors of 7 or less are found in the simulated regional (15degW-42degE, 13degS-17degN) and monthly averages of column PM(sub 2.5) loading, surface PM(sub 2.5) concentration, aerosol optical depth (AOD), smoke radiative forcing at the top-of-atmosphere and at the surface, and air temperature at 2 m and at 700 hPa. The smaller differences in these simulated variables may reflect the atmospheric diffusion and deposition effects to dampen the large difference in smoke emissions that are highly concentrated in areas much smaller than the regional domain of the study. Indeed, at the local scale, large differences (up to a factor of 33) persist in simulated smoke-related variables and radiative effects including semi-direct effect. Similar results are also found for November 2010, despite differences in meteorology and fire activity. Hence, biomass burning emission uncertainties have a large influence on the reliability of model simulations of atmospheric aerosol loading, transport, and radiative impacts, and this influence is largest at local and hourly-to-daily scales. Accurate quantification of smoke effects on regional climate and air quality requires further reduction of emission uncertainties, particularly for regions of high fire concentrations such as NSSA.

  10. Prescribed Grassland Burning Smoke Emission Measurements in the Northern Flint Hills Region

    NASA Astrophysics Data System (ADS)

    Wilkins, J. L.; Baker, K. R.; Landis, M.; Aurell, J.; Gullett, B.

    2017-12-01

    Historically, frequent wildfires were essential for the maintenance of native prairie fire adapted ecosystems. Today prescribed fires are used to control invasive woody species and potentially improve forage production in these same prairie ecosystems for the beef-cattle industry. The emission of primary particulate matter, secondary aerosol, ozone precursors, and air toxics from prescribed grassland burning operations has been implicated as drivers of downwind air quality problems across a multi-state area. A field study has been planned to quantify prescribed burn smoke emissions using both surface and aerial sampling platforms to better constrain emissions rates for organic and inorganic pollutants. Multiple prescribed burns on tallgrass prairie fields in the northern Flint Hills ecoregion are planned for March 2017 at the Konza Prairie Biological Station in Kansas. An array of measurement systems will be deployed to quantify a suite of continuous and integrated air pollution parameters, combustion conditions, meteorological parameters, and plume dynamics to calculate more accurate and condition-specific emission factors that will be used to better predict primary and secondary pollutants both locally and regionally. These emissions measurements will allow for evaluation and improvement of the U.S. Forest Service's Bluesky modeling framework which includes the Fire Emission Production Simulator (FEPS) and Fuel characterization classification system (FCCS). Elucidating grassland prescribed burning emission factors based on fuel type, loading, and environmental conditions is expected to provide an improved understanding of the impact of this land management practice on air quality in the greater Flint Hills region. It is also expected that measurements will be made to help constrain and develop better routines for fire plume rise, vertical allocation, and smoke optical properties.

  11. Public Health Impact of Wildfire Emissions: Up-date on the Wildfire Smoke Guide, Public Health Information and Communications Research

    EPA Science Inventory

    EPA Tools and Resources Webinar: Public Health Impact of Wildfire Smoke Emissions Specific strategies to reduce smoke exposure and the Smoke Sense App As the start of the summer wildfire season approaches, public officials, communities and individuals need up-to-date wildfire smo...

  12. Kernel Machine SNP-set Testing under Multiple Candidate Kernels

    PubMed Central

    Wu, Michael C.; Maity, Arnab; Lee, Seunggeun; Simmons, Elizabeth M.; Harmon, Quaker E.; Lin, Xinyi; Engel, Stephanie M.; Molldrem, Jeffrey J.; Armistead, Paul M.

    2013-01-01

    Joint testing for the cumulative effect of multiple single nucleotide polymorphisms grouped on the basis of prior biological knowledge has become a popular and powerful strategy for the analysis of large scale genetic association studies. The kernel machine (KM) testing framework is a useful approach that has been proposed for testing associations between multiple genetic variants and many different types of complex traits by comparing pairwise similarity in phenotype between subjects to pairwise similarity in genotype, with similarity in genotype defined via a kernel function. An advantage of the KM framework is its flexibility: choosing different kernel functions allows for different assumptions concerning the underlying model and can allow for improved power. In practice, it is difficult to know which kernel to use a priori since this depends on the unknown underlying trait architecture and selecting the kernel which gives the lowest p-value can lead to inflated type I error. Therefore, we propose practical strategies for KM testing when multiple candidate kernels are present based on constructing composite kernels and based on efficient perturbation procedures. We demonstrate through simulations and real data applications that the procedures protect the type I error rate and can lead to substantially improved power over poor choices of kernels and only modest differences in power versus using the best candidate kernel. PMID:23471868

  13. Using JPSS VIIRS Fire Radiative Power Data to Forecast Biomass Burning Emissions and Smoke Transport by the High Resolution Rapid Refresh Model

    NASA Astrophysics Data System (ADS)

    Ahmadov, R.; Grell, G. A.; James, E.; Alexander, C.; Stewart, J.; Benjamin, S.; McKeen, S. A.; Csiszar, I. A.; Tsidulko, M.; Pierce, R. B.; Pereira, G.; Freitas, S. R.; Goldberg, M.

    2017-12-01

    We present a new real-time smoke modeling system, the High Resolution Rapid Refresh coupled with smoke (HRRR-Smoke), to simulate biomass burning (BB) emissions, plume rise and smoke transport in real time. The HRRR is the NOAA Earth System Research Laboratory's 3km grid spacing version of the Weather Research and Forecasting (WRF) model used for weather forecasting. Here we make use of WRF-Chem (the WRF model coupled with chemistry) and simulate fine particulate matter (smoke) emissions emitted by BB. The HRRR-Smoke modeling system ingests fire radiative power (FRP) data from the Visible Infrared Imaging Radiometer Suite (VIIRS) sensor on the Suomi National Polar-orbiting Partnership (S-NPP) satellite to calculate BB emissions. The FRP product is based on processing 750m resolution "M" bands. The algorithms for fire detection and FRP retrieval are consistent with those used to generate the MODIS fire detection data. For the purpose of ingesting VIIRS fire data into the HRRR-Smoke model, text files are generated to provide the location and detection confidence of fire pixels, as well as FRP. The VIIRS FRP data from the text files are processed and remapped over the HRRR-Smoke model domains. We process the FRP data to calculate BB emissions (smoldering part) and fire size for the model input. In addition, HRRR-Smoke uses the FRP data to simulate the injection height for the flaming emissions using concurrently simulated meteorological fields by the model. Currently, there are two 3km resolution domains covering the contiguous US and Alaska which are used to simulate smoke in real time. In our presentation, we focus on the CONUS domain. HRRR-Smoke is initialized 4 times per day to forecast smoke concentrations for the next 36 hours. The VIIRS FRP data, as well as near-surface and vertically integrated smoke mass concentrations are visualized for every forecast hour. These plots are provided to the public via the HRRR-Smoke web-page: https

  14. Secondhand smoke emission levels in waterpipe cafes in Doha, Qatar.

    PubMed

    Al Mulla, Ahmad; Fanous, Nadia; Seidenberg, Andrew B; Rees, Vaughan W

    2015-10-01

    Exposure to the emissions of a tobacco waterpipe is associated with increased health risks among its users as well as those exposed to its secondhand smoke. Waterpipe use is an emerging concern to the tobacco control community, particularly among countries of the Eastern Mediterranean Region. In 2002, Qatar adopted legislation that prohibited cigarette smoking inside public venues, but exempted tobacco waterpipe smoking. To inform the development and enforcement of effective policy, the impact of cigarette and waterpipe use on indoor air quality was monitored in waterpipe cafes in Doha, Qatar. Particulate matter (PM2.5) levels were measured inside and outside of a sample of 40 waterpipe cafes and 16 smoke-free venues in Doha, Qatar between July and October 2012. In addition, the number of waterpipes being smoked and the number of cigarette smokers were counted within each venue. Non-paired and paired sample t tests were used to assess differences in mean PM2.5 measurements between venue type (waterpipe vs smoke-free) and environment (indoor vs outdoor). The mean PM2.5 level inside waterpipe venues (476 μg/m(3)) was significantly higher than the mean PM2.5 level inside smoke-free venues (17 μg/m(3); p<0.001), and significantly higher than the mean PM2.5 level found immediately outside waterpipe venues (35 μg/m(3); p<0.001). In smoke-free venues, the outside mean PM2.5 level (30 μg/m(3)) did not differ significantly from the mean PM2.5 inside levels inside these venues (p=0.121). Elevated levels of particulate pollution were found in waterpipe cafes in Doha, Qatar, potentially endangering the health of employees and patrons. To protect the public from the dangers of secondhand tobacco smoke, and to change social norms around tobacco use, smoke-free policies that apply to all forms of combusted tobacco products, including the waterpipe, are needed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence

  15. Gas-phase organics in environmental tobacco smoke: 2. Exposure-relevant emission factors and indirect exposures from habitual smoking

    NASA Astrophysics Data System (ADS)

    Singer, Brett C.; Hodgson, Alfred T.; Nazaroff, William W.

    Sorption of emitted gas-phase organic compounds onto material surfaces affects environmental tobacco smoke (ETS) composition and exposures indoors. We have introduced a new metric, the exposure relevant emission factor (EREF) that accounts for sorptive uptake and reemission to give the mass of individual ETS constituents available for exposure over a day in which smoking occurs. This paper describes month-long experiments to investigate sorption effects on EREFs and potential ETS exposures under habitual smoking conditions. Cigarettes were smoked in a 50-m 3 furnished room over a 3-h period 6-7 days per week, with continuous ventilation at 0.3, 0.6, or 2.1 h -1. Organic gas concentrations were measured every few days over 4-h "smoking", 10-h "post-smoking" and 10-h "background" periods. Concentration patterns of volatile ETS components including 1,3-butadiene, benzene and acrolein were similar to those calculated for a theoretical non-sorbing tracer, indicating limited sorption. Concentrations of ETS tracers, e.g. 3-ethenylpyridine (3-EP) and nicotine, and lower volatility toxic air contaminants including phenol, cresols, and naphthalene increased as experiments progressed, indicating mass accumulation on surfaces and higher desorption rates. Daily patterns stabilized after week 2, yielding a steady daily cycle of ETS concentrations associated with habitual smoking. EREFs for sorbing compounds were higher under steady cycle versus single-day smoking conditions by ˜50% for 3-EP, and by 2-3 times for nicotine, phenol, cresols, naphthalene, and methylnaphthalenes. Our results provide relevant information about potential indirect exposures from residual ETS (non-smoker enters room shortly after smoker finishes) and from reemission, and their importance relative to direct exposures (non-smoker present during smoking). Under the conditions examined, indirect exposures accounted for a larger fraction of total potential exposures for sorbing versus non-sorbing compounds

  16. Smoke plumes: Emissions and effects

    Treesearch

    Susan O' Neill; Shawn Urbanski; Scott Goodrick; Sim Larkin

    2017-01-01

    Smoke can manifest itself as a towering plume rising against the clear blue sky-or as a vast swath of thick haze, with fingers that settle into valleys overnight. It comes in many forms and colors, from fluffy and white to thick and black. Smoke plumes can rise high into the atmosphere and travel great distances across oceans and continents. Or smoke can remain close...

  17. A coupled high-resolution modeling system to simulate biomass burning emissions, plume rise and smoke transport in real time over the contiguous US

    NASA Astrophysics Data System (ADS)

    Ahmadov, R.; Grell, G. A.; James, E.; Freitas, S.; Pereira, G.; Csiszar, I. A.; Tsidulko, M.; Pierce, R. B.; McKeen, S. A.; Saide, P.; Alexander, C.; Benjamin, S.; Peckham, S.

    2016-12-01

    Wildfires can have huge impact on air quality and visibility over large parts of the US. It is quite challenging to accurately predict wildfire air quality given significant uncertainties in modeling of biomass burning (BB) emissions, fire size, plume rise and smoke transport. We developed a new smoke modeling system (HRRR-Smoke) based on the coupled meteorology-chemistry model WRF-Chem. The HRRR-Smoke modeling system uses fire radiative power (FRP) data measured by the Visible Infrared Imaging Radiometer Suite (VIIRS) sensor on the Suomi National Polar-orbiting Partnership satellite. Using the FRP data enables predicting fire emissions, fire size and plume rise more accurately. Another advantage of the VIIRS data is the fire detection and characterization at­ high spatial resolution during both day and nighttime. The HRRR-Smoke model is run in real-time for summer 2016 on 3km horizontal grid resolution over CONUS domain by NOAA/ESRL Global Systems Division (GSD). The model simulates advection and mixing of fine particulate matter (PM2.5 or smoke) emitted by calculated BB emissions. The BB emissions include both smoldering and flaming fractions. Fire plume rise is parameterized in an online mode during the model integration. In addition to smoke, anthropogenic emissions of PM2.5 are transported in an inline mode as a passive tracer by HRRR-Smoke. The HRRR-Smoke real-time runs use meteorological fields for initial and lateral boundary conditions from the experimental real-time HRRR(X) numerical weather prediction model also run at NOAA/ESRL/GSD. The model is initialized every 6 hours (00, 06, 12 and 18UTC) daily using newly generated meteorological fields and FRP data obtained during previous 24 hours. Then the model produces meteorological and smoke forecasts for next 36 hours. The smoke fields are cycled from one forecast to the next one. Predicted near-surface and vertically integrated smoke concentrations are visualized online on a web-site: http

  18. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  19. 40 CFR 87.82 - Sampling and analytical procedures for measuring smoke exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling and analytical procedures for measuring smoke exhaust emissions. 87.82 Section 87.82 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM AIRCRAFT AND AIRCRAFT ENGINES...

  20. Experimental analysis of performance and emission on DI diesel engine fueled with diesel-palm kernel methyl ester-triacetin blends: a Taguchi fuzzy-based optimization.

    PubMed

    Panda, Jibitesh Kumar; Sastry, Gadepalli Ravi Kiran; Rai, Ram Naresh

    2018-05-25

    The energy situation and the concerns about global warming nowadays have ignited research interest in non-conventional and alternative fuel resources to decrease the emission and the continuous dependency on fossil fuels, particularly for various sectors like power generation, transportation, and agriculture. In the present work, the research is focused on evaluating the performance, emission characteristics, and combustion of biodiesel such as palm kernel methyl ester with the addition of diesel additive "triacetin" in it. A timed manifold injection (TMI) system was taken up to examine the influence of durations of several blends induced on the emission and performance characteristics as compared to normal diesel operation. This experimental study shows better performance and releases less emission as compared with mineral diesel and in turn, indicates that high performance and low emission is promising in PKME-triacetin fuel operation. This analysis also attempts to describe the application of the fuzzy logic-based Taguchi analysis to optimize the emission and performance parameters.

  1. Comparative carcinogenic potencies of particulates from diesel engine exhausts, coke oven emissions, roofing tar aerosols and cigarette smoke.

    PubMed Central

    Albert, R E

    1983-01-01

    Mammalian cell mutagenesis, transformation and skin tumorigenesis assays show similar results in comparing the potencies of diesel, coke oven, roofing tar and cigarette smoke particulates. These assay results are reasonably consistent with the comparative carcinogenic potencies of coke oven and roofing tar emissions as determined by epidemiological studies. The bacterial mutagenesis assay tends to show disproportionately high potencies, particularly with diesel particulates. Results to date encourage the approach to the assessment for carcinogenic risks from diesel emissions based on the use of epidemiological data on cancer induced by coke oven emissions, roofing tar particulates and cigarette smoke with the comparative potencies of these materials determined by in vivo and in vitro bioassays. PMID:6186481

  2. Mesoscale modeling of Central American smoke transport to the United States: 1. ``Top-down'' assessment of emission strength and diurnal variation impacts

    NASA Astrophysics Data System (ADS)

    Wang, Jun; Christopher, Sundar A.; Nair, U. S.; Reid, Jeffrey S.; Prins, Elaine M.; Szykman, James; Hand, Jenny L.

    2006-03-01

    As is typical in the Northern Hemisphere spring, during 20 April to 21 May 2003, significant biomass burning smoke from Central America was transported to the southeastern United States (SEUS). A coupled aerosol, radiation, and meteorology model that is built upon the heritage of the Regional Atmospheric Modeling System (RAMS), having newly developed capabilities of Assimilation and Radiation Online Modeling of Aerosols (AROMA) algorithm, was used to simulate the smoke transport and quantify the smoke radiative impacts on surface energetics, boundary layer, and other atmospheric processes. This paper, the first of a two-part series, describes the model and examines the ability of RAMS-AROMA to simulate the smoke transport. Because biomass-burning fire activities have distinct diurnal variations, the FLAMBE hourly smoke emission inventory that is derived from the geostationary satellite (GOES) fire products was assimilated into the model. In the "top-down" analysis, ground-based observations were used to evaluate the model performance, and the comparisons with model-simulated results were used to estimate emission uncertainties. Qualitatively, a 30-day simulation of smoke spatial distribution as well as the timing and location of the smoke fronts are consistent with those identified from the PM2.5 observation network, local air quality reports, and the measurements of aerosol optical thickness (AOT) and aerosol vertical profiles from the Southern Great Plains (SGP) Atmospheric Radiation Measurements (ARM) site in Oklahoma. Quantitatively, the model-simulated daily mean near-surface dry smoke mass correlates well with PM2.5 mass at 34 locations in Texas and with the total carbon mass and nonsoil potassium mass (KNON) at three IMPROVE sites along the smoke pathway (with linear correlation coefficients R = 0.77, 0.74, and 0.69 at the significance level larger than 0.99, respectively). The top-down sensitivity analysis indicates that the total smoke particle emission

  3. Kernel Abortion in Maize 1

    PubMed Central

    Hanft, Jonathan M.; Jones, Robert J.

    1986-01-01

    Kernels cultured in vitro were induced to abort by high temperature (35°C) and by culturing six kernels/cob piece. Aborting kernels failed to enter a linear phase of dry mass accumulation and had a final mass that was less than 6% of nonaborting field-grown kernels. Kernels induced to abort by high temperature failed to synthesize starch in the endosperm and had elevated sucrose concentrations and low fructose and glucose concentrations in the pedicel during early growth compared to nonaborting kernels. Kernels induced to abort by high temperature also had much lower pedicel soluble acid invertase activities than did nonaborting kernels. These results suggest that high temperature during the lag phase of kernel growth may impair the process of sucrose unloading in the pedicel by indirectly inhibiting soluble acid invertase activity and prevent starch synthesis in the endosperm. Kernels induced to abort by culturing six kernels/cob piece had reduced pedicel fructose, glucose, and sucrose concentrations compared to kernels from field-grown ears. These aborting kernels also had a lower pedicel soluble acid invertase activity compared to nonaborting kernels from the same cob piece and from field-grown ears. The low invertase activity in pedicel tissue of the aborting kernels was probably caused by a lack of substrate (sucrose) for the invertase to cleave due to the intense competition for available assimilates. In contrast to kernels cultured at 35°C, aborting kernels from cob pieces containing all six kernels accumulated starch in a linear fashion. These results indicate that kernels cultured six/cob piece abort because of an inadequate supply of sugar and are similar to apical kernels from field-grown ears that often abort prior to the onset of linear growth. PMID:16664846

  4. Smoke aerosol chemistry and aging of Siberian biomass burning emissions in a large aerosol chamber

    NASA Astrophysics Data System (ADS)

    Kalogridis, A.-C.; Popovicheva, O. B.; Engling, G.; Diapouli, E.; Kawamura, K.; Tachibana, E.; Ono, K.; Kozlov, V. S.; Eleftheriadis, K.

    2018-07-01

    Vegetation open fires constitute a significant source of particulate pollutants on a global scale and play an important role in both atmospheric chemistry and climate change. To better understand the emission and aging characteristics of smoke aerosols, we performed small-scale fire experiments using the Large Aerosol Chamber (LAC, 1800 m3) with a focus on biomass burning from Siberian boreal coniferous forests. A series of burn experiments were conducted with typical Siberian biomass (pine and debris), simulating separately different combustion conditions, namely, flaming, smoldering and mixed phase. Following smoke emission and dispersion in the combustion chamber, we investigated aging of aerosols under dark conditions. Here, we present experimental data on emission factors of total, elemental and organic carbon, as well as individual organic compounds, such as anhydrosugars, phenolic and dicarboxylic acids. We found that total carbon accounts for up to 80% of the fine mode (PM2.5) smoke aerosol. Higher PM2.5 emission factors were observed in the smoldering compared to flaming phase and in pine compared to debris smoldering phase. For low-temperature combustion, organic carbon (OC) contributed to more than 90% of total carbon, whereas elemental carbon (EC) dominated the aerosol composition in flaming burns with a 60-70% contribution to the total carbon mass. For all smoldering burns, levoglucosan (LG), a cellulose decomposition product, was the most abundant organic species (average LG/OC = 0.26 for pine smoldering), followed by its isomer mannosan or dehydroabietic acid (DA), an important constituent of conifer resin (DA/OC = 0.033). A levoglucosan-to-mannosan ratio of about 3 was observed, which is consistent with ratios reported for coniferous biomass and more generally softwood. The rates of aerosol removal for OC and individual organic compounds were investigated during aging in the chamber in terms of mass concentration loss rates over time under dark

  5. Approximate kernel competitive learning.

    PubMed

    Wu, Jian-Sheng; Zheng, Wei-Shi; Lai, Jian-Huang

    2015-03-01

    Kernel competitive learning has been successfully used to achieve robust clustering. However, kernel competitive learning (KCL) is not scalable for large scale data processing, because (1) it has to calculate and store the full kernel matrix that is too large to be calculated and kept in the memory and (2) it cannot be computed in parallel. In this paper we develop a framework of approximate kernel competitive learning for processing large scale dataset. The proposed framework consists of two parts. First, it derives an approximate kernel competitive learning (AKCL), which learns kernel competitive learning in a subspace via sampling. We provide solid theoretical analysis on why the proposed approximation modelling would work for kernel competitive learning, and furthermore, we show that the computational complexity of AKCL is largely reduced. Second, we propose a pseudo-parallelled approximate kernel competitive learning (PAKCL) based on a set-based kernel competitive learning strategy, which overcomes the obstacle of using parallel programming in kernel competitive learning and significantly accelerates the approximate kernel competitive learning for large scale clustering. The empirical evaluation on publicly available datasets shows that the proposed AKCL and PAKCL can perform comparably as KCL, with a large reduction on computational cost. Also, the proposed methods achieve more effective clustering performance in terms of clustering precision against related approximate clustering approaches. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Emission of Gas and Al2O3 Smoke in Gas-Al Particle Deflagration: Experiments and Emission Modeling for Explosive Fireballs

    NASA Astrophysics Data System (ADS)

    Ranc-Darbord, Isabelle; Baudin, Gérard; Genetier, Marc; Ramel, David; Vasseur, Pierre; Legrand, Julien; Pina, Vincent

    2018-03-01

    Emission of gas and Al2O3 smoke within the deflagration of H2{-}O2-{N2{-}CO2}-Al particles has been studied in a closed combustion chamber at pressures of up to 18 bar and at gas temperatures of up to 3700 K. Measurements of radiance intensity were taken using a five wavelength pyrometer (0.660 μ m, 0.850 μ m, 1.083 μ m, 1.260 μ m, 1.481 μ m) and a grating spectrometer in the range (4.10 μ m to 4.30 μ m). In order to characterize the aluminum oxide smoke size and temperature, an inversion method has been developed based on the radiation transfer equation and using pyrometer measurements and thermochemical calculations of Al2O3 smoke volume fractions. Temperatures in combustion gas have been determined using a method based on the assumed blackbody head of the 4.26 μ m CO2 emission line and on its spectral shift with pressure and temperature. For validation purpose, this method has been applied to measurements obtained when calibrated alumina particles are injected in a combustion chamber prior to gaseous deflagrations. This mathematical inversion method was developed to investigate explosive fireballs.

  7. Classification With Truncated Distance Kernel.

    PubMed

    Huang, Xiaolin; Suykens, Johan A K; Wang, Shuning; Hornegger, Joachim; Maier, Andreas

    2018-05-01

    This brief proposes a truncated distance (TL1) kernel, which results in a classifier that is nonlinear in the global region but is linear in each subregion. With this kernel, the subregion structure can be trained using all the training data and local linear classifiers can be established simultaneously. The TL1 kernel has good adaptiveness to nonlinearity and is suitable for problems which require different nonlinearities in different areas. Though the TL1 kernel is not positive semidefinite, some classical kernel learning methods are still applicable which means that the TL1 kernel can be directly used in standard toolboxes by replacing the kernel evaluation. In numerical experiments, the TL1 kernel with a pregiven parameter achieves similar or better performance than the radial basis function kernel with the parameter tuned by cross validation, implying the TL1 kernel a promising nonlinear kernel for classification tasks.

  8. Wexler's Great Smoke Pall: a chemistry-climate model analysis of a singularly large emissions pulse

    NASA Astrophysics Data System (ADS)

    Field, R. D.; Voulgarakis, A.

    2011-12-01

    We model the effects of the smoke plume from what was arguably the largest forest fire in recorded history. The Chinchaga fire burned continuously during the summer of 1950 in northwestern Canada during a very dry fire season. On September 22nd, the fire made a major advance, burning an area of approximately 1400 km2. Ground and aircraft observations showed that from September 22 to 28, the smoke plume from the emissions pulse travelled over northern Canada, southward over the Great Lakes region and eastern US, across the Atlantic, and to Western Europe. Over the Great Lakes region, the plume remained thick enough to create twilight conditions in the mid-afternoon, and was estimated to have caused a 4 oC cooling at the surface. While many instances of long-range transport of wildfire emissions have been detected over the past decade, we know of no other wildfire which created such an acute effect on downward shortwave radiation at such a long distance. As a result, the fire was an important analogue event used in estimating the effects of a nuclear winter. Simulations with the nudged version of the GISS chemistry-climate model accurately capture the long-range transport pattern of the smoke emissions in the free-troposphere. The timing and location of aircraft observations of the plume over the eastern US, North Atlantic and the United Kingdom were well-matched to modeled anomalies of CO and aerosol optical depth. Further work will examine the model's ability to create twilight conditions during the day, and to provide an estimate of the consequent cooling effects at the surface from this remarkable emissions pulse.

  9. Quantifying Uncertainty in Daily Temporal Variations of Atmospheric NH3 Emissions Following Application of Chemical Fertilizers

    NASA Astrophysics Data System (ADS)

    Balasubramanian, S.; Koloutsou-Vakakis, S.; Rood, M. J.

    2014-12-01

    Improving modeling predictions of atmospheric particulate matter and deposition of reactive nitrogen requires representative emission inventories of precursor species, such as ammonia (NH3). Anthropogenic NH3 is primarily emitted to the atmosphere from agricultural sources (80-90%) with dominant contributions (56%) from chemical fertilizer usage (CFU) in regions like Midwest USA. Local crop management practices vary spatially and temporally, which influence regional air quality. To model the impact of CFU, NH3 emission inputs to chemical transport models are obtained from the National Emission Inventory (NEI). NH3 emissions from CFU are typically estimated by combining annual fertilizer sales data with emission factors. The Sparse Matrix Operator Kernel Emissions (SMOKE) model is used to disaggregate annual emissions to hourly scale using temporal factors. These factors are estimated by apportioning emissions within each crop season in proportion to the nitrogen applied and time-averaged to the hourly scale. Such approach does not reflect influence of CFU for different crops and local weather and soil conditions. This study provides an alternate approach for estimating temporal factors for NH3 emissions. The DeNitrification DeComposition (DNDC) model was used to estimate daily variations in NH3 emissions from CFU at 14 Central Illinois locations for 2002-2011. Weather, crop and soil data were provided as inputs. A method was developed to estimate site level CFU by combining planting and harvesting dates, nitrogen management and fertilizer sales data. DNDC results indicated that annual NH3 emissions were within ±15% of SMOKE estimates. Daily modeled emissions across 10 years followed similar distributions but varied in magnitudes within ±20%. Individual emission peaks on days after CFU were 2.5-8 times greater as compared to existing estimates from SMOKE. By identifying the episodic nature of NH3 emissions from CFU, this study is expected to provide improvements

  10. Assessment of biomass burning smoke influence on environmental conditions for multiyear tornado outbreaks by combining aerosol-aware microphysics and fire emission constraints

    NASA Astrophysics Data System (ADS)

    Saide, Pablo E.; Thompson, Gregory; Eidhammer, Trude; da Silva, Arlindo M.; Pierce, R. Bradley; Carmichael, Gregory R.

    2016-09-01

    We use the Weather Research and Forecasting (WRF) system to study the impacts of biomass burning smoke from Central America on several tornado outbreaks occurring in the U.S. during spring. The model is configured with an aerosol-aware microphysics parameterization capable of resolving aerosol-cloud-radiation interactions in a cost-efficient way for numerical weather prediction (NWP) applications. Primary aerosol emissions are included, and smoke emissions are constrained using an inverse modeling technique and satellite-based aerosol optical depth observations. Simulations turning on and off fire emissions reveal smoke presence in all tornado outbreaks being studied and show an increase in aerosol number concentrations due to smoke. However, the likelihood of occurrence and intensification of tornadoes is higher due to smoke only in cases where cloud droplet number concentration in low-level clouds increases considerably in a way that modifies the environmental conditions where the tornadoes are formed (shallower cloud bases and higher low-level wind shear). Smoke absorption and vertical extent also play a role, with smoke absorption at cloud-level tending to burn-off clouds and smoke absorption above clouds resulting in an increased capping inversion. Comparing these and WRF-Chem simulations configured with a more complex representation of aerosol size and composition and different optical properties, microphysics, and activation schemes, we find similarities in terms of the simulated aerosol optical depths and aerosol impacts on near-storm environments. This provides reliability on the aerosol-aware microphysics scheme as a less computationally expensive alternative to WRF-Chem for its use in applications such as NWP and cloud-resolving simulations.

  11. 49 CFR Appendix B to Part 238 - Test Methods and Performance Criteria for the Flammability and Smoke Emission Characteristics of...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., etc.) shall be designed against acting as passageways for fire and smoke and representative... structural flooring assembly to perform as a barrier against under-vehicle fires. The fire resistance period... Flammability and Smoke Emission Characteristics of Materials Used in Passenger Cars and Locomotive Cabs B...

  12. Development and Implementation of a Formal Framework for Bottom-up Uncertainty Analysis of Input Emissions: Case Study of Residential Wood Combustion

    NASA Astrophysics Data System (ADS)

    Zhao, S.; Mashayekhi, R.; Saeednooran, S.; Hakami, A.; Ménard, R.; Moran, M. D.; Zhang, J.

    2016-12-01

    We have developed a formal framework for documentation, quantification, and propagation of uncertainties in upstream emissions inventory data at various stages leading to the generation of model-ready gridded emissions through emissions processing software such as the EPA's SMOKE (Sparse Matrix Operator Kernel Emissions) system. To illustrate this framework we present a proof-of-concept case study of a bottom-up quantitative assessment of uncertainties in emissions from residential wood combustion (RWC) in the U.S. and Canada. Uncertainties associated with key inventory parameters are characterized based on existing information sources, including the American Housing Survey (AHS) from the U.S. Census Bureau, Timber Products Output (TPO) surveys from the U.S. Forest Service, TNS Canadian Facts surveys, and the AP-42 emission factor document from the U.S. EPA. The propagation of uncertainties is based on Monte Carlo simulation code external to SMOKE. Latin Hypercube Sampling (LHS) is implemented to generate a set of random realizations of each RWC inventory parameter, for which the uncertainties are assumed to be normally distributed. Random realizations are also obtained for each RWC temporal and chemical speciation profile and spatial surrogate field external to SMOKE using the LHS approach. SMOKE outputs for primary emissions (e.g., CO, VOC) using both RWC emission inventory realizations and perturbed temporal and chemical profiles and spatial surrogates show relative uncertainties of about 30-50% across the U.S. and about 70-100% across Canada. Positive skewness values (up to 2.7) and variable kurtosis values (up to 4.8) were also found. Spatial allocation contributes significantly to the overall uncertainty, particularly in Canada. By applying this framework we are able to produce random realizations of model-ready gridded emissions that along with available meteorological ensembles can be used to propagate uncertainties through chemical transport models. The

  13. Effect of low-density polyethylene on smoke emissions from burning of simulated debris piles.

    PubMed

    Hosseini, Seyedehsan; Shrivastava, Manish; Qi, Li; Weise, David R; Cocker, David R; Miller, John W; Jung, Heejung S

    2014-06-01

    Low-density polyethylene (LDPE) plastic is used to keep piled debris from silvicultural activities--activities associated with development and care of forests--dry to enable efficient disposal by burning. The effects of inclusion of LDPE in this manner on smoke emissions are not well known. In a combustion laboratory experiment, 2-kg mixtures of LDPE and manzanita (Arctostaphylos sp.) wood containing 0, 0.25, and 2.5% LDPE by mass were burned. Gaseous and particulate emissions were sampled in real time during the entire flaming, mixed combustion phase--when the flaming and smoldering phases are present at the same time--and during a portion of the smoldering phase. Analysis of variance was used to test significance of modified combustion efficiency (MCE)--the ratio of concentrations of fire-integrated excess CO2 to CO2 plus CO--and LDPE content on measured individual compounds. MCE ranged between 0.983 and 0.993, indicating that combustion was primarily flaming; MCE was seldom significant as a covariate. Of the 195 compounds identified in the smoke emissions, only the emission factor (EF) of 3M-octane showed an increase with increasing LDPE content. Inclusion of LDPE had an effect on EFs of pyrene and fluoranthene, but no statistical evidence of a linear trend was found. Particulate emission factors showed a marginally significant linear relationship with MCE (0.05 < P-value < 0.10). Based on the results of the current and previous studies and literature reviews, the inclusion of small mass proportions of LDPE in piled silvicultural debris does not appear to change the emissions produced when low-moisture-content wood is burned. In general, combustion of wet piles results in lower MCEs and consequently higher levels of emissions. Current air quality regulations permit the use of burning to dispose of silvicultural piles; however, inclusion of low-density polyethyelene (LDPE) plastic in silvicultural piles can result in a designation of the pile as waste. Waste

  14. Optimized Kernel Entropy Components.

    PubMed

    Izquierdo-Verdiguier, Emma; Laparra, Valero; Jenssen, Robert; Gomez-Chova, Luis; Camps-Valls, Gustau

    2017-06-01

    This brief addresses two main issues of the standard kernel entropy component analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of variance, as in the kernel principal components analysis. In this brief, we propose an extension of the KECA method, named optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular, it is based on the independent component analysis framework, and introduces an extra rotation to the eigen decomposition, which is optimized via gradient-ascent search. This maximum entropy preservation suggests that OKECA features are more efficient than KECA features for density estimation. In addition, a critical issue in both the methods is the selection of the kernel parameter, since it critically affects the resulting performance. Here, we analyze the most common kernel length-scale selection criteria. The results of both the methods are illustrated in different synthetic and real problems. Results show that OKECA returns projections with more expressive power than KECA, the most successful rule for estimating the kernel parameter is based on maximum likelihood, and OKECA is more robust to the selection of the length-scale parameter in kernel density estimation.

  15. An Overview of the New FEER Smoke Emissions Product and Its Applications over Northern Sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Ellison, L. T.; Ichoku, C. M.

    2012-12-01

    A new smoke emissions inventory is being derived by NASA's Fire Energetics and Emissions Research (FEER, http://feer.gsfc.nasa.gov/) group in conjunction with the NASA-funded interdisciplinary research project on the interactions and feedbacks between biomass burning and water cycle dynamics across the Northern Sub-Saharan African (NSSA) region. The vast amount of anthropogenic biomass burning conducted in NSSA during the dry months contributes significant amounts of gaseous and particulate emissions to the local climate system. The emissions product presented here is a result of the efforts made to utilize quantitative satellite measures of important fire and smoke variables to generate an accurate emissions product that can be used to quantify the relationship between biomass burning and regional climate impacts. This new product is based on a unique top-down approach whereby radiant energy and emission rates are related from independent yet coincident remotely sensed retrievals of fire radiative power (FRP) and aerosol optical depth (AOD) from the two active Moderate Resolution Imaging Spectroradiometer (MODIS) instruments. The algorithm produces a 1×1° global grid of coefficients of emission, Ce, that directly relate FRP to emission rates, or equivalently, fire radiative energy (FRE, the temporally integrated FRP curve) to emissions. Thus, emissions can be easily and quickly obtained in a given region by multiplying the Ce grid with FRP measurements acquired within that region. The Ce product offers the user flexibility in using any desired FRP data source, and the lag time in generating emissions is only constrained by that of obtaining FRP. The accuracy of this emissions product and its comparisons to other established emissions databases are presented here, as is a discussion of the contribution that this product will make toward accounting for climate variabilities in the NSSA region.

  16. Intercomparison of Fire Size, Fuel Loading, Fuel Consumption, and Smoke Emissions Estimates on the 2006 Tripod Fire, Washington, USA

    Treesearch

    Stacy A. Drury; Narasimhan Larkin; Tara T. Strand; ShihMing Huang; Scott J. Strenfel; Theresa E. O' Brien; Sean M. Raffuse

    2014-01-01

    Land managers rely on prescribed burning and naturally ignited wildfires for ecosystem management, and must balance trade-offs of air quality, carbon storage, and ecosystem health. A current challenge for land managers when using fire for ecosystem management is managing smoke production. Smoke emissions are a potential human health hazard due to the production of fine...

  17. Assessment of biomass burning smoke influence on environmental conditions for multi-year tornado outbreaks by combining aerosol-aware microphysics and fire emission constraints.

    PubMed

    Saide, Pablo E; Thompson, Gregory; Eidhammer, Trude; da Silva, Arlindo M; Pierce, R Bradley; Carmichael, Gregory R

    2016-09-16

    We use the WRF system to study the impacts of biomass burning smoke from Central America on several tornado outbreaks occurring in the US during spring. The model is configured with an aerosol-aware microphysics parameterization capable of resolving aerosol-cloud-radiation interactions in a cost-efficient way for numerical weather prediction (NWP) applications. Primary aerosol emissions are included and smoke emissions are constrained using an inverse modeling technique and satellite-based AOD observations. Simulations turning on and off fire emissions reveal smoke presence in all tornado outbreaks being studied and show an increase in aerosol number concentrations due to smoke. However, the likelihood of occurrence and intensification of tornadoes is higher due to smoke only in cases where cloud droplet number concentration in low level clouds increases considerably in a way that modifies the environmental conditions where the tornadoes are formed (shallower cloud bases and higher low-level wind shear). Smoke absorption and vertical extent also play a role, with smoke absorption at cloud-level tending to burn-off clouds and smoke absorption above clouds resulting in an increased capping inversion. Comparing these and WRF-Chem simulations configured with a more complex representation of aerosol size and composition and different optical properties, microphysics and activation schemes, we find similarities in terms of the simulated aerosol optical depths and aerosol impacts on near-storm environments. This provides reliability on the aerosol-aware microphysics scheme as a less computationally expensive alternative to WRF-Chem for its use in applications such as NWP and cloud-resolving simulations.

  18. Assessment of Biomass Burning Smoke Influence on Environmental Conditions for Multi-Year Tornado Outbreaks by Combining Aerosol-Aware Microphysics and Fire Emission Constraints

    NASA Technical Reports Server (NTRS)

    Saide, Pablo E.; Thompson, Gregory; Eidhammer, Trude; Da Silva, Arlindo M.; Pierce, R. Bradley; Carmichael, Gregory R.

    2016-01-01

    We use the WRF system to study the impacts of biomass burning smoke from Central America on several tornado outbreaks occurring in the US during spring. The model is configured with an aerosol-aware microphysics parameterization capable of resolving aerosol-cloud-radiation interactions in a cost-efficient way for numerical weather prediction (NWP) applications. Primary aerosol emissions are included and smoke emissions are constrained using an inverse modeling technique and satellite-based AOD observations. Simulations turning on and off fire emissions reveal smoke presence in all tornado outbreaks being studied and show an increase in aerosol number concentrations due to smoke. However, the likelihood of occurrence and intensification of tornadoes is higher due to smoke only in cases where cloud droplet number concentration in low level clouds increases considerably in a way that modifies the environmental conditions where the tornadoes are formed (shallower cloud bases and higher low-level wind shear). Smoke absorption and vertical extent also play a role, with smoke absorption at cloud-level tending to burn-off clouds and smoke absorption above clouds resulting in an increased capping inversion. Comparing these and WRF-Chem simulations configured with a more complex representation of aerosol size and composition and different optical properties, microphysics and activation schemes, we find similarities in terms of the simulated aerosol optical depths and aerosol impacts on near-storm environments. This provides reliability on the aerosol-aware microphysics scheme as a less computationally expensive alternative to WRFChem for its use in applications such as NWP and cloud-resolving simulations.

  19. Assessment of biomass burning smoke influence on environmental conditions for multi-year tornado outbreaks by combining aerosol-aware microphysics and fire emission constraints

    PubMed Central

    Saide, Pablo E.; Thompson, Gregory; Eidhammer, Trude; da Silva, Arlindo M.; Pierce, R. Bradley; Carmichael, Gregory R.

    2018-01-01

    We use the WRF system to study the impacts of biomass burning smoke from Central America on several tornado outbreaks occurring in the US during spring. The model is configured with an aerosol-aware microphysics parameterization capable of resolving aerosol-cloud-radiation interactions in a cost-efficient way for numerical weather prediction (NWP) applications. Primary aerosol emissions are included and smoke emissions are constrained using an inverse modeling technique and satellite-based AOD observations. Simulations turning on and off fire emissions reveal smoke presence in all tornado outbreaks being studied and show an increase in aerosol number concentrations due to smoke. However, the likelihood of occurrence and intensification of tornadoes is higher due to smoke only in cases where cloud droplet number concentration in low level clouds increases considerably in a way that modifies the environmental conditions where the tornadoes are formed (shallower cloud bases and higher low-level wind shear). Smoke absorption and vertical extent also play a role, with smoke absorption at cloud-level tending to burn-off clouds and smoke absorption above clouds resulting in an increased capping inversion. Comparing these and WRF-Chem simulations configured with a more complex representation of aerosol size and composition and different optical properties, microphysics and activation schemes, we find similarities in terms of the simulated aerosol optical depths and aerosol impacts on near-storm environments. This provides reliability on the aerosol-aware microphysics scheme as a less computationally expensive alternative to WRF-Chem for its use in applications such as NWP and cloud-resolving simulations. PMID:29619287

  20. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  1. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  2. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  3. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as... purposes of determining inedible kernels, pieces, or particles of almond kernels. [59 FR 39419, Aug. 3...

  4. UNICOS Kernel Internals Application Development

    NASA Technical Reports Server (NTRS)

    Caredo, Nicholas; Craw, James M. (Technical Monitor)

    1995-01-01

    Having an understanding of UNICOS Kernel Internals is valuable information. However, having the knowledge is only half the value. The second half comes with knowing how to use this information and apply it to the development of tools. The kernel contains vast amounts of useful information that can be utilized. This paper discusses the intricacies of developing utilities that utilize kernel information. In addition, algorithms, logic, and code will be discussed for accessing kernel information. Code segments will be provided that demonstrate how to locate and read kernel structures. Types of applications that can utilize kernel information will also be discussed.

  5. Modeling natural emissions in the Community Multiscale Air Quality (CMAQ) Model-I: building an emissions data base

    NASA Astrophysics Data System (ADS)

    Smith, S. N.; Mueller, S. F.

    2010-05-01

    A natural emissions inventory for the continental United States and surrounding territories is needed in order to use the US Environmental Protection Agency Community Multiscale Air Quality (CMAQ) Model for simulating natural air quality. The CMAQ air modeling system (including the Sparse Matrix Operator Kernel Emissions (SMOKE) emissions processing system) currently estimates non-methane volatile organic compound (NMVOC) emissions from biogenic sources, nitrogen oxide (NOx) emissions from soils, ammonia from animals, several types of particulate and reactive gas emissions from fires, as well as sea salt emissions. However, there are several emission categories that are not commonly treated by the standard CMAQ Model system. Most notable among these are nitrogen oxide emissions from lightning, reduced sulfur emissions from oceans, geothermal features and other continental sources, windblown dust particulate, and reactive chlorine gas emissions linked with sea salt chloride. A review of past emissions modeling work and existing global emissions data bases provides information and data necessary for preparing a more complete natural emissions data base for CMAQ applications. A model-ready natural emissions data base is developed to complement the anthropogenic emissions inventory used by the VISTAS Regional Planning Organization in its work analyzing regional haze based on the year 2002. This new data base covers a modeling domain that includes the continental United States plus large portions of Canada, Mexico and surrounding oceans. Comparing July 2002 source data reveals that natural emissions account for 16% of total gaseous sulfur (sulfur dioxide, dimethylsulfide and hydrogen sulfide), 44% of total NOx, 80% of reactive carbonaceous gases (NMVOCs and carbon monoxide), 28% of ammonia, 96% of total chlorine (hydrochloric acid, nitryl chloride and sea salt chloride), and 84% of fine particles (i.e., those smaller than 2.5 μm in size) released into the atmosphere

  6. Protein Subcellular Localization with Gaussian Kernel Discriminant Analysis and Its Kernel Parameter Selection.

    PubMed

    Wang, Shunfang; Nie, Bing; Yue, Kun; Fei, Yu; Li, Wenjia; Xu, Dongshu

    2017-12-15

    Kernel discriminant analysis (KDA) is a dimension reduction and classification algorithm based on nonlinear kernel trick, which can be novelly used to treat high-dimensional and complex biological data before undergoing classification processes such as protein subcellular localization. Kernel parameters make a great impact on the performance of the KDA model. Specifically, for KDA with the popular Gaussian kernel, to select the scale parameter is still a challenging problem. Thus, this paper introduces the KDA method and proposes a new method for Gaussian kernel parameter selection depending on the fact that the differences between reconstruction errors of edge normal samples and those of interior normal samples should be maximized for certain suitable kernel parameters. Experiments with various standard data sets of protein subcellular localization show that the overall accuracy of protein classification prediction with KDA is much higher than that without KDA. Meanwhile, the kernel parameter of KDA has a great impact on the efficiency, and the proposed method can produce an optimum parameter, which makes the new algorithm not only perform as effectively as the traditional ones, but also reduce the computational time and thus improve efficiency.

  7. 7 CFR 981.7 - Edible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Edible kernel. 981.7 Section 981.7 Agriculture... Regulating Handling Definitions § 981.7 Edible kernel. Edible kernel means a kernel, piece, or particle of almond kernel that is not inedible. [41 FR 26852, June 30, 1976] ...

  8. Modeling natural emissions in the Community Multiscale Air Quality (CMAQ) model - Part 1: Building an emissions data base

    NASA Astrophysics Data System (ADS)

    Smith, S. N.; Mueller, S. F.

    2010-01-01

    A natural emissions inventory for the continental United States and surrounding territories is needed in order to use the US Environmental Protection Agency Community Multiscale Air Quality (CMAQ) Model for simulating natural air quality. The CMAQ air modeling system (including the Sparse Matrix Operator Kernel Emissions (SMOKE) emissions processing system) currently estimates volatile organic compound (VOC) emissions from biogenic sources, nitrogen oxide (NOx) emissions from soils, ammonia from animals, several types of particulate and reactive gas emissions from fires, as well as windblown dust and sea salt emissions. However, there are several emission categories that are not commonly treated by the standard CMAQ Model system. Most notable among these are nitrogen oxide emissions from lightning, reduced sulfur emissions from oceans, geothermal features and other continental sources, and reactive chlorine gas emissions linked with sea salt chloride. A review of past emissions modeling work and existing global emissions data bases provides information and data necessary for preparing a more complete natural emissions data base for CMAQ applications. A model-ready natural emissions data base is developed to complement the anthropogenic emissions inventory used by the VISTAS Regional Planning Organization in its work analyzing regional haze based on the year 2002. This new data base covers a modeling domain that includes the continental United States plus large portions of Canada, Mexico and surrounding oceans. Comparing July 2002 source data reveals that natural emissions account for 16% of total gaseous sulfur (sulfur dioxide, dimethylsulfide and hydrogen sulfide), 44% of total NOx, 80% of reactive carbonaceous gases (VOCs and carbon monoxide), 28% of ammonia, 96% of total chlorine (hydrochloric acid, nitryl chloride and sea salt chloride), and 84% of fine particles (i.e., those smaller than 2.5 μm in size) released into the atmosphere. The seasonality and

  9. Unconventional protein sources: apricot seed kernels.

    PubMed

    Gabrial, G N; El-Nahry, F I; Awadalla, M Z; Girgis, S M

    1981-09-01

    Hamawy apricot seed kernels (sweet), Amar apricot seed kernels (bitter) and treated Amar apricot kernels (bitterness removed) were evaluated biochemically. All kernels were found to be high in fat (42.2--50.91%), protein (23.74--25.70%) and fiber (15.08--18.02%). Phosphorus, calcium, and iron were determined in all experimental samples. The three different apricot seed kernels were used for extensive study including the qualitative determination of the amino acid constituents by acid hydrolysis, quantitative determination of some amino acids, and biological evaluation of the kernel proteins in order to use them as new protein sources. Weanling albino rats failed to grow on diets containing the Amar apricot seed kernels due to low food consumption because of its bitterness. There was no loss in weight in that case. The Protein Efficiency Ratio data and blood analysis results showed the Hamawy apricot seed kernels to be higher in biological value than treated apricot seed kernels. The Net Protein Ratio data which accounts for both weight, maintenance and growth showed the treated apricot seed kernels to be higher in biological value than both Hamawy and Amar kernels. The Net Protein Ratio for the last two kernels were nearly equal.

  10. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  11. 7 CFR 981.8 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.8 Section 981.8 Agriculture... Regulating Handling Definitions § 981.8 Inedible kernel. Inedible kernel means a kernel, piece, or particle of almond kernel with any defect scored as serious damage, or damage due to mold, gum, shrivel, or...

  12. 7 CFR 51.1415 - Inedible kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Inedible kernels. 51.1415 Section 51.1415 Agriculture... Standards for Grades of Pecans in the Shell 1 Definitions § 51.1415 Inedible kernels. Inedible kernels means that the kernel or pieces of kernels are rancid, moldy, decayed, injured by insects or otherwise...

  13. First look at smoke emissions from prescribed burns in long-unburned longleaf pine forests

    Treesearch

    Sheryl K. Akagi; Robert J. Yokelson; Ian R. Burling; David R. Weise; James Reardon; Shawn Urbanski; Timothy J. Johnson

    2014-01-01

    While fire has long played a role in the longleaf pine ecosystem, there are still some stands in the southeastern United States where fire has not been reintroduced and fuels have accumulated for 50 years or more. As part of a larger study examining fuel loading and smoke emissions on Department of Defense installations in the southeastern U.S., fuels and trace...

  14. Effect of low-density polyethylene on smoke emissions from burning of simulated debris piles

    Treesearch

    Seyedehsan Hosseini; Qi Li; Manish Shrivastava; David R. Weise; David R. Cocker; J. Wayne Miller; Heejung S Jung

    2014-01-01

    Low-density polyethylene (LDPE) plastic is used to keep piled debris from silvicultural activities—activities associated with development and care of forests—dry to enable efficient disposal by burning. The effects of inclusion of LDPE in this manner on smoke emissions are not well known. In a combustion laboratory experiment, 2-kg mixtures of LDPE and manzanita (

  15. 7 CFR 981.408 - Inedible kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Inedible kernel. 981.408 Section 981.408 Agriculture... Administrative Rules and Regulations § 981.408 Inedible kernel. Pursuant to § 981.8, the definition of inedible kernel is modified to mean a kernel, piece, or particle of almond kernel with any defect scored as...

  16. Emission Studies in CI Engine using LPG and Palm Kernel Methyl Ester as Fuels and Di-ethyl Ether as an Additive

    NASA Astrophysics Data System (ADS)

    Dora, Nagaraju; Jothi, T. J. Sarvoththama

    2018-05-01

    The present study investigates the effectiveness of using di-ethyl ether (DEE) as the fuel additive in engine performance and emissions. Experiments are carried out in a single cylinder four stroke diesel engine at constant speed. Two different fuels namely liquefied petroleum gas (LPG) and palm kernel methyl ester (PKME) are used as primary fuels with DEE as the fuel additive. LPG flow rates of 0.6 and 0.8 kg/h are considered, and flow rate of DEE is varied to maintain the constant engine speed. In case of PKME fuel, it is blended with diesel in the latter to the former ratio of 80:20, and DEE is varied in the volumetric proportion of 1 and 2%. Results indicate that for the engine operating in LPG-DEE mode at 0.6 kg/h of LPG, the brake thermal efficiency is lowered by 26%; however, NOx is subsequently reduced by around 30% compared to the engine running with only diesel fuel at 70% load. Similarly, results of PKME blended fuel showed a drastic reduction in the NOx and CO emissions. In these two modes of operation, DEE is observed to be significant fuel additive regarding emissions reduction.

  17. LZW-Kernel: fast kernel utilizing variable length code blocks from LZW compressors for protein sequence classification.

    PubMed

    Filatov, Gleb; Bauwens, Bruno; Kertész-Farkas, Attila

    2018-05-07

    Bioinformatics studies often rely on similarity measures between sequence pairs, which often pose a bottleneck in large-scale sequence analysis. Here, we present a new convolutional kernel function for protein sequences called the LZW-Kernel. It is based on code words identified with the Lempel-Ziv-Welch (LZW) universal text compressor. The LZW-Kernel is an alignment-free method, it is always symmetric, is positive, always provides 1.0 for self-similarity and it can directly be used with Support Vector Machines (SVMs) in classification problems, contrary to normalized compression distance (NCD), which often violates the distance metric properties in practice and requires further techniques to be used with SVMs. The LZW-Kernel is a one-pass algorithm, which makes it particularly plausible for big data applications. Our experimental studies on remote protein homology detection and protein classification tasks reveal that the LZW-Kernel closely approaches the performance of the Local Alignment Kernel (LAK) and the SVM-pairwise method combined with Smith-Waterman (SW) scoring at a fraction of the time. Moreover, the LZW-Kernel outperforms the SVM-pairwise method when combined with BLAST scores, which indicates that the LZW code words might be a better basis for similarity measures than local alignment approximations found with BLAST. In addition, the LZW-Kernel outperforms n-gram based mismatch kernels, hidden Markov model based SAM and Fisher kernel, and protein family based PSI-BLAST, among others. Further advantages include the LZW-Kernel's reliance on a simple idea, its ease of implementation, and its high speed, three times faster than BLAST and several magnitudes faster than SW or LAK in our tests. LZW-Kernel is implemented as a standalone C code and is a free open-source program distributed under GPLv3 license and can be downloaded from https://github.com/kfattila/LZW-Kernel. akerteszfarkas@hse.ru. Supplementary data are available at Bioinformatics Online.

  18. Implementing a combined polar-geostationary algorithm for smoke emissions estimation in near real time

    NASA Astrophysics Data System (ADS)

    Hyer, E. J.; Schmidt, C. C.; Hoffman, J.; Giglio, L.; Peterson, D. A.

    2013-12-01

    Polar and geostationary satellites are used operationally for fire detection and smoke source estimation by many near-real-time operational users, including operational forecast centers around the globe. The input satellite radiance data are processed by data providers to produce Level-2 and Level -3 fire detection products, but processing these data into spatially and temporally consistent estimates of fire activity requires a substantial amount of additional processing. The most significant processing steps are correction for variable coverage of the satellite observations, and correction for conditions that affect the detection efficiency of the satellite sensors. We describe a system developed by the Naval Research Laboratory (NRL) that uses the full raster information from the entire constellation to diagnose detection opportunities, calculate corrections for factors such as angular dependence of detection efficiency, and generate global estimates of fire activity at spatial and temporal scales suitable for atmospheric modeling. By incorporating these improved fire observations, smoke emissions products, such as NRL's FLAMBE, are able to produce improved estimates of global emissions. This talk provides an overview of the system, demonstrates the achievable improvement over older methods, and describes challenges for near-real-time implementation.

  19. First Look at Smoke Emissions from Prescribed Burns in Long-unburned Longleaf Pine Forests

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akagi, Sheryl; Yokelson, Robert J.; Burling, Ian R.

    While fire has long played a role in the longleaf pine ecosystem, there are still some stands in the southeastern United States where fire has not been reintroduced and fuels have accumulated for 50 years or more. As part of a larger study examining fuel loading and smoke emissions on Department of Defense installations in the southeastern U.S., fuels and trace emissions were measured during three prescribed burns at Ft. Jackson Army Base near Columbia, South Carolina in November 2011. These pine-forest understory fires provided valuable emissions data for fires that burned in stands that had little or no exposuremore » to fire for decades. Smoke emissions were measured on the ground and from an aircraft by scientists from a large team of atmospheric researchers. (Akagi et al., 2013) To characterize initial emissions in the lofted plume and in point sources of residual smoldering combustion, trace-gas species were measured using an airborne FTIR and a ground-based FTIR, respectively. Whole-air sampling canisters were also collected from both ground- and airborne-based platforms. A total of 97 trace gases were quantified in this work, largely via infrared spectroscopy. Selected emissions data were compared with similar data collected from prescribed burns sampled in coastal North Carolina in 2010 in younger fuels beds of loblolly/longleaf stands near Camp Lejeune (Burling et al., 2011). The emission factors measured in this work differ by ~13-195% to EF measured from the managed stands at Camp Lejeune for organic and N-containing species, suggesting that fire emissions in similar ecosystems can exhibit large variability. Part of the differences, however, may be ascribed to burn conditions as well since the NC burns were during the wet season whereas the SC stands were burned after an extended drought. We also report the first detailed FTIR emissions data for a suite of monoterpenes. Figure 1 displays the emission factors (g/kg fuel) for several monoterpenes and

  20. Partial Deconvolution with Inaccurate Blur Kernel.

    PubMed

    Ren, Dongwei; Zuo, Wangmeng; Zhang, David; Xu, Jun; Zhang, Lei

    2017-10-17

    Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning-based models to suppress the adverse effect of kernel estimation error. Furthermore, an E-M algorithm is developed for estimating the partial map and recovering the latent sharp image alternatively. Experimental results show that our partial deconvolution model is effective in relieving artifacts caused by inaccurate blur kernel, and can achieve favorable deblurring quality on synthetic and real blurry images.Most non-blind deconvolution methods are developed under the error-free kernel assumption, and are not robust to inaccurate blur kernel. Unfortunately, despite the great progress in blind deconvolution, estimation error remains inevitable during blur kernel estimation. Consequently, severe artifacts such as ringing effects and distortions are likely to be introduced in the non-blind deconvolution stage. In this paper, we tackle this issue by suggesting: (i) a partial map in the Fourier domain for modeling kernel estimation error, and (ii) a partial deconvolution model for robust deblurring with inaccurate blur kernel. The partial map is constructed by detecting the reliable Fourier entries of estimated blur kernel. And partial deconvolution is applied to wavelet-based and learning

  1. 7 CFR 981.9 - Kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Kernel weight. 981.9 Section 981.9 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Marketing Agreements... Regulating Handling Definitions § 981.9 Kernel weight. Kernel weight means the weight of kernels, including...

  2. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    NASA Astrophysics Data System (ADS)

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.; Brumfield, B. E.; Phillips, M. C.; Miloshevsky, G.

    2017-06-01

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during their early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of the surrounding ambient: photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early times of their creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with a pulse duration of 6 ns are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density, and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times, while space and time resolved spectroscopy is used for evaluating the emission features and for inferring plasma physical conditions at on- and off-axis positions. The structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using the computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms, and

  3. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during its early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of surrounding ambient: viz. photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early timesmore » of its creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission features of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with 6 ns pulse duration are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times while space and time resolved spectroscopy is used for evaluating the emission features as well as for inferring plasma fundaments at on- and off-axis. Structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms and molecules are

  4. On- and off-axis spectral emission features from laser-produced gas breakdown plasmas

    DOE PAGES

    Harilal, S. S.; Skrodzki, P. J.; Miloshevsky, A.; ...

    2017-06-01

    Laser-heated gas breakdown plasmas or sparks emit profoundly in the ultraviolet and visible region of the electromagnetic spectrum with contributions from ionic, atomic, and molecular species. Laser created kernels expand into a cold ambient with high velocities during its early lifetime followed by confinement of the plasma kernel and eventually collapse. However, the plasma kernels produced during laser breakdown of gases are also capable of exciting and ionizing the surrounding ambient medium. Two mechanisms can be responsible for excitation and ionization of surrounding ambient: viz. photoexcitation and ionization by intense ultraviolet emission from the sparks produced during the early timesmore » of its creation and/or heating by strong shocks generated by the kernel during its expansion into the ambient. In this study, an investigation is made on the spectral features of on- and off-axis emission features of laser-induced plasma breakdown kernels generated in atmospheric pressure conditions with an aim to elucidate the mechanisms leading to ambient excitation and emission. Pulses from an Nd:YAG laser emitting at 1064 nm with 6 ns pulse duration are used to generate plasma kernels. Laser sparks were generated in air, argon, and helium gases to provide different physical properties of expansion dynamics and plasma chemistry considering the differences in laser absorption properties, mass density and speciation. Point shadowgraphy and time-resolved imaging were used to evaluate the shock wave and spark self-emission morphology at early and late times while space and time resolved spectroscopy is used for evaluating the emission features as well as for inferring plasma fundaments at on- and off-axis. Structure and dynamics of the plasma kernel obtained using imaging techniques are also compared to numerical simulations using computational fluid dynamics code. The emission from the kernel showed that spectral features from ions, atoms and molecules are

  5. 7 CFR 51.2295 - Half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Half kernel. 51.2295 Section 51.2295 Agriculture... Standards for Shelled English Walnuts (Juglans Regia) Definitions § 51.2295 Half kernel. Half kernel means the separated half of a kernel with not more than one-eighth broken off. ...

  6. Oecophylla longinoda (Hymenoptera: Formicidae) Lead to Increased Cashew Kernel Size and Kernel Quality.

    PubMed

    Anato, F M; Sinzogan, A A C; Offenberg, J; Adandonon, A; Wargui, R B; Deguenon, J M; Ayelo, P M; Vayssières, J-F; Kossou, D K

    2017-06-01

    Weaver ants, Oecophylla spp., are known to positively affect cashew, Anacardium occidentale L., raw nut yield, but their effects on the kernels have not been reported. We compared nut size and the proportion of marketable kernels between raw nuts collected from trees with and without ants. Raw nuts collected from trees with weaver ants were 2.9% larger than nuts from control trees (i.e., without weaver ants), leading to 14% higher proportion of marketable kernels. On trees with ants, the kernel: raw nut ratio from nuts damaged by formic acid was 4.8% lower compared with nondamaged nuts from the same trees. Weaver ants provided three benefits to cashew production by increasing yields, yielding larger nuts, and by producing greater proportions of marketable kernel mass. © The Authors 2017. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Kernel abortion in maize : I. Carbohydrate concentration patterns and Acid invertase activity of maize kernels induced to abort in vitro.

    PubMed

    Hanft, J M; Jones, R J

    1986-06-01

    Kernels cultured in vitro were induced to abort by high temperature (35 degrees C) and by culturing six kernels/cob piece. Aborting kernels failed to enter a linear phase of dry mass accumulation and had a final mass that was less than 6% of nonaborting field-grown kernels. Kernels induced to abort by high temperature failed to synthesize starch in the endosperm and had elevated sucrose concentrations and low fructose and glucose concentrations in the pedicel during early growth compared to nonaborting kernels. Kernels induced to abort by high temperature also had much lower pedicel soluble acid invertase activities than did nonaborting kernels. These results suggest that high temperature during the lag phase of kernel growth may impair the process of sucrose unloading in the pedicel by indirectly inhibiting soluble acid invertase activity and prevent starch synthesis in the endosperm. Kernels induced to abort by culturing six kernels/cob piece had reduced pedicel fructose, glucose, and sucrose concentrations compared to kernels from field-grown ears. These aborting kernels also had a lower pedicel soluble acid invertase activity compared to nonaborting kernels from the same cob piece and from field-grown ears. The low invertase activity in pedicel tissue of the aborting kernels was probably caused by a lack of substrate (sucrose) for the invertase to cleave due to the intense competition for available assimilates. In contrast to kernels cultured at 35 degrees C, aborting kernels from cob pieces containing all six kernels accumulated starch in a linear fashion. These results indicate that kernels cultured six/cob piece abort because of an inadequate supply of sugar and are similar to apical kernels from field-grown ears that often abort prior to the onset of linear growth.

  8. An Approximate Approach to Automatic Kernel Selection.

    PubMed

    Ding, Lizhong; Liao, Shizhong

    2016-02-02

    Kernel selection is a fundamental problem of kernel-based learning algorithms. In this paper, we propose an approximate approach to automatic kernel selection for regression from the perspective of kernel matrix approximation. We first introduce multilevel circulant matrices into automatic kernel selection, and develop two approximate kernel selection algorithms by exploiting the computational virtues of multilevel circulant matrices. The complexity of the proposed algorithms is quasi-linear in the number of data points. Then, we prove an approximation error bound to measure the effect of the approximation in kernel matrices by multilevel circulant matrices on the hypothesis and further show that the approximate hypothesis produced with multilevel circulant matrices converges to the accurate hypothesis produced with kernel matrices. Experimental evaluations on benchmark datasets demonstrate the effectiveness of approximate kernel selection.

  9. Viscozyme L pretreatment on palm kernels improved the aroma of palm kernel oil after kernel roasting.

    PubMed

    Zhang, Wencan; Leong, Siew Mun; Zhao, Feifei; Zhao, Fangju; Yang, Tiankui; Liu, Shaoquan

    2018-05-01

    With an interest to enhance the aroma of palm kernel oil (PKO), Viscozyme L, an enzyme complex containing a wide range of carbohydrases, was applied to alter the carbohydrates in palm kernels (PK) to modulate the formation of volatiles upon kernel roasting. After Viscozyme treatment, the content of simple sugars and free amino acids in PK increased by 4.4-fold and 4.5-fold, respectively. After kernel roasting and oil extraction, significantly more 2,5-dimethylfuran, 2-[(methylthio)methyl]-furan, 1-(2-furanyl)-ethanone, 1-(2-furyl)-2-propanone, 5-methyl-2-furancarboxaldehyde and 2-acetyl-5-methylfuran but less 2-furanmethanol and 2-furanmethanol acetate were found in treated PKO; the correlation between their formation and simple sugar profile was estimated by using partial least square regression (PLS1). Obvious differences in pyrroles and Strecker aldehydes were also found between the control and treated PKOs. Principal component analysis (PCA) clearly discriminated the treated PKOs from that of control PKOs on the basis of all volatile compounds. Such changes in volatiles translated into distinct sensory attributes, whereby treated PKO was more caramelic and burnt after aqueous extraction and more nutty, roasty, caramelic and smoky after solvent extraction. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. 7 CFR 51.1441 - Half-kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Half-kernel. 51.1441 Section 51.1441 Agriculture... Standards for Grades of Shelled Pecans Definitions § 51.1441 Half-kernel. Half-kernel means one of the separated halves of an entire pecan kernel with not more than one-eighth of its original volume missing...

  11. An introduction to kernel-based learning algorithms.

    PubMed

    Müller, K R; Mika, S; Rätsch, G; Tsuda, K; Schölkopf, B

    2001-01-01

    This paper provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis, as examples for successful kernel-based learning methods. We first give a short background about Vapnik-Chervonenkis theory and kernel feature spaces and then proceed to kernel based learning in supervised and unsupervised scenarios including practical and algorithmic considerations. We illustrate the usefulness of kernel algorithms by discussing applications such as optical character recognition and DNA analysis.

  12. A new discriminative kernel from probabilistic models.

    PubMed

    Tsuda, Koji; Kawanabe, Motoaki; Rätsch, Gunnar; Sonnenburg, Sören; Müller, Klaus-Robert

    2002-10-01

    Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel derived; from tangent vectors of posterior log-odds. Furthermore, we develop a theoretical framework on feature extractors from probabilistic models and use it for analyzing the TOP kernel. In experiments, our new discriminative TOP kernel compares favorably to the Fisher kernel.

  13. Absorbed dose kernel and self-shielding calculations for a novel radiopaque glass microsphere for transarterial radioembolization.

    PubMed

    Church, Cody; Mawko, George; Archambault, John Paul; Lewandowski, Robert; Liu, David; Kehoe, Sharon; Boyd, Daniel; Abraham, Robert; Syme, Alasdair

    2018-02-01

    Radiopaque microspheres may provide intraprocedural and postprocedural feedback during transarterial radioembolization (TARE). Furthermore, the potential to use higher resolution x-ray imaging techniques as opposed to nuclear medicine imaging suggests that significant improvements in the accuracy and precision of radiation dosimetry calculations could be realized for this type of therapy. This study investigates the absorbed dose kernel for novel radiopaque microspheres including contributions of both short and long-lived contaminant radionuclides while concurrently quantifying the self-shielding of the glass network. Monte Carlo simulations using EGSnrc were performed to determine the dose kernels for all monoenergetic electron emissions and all beta spectra for radionuclides reported in a neutron activation study of the microspheres. Simulations were benchmarked against an accepted 90 Y dose point kernel. Self-shielding was quantified for the microspheres by simulating an isotropically emitting, uniformly distributed source, in glass and in water. The ratio of the absorbed doses was scored as a function of distance from a microsphere. The absorbed dose kernel for the microspheres was calculated for (a) two bead formulations following (b) two different durations of neutron activation, at (c) various time points following activation. Self-shielding varies with time postremoval from the reactor. At early time points, it is less pronounced due to the higher energies of the emissions. It is on the order of 0.4-2.8% at a radial distance of 5.43 mm with increased size from 10 to 50 μm in diameter during the time that the microspheres would be administered to a patient. At long time points, self-shielding is more pronounced and can reach values in excess of 20% near the end of the range of the emissions. Absorbed dose kernels for 90 Y, 90m Y, 85m Sr, 85 Sr, 87m Sr, 89 Sr, 70 Ga, 72 Ga, and 31 Si are presented and used to determine an overall kernel for the

  14. Kernel Abortion in Maize 1

    PubMed Central

    Hanft, Jonathan M.; Jones, Robert J.

    1986-01-01

    This study was designed to compare the uptake and distribution of 14C among fructose, glucose, sucrose, and starch in the cob, pedicel, and endosperm tissues of maize (Zea mays L.) kernels induced to abort by high temperature with those that develop normally. Kernels cultured in vitro at 30 and 35°C were transferred to [14C]sucrose media 10 days after pollination. Kernels cultured at 35°C aborted prior to the onset of linear dry matter accumulation. Significant uptake into the cob, pedicel, and endosperm of radioactivity associated with the soluble and starch fractions of the tissues was detected after 24 hours in culture on labeled media. After 8 days in culture on [14C]sucrose media, 48 and 40% of the radioactivity associated with the cob carbohydrates was found in the reducing sugars at 30 and 35°C, respectively. This indicates that some of the sucrose taken up by the cob tissue was cleaved to fructose and glucose in the cob. Of the total carbohydrates, a higher percentage of label was associated with sucrose and a lower percentage with fructose and glucose in pedicel tissue of kernels cultured at 35°C compared to kernels cultured at 30°C. These results indicate that sucrose was not cleaved to fructose and glucose as rapidly during the unloading process in the pedicel of kernels induced to abort by high temperature. Kernels cultured at 35°C had a much lower proportion of label associated with endosperm starch (29%) than did kernels cultured at 30°C (89%). Kernels cultured at 35°C had a correspondingly higher proportion of 14C in endosperm fructose, glucose, and sucrose. These results indicate that starch synthesis in the endosperm is strongly inhibited in kernels induced to abort by high temperature even though there is an adequate supply of sugar. PMID:16664847

  15. Local Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Wiberg, Marie; van der Linden, Wim J.; von Davier, Alina A.

    2014-01-01

    Three local observed-score kernel equating methods that integrate methods from the local equating and kernel equating frameworks are proposed. The new methods were compared with their earlier counterparts with respect to such measures as bias--as defined by Lord's criterion of equity--and percent relative error. The local kernel item response…

  16. Credit scoring analysis using kernel discriminant

    NASA Astrophysics Data System (ADS)

    Widiharih, T.; Mukid, M. A.; Mustafid

    2018-05-01

    Credit scoring model is an important tool for reducing the risk of wrong decisions when granting credit facilities to applicants. This paper investigate the performance of kernel discriminant model in assessing customer credit risk. Kernel discriminant analysis is a non- parametric method which means that it does not require any assumptions about the probability distribution of the input. The main ingredient is a kernel that allows an efficient computation of Fisher discriminant. We use several kernel such as normal, epanechnikov, biweight, and triweight. The models accuracy was compared each other using data from a financial institution in Indonesia. The results show that kernel discriminant can be an alternative method that can be used to determine who is eligible for a credit loan. In the data we use, it shows that a normal kernel is relevant to be selected for credit scoring using kernel discriminant model. Sensitivity and specificity reach to 0.5556 and 0.5488 respectively.

  17. Analysis of the performance, emission and combustion characteristics of a turbocharged diesel engine fuelled with Jatropha curcas biodiesel-diesel blends using kernel-based extreme learning machine.

    PubMed

    Silitonga, Arridina Susan; Hassan, Masjuki Haji; Ong, Hwai Chyuan; Kusumo, Fitranto

    2017-11-01

    The purpose of this study is to investigate the performance, emission and combustion characteristics of a four-cylinder common-rail turbocharged diesel engine fuelled with Jatropha curcas biodiesel-diesel blends. A kernel-based extreme learning machine (KELM) model is developed in this study using MATLAB software in order to predict the performance, combustion and emission characteristics of the engine. To acquire the data for training and testing the KELM model, the engine speed was selected as the input parameter, whereas the performance, exhaust emissions and combustion characteristics were chosen as the output parameters of the KELM model. The performance, emissions and combustion characteristics predicted by the KELM model were validated by comparing the predicted data with the experimental data. The results show that the coefficient of determination of the parameters is within a range of 0.9805-0.9991 for both the KELM model and the experimental data. The mean absolute percentage error is within a range of 0.1259-2.3838. This study shows that KELM modelling is a useful technique in biodiesel production since it facilitates scientists and researchers to predict the performance, exhaust emissions and combustion characteristics of internal combustion engines with high accuracy.

  18. Automatic detection of white-light flare kernels in SDO/HMI intensitygrams

    NASA Astrophysics Data System (ADS)

    Mravcová, Lucia; Švanda, Michal

    2017-11-01

    Solar flares with a broadband emission in the white-light range of the electromagnetic spectrum belong to most enigmatic phenomena on the Sun. The origin of the white-light emission is not entirely understood. We aim to systematically study the visible-light emission connected to solar flares in SDO/HMI observations. We developed a code for automatic detection of kernels of flares with HMI intensity brightenings and study properties of detected candidates. The code was tuned and tested and with a little effort, it could be applied to any suitable data set. By studying a few flare examples, we found indication that HMI intensity brightening might be an artefact of the simplified procedure used to compute HMI observables.

  19. Modeling adaptive kernels from probabilistic phylogenetic trees.

    PubMed

    Nicotra, Luca; Micheli, Alessio

    2009-01-01

    Modeling phylogenetic interactions is an open issue in many computational biology problems. In the context of gene function prediction we introduce a class of kernels for structured data leveraging on a hierarchical probabilistic modeling of phylogeny among species. We derive three kernels belonging to this setting: a sufficient statistics kernel, a Fisher kernel, and a probability product kernel. The new kernels are used in the context of support vector machine learning. The kernels adaptivity is obtained through the estimation of the parameters of a tree structured model of evolution using as observed data phylogenetic profiles encoding the presence or absence of specific genes in a set of fully sequenced genomes. We report results obtained in the prediction of the functional class of the proteins of the budding yeast Saccharomyces cerevisae which favorably compare to a standard vector based kernel and to a non-adaptive tree kernel function. A further comparative analysis is performed in order to assess the impact of the different components of the proposed approach. We show that the key features of the proposed kernels are the adaptivity to the input domain and the ability to deal with structured data interpreted through a graphical model representation.

  20. Nonlinear Deep Kernel Learning for Image Annotation.

    PubMed

    Jiu, Mingyuan; Sahbi, Hichem

    2017-02-08

    Multiple kernel learning (MKL) is a widely used technique for kernel design. Its principle consists in learning, for a given support vector classifier, the most suitable convex (or sparse) linear combination of standard elementary kernels. However, these combinations are shallow and often powerless to capture the actual similarity between highly semantic data, especially for challenging classification tasks such as image annotation. In this paper, we redefine multiple kernels using deep multi-layer networks. In this new contribution, a deep multiple kernel is recursively defined as a multi-layered combination of nonlinear activation functions, each one involves a combination of several elementary or intermediate kernels, and results into a positive semi-definite deep kernel. We propose four different frameworks in order to learn the weights of these networks: supervised, unsupervised, kernel-based semisupervised and Laplacian-based semi-supervised. When plugged into support vector machines (SVMs), the resulting deep kernel networks show clear gain, compared to several shallow kernels for the task of image annotation. Extensive experiments and analysis on the challenging ImageCLEF photo annotation benchmark, the COREL5k database and the Banana dataset validate the effectiveness of the proposed method.

  1. Investigating fire emissions and smoke transport during the Summer of 2013 using an operational smoke modeling system and chemical transport model

    NASA Astrophysics Data System (ADS)

    ONeill, S. M.; Chung, S. H.; Wiedinmyer, C.; Larkin, N. K.; Martinez, M. E.; Solomon, R. C.; Rorig, M.

    2014-12-01

    Emissions from fires in the Western US are substantial and can impact air quality and regional climate. Many methods exist that estimate the particulate and gaseous emissions from fires, including those run operationally for use with chemical forecast models. The US Forest Service Smartfire2/BlueSky modeling framework uses satellite data and reported information about fire perimeters to estimate emissions of pollutants to the atmosphere. The emission estimates are used as inputs to dispersion models, such as HYSPLIT, and chemical transport models, such as CMAQ and WRF-Chem, to assess the chemical and physical impacts of fires on the atmosphere. Here we investigate the use of Smartfire2/BlueSky and WRF-Chem to simulate emissions from the 2013 fire summer fire season, with special focus on the Rim Fire in northern California. The 2013 Rim Fire ignited on August 17 and eventually burned more than 250,000 total acres before being contained on October 24. Large smoke plumes and pyro-convection events were observed. In this study, the Smartfire2/BlueSky operational emission estimates are compared to other estimation methods, such as the Fire INventory from NCAR (FINN) and other global databases to quantify variations in emission estimation methods for this wildfire event. The impact of the emissions on downwind chemical composition is investigated with the coupled meteorology-chemistry WRF-Chem model. The inclusion of aerosol-cloud and aerosol-radiation interactions in the model framework enables the evaluation of the downwind impacts of the fire plume. The emissions and modeled chemistry can also be evaluated with data collected from the Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS) aircraft field campaign, which intersected the fire plume.

  2. Experimental Study of Effect of EGR Rates on NOx and Smoke Emission of LHR Diesel Engine Fueled with Blends of Diesel and Neem Biodiesel

    NASA Astrophysics Data System (ADS)

    Modi, Ashishkumar Jashvantlal; Gosai, Dipak Chimangiri; Solanki, Chandresh Maheshchandra

    2018-04-01

    Energy conservation and efficiency have been the quest of engineers concerned with internal combustion engine. Theoretically, if the heat rejected could be reduced, then the thermal efficiency would be improved, at least up to the limit set by the second law of thermodynamics. For current work a ceramic coated twin cylinder water-cooled diesel engine using blends of diesel and Neem biodiesel as fuel was evaluated for its performance and exhaust emissions. Multi cylinder vertical water cooled self-governed diesel engine, piston, top surface of cylinder head and liners were fully coated with partially stabilized zirconia as ceramic material attaining an adiabatic condition. Previous studies have reported that combustion of Neem biodiesel emitted higher NOx, while hydrocarbon and smoke emissions were lower than conventional diesel fuel. Exhaust gas recirculation (EGR) is one of the techniques being used to reduce NOx emission from diesel engines; because it decreases both flame temperature and oxygen concentration in the combustion chamber. The stationary diesel engine was run in laboratory at a high load condition (85% of maximum load), fixed speed (2000 rpm) and various EGR rates of 5-40% (with 5% increment). Various measurements like fuel flow, exhaust temperature, exhaust emission measurement and exhaust smoke test were carried out. The results indicate improved fuel economy and reduced pollution levels for the low heat rejection (LHR) engine. The results showed that, at 5% EGR with TB10, both NOx and smoke opacity were reduced by 26 and 15%, respectively. Furthermore, TB20 along with 10% EGR was also able to reduce both NOx and smoke emission by 34 and 30%, respectively compared to diesel fuel without EGR.

  3. Graph Kernels for Molecular Similarity.

    PubMed

    Rupp, Matthias; Schneider, Gisbert

    2010-04-12

    Molecular similarity measures are important for many cheminformatics applications like ligand-based virtual screening and quantitative structure-property relationships. Graph kernels are formal similarity measures defined directly on graphs, such as the (annotated) molecular structure graph. Graph kernels are positive semi-definite functions, i.e., they correspond to inner products. This property makes them suitable for use with kernel-based machine learning algorithms such as support vector machines and Gaussian processes. We review the major types of kernels between graphs (based on random walks, subgraphs, and optimal assignments, respectively), and discuss their advantages, limitations, and successful applications in cheminformatics. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. EFFECT OF SMOKING ON TRASIENTLY EVOKED OTOACOUSTIC EMISSION.

    PubMed

    Gegenava, Kh; Japaridze, Sh; Sharashenidze, N; Jalabadze, G; Kevanishvili, Z

    2016-01-01

    Evoked otoacoustic emissions, EOAEs, are proved to be sounds aroused in response to external acoustic stimulus by the cochlear outer hair cells. Transiently evoked otoacoustic emissions, TEOAEs, are the most clinically utilized EOAEs. TEOAEs are detectable in 98% of people with normal hearing, regardless of age or sex, while two ears of any individual produce similar TEOAEs waveforms. The objective of the presented study was the comparison of TEOAE magnitudes in cigarette smokers and nonsmokers. The TEOAE occurrence and characteristics in individuals of both samples with audiometrically proved hearing losses and in those without were also specifically examined. 30 smokers and and 30 nonsmokers within the age range of 30-59 years were involved in the present study after informed concent. OAEs were performed to each subject by Madsen Capella's-OAE/middle ear analyzer-GN Otometrics, (Danmark). After OAE testing each subject was performed routine pure-tone audiometry and tympanometry. Obtained results were statistically treated by the student's t-distribution. According to our results 76.6% of smokers and 3.33% of nonsmokers had marked different level decrease in TEOAE amplitude. Audiographic measurments showed altered audiogram in 6.7% of smokers and in 3.33% of nonsmokers. Based on the above mentioned results we suppose that smoking has significant influence on hearing function, especially on cochlear apparatus; At the same time, TOEAE, as a sensitive method can be used for very early detection of hearing loss, even when there are neither any subjective complains nor some changies on audiogram.

  5. Mineral contents and proximate composition of Pistacia vera kernels.

    PubMed

    Harmankaya, Mustafa; Ozcan, Mehmet Musa; Al Juhaimi, Fahad

    2014-07-01

    The mineral contents of Pistacia vera kernels were determined by inductively coupled plasma-atomic emission spectroscopy (ICP-AES). The minimum and maximum values of K, P, Ca, Mg, and S elements ranged from 6,333 to 8,064 mg/kg, 3,630 to 5,228 mg/kg, 1,614 to 3,226 mg/kg, 1,716 to 2,402 mg/kg, and 1,417 to 1,825 mg/kg, respectively. In addition, the mean values of Fe, Zn, Cu, Mn, B, Mo, Cr and Ni elements were determined as 42.48, 20.52, 12.81, 7.48, 11.31, 0.106, 0.511 and 1.67 mg/kg, respectively. Ash levels of kernels were found between 2.28 % (Urfa) and 2.79 % (Halebi). In addition, crude oil and protein contents were determined between 48.8 % (Halebi) to 55.3 % (Siirt) and 23.33 % (Uzun) to 27.16 % (Halebi), respectively.

  6. Comparing Alternative Kernels for the Kernel Method of Test Equating: Gaussian, Logistic, and Uniform Kernels. Research Report. ETS RR-08-12

    ERIC Educational Resources Information Center

    Lee, Yi-Hsuan; von Davier, Alina A.

    2008-01-01

    The kernel equating method (von Davier, Holland, & Thayer, 2004) is based on a flexible family of equipercentile-like equating functions that use a Gaussian kernel to continuize the discrete score distributions. While the classical equipercentile, or percentile-rank, equating method carries out the continuization step by linear interpolation,…

  7. Tobacco Retail Environments and Social Inequalities in Individual-Level Smoking and Cessation Among Scottish Adults.

    PubMed

    Pearce, Jamie; Rind, Esther; Shortt, Niamh; Tisch, Catherine; Mitchell, Richard

    2016-02-01

    Many neighborhood characteristics may constrain or enable smoking. This study investigated whether the neighborhood tobacco retail environment was associated with individual-level smoking and cessation in Scottish adults, and whether inequalities in smoking status were related to tobacco retailing. Tobacco outlet density measures were developed for neighborhoods across Scotland using the September 2012 Scottish Tobacco Retailers Register. The outlet data were cleaned and geocoded (n = 10,161) using a Geographic Information System. Kernel density estimation was used to calculate an outlet density measure for each postcode. The kernel density estimation measures were then appended to data on individuals included in the 2008-2011 Scottish Health Surveys (n = 28,751 adults aged ≥16), via their postcode. Two-level logistic regression models examined whether neighborhood density of tobacco retailing was associated with current smoking status and smoking cessation and whether there were differences in the relationship between household income and smoking status, by tobacco outlet density. After adjustment for individual- and area-level confounders, compared to residents of areas with the lowest outlet densities, those living in areas with the highest outlet densities had a 6% higher chance of being a current smoker, and a 5% lower chance of being an ex-smoker. There was little evidence to suggest that inequalities in either current smoking or cessation were narrower in areas with lower availability of tobacco retailing. The findings suggest that residents of environments with a greater availability of tobacco outlets are more likely to start and/or sustain smoking, and less likely to quit. © The Author 2015. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  9. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  10. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  11. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  12. 7 CFR 981.401 - Adjusted kernel weight.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... based on the analysis of a 1,000 gram sample taken from a lot of almonds weighing 10,000 pounds with less than 95 percent kernels, and a 1,000 gram sample taken from a lot of almonds weighing 10,000... percent kernels containing the following: Edible kernels, 530 grams; inedible kernels, 120 grams; foreign...

  13. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Kernel color classification. 51.1403 Section 51.1403... STANDARDS) United States Standards for Grades of Pecans in the Shell 1 Kernel Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be described in terms of the color...

  14. Absence of multiplicative interactions between occupational lung carcinogens and tobacco smoking: a systematic review involving asbestos, crystalline silica and diesel engine exhaust emissions.

    PubMed

    El Zoghbi, Mohamad; Salameh, Pascale; Stücker, Isabelle; Brochard, Patrick; Delva, Fleur; Lacourt, Aude

    2017-02-02

    Tobacco smoking is the main cause of lung cancer, but it is not the sole causal factor. Significant proportions of workers are smokers and exposed to occupational lung carcinogens. This study aims to systematically review the statistical interaction between occupational lung carcinogens and tobacco smoking, in particular asbestos, crystalline silica and diesel engine exhaust emissions. Articles were identified using Scopus, PubMed, and Web of Science, and were limited to those published in English or French, without limitation of time. The reference list of selected studies was reviewed to identify other relevant papers. One reviewer selected the articles based on the inclusion and exclusion criteria. Two reviewers checked the eligibility of articles to be included in the systematic review. Data were extracted by one reviewer and revised by two other reviewers. Cohorts and case-control studies were analyzed separately. The risk of bias was evaluated for each study based on the outcome. The results of the interaction between the tobacco smoking and each carcinogen was evaluated and reported separately. Fifteen original studies were included for asbestos-smoking interaction, seven for silica-smoking interaction and two for diesel-smoking interaction. The results suggested the absence of multiplicative interaction between the three occupational lung carcinogens and smoking. There is no enough evidence from the literature to conclude for the additive interaction. We believe there is a limited risk of publication bias as several studies reporting negative results were published. There are no multiplicative interactions between tobacco smoking and occupational lung carcinogens, in particular asbestos, crystalline silica and diesel engine exhaust emissions. Even though, specific programs should be developed and promoted to reduce concomitantly the exposure to occupational lung carcinogens and tobacco smoking.

  15. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  16. Temporalization of Electric Generation Emissions for Improved Representation of Peak Air Quality Episodes

    NASA Astrophysics Data System (ADS)

    Farkas, C. M.; Moeller, M.; Carlton, A. G.

    2013-12-01

    Photochemical transport models routinely under predict peak air quality events. This deficiency may be due, in part, to inadequate temporalization of emissions from the electric generating sector. The National Emissions Inventory (NEI) reports emissions from Electric Generating Units (EGUs) by either Continuous Emission Monitors (CEMs) that report hourly values or as an annual total. The Sparse Matrix Operator Kernel Emissions preprocessor (SMOKE), used to prepare emissions data for modeling with the CMAQ air quality model, allocates annual emission totals throughout the year using specific monthly, weekly, and hourly weights according to standard classification code (SCC) and location. This approach represents average diurnal and seasonal patterns of electricity generation but does not capture spikes in emissions due to episodic use as with peaking units or due to extreme weather events. In this project we use a combination of state air quality permits, CEM data, and EPA emission factors to more accurately temporalize emissions of NOx, SO2 and particulate matter (PM) during the extensive heat wave of July and August 2006. Two CMAQ simulations are conducted; the first with the base NEI emissions and the second with improved temporalization, more representative of actual emissions during the heat wave. Predictions from both simulations are evaluated with O3 and PM measurement data from EPA's National Air Monitoring Stations (NAMS) and State and Local Air Monitoring Stations (SLAMS) during the heat wave, for which ambient concentrations of criteria pollutants were often above NAAQS. During periods of increased photochemistry and high pollutant concentrations, it is critical that emissions are most accurately represented in air quality models.

  17. Computed tomography coronary stent imaging with iterative reconstruction: a trade-off study between medium kernel and sharp kernel.

    PubMed

    Zhou, Qijing; Jiang, Biao; Dong, Fei; Huang, Peiyu; Liu, Hongtao; Zhang, Minming

    2014-01-01

    To evaluate the improvement of iterative reconstruction in image space (IRIS) technique in computed tomographic (CT) coronary stent imaging with sharp kernel, and to make a trade-off analysis. Fifty-six patients with 105 stents were examined by 128-slice dual-source CT coronary angiography (CTCA). Images were reconstructed using standard filtered back projection (FBP) and IRIS with both medium kernel and sharp kernel applied. Image noise and the stent diameter were investigated. Image noise was measured both in background vessel and in-stent lumen as objective image evaluation. Image noise score and stent score were performed as subjective image evaluation. The CTCA images reconstructed with IRIS were associated with significant noise reduction compared to that of CTCA images reconstructed using FBP technique in both of background vessel and in-stent lumen (the background noise decreased by approximately 25.4% ± 8.2% in medium kernel (P kernel (P kernel (P kernel (P kernel showed better visualization of the stent struts and in-stent lumen than that with medium kernel. Iterative reconstruction in image space reconstruction can effectively reduce the image noise and improve image quality. The sharp kernel images constructed with iterative reconstruction are considered the optimal images to observe coronary stents in this study.

  18. Anisotropic hydrodynamics with a scalar collisional kernel

    NASA Astrophysics Data System (ADS)

    Almaalol, Dekrayat; Strickland, Michael

    2018-04-01

    Prior studies of nonequilibrium dynamics using anisotropic hydrodynamics have used the relativistic Anderson-Witting scattering kernel or some variant thereof. In this paper, we make the first study of the impact of using a more realistic scattering kernel. For this purpose, we consider a conformal system undergoing transversally homogenous and boost-invariant Bjorken expansion and take the collisional kernel to be given by the leading order 2 ↔2 scattering kernel in scalar λ ϕ4 . We consider both classical and quantum statistics to assess the impact of Bose enhancement on the dynamics. We also determine the anisotropic nonequilibrium attractor of a system subject to this collisional kernel. We find that, when the near-equilibrium relaxation-times in the Anderson-Witting and scalar collisional kernels are matched, the scalar kernel results in a higher degree of momentum-space anisotropy during the system's evolution, given the same initial conditions. Additionally, we find that taking into account Bose enhancement further increases the dynamically generated momentum-space anisotropy.

  19. Ranking Support Vector Machine with Kernel Approximation

    PubMed Central

    Dou, Yong

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms. PMID:28293256

  20. Ranking Support Vector Machine with Kernel Approximation.

    PubMed

    Chen, Kai; Li, Rongchun; Dou, Yong; Liang, Zhengfa; Lv, Qi

    2017-01-01

    Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms.

  1. Rare variant testing across methods and thresholds using the multi-kernel sequence kernel association test (MK-SKAT).

    PubMed

    Urrutia, Eugene; Lee, Seunggeun; Maity, Arnab; Zhao, Ni; Shen, Judong; Li, Yun; Wu, Michael C

    Analysis of rare genetic variants has focused on region-based analysis wherein a subset of the variants within a genomic region is tested for association with a complex trait. Two important practical challenges have emerged. First, it is difficult to choose which test to use. Second, it is unclear which group of variants within a region should be tested. Both depend on the unknown true state of nature. Therefore, we develop the Multi-Kernel SKAT (MK-SKAT) which tests across a range of rare variant tests and groupings. Specifically, we demonstrate that several popular rare variant tests are special cases of the sequence kernel association test which compares pair-wise similarity in trait value to similarity in the rare variant genotypes between subjects as measured through a kernel function. Choosing a particular test is equivalent to choosing a kernel. Similarly, choosing which group of variants to test also reduces to choosing a kernel. Thus, MK-SKAT uses perturbation to test across a range of kernels. Simulations and real data analyses show that our framework controls type I error while maintaining high power across settings: MK-SKAT loses power when compared to the kernel for a particular scenario but has much greater power than poor choices.

  2. Wigner functions defined with Laplace transform kernels.

    PubMed

    Oh, Se Baek; Petruccelli, Jonathan C; Tian, Lei; Barbastathis, George

    2011-10-24

    We propose a new Wigner-type phase-space function using Laplace transform kernels--Laplace kernel Wigner function. Whereas momentum variables are real in the traditional Wigner function, the Laplace kernel Wigner function may have complex momentum variables. Due to the property of the Laplace transform, a broader range of signals can be represented in complex phase-space. We show that the Laplace kernel Wigner function exhibits similar properties in the marginals as the traditional Wigner function. As an example, we use the Laplace kernel Wigner function to analyze evanescent waves supported by surface plasmon polariton. © 2011 Optical Society of America

  3. Metabolic network prediction through pairwise rational kernels.

    PubMed

    Roche-Lima, Abiel; Domaratzki, Michael; Fristensky, Brian

    2014-09-26

    Metabolic networks are represented by the set of metabolic pathways. Metabolic pathways are a series of biochemical reactions, in which the product (output) from one reaction serves as the substrate (input) to another reaction. Many pathways remain incompletely characterized. One of the major challenges of computational biology is to obtain better models of metabolic pathways. Existing models are dependent on the annotation of the genes. This propagates error accumulation when the pathways are predicted by incorrectly annotated genes. Pairwise classification methods are supervised learning methods used to classify new pair of entities. Some of these classification methods, e.g., Pairwise Support Vector Machines (SVMs), use pairwise kernels. Pairwise kernels describe similarity measures between two pairs of entities. Using pairwise kernels to handle sequence data requires long processing times and large storage. Rational kernels are kernels based on weighted finite-state transducers that represent similarity measures between sequences or automata. They have been effectively used in problems that handle large amount of sequence information such as protein essentiality, natural language processing and machine translations. We create a new family of pairwise kernels using weighted finite-state transducers (called Pairwise Rational Kernel (PRK)) to predict metabolic pathways from a variety of biological data. PRKs take advantage of the simpler representations and faster algorithms of transducers. Because raw sequence data can be used, the predictor model avoids the errors introduced by incorrect gene annotations. We then developed several experiments with PRKs and Pairwise SVM to validate our methods using the metabolic network of Saccharomyces cerevisiae. As a result, when PRKs are used, our method executes faster in comparison with other pairwise kernels. Also, when we use PRKs combined with other simple kernels that include evolutionary information, the accuracy

  4. Ideal regularization for learning kernels from labels.

    PubMed

    Pan, Binbin; Lai, Jianhuang; Shen, Lixin

    2014-08-01

    In this paper, we propose a new form of regularization that is able to utilize the label information of a data set for learning kernels. The proposed regularization, referred to as ideal regularization, is a linear function of the kernel matrix to be learned. The ideal regularization allows us to develop efficient algorithms to exploit labels. Three applications of the ideal regularization are considered. Firstly, we use the ideal regularization to incorporate the labels into a standard kernel, making the resulting kernel more appropriate for learning tasks. Next, we employ the ideal regularization to learn a data-dependent kernel matrix from an initial kernel matrix (which contains prior similarity information, geometric structures, and labels of the data). Finally, we incorporate the ideal regularization to some state-of-the-art kernel learning problems. With this regularization, these learning problems can be formulated as simpler ones which permit more efficient solvers. Empirical results show that the ideal regularization exploits the labels effectively and efficiently. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. SEMI-SUPERVISED OBJECT RECOGNITION USING STRUCTURE KERNEL

    PubMed Central

    Wang, Botao; Xiong, Hongkai; Jiang, Xiaoqian; Ling, Fan

    2013-01-01

    Object recognition is a fundamental problem in computer vision. Part-based models offer a sparse, flexible representation of objects, but suffer from difficulties in training and often use standard kernels. In this paper, we propose a positive definite kernel called “structure kernel”, which measures the similarity of two part-based represented objects. The structure kernel has three terms: 1) the global term that measures the global visual similarity of two objects; 2) the part term that measures the visual similarity of corresponding parts; 3) the spatial term that measures the spatial similarity of geometric configuration of parts. The contribution of this paper is to generalize the discriminant capability of local kernels to complex part-based object models. Experimental results show that the proposed kernel exhibit higher accuracy than state-of-art approaches using standard kernels. PMID:23666108

  6. 40 CFR 1039.105 - What smoke standards must my engines meet?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false What smoke standards must my engines... Emission Standards and Related Requirements § 1039.105 What smoke standards must my engines meet? (a) The smoke standards in this section apply to all engines subject to emission standards under this part...

  7. 40 CFR 1039.105 - What smoke standards must my engines meet?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false What smoke standards must my engines... Emission Standards and Related Requirements § 1039.105 What smoke standards must my engines meet? (a) The smoke standards in this section apply to all engines subject to emission standards under this part...

  8. 40 CFR 1039.105 - What smoke standards must my engines meet?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false What smoke standards must my engines... Emission Standards and Related Requirements § 1039.105 What smoke standards must my engines meet? (a) The smoke standards in this section apply to all engines subject to emission standards under this part...

  9. 40 CFR 1039.105 - What smoke standards must my engines meet?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false What smoke standards must my engines... Emission Standards and Related Requirements § 1039.105 What smoke standards must my engines meet? (a) The smoke standards in this section apply to all engines subject to emission standards under this part...

  10. 40 CFR 1039.105 - What smoke standards must my engines meet?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false What smoke standards must my engines... Emission Standards and Related Requirements § 1039.105 What smoke standards must my engines meet? (a) The smoke standards in this section apply to all engines subject to emission standards under this part...

  11. The pre-image problem in kernel methods.

    PubMed

    Kwok, James Tin-yau; Tsang, Ivor Wai-hung

    2004-11-01

    In this paper, we address the problem of finding the pre-image of a feature vector in the feature space induced by a kernel. This is of central importance in some kernel applications, such as on using kernel principal component analysis (PCA) for image denoising. Unlike the traditional method which relies on nonlinear optimization, our proposed method directly finds the location of the pre-image based on distance constraints in the feature space. It is noniterative, involves only linear algebra and does not suffer from numerical instability or local minimum problems. Evaluations on performing kernel PCA and kernel clustering on the USPS data set show much improved performance.

  12. Exploiting graph kernels for high performance biomedical relation extraction.

    PubMed

    Panyam, Nagesh C; Verspoor, Karin; Cohn, Trevor; Ramamohanarao, Kotagiri

    2018-01-30

    Relation extraction from biomedical publications is an important task in the area of semantic mining of text. Kernel methods for supervised relation extraction are often preferred over manual feature engineering methods, when classifying highly ordered structures such as trees and graphs obtained from syntactic parsing of a sentence. Tree kernels such as the Subset Tree Kernel and Partial Tree Kernel have been shown to be effective for classifying constituency parse trees and basic dependency parse graphs of a sentence. Graph kernels such as the All Path Graph kernel (APG) and Approximate Subgraph Matching (ASM) kernel have been shown to be suitable for classifying general graphs with cycles, such as the enhanced dependency parse graph of a sentence. In this work, we present a high performance Chemical-Induced Disease (CID) relation extraction system. We present a comparative study of kernel methods for the CID task and also extend our study to the Protein-Protein Interaction (PPI) extraction task, an important biomedical relation extraction task. We discuss novel modifications to the ASM kernel to boost its performance and a method to apply graph kernels for extracting relations expressed in multiple sentences. Our system for CID relation extraction attains an F-score of 60%, without using external knowledge sources or task specific heuristic or rules. In comparison, the state of the art Chemical-Disease Relation Extraction system achieves an F-score of 56% using an ensemble of multiple machine learning methods, which is then boosted to 61% with a rule based system employing task specific post processing rules. For the CID task, graph kernels outperform tree kernels substantially, and the best performance is obtained with APG kernel that attains an F-score of 60%, followed by the ASM kernel at 57%. The performance difference between the ASM and APG kernels for CID sentence level relation extraction is not significant. In our evaluation of ASM for the PPI task, ASM

  13. Do Polyethylene Plastic Covers Affect Smoke Emissions from Debris Piles?

    NASA Astrophysics Data System (ADS)

    Weise, D. R.; Jung, H.; Cocker, D.; Hosseini, E.; Li, Q.; Shrivastava, M.; McCorison, M.

    2010-12-01

    Shrubs and small diameter trees exist in the understories of many western forests. They are important from an ecological perspective; however, this vegetation also presents a potential hazard as “ladder fuels” or as a heat source to damage the overstory during prescribed burns. Cutting and piling of this material to burn under safe conditions is a common silvicultural practice. To improve ignition success of the piled debris, polyethylene plastic is often used to cover a portion of the pile. While burning of piled forest debris is an acceptable practice in southern California from an air quality perspective, inclusion of plastic in the piles changes these debris piles to rubbish piles which should not be burned. With support from the four National Forests in southern California, we conducted a laboratory experiment to determine if the presence of polyethylene plastic in a pile of burning wood changed the smoke emissions. Debris piles in southern California include wood and foliage from common forest trees such as sugar and ponderosa pines, white fir, incense cedar, and California black oak and shrubs such as ceanothus and manzanita in addition to forest floor material and dirt. Manzanita wood was used to represent the debris pile in order to control the effects of fuel bed composition. The mass of polyethylene plastic incorporated into the pile was 0, 0.25 and 2.5% of the wood mass—a range representative of field conditions. Measured emissions included NOx, CO, CO2, SO2, polycyclic and light hydrocarbons, carbonyls, particulate matter (5 to 560 nm), elemental and organic carbon. The presence of polyethylene did not alter the emissions composition from this experiment.

  14. Adaptive kernel function using line transect sampling

    NASA Astrophysics Data System (ADS)

    Albadareen, Baker; Ismail, Noriszura

    2018-04-01

    The estimation of f(0) is crucial in the line transect method which is used for estimating population abundance in wildlife survey's. The classical kernel estimator of f(0) has a high negative bias. Our study proposes an adaptation in the kernel function which is shown to be more efficient than the usual kernel estimator. A simulation study is adopted to compare the performance of the proposed estimators with the classical kernel estimators.

  15. Kernel K-Means Sampling for Nyström Approximation.

    PubMed

    He, Li; Zhang, Hong

    2018-05-01

    A fundamental problem in Nyström-based kernel matrix approximation is the sampling method by which training set is built. In this paper, we suggest to use kernel -means sampling, which is shown in our works to minimize the upper bound of a matrix approximation error. We first propose a unified kernel matrix approximation framework, which is able to describe most existing Nyström approximations under many popular kernels, including Gaussian kernel and polynomial kernel. We then show that, the matrix approximation error upper bound, in terms of the Frobenius norm, is equal to the -means error of data points in kernel space plus a constant. Thus, the -means centers of data in kernel space, or the kernel -means centers, are the optimal representative points with respect to the Frobenius norm error upper bound. Experimental results, with both Gaussian kernel and polynomial kernel, on real-world data sets and image segmentation tasks show the superiority of the proposed method over the state-of-the-art methods.

  16. 40 CFR 87.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Gas Turbine Engines) § 87.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of class TF and...

  17. 40 CFR 87.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Gas Turbine Engines) § 87.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of class TF and...

  18. 40 CFR 87.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Gas Turbine Engines) § 87.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of class TF and...

  19. 40 CFR 87.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Gas Turbine Engines) § 87.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of class TF and...

  20. The carbon footprint of behavioural support services for smoking cessation.

    PubMed

    Smith, Anna Jo Bodurtha; Tennison, Imogen; Roberts, Ian; Cairns, John; Free, Caroline

    2013-09-01

    To estimate the carbon footprint of behavioural support services for smoking cessation: text message support, telephone counselling, group counselling and individual counselling. Carbon footprint analysis. Publicly available data on National Health Service Stop Smoking Services and per unit carbon emissions; published effectiveness data from the txt2stop trial and systematic reviews of smoking cessation services. Carbon dioxide equivalents (CO2e) per 1000 smokers, per lifetime quitter, and per quality-adjusted life year gained, and cost-effectiveness, including social cost of carbon, of smoking cessation services. Emissions per 1000 participants were 8143 kg CO2e for text message support, 8619 kg CO2e for telephone counselling, 16 114 kg CO2e for group counselling and 16 372 kg CO2e for individual counselling. Emissions per intervention lifetime quitter were 636 (95% CI 455 to 958) kg CO2e for text message support, 1051 (95% CI 560 to 2873) kg CO2e for telephone counselling, 1143 (95% CI 695 to 2270) kg CO2e for group counselling and 2823 (95% CI 1688 to 6549) kg CO2e for individual counselling. Text message, telephone and group counselling remained cost-effective when cost-effectiveness analysis was revised to include the environmental and economic cost of damage from carbon emissions. All smoking cessation services had low emissions compared to the health gains produced. Text message support had the lowest emissions of the services evaluated. Smoking cessation services have small carbon footprints and were cost-effective after accounting for the societal costs of greenhouse gas emissions.

  1. 7 CFR 51.2125 - Split or broken kernels.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Split or broken kernels. 51.2125 Section 51.2125 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... kernels. Split or broken kernels means seven-eighths or less of complete whole kernels but which will not...

  2. Robotic Intelligence Kernel: Driver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    The INL Robotic Intelligence Kernel-Driver is built on top of the RIK-A and implements a dynamic autonomy structure. The RIK-D is used to orchestrate hardware for sensing and action as well as software components for perception, communication, behavior and world modeling into a single cognitive behavior kernel that provides intrinsic intelligence for a wide variety of unmanned ground vehicle systems.

  3. Bell nozzle kernel analysis program

    NASA Technical Reports Server (NTRS)

    Elliot, J. J.; Stromstra, R. R.

    1969-01-01

    Bell Nozzle Kernel Analysis Program computes and analyzes the supersonic flowfield in the kernel, or initial expansion region, of a bell or conical nozzle. It analyzes both plane and axisymmetric geometrices for specified gas properties, nozzle throat geometry and input line.

  4. 7 CFR 51.2296 - Three-fourths half kernel.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Three-fourths half kernel. 51.2296 Section 51.2296 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards...-fourths half kernel. Three-fourths half kernel means a portion of a half of a kernel which has more than...

  5. Application of kernel method in fluorescence molecular tomography

    NASA Astrophysics Data System (ADS)

    Zhao, Yue; Baikejiang, Reheman; Li, Changqing

    2017-02-01

    Reconstruction of fluorescence molecular tomography (FMT) is an ill-posed inverse problem. Anatomical guidance in the FMT reconstruction can improve FMT reconstruction efficiently. We have developed a kernel method to introduce the anatomical guidance into FMT robustly and easily. The kernel method is from machine learning for pattern analysis and is an efficient way to represent anatomical features. For the finite element method based FMT reconstruction, we calculate a kernel function for each finite element node from an anatomical image, such as a micro-CT image. Then the fluorophore concentration at each node is represented by a kernel coefficient vector and the corresponding kernel function. In the FMT forward model, we have a new system matrix by multiplying the sensitivity matrix with the kernel matrix. Thus, the kernel coefficient vector is the unknown to be reconstructed following a standard iterative reconstruction process. We convert the FMT reconstruction problem into the kernel coefficient reconstruction problem. The desired fluorophore concentration at each node can be calculated accordingly. Numerical simulation studies have demonstrated that the proposed kernel-based algorithm can improve the spatial resolution of the reconstructed FMT images. In the proposed kernel method, the anatomical guidance can be obtained directly from the anatomical image and is included in the forward modeling. One of the advantages is that we do not need to segment the anatomical image for the targets and background.

  6. Examining Potential Boundary Bias Effects in Kernel Smoothing on Equating: An Introduction for the Adaptive and Epanechnikov Kernels.

    PubMed

    Cid, Jaime A; von Davier, Alina A

    2015-05-01

    Test equating is a method of making the test scores from different test forms of the same assessment comparable. In the equating process, an important step involves continuizing the discrete score distributions. In traditional observed-score equating, this step is achieved using linear interpolation (or an unscaled uniform kernel). In the kernel equating (KE) process, this continuization process involves Gaussian kernel smoothing. It has been suggested that the choice of bandwidth in kernel smoothing controls the trade-off between variance and bias. In the literature on estimating density functions using kernels, it has also been suggested that the weight of the kernel depends on the sample size, and therefore, the resulting continuous distribution exhibits bias at the endpoints, where the samples are usually smaller. The purpose of this article is (a) to explore the potential effects of atypical scores (spikes) at the extreme ends (high and low) on the KE method in distributions with different degrees of asymmetry using the randomly equivalent groups equating design (Study I), and (b) to introduce the Epanechnikov and adaptive kernels as potential alternative approaches to reducing boundary bias in smoothing (Study II). The beta-binomial model is used to simulate observed scores reflecting a range of different skewed shapes.

  7. 7 CFR 868.254 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.254 Section 868.254 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Governing Application of Standards § 868.254 Broken kernels determination. Broken kernels shall be...

  8. Evaluating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Wilton, Donald R.; Champagne, Nathan J.

    2008-01-01

    Recently, a formulation for evaluating the thin wire kernel was developed that employed a change of variable to smooth the kernel integrand, canceling the singularity in the integrand. Hence, the typical expansion of the wire kernel in a series for use in the potential integrals is avoided. The new expression for the kernel is exact and may be used directly to determine the gradient of the wire kernel, which consists of components that are parallel and radial to the wire axis.

  9. KITTEN Lightweight Kernel 0.1 Beta

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pedretti, Kevin; Levenhagen, Michael; Kelly, Suzanne

    2007-12-12

    The Kitten Lightweight Kernel is a simplified OS (operating system) kernel that is intended to manage a compute node's hardware resources. It provides a set of mechanisms to user-level applications for utilizing hardware resources (e.g., allocating memory, creating processes, accessing the network). Kitten is much simpler than general-purpose OS kernels, such as Linux or Windows, but includes all of the esssential functionality needed to support HPC (high-performance computing) MPI, PGAS and OpenMP applications. Kitten provides unique capabilities such as physically contiguous application memory, transparent large page support, and noise-free tick-less operation, which enable HPC applications to obtain greater efficiency andmore » scalability than with general purpose OS kernels.« less

  10. Implementing Kernel Methods Incrementally by Incremental Nonlinear Projection Trick.

    PubMed

    Kwak, Nojun

    2016-05-20

    Recently, the nonlinear projection trick (NPT) was introduced enabling direct computation of coordinates of samples in a reproducing kernel Hilbert space. With NPT, any machine learning algorithm can be extended to a kernel version without relying on the so called kernel trick. However, NPT is inherently difficult to be implemented incrementally because an ever increasing kernel matrix should be treated as additional training samples are introduced. In this paper, an incremental version of the NPT (INPT) is proposed based on the observation that the centerization step in NPT is unnecessary. Because the proposed INPT does not change the coordinates of the old data, the coordinates obtained by INPT can directly be used in any incremental methods to implement a kernel version of the incremental methods. The effectiveness of the INPT is shown by applying it to implement incremental versions of kernel methods such as, kernel singular value decomposition, kernel principal component analysis, and kernel discriminant analysis which are utilized for problems of kernel matrix reconstruction, letter classification, and face image retrieval, respectively.

  11. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 7 2011-01-01 2011-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the use...

  12. 7 CFR 868.304 - Broken kernels determination.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 7 2010-01-01 2010-01-01 false Broken kernels determination. 868.304 Section 868.304 Agriculture Regulations of the Department of Agriculture (Continued) GRAIN INSPECTION, PACKERS AND STOCKYARD... Application of Standards § 868.304 Broken kernels determination. Broken kernels shall be determined by the use...

  13. Comparison of environmental tobacco smoke (ETS) concentrations generated by an electrically heated cigarette smoking system and a conventional cigarette.

    PubMed

    Tricker, Anthony R; Schorp, Matthias K; Urban, Hans-Jörg; Leyden, Donald; Hagedorn, Heinz-Werner; Engl, Johannes; Urban, Michael; Riedel, Kirsten; Gilch, Gerhard; Janket, Dinamis; Scherer, Gerhard

    2009-01-01

    Smoking conventional lit-end cigarettes results in exposure of nonsmokers to potentially harmful cigarette smoke constituents present in environmental tobacco smoke (ETS) generated by sidestream smoke emissions and exhaled mainstream smoke. ETS constituent concentrations generated by a conventional lit-end cigarette and a newly developed electrically heated cigarette smoking system (EHCSS) that produces only mainstream smoke and no sidestream smoke emissions were investigated in simulated "office" and "hospitality" environments with different levels of baseline indoor air quality. Smoking the EHCSS (International Organisation for Standardization yields: 5 mg tar, 0.3 mg nicotine, and 0.6 mg carbon monoxide) in simulated indoor environments resulted in significant reductions in ETS constituent concentrations compared to when smoking a representative lit-end cigarette (Marlboro: 6 mg tar, 0.5 mg nicotine, and 7 mg carbon monoxide). In direct comparisons, 24 of 29 measured smoke constituents (83%) showed mean reductions of greater than 90%, and 5 smoke constituents (17%) showed mean reductions between 80% and 90%. Gas-vapor phase ETS markers (nicotine and 3-ethenylpyridine) were reduced by an average of 97% (range 94-99%). Total respirable suspended particles, determined by online particle measurements and as gravimetric respirable suspended particles, were reduced by 90% (range 82-100%). The mean and standard deviation of the reduction of all constituents was 94 +/- 4%, indicating that smoking the new EHCSS in simulated "office" and "hospitality" indoor environments resulted in substantial reductions of ETS constituents in indoor air.

  14. Kernel learning at the first level of inference.

    PubMed

    Cawley, Gavin C; Talbot, Nicola L C

    2014-05-01

    Kernel learning methods, whether Bayesian or frequentist, typically involve multiple levels of inference, with the coefficients of the kernel expansion being determined at the first level and the kernel and regularisation parameters carefully tuned at the second level, a process known as model selection. Model selection for kernel machines is commonly performed via optimisation of a suitable model selection criterion, often based on cross-validation or theoretical performance bounds. However, if there are a large number of kernel parameters, as for instance in the case of automatic relevance determination (ARD), there is a substantial risk of over-fitting the model selection criterion, resulting in poor generalisation performance. In this paper we investigate the possibility of learning the kernel, for the Least-Squares Support Vector Machine (LS-SVM) classifier, at the first level of inference, i.e. parameter optimisation. The kernel parameters and the coefficients of the kernel expansion are jointly optimised at the first level of inference, minimising a training criterion with an additional regularisation term acting on the kernel parameters. The key advantage of this approach is that the values of only two regularisation parameters need be determined in model selection, substantially alleviating the problem of over-fitting the model selection criterion. The benefits of this approach are demonstrated using a suite of synthetic and real-world binary classification benchmark problems, where kernel learning at the first level of inference is shown to be statistically superior to the conventional approach, improves on our previous work (Cawley and Talbot, 2007) and is competitive with Multiple Kernel Learning approaches, but with reduced computational expense. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Multiple kernels learning-based biological entity relationship extraction method.

    PubMed

    Dongliang, Xu; Jingchang, Pan; Bailing, Wang

    2017-09-20

    Automatic extracting protein entity interaction information from biomedical literature can help to build protein relation network and design new drugs. There are more than 20 million literature abstracts included in MEDLINE, which is the most authoritative textual database in the field of biomedicine, and follow an exponential growth over time. This frantic expansion of the biomedical literature can often be difficult to absorb or manually analyze. Thus efficient and automated search engines are necessary to efficiently explore the biomedical literature using text mining techniques. The P, R, and F value of tag graph method in Aimed corpus are 50.82, 69.76, and 58.61%, respectively. The P, R, and F value of tag graph kernel method in other four evaluation corpuses are 2-5% higher than that of all-paths graph kernel. And The P, R and F value of feature kernel and tag graph kernel fuse methods is 53.43, 71.62 and 61.30%, respectively. The P, R and F value of feature kernel and tag graph kernel fuse methods is 55.47, 70.29 and 60.37%, respectively. It indicated that the performance of the two kinds of kernel fusion methods is better than that of simple kernel. In comparison with the all-paths graph kernel method, the tag graph kernel method is superior in terms of overall performance. Experiments show that the performance of the multi-kernels method is better than that of the three separate single-kernel method and the dual-mutually fused kernel method used hereof in five corpus sets.

  16. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... generally conforms to the “light” or “light amber” classification, that color classification may be used to... 7 Agriculture 2 2013-01-01 2013-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be...

  17. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... generally conforms to the “light” or “light amber” classification, that color classification may be used to... 7 Agriculture 2 2014-01-01 2014-01-01 false Kernel color classification. 51.1403 Section 51.1403... Color Classification § 51.1403 Kernel color classification. (a) The skin color of pecan kernels may be...

  18. Evidence-based Kernels: Fundamental Units of Behavioral Influence

    PubMed Central

    Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior–influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of its components would render it inert. Existing evidence shows that a variety of kernels can influence behavior in context, and some evidence suggests that frequent use or sufficient use of some kernels may produce longer lasting behavioral shifts. The analysis of kernels could contribute to an empirically based theory of behavioral influence, augment existing prevention or treatment efforts, facilitate the dissemination of effective prevention and treatment practices, clarify the active ingredients in existing interventions, and contribute to efficiently developing interventions that are more effective. Kernels involve one or more of the following mechanisms of behavior influence: reinforcement, altering antecedents, changing verbal relational responding, or changing physiological states directly. The paper describes 52 of these kernels, and details practical, theoretical, and research implications, including calling for a national database of kernels that influence human behavior. PMID:18712600

  19. Integrating the Gradient of the Thin Wire Kernel

    NASA Technical Reports Server (NTRS)

    Champagne, Nathan J.; Wilton, Donald R.

    2008-01-01

    A formulation for integrating the gradient of the thin wire kernel is presented. This approach employs a new expression for the gradient of the thin wire kernel derived from a recent technique for numerically evaluating the exact thin wire kernel. This approach should provide essentially arbitrary accuracy and may be used with higher-order elements and basis functions using the procedure described in [4].When the source and observation points are close, the potential integrals over wire segments involving the wire kernel are split into parts to handle the singular behavior of the integrand [1]. The singularity characteristics of the gradient of the wire kernel are different than those of the wire kernel, and the axial and radial components have different singularities. The characteristics of the gradient of the wire kernel are discussed in [2]. To evaluate the near electric and magnetic fields of a wire, the integration of the gradient of the wire kernel needs to be calculated over the source wire. Since the vector bases for current have constant direction on linear wire segments, these integrals reduce to integrals of the form

  20. THERMOS. 30-Group ENDF/B Scattered Kernels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCrosson, F.J.; Finch, D.R.

    1973-12-01

    These data are 30-group THERMOS thermal scattering kernels for P0 to P5 Legendre orders for every temperature of every material from s(alpha,beta) data stored in the ENDF/B library. These scattering kernels were generated using the FLANGE2 computer code. To test the kernels, the integral properties of each set of kernels were determined by a precision integration of the diffusion length equation and compared to experimental measurements of these properties. In general, the agreement was very good. Details of the methods used and results obtained are contained in the reference. The scattering kernels are organized into a two volume magnetic tapemore » library from which they may be retrieved easily for use in any 30-group THERMOS library.« less

  1. Cigarette Design Features: Effects on Emission Levels, User Perception, and Behavior.

    PubMed

    Talhout, Reinskje; Richter, Patricia A; Stepanov, Irina; Watson, Christina V; Watson, Clifford H

    2018-01-01

    This paper describes the effects of non-tobacco, physical cigarette design features on smoke emissions, product appeal, and smoking behaviors - 3 factors that determine smoker's exposure and related health risks. We reviewed available evidence for the impact of filter ventilation, new filter types, and cigarettes dimensions on toxic emissions, smoker's perceptions, and behavior. For evidence sources we used scientific literature and websites providing product characteristics and marketing information. Whereas filter ventilation results in lower machine-generated emissions, it also leads to perceptions of lighter taste and relative safety in smokers who can unwittingly employ more intense smoking behavior to obtain the desired amount of nicotine and sensory appeal. Filter additives that modify smoke emissions can also modify sensory cues, resulting in changes in smoking behavior. Flavor capsules increase the cigarette's appeal and novelty, and lead to misperceptions of reduced harm. Slim cigarettes have lower yields of some smoke emissions, but smoking behavior can be more intense than with standard cigarettes. Physical design features significantly impact machine-measured emission yields in cigarette smoke, product appeal, smoking behaviors, and exposures in smokers. The influence of current and emerging design features is important in understanding the effectiveness of regulatory actions to reduce smoking-related harm.

  2. Integrating chemical, toxicological and clinical research to assess the potential of reducing health risks associated with cigarette smoking through reducing toxicant emissions.

    PubMed

    McAdam, Kevin; Murphy, James; Eldridge, Alison; Meredith, Clive; Proctor, Christopher

    2018-06-01

    The concept of a risk continuum for tobacco and nicotine products has been proposed, which differentiates products according to their propensity to reduce toxicant exposure and risk. Cigarettes are deemed the most risky and medicinal nicotine the least. We assessed whether a Reduced-Toxicant Prototype (RTP) cigarette could sufficiently reduce exposure to toxicants versus conventional cigarettes to be considered a distinct category in the risk continuum. We present findings from both pre-clinical and clinical studies in order to examine the potential for reduced smoke toxicant emissions to lower health risks associated with cigarette smoking. We conclude that current toxicant reducing technologies are unable to reduce toxicant emissions sufficiently to manifest beneficial disease-relevant changes in smokers. These findings point to a minimum toxicant exposure standard that future potentially reduced risk products would need to meet to be considered for full biological assessment. The RTP met WHO TobReg proposed limits on cigarette toxicant emissions, however the absence of beneficial disease relevant changes in smokers after six months reduced toxicant cigarette use, does not provide evidence that these regulatory proposals will positively impact risks of smoking related diseases. Greater toxicant reductions, such as those that can be achieved in next generation products e.g. tobacco heating products and electronic cigarettes are likely to be necessary to clearly reduce risks compared with conventional cigarettes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  3. The Classification of Diabetes Mellitus Using Kernel k-means

    NASA Astrophysics Data System (ADS)

    Alamsyah, M.; Nafisah, Z.; Prayitno, E.; Afida, A. M.; Imah, E. M.

    2018-01-01

    Diabetes Mellitus is a metabolic disorder which is characterized by chronicle hypertensive glucose. Automatics detection of diabetes mellitus is still challenging. This study detected diabetes mellitus by using kernel k-Means algorithm. Kernel k-means is an algorithm which was developed from k-means algorithm. Kernel k-means used kernel learning that is able to handle non linear separable data; where it differs with a common k-means. The performance of kernel k-means in detecting diabetes mellitus is also compared with SOM algorithms. The experiment result shows that kernel k-means has good performance and a way much better than SOM.

  4. Brain tumor image segmentation using kernel dictionary learning.

    PubMed

    Jeon Lee; Seung-Jun Kim; Rong Chen; Herskovits, Edward H

    2015-08-01

    Automated brain tumor image segmentation with high accuracy and reproducibility holds a big potential to enhance the current clinical practice. Dictionary learning (DL) techniques have been applied successfully to various image processing tasks recently. In this work, kernel extensions of the DL approach are adopted. Both reconstructive and discriminative versions of the kernel DL technique are considered, which can efficiently incorporate multi-modal nonlinear feature mappings based on the kernel trick. Our novel discriminative kernel DL formulation allows joint learning of a task-driven kernel-based dictionary and a linear classifier using a K-SVD-type algorithm. The proposed approaches were tested using real brain magnetic resonance (MR) images of patients with high-grade glioma. The obtained preliminary performances are competitive with the state of the art. The discriminative kernel DL approach is seen to reduce computational burden without much sacrifice in performance.

  5. Development of a kernel function for clinical data.

    PubMed

    Daemen, Anneleen; De Moor, Bart

    2009-01-01

    For most diseases and examinations, clinical data such as age, gender and medical history guides clinical management, despite the rise of high-throughput technologies. To fully exploit such clinical information, appropriate modeling of relevant parameters is required. As the widely used linear kernel function has several disadvantages when applied to clinical data, we propose a new kernel function specifically developed for this data. This "clinical kernel function" more accurately represents similarities between patients. Evidently, three data sets were studied and significantly better performances were obtained with a Least Squares Support Vector Machine when based on the clinical kernel function compared to the linear kernel function.

  6. Towards the Geometry of Reproducing Kernels

    NASA Astrophysics Data System (ADS)

    Galé, J. E.

    2010-11-01

    It is shown here how one is naturally led to consider a category whose objects are reproducing kernels of Hilbert spaces, and how in this way a differential geometry for such kernels may be settled down.

  7. Kernel-PCA data integration with enhanced interpretability

    PubMed Central

    2014-01-01

    Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge. PMID:25032747

  8. 'Herbal' but potentially hazardous: an analysis of the constituents and smoke emissions of tobacco-free waterpipe products and the air quality in the cafés where they are served.

    PubMed

    Hammal, Fadi; Chappell, Alyssa; Wild, T Cameron; Kindzierski, Warren; Shihadeh, Alan; Vanderhoek, Amanda; Huynh, Cong Khanh; Plateel, Gregory; Finegan, Barry A

    2015-05-01

    There are limited data on the composition and smoke emissions of 'herbal' shisha products and the air quality of establishments where they are smoked. Three studies of 'herbal' shisha were conducted: (1) samples of 'herbal' shisha products were chemically analysed; (2) 'herbal' and tobacco shisha were burned in a waterpipe smoking machine and main and sidestream smoke analysed by standard methods and (3) the air quality of six waterpipe cafés was assessed by measurement of CO, particulate and nicotine vapour content. We found considerable variation in heavy metal content between the three products sampled, one being particularly high in lead, chromium, nickel and arsenic. A similar pattern emerged for polycyclic aromatic hydrocarbons. Smoke emission analyses indicated that toxic byproducts produced by the combustion of 'herbal' shisha were equivalent or greater than those produced by tobacco shisha. The results of our air quality assessment demonstrated that mean PM2.5 levels and CO content were significantly higher in waterpipe establishments compared to a casino where cigarette smoking was permitted. Nicotine vapour was detected in one of the waterpipe cafés. 'Herbal' shisha products tested contained toxic trace metals and PAHs levels equivalent to, or in excess of, that found in cigarettes. Their mainstream and sidestream smoke emissions contained carcinogens equivalent to, or in excess of, those of tobacco products. The content of the air in the waterpipe cafés tested was potentially hazardous. These data, in aggregate, suggest that smoking 'herbal' shisha may well be dangerous to health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  9. Gaussian mass optimization for kernel PCA parameters

    NASA Astrophysics Data System (ADS)

    Liu, Yong; Wang, Zulin

    2011-10-01

    This paper proposes a novel kernel parameter optimization method based on Gaussian mass, which aims to overcome the current brute force parameter optimization method in a heuristic way. Generally speaking, the choice of kernel parameter should be tightly related to the target objects while the variance between the samples, the most commonly used kernel parameter, doesn't possess much features of the target, which gives birth to Gaussian mass. Gaussian mass defined in this paper has the property of the invariance of rotation and translation and is capable of depicting the edge, topology and shape information. Simulation results show that Gaussian mass leads a promising heuristic optimization boost up for kernel method. In MNIST handwriting database, the recognition rate improves by 1.6% compared with common kernel method without Gaussian mass optimization. Several promising other directions which Gaussian mass might help are also proposed at the end of the paper.

  10. Design of CT reconstruction kernel specifically for clinical lung imaging

    NASA Astrophysics Data System (ADS)

    Cody, Dianna D.; Hsieh, Jiang; Gladish, Gregory W.

    2005-04-01

    In this study we developed a new reconstruction kernel specifically for chest CT imaging. An experimental flat-panel CT scanner was used on large dogs to produce 'ground-truth" reference chest CT images. These dogs were also examined using a clinical 16-slice CT scanner. We concluded from the dog images acquired on the clinical scanner that the loss of subtle lung structures was due mostly to the presence of the background noise texture when using currently available reconstruction kernels. This qualitative evaluation of the dog CT images prompted the design of a new recon kernel. This new kernel consisted of the combination of a low-pass and a high-pass kernel to produce a new reconstruction kernel, called the 'Hybrid" kernel. The performance of this Hybrid kernel fell between the two kernels on which it was based, as expected. This Hybrid kernel was also applied to a set of 50 patient data sets; the analysis of these clinical images is underway. We are hopeful that this Hybrid kernel will produce clinical images with an acceptable tradeoff of lung detail, reliable HU, and image noise.

  11. Quality changes in macadamia kernel between harvest and farm-gate.

    PubMed

    Walton, David A; Wallace, Helen M

    2011-02-01

    Macadamia integrifolia, Macadamia tetraphylla and their hybrids are cultivated for their edible kernels. After harvest, nuts-in-shell are partially dried on-farm and sorted to eliminate poor-quality kernels before consignment to a processor. During these operations, kernel quality may be lost. In this study, macadamia nuts-in-shell were sampled at five points of an on-farm postharvest handling chain from dehusking to the final storage silo to assess quality loss prior to consignment. Shoulder damage, weight of pieces and unsound kernel were assessed for raw kernels, and colour, mottled colour and surface damage for roasted kernels. Shoulder damage, weight of pieces and unsound kernel for raw kernels increased significantly between the dehusker and the final silo. Roasted kernels displayed a significant increase in dark colour, mottled colour and surface damage during on-farm handling. Significant loss of macadamia kernel quality occurred on a commercial farm during sorting and storage of nuts-in-shell before nuts were consigned to a processor. Nuts-in-shell should be dried as quickly as possible and on-farm handling minimised to maintain optimum kernel quality. 2010 Society of Chemical Industry.

  12. Unified heat kernel regression for diffusion, kernel smoothing and wavelets on manifolds and its application to mandible growth modeling in CT images.

    PubMed

    Chung, Moo K; Qiu, Anqi; Seo, Seongho; Vorperian, Houri K

    2015-05-01

    We present a novel kernel regression framework for smoothing scalar surface data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel constructed from the eigenfunctions, we formulate a new bivariate kernel regression framework as a weighted eigenfunction expansion with the heat kernel as the weights. The new kernel method is mathematically equivalent to isotropic heat diffusion, kernel smoothing and recently popular diffusion wavelets. The numerical implementation is validated on a unit sphere using spherical harmonics. As an illustration, the method is applied to characterize the localized growth pattern of mandible surfaces obtained in CT images between ages 0 and 20 by regressing the length of displacement vectors with respect to a surface template. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Quantum kernel applications in medicinal chemistry.

    PubMed

    Huang, Lulu; Massa, Lou

    2012-07-01

    Progress in the quantum mechanics of biological molecules is being driven by computational advances. The notion of quantum kernels can be introduced to simplify the formalism of quantum mechanics, making it especially suitable for parallel computation of very large biological molecules. The essential idea is to mathematically break large biological molecules into smaller kernels that are calculationally tractable, and then to represent the full molecule by a summation over the kernels. The accuracy of the kernel energy method (KEM) is shown by systematic application to a great variety of molecular types found in biology. These include peptides, proteins, DNA and RNA. Examples are given that explore the KEM across a variety of chemical models, and to the outer limits of energy accuracy and molecular size. KEM represents an advance in quantum biology applicable to problems in medicine and drug design.

  14. Generalization Performance of Regularized Ranking With Multiscale Kernels.

    PubMed

    Zhou, Yicong; Chen, Hong; Lan, Rushi; Pan, Zhibin

    2016-05-01

    The regularized kernel method for the ranking problem has attracted increasing attentions in machine learning. The previous regularized ranking algorithms are usually based on reproducing kernel Hilbert spaces with a single kernel. In this paper, we go beyond this framework by investigating the generalization performance of the regularized ranking with multiscale kernels. A novel ranking algorithm with multiscale kernels is proposed and its representer theorem is proved. We establish the upper bound of the generalization error in terms of the complexity of hypothesis spaces. It shows that the multiscale ranking algorithm can achieve satisfactory learning rates under mild conditions. Experiments demonstrate the effectiveness of the proposed method for drug discovery and recommendation tasks.

  15. Multineuron spike train analysis with R-convolution linear combination kernel.

    PubMed

    Tezuka, Taro

    2018-06-01

    A spike train kernel provides an effective way of decoding information represented by a spike train. Some spike train kernels have been extended to multineuron spike trains, which are simultaneously recorded spike trains obtained from multiple neurons. However, most of these multineuron extensions were carried out in a kernel-specific manner. In this paper, a general framework is proposed for extending any single-neuron spike train kernel to multineuron spike trains, based on the R-convolution kernel. Special subclasses of the proposed R-convolution linear combination kernel are explored. These subclasses have a smaller number of parameters and make optimization tractable when the size of data is limited. The proposed kernel was evaluated using Gaussian process regression for multineuron spike trains recorded from an animal brain. It was compared with the sum kernel and the population Spikernel, which are existing ways of decoding multineuron spike trains using kernels. The results showed that the proposed approach performs better than these kernels and also other commonly used neural decoding methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Putting Priors in Mixture Density Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2004-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly infinite dimensional feature space. We describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using predefined kernels. These data adaptive kernels can en- code prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS). The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains template for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic- algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code. The results show that the Mixture Density Mercer-Kernel described here outperforms tree-based classification in distinguishing high-redshift galaxies from low- redshift galaxies by approximately 16% on test data, bagged trees by approximately 7%, and bagged trees built on a much larger sample of data by approximately 2%.

  17. Increasing accuracy of dispersal kernels in grid-based population models

    USGS Publications Warehouse

    Slone, D.H.

    2011-01-01

    Dispersal kernels in grid-based population models specify the proportion, distance and direction of movements within the model landscape. Spatial errors in dispersal kernels can have large compounding effects on model accuracy. Circular Gaussian and Laplacian dispersal kernels at a range of spatial resolutions were investigated, and methods for minimizing errors caused by the discretizing process were explored. Kernels of progressively smaller sizes relative to the landscape grid size were calculated using cell-integration and cell-center methods. These kernels were convolved repeatedly, and the final distribution was compared with a reference analytical solution. For large Gaussian kernels (σ > 10 cells), the total kernel error was <10 &sup-11; compared to analytical results. Using an invasion model that tracked the time a population took to reach a defined goal, the discrete model results were comparable to the analytical reference. With Gaussian kernels that had σ ≤ 0.12 using the cell integration method, or σ ≤ 0.22 using the cell center method, the kernel error was greater than 10%, which resulted in invasion times that were orders of magnitude different than theoretical results. A goal-seeking routine was developed to adjust the kernels to minimize overall error. With this, corrections for small kernels were found that decreased overall kernel error to <10-11 and invasion time error to <5%.

  18. An SVM model with hybrid kernels for hydrological time series

    NASA Astrophysics Data System (ADS)

    Wang, C.; Wang, H.; Zhao, X.; Xie, Q.

    2017-12-01

    Support Vector Machine (SVM) models have been widely applied to the forecast of climate/weather and its impact on other environmental variables such as hydrologic response to climate/weather. When using SVM, the choice of the kernel function plays the key role. Conventional SVM models mostly use one single type of kernel function, e.g., radial basis kernel function. Provided that there are several featured kernel functions available, each having its own advantages and drawbacks, a combination of these kernel functions may give more flexibility and robustness to SVM approach, making it suitable for a wide range of application scenarios. This paper presents such a linear combination of radial basis kernel and polynomial kernel for the forecast of monthly flowrate in two gaging stations using SVM approach. The results indicate significant improvement in the accuracy of predicted series compared to the approach with either individual kernel function, thus demonstrating the feasibility and advantages of such hybrid kernel approach for SVM applications.

  19. Graph wavelet alignment kernels for drug virtual screening.

    PubMed

    Smalter, Aaron; Huan, Jun; Lushington, Gerald

    2009-06-01

    In this paper, we introduce a novel statistical modeling technique for target property prediction, with applications to virtual screening and drug design. In our method, we use graphs to model chemical structures and apply a wavelet analysis of graphs to summarize features capturing graph local topology. We design a novel graph kernel function to utilize the topology features to build predictive models for chemicals via Support Vector Machine classifier. We call the new graph kernel a graph wavelet-alignment kernel. We have evaluated the efficacy of the wavelet-alignment kernel using a set of chemical structure-activity prediction benchmarks. Our results indicate that the use of the kernel function yields performance profiles comparable to, and sometimes exceeding that of the existing state-of-the-art chemical classification approaches. In addition, our results also show that the use of wavelet functions significantly decreases the computational costs for graph kernel computation with more than ten fold speedup.

  20. Small convolution kernels for high-fidelity image restoration

    NASA Technical Reports Server (NTRS)

    Reichenbach, Stephen E.; Park, Stephen K.

    1991-01-01

    An algorithm is developed for computing the mean-square-optimal values for small, image-restoration kernels. The algorithm is based on a comprehensive, end-to-end imaging system model that accounts for the important components of the imaging process: the statistics of the scene, the point-spread function of the image-gathering device, sampling effects, noise, and display reconstruction. Subject to constraints on the spatial support of the kernel, the algorithm generates the kernel values that restore the image with maximum fidelity, that is, the kernel minimizes the expected mean-square restoration error. The algorithm is consistent with the derivation of the spatially unconstrained Wiener filter, but leads to a small, spatially constrained kernel that, unlike the unconstrained filter, can be efficiently implemented by convolution. Simulation experiments demonstrate that for a wide range of imaging systems these small kernels can restore images with fidelity comparable to images restored with the unconstrained Wiener filter.

  1. Smoke Sense: Citizen Science Study on Health Risk and Health Risk Communication During Wildfire Smoke Episodes

    EPA Science Inventory

    Why do we need to communicate smoke impacts on health? Indicence and severity of large fires are increasing. As emissions from the Wildland fires produce air pollution that adversely impacts people's health, incidence and severity of large fires are increasing. As emissions fr...

  2. Reduced multiple empirical kernel learning machine.

    PubMed

    Wang, Zhe; Lu, MingZhe; Gao, Daqi

    2015-02-01

    Multiple kernel learning (MKL) is demonstrated to be flexible and effective in depicting heterogeneous data sources since MKL can introduce multiple kernels rather than a single fixed kernel into applications. However, MKL would get a high time and space complexity in contrast to single kernel learning, which is not expected in real-world applications. Meanwhile, it is known that the kernel mapping ways of MKL generally have two forms including implicit kernel mapping and empirical kernel mapping (EKM), where the latter is less attracted. In this paper, we focus on the MKL with the EKM, and propose a reduced multiple empirical kernel learning machine named RMEKLM for short. To the best of our knowledge, it is the first to reduce both time and space complexity of the MKL with EKM. Different from the existing MKL, the proposed RMEKLM adopts the Gauss Elimination technique to extract a set of feature vectors, which is validated that doing so does not lose much information of the original feature space. Then RMEKLM adopts the extracted feature vectors to span a reduced orthonormal subspace of the feature space, which is visualized in terms of the geometry structure. It can be demonstrated that the spanned subspace is isomorphic to the original feature space, which means that the dot product of two vectors in the original feature space is equal to that of the two corresponding vectors in the generated orthonormal subspace. More importantly, the proposed RMEKLM brings a simpler computation and meanwhile needs a less storage space, especially in the processing of testing. Finally, the experimental results show that RMEKLM owns a much efficient and effective performance in terms of both complexity and classification. The contributions of this paper can be given as follows: (1) by mapping the input space into an orthonormal subspace, the geometry of the generated subspace is visualized; (2) this paper first reduces both the time and space complexity of the EKM-based MKL; (3

  3. 7 CFR 981.61 - Redetermination of kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Redetermination of kernel weight. 981.61 Section 981... GROWN IN CALIFORNIA Order Regulating Handling Volume Regulation § 981.61 Redetermination of kernel weight. The Board, on the basis of reports by handlers, shall redetermine the kernel weight of almonds...

  4. Enhanced gluten properties in soft kernel durum wheat

    USDA-ARS?s Scientific Manuscript database

    Soft kernel durum wheat is a relatively recent development (Morris et al. 2011 Crop Sci. 51:114). The soft kernel trait exerts profound effects on kernel texture, flour milling including break flour yield, milling energy, and starch damage, and dough water absorption (DWA). With the caveat of reduce...

  5. Characterizing sources of emissions from wildland fires

    Treesearch

    Roger D. Ottmar; Ana Isabel Miranda; David V. Sandberg

    2009-01-01

    Smoke emissions from wildland fire can be harmful to human health and welfare, impair visibility, and contribute to greenhouse gas emissions. The generation of emissions and heat release need to be characterized to estimate the potential impacts of wildland fire smoke. This requires explicit knowledge of the source, including size of the area burned, burn period,...

  6. Achillea millefolium L. extract mediated green synthesis of waste peach kernel shell supported silver nanoparticles: Application of the nanoparticles for catalytic reduction of a variety of dyes in water.

    PubMed

    Khodadadi, Bahar; Bordbar, Maryam; Nasrollahzadeh, Mahmoud

    2017-05-01

    In this paper, silver nanoparticles (Ag NPs) are synthesized using Achillea millefolium L. extract as reducing and stabilizing agents and peach kernel shell as an environmentally benign support. FT-IR spectroscopy, UV-Vis spectroscopy, X-ray Diffraction (XRD), Field emission scanning electron microscopy (FESEM), Energy Dispersive X-ray Spectroscopy (EDS), Thermo gravimetric-differential thermal analysis (TG-DTA) and Transmission Electron Microscopy (TEM) were used to characterize peach kernel shell, Ag NPs, and Ag NPs/peach kernel shell. The catalytic activity of the Ag NPs/peach kernel shell was investigated for the reduction of 4-nitrophenol (4-NP), Methyl Orange (MO), and Methylene Blue (MB) at room temperature. Ag NPs/peach kernel shell was found to be a highly active catalyst. In addition, Ag NPs/peach kernel shell can be recovered and reused several times with no significant loss of its catalytic activity. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Accelerating the Original Profile Kernel.

    PubMed

    Hamp, Tobias; Goldberg, Tatyana; Rost, Burkhard

    2013-01-01

    One of the most accurate multi-class protein classification systems continues to be the profile-based SVM kernel introduced by the Leslie group. Unfortunately, its CPU requirements render it too slow for practical applications of large-scale classification tasks. Here, we introduce several software improvements that enable significant acceleration. Using various non-redundant data sets, we demonstrate that our new implementation reaches a maximal speed-up as high as 14-fold for calculating the same kernel matrix. Some predictions are over 200 times faster and render the kernel as possibly the top contender in a low ratio of speed/performance. Additionally, we explain how to parallelize various computations and provide an integrative program that reduces creating a production-quality classifier to a single program call. The new implementation is available as a Debian package under a free academic license and does not depend on commercial software. For non-Debian based distributions, the source package ships with a traditional Makefile-based installer. Download and installation instructions can be found at https://rostlab.org/owiki/index.php/Fast_Profile_Kernel. Bugs and other issues may be reported at https://rostlab.org/bugzilla3/enter_bug.cgi?product=fastprofkernel.

  8. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 3 2014-04-01 2014-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in producing, manufacturing, packing, processing, preparing, treating...

  9. 7 CFR 981.60 - Determination of kernel weight.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 8 2010-01-01 2010-01-01 false Determination of kernel weight. 981.60 Section 981.60... Regulating Handling Volume Regulation § 981.60 Determination of kernel weight. (a) Almonds for which settlement is made on kernel weight. All lots of almonds, whether shelled or unshelled, for which settlement...

  10. End-use quality of soft kernel durum wheat

    USDA-ARS?s Scientific Manuscript database

    Kernel texture is a major determinant of end-use quality of wheat. Durum wheat has very hard kernels. We developed soft kernel durum wheat via Ph1b-mediated homoeologous recombination. The Hardness locus was transferred from Chinese Spring to Svevo durum wheat via back-crossing. ‘Soft Svevo’ had SKC...

  11. Deep Restricted Kernel Machines Using Conjugate Feature Duality.

    PubMed

    Suykens, Johan A K

    2017-08-01

    The aim of this letter is to propose a theory of deep restricted kernel machines offering new foundations for deep learning with kernel machines. From the viewpoint of deep learning, it is partially related to restricted Boltzmann machines, which are characterized by visible and hidden units in a bipartite graph without hidden-to-hidden connections and deep learning extensions as deep belief networks and deep Boltzmann machines. From the viewpoint of kernel machines, it includes least squares support vector machines for classification and regression, kernel principal component analysis (PCA), matrix singular value decomposition, and Parzen-type models. A key element is to first characterize these kernel machines in terms of so-called conjugate feature duality, yielding a representation with visible and hidden units. It is shown how this is related to the energy form in restricted Boltzmann machines, with continuous variables in a nonprobabilistic setting. In this new framework of so-called restricted kernel machine (RKM) representations, the dual variables correspond to hidden features. Deep RKM are obtained by coupling the RKMs. The method is illustrated for deep RKM, consisting of three levels with a least squares support vector machine regression level and two kernel PCA levels. In its primal form also deep feedforward neural networks can be trained within this framework.

  12. Improved modeling of clinical data with kernel methods.

    PubMed

    Daemen, Anneleen; Timmerman, Dirk; Van den Bosch, Thierry; Bottomley, Cecilia; Kirk, Emma; Van Holsbeke, Caroline; Valentin, Lil; Bourne, Tom; De Moor, Bart

    2012-02-01

    Despite the rise of high-throughput technologies, clinical data such as age, gender and medical history guide clinical management for most diseases and examinations. To improve clinical management, available patient information should be fully exploited. This requires appropriate modeling of relevant parameters. When kernel methods are used, traditional kernel functions such as the linear kernel are often applied to the set of clinical parameters. These kernel functions, however, have their disadvantages due to the specific characteristics of clinical data, being a mix of variable types with each variable its own range. We propose a new kernel function specifically adapted to the characteristics of clinical data. The clinical kernel function provides a better representation of patients' similarity by equalizing the influence of all variables and taking into account the range r of the variables. Moreover, it is robust with respect to changes in r. Incorporated in a least squares support vector machine, the new kernel function results in significantly improved diagnosis, prognosis and prediction of therapy response. This is illustrated on four clinical data sets within gynecology, with an average increase in test area under the ROC curve (AUC) of 0.023, 0.021, 0.122 and 0.019, respectively. Moreover, when combining clinical parameters and expression data in three case studies on breast cancer, results improved overall with use of the new kernel function and when considering both data types in a weighted fashion, with a larger weight assigned to the clinical parameters. The increase in AUC with respect to a standard kernel function and/or unweighted data combination was maximum 0.127, 0.042 and 0.118 for the three case studies. For clinical data consisting of variables of different types, the proposed kernel function--which takes into account the type and range of each variable--has shown to be a better alternative for linear and non-linear classification problems

  13. Triso coating development progress for uranium nitride kernels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jolly, Brian C.; Lindemer, Terrence; Terrani, Kurt A.

    2015-08-01

    In support of fully ceramic matrix (FCM) fuel development [1-2], coating development work is ongoing at the Oak Ridge National Laboratory (ORNL) to produce tri-structural isotropic (TRISO) coated fuel particles with UN kernels [3]. The nitride kernels are used to increase fissile density in these SiC-matrix fuel pellets with details described elsewhere [4]. The advanced gas reactor (AGR) program at ORNL used fluidized bed chemical vapor deposition (FBCVD) techniques for TRISO coating of UCO (two phase mixture of UO2 and UCx) kernels [5]. Similar techniques were employed for coating of the UN kernels, however significant changes in processing conditions weremore » required to maintain acceptable coating properties due to physical property and dimensional differences between the UCO and UN kernels (Table 1).« less

  14. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 3 2011-04-01 2011-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in producing...

  15. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 3 2012-04-01 2012-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in producing...

  16. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 3 2010-04-01 2009-04-01 true Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in producing...

  17. 21 CFR 176.350 - Tamarind seed kernel powder.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 3 2013-04-01 2013-04-01 false Tamarind seed kernel powder. 176.350 Section 176... Substances for Use Only as Components of Paper and Paperboard § 176.350 Tamarind seed kernel powder. Tamarind seed kernel powder may be safely used as a component of articles intended for use in producing...

  18. When smoke comes to town - effects of biomass burning smoke on air quality down under

    NASA Astrophysics Data System (ADS)

    Keywood, Melita; Cope, Martin; (C. P) Meyer, Mick; Iinuma, Yoshi; Emmerson, Kathryn

    2014-05-01

    Annually, biomass burning results in the emission of quantities of trace gases and aerosol to the atmosphere. Biomass burning emissions have a significant effect on atmospheric chemistry due to the presence of reactive species. Biomass burning aerosols influence the radiative balance of the earth-atmosphere system directly through the scattering and absorption of radiation, and indirectly through their influence on cloud microphysical processes, and therefore constitute an important forcing in climate models. They also reduce visibility, influence atmospheric photochemistry and can be inhaled into the deepest parts of the lungs, so that they can have a significant effect on human health. Australia experiences bushfires on an annual basis. In most years fires are restricted to the tropical savannah forests of Northern Australia. However in the summer of 2006/2007 (December 2006 - February 2007), South Eastern Australia was affected by the longest recorded fires in its history. During this time the State of Victoria was ravaged by 690 separate bushfires, including the major Great Divide Fire, which devastated 1,048,238 hectares over 69 days. On several occasions, thick smoke haze was transported to the Melbourne central business district and PM10 concentrations at several air quality monitoring stations peaked at over 200 µg m-3 (four times the National Environment Protection Measure PM10 24 hour standard). During this period, a comprehensive suite of air quality measurements was carried out at a location 25 km south of the Melbourne CBD, including detailed aerosol microphysical and chemical composition measurements. Here we examine the chemical and physical properties of the smoke plume as it impacted Melbourne's air shed and discuss its impact on air quality over the city. We estimate the aerosol emission rates of the source fires, the age of the plumes and investigate the transformation of the smoke as it progressed from its source to the Melbourne airshed. We

  19. Unified Heat Kernel Regression for Diffusion, Kernel Smoothing and Wavelets on Manifolds and Its Application to Mandible Growth Modeling in CT Images

    PubMed Central

    Chung, Moo K.; Qiu, Anqi; Seo, Seongho; Vorperian, Houri K.

    2014-01-01

    We present a novel kernel regression framework for smoothing scalar surface data using the Laplace-Beltrami eigenfunctions. Starting with the heat kernel constructed from the eigenfunctions, we formulate a new bivariate kernel regression framework as a weighted eigenfunction expansion with the heat kernel as the weights. The new kernel regression is mathematically equivalent to isotropic heat diffusion, kernel smoothing and recently popular diffusion wavelets. Unlike many previous partial differential equation based approaches involving diffusion, our approach represents the solution of diffusion analytically, reducing numerical inaccuracy and slow convergence. The numerical implementation is validated on a unit sphere using spherical harmonics. As an illustration, we have applied the method in characterizing the localized growth pattern of mandible surfaces obtained in CT images from subjects between ages 0 and 20 years by regressing the length of displacement vectors with respect to the template surface. PMID:25791435

  20. The Growing Public Health Impact of Wildfire Smoke Emissions Webinar

    EPA Pesticide Factsheets

    This is a brief discussion of wildfire smoke and its health effects along with tools available to provide public health guidance during wildfire events, including the Wildfire Smoke Guide for Public Health Officials

  1. A dynamic kernel modifier for linux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minnich, R. G.

    2002-09-03

    Dynamic Kernel Modifier, or DKM, is a kernel module for Linux that allows user-mode programs to modify the execution of functions in the kernel without recompiling or modifying the kernel source in any way. Functions may be traced, either function entry only or function entry and exit; nullified; or replaced with some other function. For the tracing case, function execution results in the activation of a watchpoint. When the watchpoint is activated, the address of the function is logged in a FIFO buffer that is readable by external applications. The watchpoints are time-stamped with the resolution of the processor highmore » resolution timers, which on most modem processors are accurate to a single processor tick. DKM is very similar to earlier systems such as the SunOS trace device or Linux TT. Unlike these two systems, and other similar systems, DKM requires no kernel modifications. DKM allows users to do initial probing of the kernel to look for performance problems, or even to resolve potential problems by turning functions off or replacing them. DKM watchpoints are not without cost: it takes about 200 nanoseconds to make a log entry on an 800 Mhz Pentium-Ill. The overhead numbers are actually competitive with other hardware-based trace systems, although it has less 'Los Alamos National Laboratory is operated by the University of California for the National Nuclear Security Administration of the United States Department of Energy under contract W-7405-ENG-36. accuracy than an In-Circuit Emulator such as the American Arium. Once the user has zeroed in on a problem, other mechanisms with a higher degree of accuracy can be used.« less

  2. Connecting smoke plumes to sources using Hazard Mapping System (HMS) smoke and fire location data over North America

    NASA Astrophysics Data System (ADS)

    Brey, Steven J.; Ruminski, Mark; Atwood, Samuel A.; Fischer, Emily V.

    2018-02-01

    Fires represent an air quality challenge because they are large, dynamic and transient sources of particulate matter and ozone precursors. Transported smoke can deteriorate air quality over large regions. Fire severity and frequency are likely to increase in the future, exacerbating an existing problem. Using the National Environmental Satellite, Data, and Information Service (NESDIS) Hazard Mapping System (HMS) smoke data for North America for the period 2007 to 2014, we examine a subset of fires that are confirmed to have produced sufficient smoke to warrant the initiation of a U.S. National Weather Service smoke forecast. We find that gridded HMS-analyzed fires are well correlated (r = 0.84) with emissions from the Global Fire Emissions Inventory Database 4s (GFED4s). We define a new metric, smoke hours, by linking observed smoke plumes to active fires using ensembles of forward trajectories. This work shows that the Southwest, Northwest, and Northwest Territories initiate the most air quality forecasts and produce more smoke than any other North American region by measure of the number of HYSPLIT points analyzed, the duration of those HYSPLIT points, and the total number of smoke hours produced. The average number of days with smoke plumes overhead is largest over the north-central United States. Only Alaska, the Northwest, the Southwest, and Southeast United States regions produce the majority of smoke plumes observed over their own borders. This work moves a new dataset from a daily operational setting to a research context, and it demonstrates how changes to the frequency or intensity of fires in the western United States could impact other regions.

  3. Biogenic Emission Inventory System (BEIS)

    EPA Pesticide Factsheets

    Biogenic Emission Inventory System (BEIS) estimates volatile organic compound (VOC) emissions from vegetation and nitric oxide (NO) emission from soils. Recent BEIS development has been restricted to the SMOKE system

  4. Hadamard Kernel SVM with applications for breast cancer outcome predictions.

    PubMed

    Jiang, Hao; Ching, Wai-Ki; Cheung, Wai-Shun; Hou, Wenpin; Yin, Hong

    2017-12-21

    Breast cancer is one of the leading causes of deaths for women. It is of great necessity to develop effective methods for breast cancer detection and diagnosis. Recent studies have focused on gene-based signatures for outcome predictions. Kernel SVM for its discriminative power in dealing with small sample pattern recognition problems has attracted a lot attention. But how to select or construct an appropriate kernel for a specified problem still needs further investigation. Here we propose a novel kernel (Hadamard Kernel) in conjunction with Support Vector Machines (SVMs) to address the problem of breast cancer outcome prediction using gene expression data. Hadamard Kernel outperform the classical kernels and correlation kernel in terms of Area under the ROC Curve (AUC) values where a number of real-world data sets are adopted to test the performance of different methods. Hadamard Kernel SVM is effective for breast cancer predictions, either in terms of prognosis or diagnosis. It may benefit patients by guiding therapeutic options. Apart from that, it would be a valuable addition to the current SVM kernel families. We hope it will contribute to the wider biology and related communities.

  5. Kernel Partial Least Squares for Nonlinear Regression and Discrimination

    NASA Technical Reports Server (NTRS)

    Rosipal, Roman; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper summarizes recent results on applying the method of partial least squares (PLS) in a reproducing kernel Hilbert space (RKHS). A previously proposed kernel PLS regression model was proven to be competitive with other regularized regression methods in RKHS. The family of nonlinear kernel-based PLS models is extended by considering the kernel PLS method for discrimination. Theoretical and experimental results on a two-class discrimination problem indicate usefulness of the method.

  6. Aflatoxin contamination of developing corn kernels.

    PubMed

    Amer, M A

    2005-01-01

    Preharvest of corn and its contamination with aflatoxin is a serious problem. Some environmental and cultural factors responsible for infection and subsequent aflatoxin production were investigated in this study. Stage of growth and location of kernels on corn ears were found to be one of the important factors in the process of kernel infection with A. flavus & A. parasiticus. The results showed positive correlation between the stage of growth and kernel infection. Treatment of corn with aflatoxin reduced germination, protein and total nitrogen contents. Total and reducing soluble sugar was increase in corn kernels as response to infection. Sucrose and protein content were reduced in case of both pathogens. Shoot system length, seeding fresh weigh and seedling dry weigh was also affected. Both pathogens induced reduction of starch content. Healthy corn seedlings treated with aflatoxin solution were badly affected. Their leaves became yellow then, turned brown with further incubation. Moreover, their total chlorophyll and protein contents showed pronounced decrease. On the other hand, total phenolic compounds were increased. Histopathological studies indicated that A. flavus & A. parasiticus could colonize corn silks and invade developing kernels. Germination of A. flavus spores was occurred and hyphae spread rapidly across the silk, producing extensive growth and lateral branching. Conidiophores and conidia had formed in and on the corn silk. Temperature and relative humidity greatly influenced the growth of A. flavus & A. parasiticus and aflatoxin production.

  7. 14 CFR 34.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (New Aircraft Gas Turbine Engines) § 34.21 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each new aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974...) Exhaust emission of smoke from each new aircraft gas turbine engine of class T3 manufactured on or after...

  8. 14 CFR 34.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (New Aircraft Gas Turbine Engines) § 34.21 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each new aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974...) Exhaust emission of smoke from each new aircraft gas turbine engine of class T3 manufactured on or after...

  9. Anthraquinones isolated from the browned Chinese chestnut kernels (Castanea mollissima blume)

    NASA Astrophysics Data System (ADS)

    Zhang, Y. L.; Qi, J. H.; Qin, L.; Wang, F.; Pang, M. X.

    2016-08-01

    Anthraquinones (AQS) represent a group of secondary metallic products in plants. AQS are often naturally occurring in plants and microorganisms. In a previous study, we found that AQS were produced by enzymatic browning reaction in Chinese chestnut kernels. To find out whether non-enzymatic browning reaction in the kernels could produce AQS too, AQS were extracted from three groups of chestnut kernels: fresh kernels, non-enzymatic browned kernels, and browned kernels, and the contents of AQS were determined. High performance liquid chromatography (HPLC) and nuclear magnetic resonance (NMR) methods were used to identify two compounds of AQS, rehein(1) and emodin(2). AQS were barely exists in the fresh kernels, while both browned kernel groups sample contained a high amount of AQS. Thus, we comfirmed that AQS could be produced during both enzymatic and non-enzymatic browning process. Rhein and emodin were the main components of AQS in the browned kernels.

  10. Performance Characteristics of a Kernel-Space Packet Capture Module

    DTIC Science & Technology

    2010-03-01

    Defense, or the United States Government . AFIT/GCO/ENG/10-03 PERFORMANCE CHARACTERISTICS OF A KERNEL-SPACE PACKET CAPTURE MODULE THESIS Presented to the...3.1.2.3 Prototype. The proof of concept for this research is the design, development, and comparative performance analysis of a kernel level N2d capture...changes to kernel code 5. Can be used for both user-space and kernel-space capture applications in order to control comparative performance analysis to

  11. Anatomically-Aided PET Reconstruction Using the Kernel Method

    PubMed Central

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2016-01-01

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest (ROI) quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization (EM) algorithm. PMID:27541810

  12. Anatomically-aided PET reconstruction using the kernel method.

    PubMed

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T; Catana, Ciprian; Qi, Jinyi

    2016-09-21

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization algorithm.

  13. Anatomically-aided PET reconstruction using the kernel method

    NASA Astrophysics Data System (ADS)

    Hutchcroft, Will; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2016-09-01

    This paper extends the kernel method that was proposed previously for dynamic PET reconstruction, to incorporate anatomical side information into the PET reconstruction model. In contrast to existing methods that incorporate anatomical information using a penalized likelihood framework, the proposed method incorporates this information in the simpler maximum likelihood (ML) formulation and is amenable to ordered subsets. The new method also does not require any segmentation of the anatomical image to obtain edge information. We compare the kernel method with the Bowsher method for anatomically-aided PET image reconstruction through a simulated data set. Computer simulations demonstrate that the kernel method offers advantages over the Bowsher method in region of interest quantification. Additionally the kernel method is applied to a 3D patient data set. The kernel method results in reduced noise at a matched contrast level compared with the conventional ML expectation maximization algorithm.

  14. Embedded real-time operating system micro kernel design

    NASA Astrophysics Data System (ADS)

    Cheng, Xiao-hui; Li, Ming-qiang; Wang, Xin-zheng

    2005-12-01

    Embedded systems usually require a real-time character. Base on an 8051 microcontroller, an embedded real-time operating system micro kernel is proposed consisting of six parts, including a critical section process, task scheduling, interruption handle, semaphore and message mailbox communication, clock managent and memory managent. Distributed CPU and other resources are among tasks rationally according to the importance and urgency. The design proposed here provides the position, definition, function and principle of micro kernel. The kernel runs on the platform of an ATMEL AT89C51 microcontroller. Simulation results prove that the designed micro kernel is stable and reliable and has quick response while operating in an application system.

  15. Kernel Temporal Differences for Neural Decoding

    PubMed Central

    Bae, Jihye; Sanchez Giraldo, Luis G.; Pohlmeyer, Eric A.; Francis, Joseph T.; Sanchez, Justin C.; Príncipe, José C.

    2015-01-01

    We study the feasibility and capability of the kernel temporal difference (KTD)(λ) algorithm for neural decoding. KTD(λ) is an online, kernel-based learning algorithm, which has been introduced to estimate value functions in reinforcement learning. This algorithm combines kernel-based representations with the temporal difference approach to learning. One of our key observations is that by using strictly positive definite kernels, algorithm's convergence can be guaranteed for policy evaluation. The algorithm's nonlinear functional approximation capabilities are shown in both simulations of policy evaluation and neural decoding problems (policy improvement). KTD can handle high-dimensional neural states containing spatial-temporal information at a reasonable computational complexity allowing real-time applications. When the algorithm seeks a proper mapping between a monkey's neural states and desired positions of a computer cursor or a robot arm, in both open-loop and closed-loop experiments, it can effectively learn the neural state to action mapping. Finally, a visualization of the coadaptation process between the decoder and the subject shows the algorithm's capabilities in reinforcement learning brain machine interfaces. PMID:25866504

  16. On The Usage Of Fire Smoke Emissions In An Air Quality Forecasting System To Reduce Particular Matter Forecasting Error

    NASA Astrophysics Data System (ADS)

    Huang, H. C.; Pan, L.; McQueen, J.; Lee, P.; ONeill, S. M.; Ruminski, M.; Shafran, P.; DiMego, G.; Huang, J.; Stajner, I.; Upadhayay, S.; Larkin, N. K.

    2016-12-01

    Wildfires contribute to air quality problems not only towards primary emissions of particular matters (PM) but also emitted ozone precursor gases that can lead to elevated ozone concentration. Wildfires are unpredictable and can be ignited by natural causes such as lightning or accidently by human negligent behavior such as live cigarette. Although wildfire impacts on the air quality can be studied by collecting fire information after events, it is extremely difficult to predict future occurrence and behavior of wildfires for real-time air quality forecasts. Because of the time constraints of operational air quality forecasting, assumption of future day's fire behavior often have to be made based on observed fire information in the past. The United States (U.S.) NOAA/NWS built the National Air Quality Forecast Capability (NAQFC) based on the U.S. EPA CMAQ to provide air quality forecast guidance (prediction) publicly. State and local forecasters use the forecast guidance to issue air quality alerts in their area. The NAQFC fine particulates (PM2.5) prediction includes emissions from anthropogenic and biogenic sources, as well as natural sources such as dust storms and fires. The fire emission input to the NAQFC is derived from the NOAA NESDIS HMS fire and smoke detection product and the emission module of the US Forest Service BlueSky Smoke Modeling Framework. This study focuses on the error estimation of NAQFC PM2.5 predictions resulting from fire emissions. The comparisons between the NAQFC modeled PM2.5 and the EPA AirNow surface observation show that present operational NAQFC fire emissions assumption can lead to a huge error in PM2.5 prediction as fire emissions are sometimes placed at wrong location and time. This PM2.5 prediction error can be propagated from the fire source in the Northwest U.S. to downstream areas as far as the Southeast U.S. From this study, a new procedure has been identified to minimize the aforementioned error. An additional 24 hours

  17. Online selective kernel-based temporal difference learning.

    PubMed

    Chen, Xingguo; Gao, Yang; Wang, Ruili

    2013-12-01

    In this paper, an online selective kernel-based temporal difference (OSKTD) learning algorithm is proposed to deal with large scale and/or continuous reinforcement learning problems. OSKTD includes two online procedures: online sparsification and parameter updating for the selective kernel-based value function. A new sparsification method (i.e., a kernel distance-based online sparsification method) is proposed based on selective ensemble learning, which is computationally less complex compared with other sparsification methods. With the proposed sparsification method, the sparsified dictionary of samples is constructed online by checking if a sample needs to be added to the sparsified dictionary. In addition, based on local validity, a selective kernel-based value function is proposed to select the best samples from the sample dictionary for the selective kernel-based value function approximator. The parameters of the selective kernel-based value function are iteratively updated by using the temporal difference (TD) learning algorithm combined with the gradient descent technique. The complexity of the online sparsification procedure in the OSKTD algorithm is O(n). In addition, two typical experiments (Maze and Mountain Car) are used to compare with both traditional and up-to-date O(n) algorithms (GTD, GTD2, and TDC using the kernel-based value function), and the results demonstrate the effectiveness of our proposed algorithm. In the Maze problem, OSKTD converges to an optimal policy and converges faster than both traditional and up-to-date algorithms. In the Mountain Car problem, OSKTD converges, requires less computation time compared with other sparsification methods, gets a better local optima than the traditional algorithms, and converges much faster than the up-to-date algorithms. In addition, OSKTD can reach a competitive ultimate optima compared with the up-to-date algorithms.

  18. Influence of wheat kernel physical properties on the pulverizing process.

    PubMed

    Dziki, Dariusz; Cacak-Pietrzak, Grażyna; Miś, Antoni; Jończyk, Krzysztof; Gawlik-Dziki, Urszula

    2014-10-01

    The physical properties of wheat kernel were determined and related to pulverizing performance by correlation analysis. Nineteen samples of wheat cultivars about similar level of protein content (11.2-12.8 % w.b.) and obtained from organic farming system were used for analysis. The kernel (moisture content 10 % w.b.) was pulverized by using the laboratory hammer mill equipped with round holes 1.0 mm screen. The specific grinding energy ranged from 120 kJkg(-1) to 159 kJkg(-1). On the basis of data obtained many of significant correlations (p < 0.05) were found between wheat kernel physical properties and pulverizing process of wheat kernel, especially wheat kernel hardness index (obtained on the basis of Single Kernel Characterization System) and vitreousness significantly and positively correlated with the grinding energy indices and the mass fraction of coarse particles (> 0.5 mm). Among the kernel mechanical properties determined on the basis of uniaxial compression test only the rapture force was correlated with the impact grinding results. The results showed also positive and significant relationships between kernel ash content and grinding energy requirements. On the basis of wheat physical properties the multiple linear regression was proposed for predicting the average particle size of pulverized kernel.

  19. Kernel-based Linux emulation for Plan 9.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minnich, Ronald G.

    2010-09-01

    CNKemu is a kernel-based system for the 9k variant of the Plan 9 kernel. It is designed to provide transparent binary support for programs compiled for IBM's Compute Node Kernel (CNK) on the Blue Gene series of supercomputers. This support allows users to build applications with the standard Blue Gene toolchain, including C++ and Fortran compilers. While the CNK is not Linux, IBM designed the CNK so that the user interface has much in common with the Linux 2.0 system call interface. The Plan 9 CNK emulator hence provides the foundation of kernel-based Linux system call support on Plan 9.more » In this paper we discuss cnkemu's implementation and some of its more interesting features, such as the ability to easily intermix Plan 9 and Linux system calls.« less

  20. Gradient-based adaptation of general gaussian kernels.

    PubMed

    Glasmachers, Tobias; Igel, Christian

    2005-10-01

    Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.

  1. Gabor-based kernel PCA with fractional power polynomial models for face recognition.

    PubMed

    Liu, Chengjun

    2004-05-01

    This paper presents a novel Gabor-based kernel Principal Component Analysis (PCA) method by integrating the Gabor wavelet representation of face images and the kernel PCA method for face recognition. Gabor wavelets first derive desirable facial features characterized by spatial frequency, spatial locality, and orientation selectivity to cope with the variations due to illumination and facial expression changes. The kernel PCA method is then extended to include fractional power polynomial models for enhanced face recognition performance. A fractional power polynomial, however, does not necessarily define a kernel function, as it might not define a positive semidefinite Gram matrix. Note that the sigmoid kernels, one of the three classes of widely used kernel functions (polynomial kernels, Gaussian kernels, and sigmoid kernels), do not actually define a positive semidefinite Gram matrix either. Nevertheless, the sigmoid kernels have been successfully used in practice, such as in building support vector machines. In order to derive real kernel PCA features, we apply only those kernel PCA eigenvectors that are associated with positive eigenvalues. The feasibility of the Gabor-based kernel PCA method with fractional power polynomial models has been successfully tested on both frontal and pose-angled face recognition, using two data sets from the FERET database and the CMU PIE database, respectively. The FERET data set contains 600 frontal face images of 200 subjects, while the PIE data set consists of 680 images across five poses (left and right profiles, left and right half profiles, and frontal view) with two different facial expressions (neutral and smiling) of 68 subjects. The effectiveness of the Gabor-based kernel PCA method with fractional power polynomial models is shown in terms of both absolute performance indices and comparative performance against the PCA method, the kernel PCA method with polynomial kernels, the kernel PCA method with fractional power

  2. Use of Volatile Tracers to Determine the Contribution ofEnvironment Tobacco Smoke to Concentrations of Volatile Organic Compoundsin Smoking Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hodgson, A.T.; Daisey, J.M.; Alevantis, L.E.

    Three volatile nitrogen-containing compounds, 3-ethenylpyridine (3-EP), pyridine and pyrrole, were investigated as potential tracers for determining the contribution of environmental tobacco smoke (ETS) to concentrations of volatile organic compounds (VOCs) in indoor environments with smoking. The source emission rates of the three tracers and ten selected VOCs in ETS were first measured in a room-size environmental chamber for a market-weighted selection of six commercial cigarettes. The ratios of the emission rates of the tracers to the emission rates of the selected VOCs were calculated and compared among the six brands. The utility of the tracers was then evaluated in amore » field study conducted in five office buildings. Samples for VOCs were collected in designated smoking areas and adjoining non-smoking areas, air change rates were measured, and smoking rates were documented. Concentrations of the three tracers in the smoking areas were calculated using a mass-balance model and compared to their measured concentrations. Based on this comparison, 3-EP was selected as the most suitable tracer for the volatile components of ETS, although pyrrole is also potentially useful. Using 3-EP as the tracer, the contributions of ETS to the measured concentrations of the selected VOCs in the smoking areas were estimated by apportionment. ETS was estimated to contribute 57 to 84 percent (4.1 to 26 pg m{sup -3}) of the formaldehyde concentrations, 44 to 69 percent (0.9 to 5.8 pg m{sup -3}) of the 2-butanone concentrations, 37 to 58 percent (1.3 to 8.2 pg m{sup -3}) of the benzene concentrations, and 20 to 69 percent (0.5 to 3.0 pg m{sup -3}) of the styrene concentrations. The fractional contributions of ETS to the concentrations of acetone, toluene, ethylbenzene, xylene isomers and d-limonene were all less than 50 percent.« less

  3. Spectrofluorimetric determination of melatonin in kernels of four different Pistacia varieties after ultrasound-assisted solid-liquid extraction.

    PubMed

    Oladi, Elham; Mohamadi, Maryam; Shamspur, Tayebeh; Mostafavi, Ali

    2014-11-11

    Melatonin is normally consumed to regulate the body's biological cycle. However it also has therapeutic properties, such as anti-tumor, anti-aging and protects the immune system. There are some reports on the presence of melatonin in edible kernels such as walnuts, but the extraction of melatonin from pistachio kernels is reported here for the first time. For this, the methanolic extract of pistachio kernels was exposed to gas chromatography/mass spectrometry analysis which confirmed the presence of melatonin. A fluorescence-based method was applied for the determination of melatonin in different extracts. When excited at λ=275 nm, the fluorescence emission intensity of melatonin was measured at λ=366 nm. Ultrasound-assisted solid-liquid extraction was used for the extraction of melatonin from pistachio kernels prior to fluorimetric determination. To achieve the highest extraction recovery, the main parameters affecting the extraction efficiency such as extracting solvent type and volume, temperature, sonication time and pH were evaluated. Under the optimized conditions, a linear dependence of fluorescence intensity on melatonin concentration was observed in the range of 0.0040-0.160 μg mL(-1), with a detection limit of 0.0036 μg mL(-1). This method was applied successfully for measuring and comparing the melatonin content in the kernels of four different varieties of Pistacia including Ahmad Aghaei, Akbari, Kalle Qouchi and Fandoghi. In addition, the results obtained were compared with those obtained using GC/MS. A good agreement was observed indicating the reliability of the proposed method. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Spectrofluorimetric determination of melatonin in kernels of four different Pistacia varieties after ultrasound-assisted solid-liquid extraction

    NASA Astrophysics Data System (ADS)

    Oladi, Elham; Mohamadi, Maryam; Shamspur, Tayebeh; Mostafavi, Ali

    2014-11-01

    Melatonin is normally consumed to regulate the body's biological cycle. However it also has therapeutic properties, such as anti-tumor, anti-aging and protects the immune system. There are some reports on the presence of melatonin in edible kernels such as walnuts, but the extraction of melatonin from pistachio kernels is reported here for the first time. For this, the methanolic extract of pistachio kernels was exposed to gas chromatography/mass spectrometry analysis which confirmed the presence of melatonin. A fluorescence-based method was applied for the determination of melatonin in different extracts. When excited at λ = 275 nm, the fluorescence emission intensity of melatonin was measured at λ = 366 nm. Ultrasound-assisted solid-liquid extraction was used for the extraction of melatonin from pistachio kernels prior to fluorimetric determination. To achieve the highest extraction recovery, the main parameters affecting the extraction efficiency such as extracting solvent type and volume, temperature, sonication time and pH were evaluated. Under the optimized conditions, a linear dependence of fluorescence intensity on melatonin concentration was observed in the range of 0.0040-0.160 μg mL-1, with a detection limit of 0.0036 μg mL-1. This method was applied successfully for measuring and comparing the melatonin content in the kernels of four different varieties of Pistacia including Ahmad Aghaei, Akbari, Kalle Qouchi and Fandoghi. In addition, the results obtained were compared with those obtained using GC/MS. A good agreement was observed indicating the reliability of the proposed method.

  5. Genetic dissection of the maize kernel development process via conditional QTL mapping for three developing kernel-related traits in an immortalized F2 population.

    PubMed

    Zhang, Zhanhui; Wu, Xiangyuan; Shi, Chaonan; Wang, Rongna; Li, Shengfei; Wang, Zhaohui; Liu, Zonghua; Xue, Yadong; Tang, Guiliang; Tang, Jihua

    2016-02-01

    Kernel development is an important dynamic trait that determines the final grain yield in maize. To dissect the genetic basis of maize kernel development process, a conditional quantitative trait locus (QTL) analysis was conducted using an immortalized F2 (IF2) population comprising 243 single crosses at two locations over 2 years. Volume (KV) and density (KD) of dried developing kernels, together with kernel weight (KW) at different developmental stages, were used to describe dynamic changes during kernel development. Phenotypic analysis revealed that final KW and KD were determined at DAP22 and KV at DAP29. Unconditional QTL mapping for KW, KV and KD uncovered 97 QTLs at different kernel development stages, of which qKW6b, qKW7a, qKW7b, qKW10b, qKW10c, qKV10a, qKV10b and qKV7 were identified under multiple kernel developmental stages and environments. Among the 26 QTLs detected by conditional QTL mapping, conqKW7a, conqKV7a, conqKV10a, conqKD2, conqKD7 and conqKD8a were conserved between the two mapping methodologies. Furthermore, most of these QTLs were consistent with QTLs and genes for kernel development/grain filling reported in previous studies. These QTLs probably contain major genes associated with the kernel development process, and can be used to improve grain yield and quality through marker-assisted selection.

  6. Determination of tobacco smoking influence on volatile organic compounds constituent by indoor tobacco smoking simulation experiment

    NASA Astrophysics Data System (ADS)

    Xie, Juexin; Wang, Xingming; Sheng, Guoying; Bi, Xinhui; Fu, Jiamo

    Tobacco smoking simulation experiment was conducted in a test room under different conditions such as cigarette brands, smoking number, and post-smoke decay in forced ventilation or in closed indoor environments. Thirty-seven chemical species were targeted and monitored, including volatile organic compounds (VOCs) and environmental tobacco smoke (ETS) markers. The results indicate that benzene, d-limonene, styrene, m-ethyltoluene and 1,2,4/1,3,5-trimethylbenzene are correlated well with ETS markers, but toluene, xylene, and ethylbenzene are not evidently correlated with ETS markers because there are some potential indoor sources of these compounds. 2,5-dimethylfuran is considered to be a better ETS marker due to the relative stability in different cigarette brands and a good relationship with other ETS markers. The VOCs concentrations emitted by tobacco smoking were linearly associated with the number of cigarettes consumed, and different behaviors were observed in closed indoor environment, of which ETS markers, d-limonene, styrene, trimethylbenzene, etc. decayed fast, whereas benzene, toluene, xylene, ethylbenzene, etc. decayed slowly and even increased in primary periods of the decay; hence ETS exposure in closed environments is believed to be more dangerous. VOCs concentrations and the relative percentage constituent of ETS markers of different brand cigarettes emissions vary largely, but the relative percentage constituent of ETS markers for the same brand cigarette emissions is similar.

  7. Emission of trace gases and organic components in smoke particles from a wildfire in a mixed-evergreen forest in Portugal.

    PubMed

    Alves, Célia A; Vicente, Ana; Monteiro, Cristina; Gonçalves, Cátia; Evtyugina, Margarita; Pio, Casimiro

    2011-03-15

    On May 2009, both the gas and particulate fractions of smoke from a wildfire in Sever do Vouga, central Portugal, were sampled. Total hydrocarbons and carbon oxides (CO(2) and CO) were measured using automatic analysers with flame ionisation and non-dispersive infrared detectors, respectively. Fine (PM(2.5)) and coarse (PM(2.5-10)) particles from the smoke plume were analysed by a thermal-optical transmission technique to determine the elemental and organic carbon (EC and OC) content. Subsequently, the particle samples were solvent extracted and fractionated by vacuum flash chromatography into different classes of organic compounds. The detailed organic speciation was performed by gas chromatography-mass spectrometry. The CO, CO(2) and total hydrocarbon emission factors (g kg(-1) dry fuel) were 170 ± 83, 1485 ± 147, and 9.8 ± 0.90, respectively. It was observed that the particulate matter and OC emissions are significantly enhanced under smouldering fire conditions. The aerosol emissions were dominated by fine particles whose mass was mainly composed of organic constituents, such as degradation products from biopolymers (e.g. levoglucosan from cellulose, methoxyphenols from lignin). The compound classes also included homologous series (n-alkanes, n-alkenes, n-alkanoic acids and n-alkanols), monosaccharide derivatives from cellulose, steroid and terpenoid biomarkers, and polycyclic aromatic hydrocarbons (PAHs). The most abundant PAH was retene. Even carbon number homologs of monoglycerides were identified for the first time as biomarkers in biomass burning aerosols. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. A trace ratio maximization approach to multiple kernel-based dimensionality reduction.

    PubMed

    Jiang, Wenhao; Chung, Fu-lai

    2014-01-01

    Most dimensionality reduction techniques are based on one metric or one kernel, hence it is necessary to select an appropriate kernel for kernel-based dimensionality reduction. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a kernel from a set of base kernels which are seen as different descriptions of data. As MKL-DR does not involve regularization, it might be ill-posed under some conditions and consequently its applications are hindered. This paper proposes a multiple kernel learning framework for dimensionality reduction based on regularized trace ratio, termed as MKL-TR. Our method aims at learning a transformation into a space of lower dimension and a corresponding kernel from the given base kernels among which some may not be suitable for the given data. The solutions for the proposed framework can be found based on trace ratio maximization. The experimental results demonstrate its effectiveness in benchmark datasets, which include text, image and sound datasets, for supervised, unsupervised as well as semi-supervised settings. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. A Kernel-based Lagrangian method for imperfectly-mixed chemical reactions

    NASA Astrophysics Data System (ADS)

    Schmidt, Michael J.; Pankavich, Stephen; Benson, David A.

    2017-05-01

    Current Lagrangian (particle-tracking) algorithms used to simulate diffusion-reaction equations must employ a certain number of particles to properly emulate the system dynamics-particularly for imperfectly-mixed systems. The number of particles is tied to the statistics of the initial concentration fields of the system at hand. Systems with shorter-range correlation and/or smaller concentration variance require more particles, potentially limiting the computational feasibility of the method. For the well-known problem of bimolecular reaction, we show that using kernel-based, rather than Dirac delta, particles can significantly reduce the required number of particles. We derive the fixed width of a Gaussian kernel for a given reduced number of particles that analytically eliminates the error between kernel and Dirac solutions at any specified time. We also show how to solve for the fixed kernel size by minimizing the squared differences between solutions over any given time interval. Numerical results show that the width of the kernel should be kept below about 12% of the domain size, and that the analytic equations used to derive kernel width suffer significantly from the neglect of higher-order moments. The simulations with a kernel width given by least squares minimization perform better than those made to match at one specific time. A heuristic time-variable kernel size, based on the previous results, performs on par with the least squares fixed kernel size.

  10. Testing in Microbiome-Profiling Studies with MiRKAT, the Microbiome Regression-Based Kernel Association Test

    PubMed Central

    Zhao, Ni; Chen, Jun; Carroll, Ian M.; Ringel-Kulka, Tamar; Epstein, Michael P.; Zhou, Hua; Zhou, Jin J.; Ringel, Yehuda; Li, Hongzhe; Wu, Michael C.

    2015-01-01

    High-throughput sequencing technology has enabled population-based studies of the role of the human microbiome in disease etiology and exposure response. Distance-based analysis is a popular strategy for evaluating the overall association between microbiome diversity and outcome, wherein the phylogenetic distance between individuals’ microbiome profiles is computed and tested for association via permutation. Despite their practical popularity, distance-based approaches suffer from important challenges, especially in selecting the best distance and extending the methods to alternative outcomes, such as survival outcomes. We propose the microbiome regression-based kernel association test (MiRKAT), which directly regresses the outcome on the microbiome profiles via the semi-parametric kernel machine regression framework. MiRKAT allows for easy covariate adjustment and extension to alternative outcomes while non-parametrically modeling the microbiome through a kernel that incorporates phylogenetic distance. It uses a variance-component score statistic to test for the association with analytical p value calculation. The model also allows simultaneous examination of multiple distances, alleviating the problem of choosing the best distance. Our simulations demonstrated that MiRKAT provides correctly controlled type I error and adequate power in detecting overall association. “Optimal” MiRKAT, which considers multiple candidate distances, is robust in that it suffers from little power loss in comparison to when the best distance is used and can achieve tremendous power gain in comparison to when a poor distance is chosen. Finally, we applied MiRKAT to real microbiome datasets to show that microbial communities are associated with smoking and with fecal protease levels after confounders are controlled for. PMID:25957468

  11. Detection of maize kernels breakage rate based on K-means clustering

    NASA Astrophysics Data System (ADS)

    Yang, Liang; Wang, Zhuo; Gao, Lei; Bai, Xiaoping

    2017-04-01

    In order to optimize the recognition accuracy of maize kernels breakage detection and improve the detection efficiency of maize kernels breakage, this paper using computer vision technology and detecting of the maize kernels breakage based on K-means clustering algorithm. First, the collected RGB images are converted into Lab images, then the original images clarity evaluation are evaluated by the energy function of Sobel 8 gradient. Finally, the detection of maize kernels breakage using different pixel acquisition equipments and different shooting angles. In this paper, the broken maize kernels are identified by the color difference between integrity kernels and broken kernels. The original images clarity evaluation and different shooting angles are taken to verify that the clarity and shooting angles of the images have a direct influence on the feature extraction. The results show that K-means clustering algorithm can distinguish the broken maize kernels effectively.

  12. Image quality of mixed convolution kernel in thoracic computed tomography.

    PubMed

    Neubauer, Jakob; Spira, Eva Maria; Strube, Juliane; Langer, Mathias; Voss, Christian; Kotter, Elmar

    2016-11-01

    The mixed convolution kernel alters his properties geographically according to the depicted organ structure, especially for the lung. Therefore, we compared the image quality of the mixed convolution kernel to standard soft and hard kernel reconstructions for different organ structures in thoracic computed tomography (CT) images.Our Ethics Committee approved this prospective study. In total, 31 patients who underwent contrast-enhanced thoracic CT studies were included after informed consent. Axial reconstructions were performed with hard, soft, and mixed convolution kernel. Three independent and blinded observers rated the image quality according to the European Guidelines for Quality Criteria of Thoracic CT for 13 organ structures. The observers rated the depiction of the structures in all reconstructions on a 5-point Likert scale. Statistical analysis was performed with the Friedman Test and post hoc analysis with the Wilcoxon rank-sum test.Compared to the soft convolution kernel, the mixed convolution kernel was rated with a higher image quality for lung parenchyma, segmental bronchi, and the border between the pleura and the thoracic wall (P < 0.03). Compared to the hard convolution kernel, the mixed convolution kernel was rated with a higher image quality for aorta, anterior mediastinal structures, paratracheal soft tissue, hilar lymph nodes, esophagus, pleuromediastinal border, large and medium sized pulmonary vessels and abdomen (P < 0.004) but a lower image quality for trachea, segmental bronchi, lung parenchyma, and skeleton (P < 0.001).The mixed convolution kernel cannot fully substitute the standard CT reconstructions. Hard and soft convolution kernel reconstructions still seem to be mandatory for thoracic CT.

  13. Coupling individual kernel-filling processes with source-sink interactions into GREENLAB-Maize.

    PubMed

    Ma, Yuntao; Chen, Youjia; Zhu, Jinyu; Meng, Lei; Guo, Yan; Li, Baoguo; Hoogenboom, Gerrit

    2018-02-13

    Failure to account for the variation of kernel growth in a cereal crop simulation model may cause serious deviations in the estimates of crop yield. The goal of this research was to revise the GREENLAB-Maize model to incorporate source- and sink-limited allocation approaches to simulate the dry matter accumulation of individual kernels of an ear (GREENLAB-Maize-Kernel). The model used potential individual kernel growth rates to characterize the individual potential sink demand. The remobilization of non-structural carbohydrates from reserve organs to kernels was also incorporated. Two years of field experiments were conducted to determine the model parameter values and to evaluate the model using two maize hybrids with different plant densities and pollination treatments. Detailed observations were made on the dimensions and dry weights of individual kernels and other above-ground plant organs throughout the seasons. Three basic traits characterizing an individual kernel were compared on simulated and measured individual kernels: (1) final kernel size; (2) kernel growth rate; and (3) duration of kernel filling. Simulations of individual kernel growth closely corresponded to experimental data. The model was able to reproduce the observed dry weight of plant organs well. Then, the source-sink dynamics and the remobilization of carbohydrates for kernel growth were quantified to show that remobilization processes accompanied source-sink dynamics during the kernel-filling process. We conclude that the model may be used to explore options for optimizing plant kernel yield by matching maize management to the environment, taking into account responses at the level of individual kernels. © The Author(s) 2018. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Automatic detection of aflatoxin contaminated corn kernels using dual-band imagery

    NASA Astrophysics Data System (ADS)

    Ononye, Ambrose E.; Yao, Haibo; Hruska, Zuzana; Kincaid, Russell; Brown, Robert L.; Cleveland, Thomas E.

    2009-05-01

    Aflatoxin is a mycotoxin predominantly produced by Aspergillus flavus and Aspergillus parasitiucus fungi that grow naturally in corn, peanuts and in a wide variety of other grain products. Corn, like other grains is used as food for human and feed for animal consumption. It is known that aflatoxin is carcinogenic; therefore, ingestion of corn infected with the toxin can lead to very serious health problems such as liver damage if the level of the contamination is high. The US Food and Drug Administration (FDA) has strict guidelines for permissible levels in the grain products for both humans and animals. The conventional approach used to determine these contamination levels is one of the destructive and invasive methods that require corn kernels to be ground and then chemically analyzed. Unfortunately, each of the analytical methods can take several hours depending on the quantity, to yield a result. The development of high spectral and spatial resolution imaging sensors has created an opportunity for hyperspectral image analysis to be employed for aflatoxin detection. However, this brings about a high dimensionality problem as a setback. In this paper, we propose a technique that automatically detects aflatoxin contaminated corn kernels by using dual-band imagery. The method exploits the fluorescence emission spectra from corn kernels captured under 365 nm ultra-violet light excitation. Our approach could lead to a non-destructive and non-invasive way of quantifying the levels of aflatoxin contamination. The preliminary results shown here, demonstrate the potential of our technique for aflatoxin detection.

  15. Stochastic subset selection for learning with kernel machines.

    PubMed

    Rhinelander, Jason; Liu, Xiaoping P

    2012-06-01

    Kernel machines have gained much popularity in applications of machine learning. Support vector machines (SVMs) are a subset of kernel machines and generalize well for classification, regression, and anomaly detection tasks. The training procedure for traditional SVMs involves solving a quadratic programming (QP) problem. The QP problem scales super linearly in computational effort with the number of training samples and is often used for the offline batch processing of data. Kernel machines operate by retaining a subset of observed data during training. The data vectors contained within this subset are referred to as support vectors (SVs). The work presented in this paper introduces a subset selection method for the use of kernel machines in online, changing environments. Our algorithm works by using a stochastic indexing technique when selecting a subset of SVs when computing the kernel expansion. The work described here is novel because it separates the selection of kernel basis functions from the training algorithm used. The subset selection algorithm presented here can be used in conjunction with any online training technique. It is important for online kernel machines to be computationally efficient due to the real-time requirements of online environments. Our algorithm is an important contribution because it scales linearly with the number of training samples and is compatible with current training techniques. Our algorithm outperforms standard techniques in terms of computational efficiency and provides increased recognition accuracy in our experiments. We provide results from experiments using both simulated and real-world data sets to verify our algorithm.

  16. Kinetic study of nickel laterite reduction roasting by palm kernel shell charcoal

    NASA Astrophysics Data System (ADS)

    Sugiarto, E.; Putera, A. D. P.; Petrus, H. T. B. M.

    2017-05-01

    Demand to process nickel-bearing laterite ore increase as continuous depletion of high-grade nickel-bearing sulfide ore takes place. Due to its common nickel association with iron, processing nickel laterite ore into nickel pig iron (NPI) has been developed by some industries. However, to achieve satisfying nickel recoveries, the process needs massive high-grade metallurgical coke consumption. Concerning on the sustainability of coke supply and positive carbon emission, reduction of nickel laterite ore using biomass-based reductor was being studied.In this study, saprolitic nickel laterite ore was being reduced by palm kernel shell charcoal at several temperatures (800-1000 °C). Variation of biomass-laterite composition was also conducted to study the reduction mechanism. X-ray diffraction and gravimetry analysis were applied to justify the phenomenon and predict kinetic model of the reduction. Results of this study provide information that palm kernel shell charcoal has similar reducing result compared with the conventional method. Reduction, however, was carried out by carbon monoxide rather than solid carbon. Regarding kinetics, Ginstling-Brouhnstein kinetic model provides satisfying results to predict the reduction phenomenon.

  17. RTOS kernel in portable electrocardiograph

    NASA Astrophysics Data System (ADS)

    Centeno, C. A.; Voos, J. A.; Riva, G. G.; Zerbini, C.; Gonzalez, E. A.

    2011-12-01

    This paper presents the use of a Real Time Operating System (RTOS) on a portable electrocardiograph based on a microcontroller platform. All medical device digital functions are performed by the microcontroller. The electrocardiograph CPU is based on the 18F4550 microcontroller, in which an uCOS-II RTOS can be embedded. The decision associated with the kernel use is based on its benefits, the license for educational use and its intrinsic time control and peripherals management. The feasibility of its use on the electrocardiograph is evaluated based on the minimum memory requirements due to the kernel structure. The kernel's own tools were used for time estimation and evaluation of resources used by each process. After this feasibility analysis, the migration from cyclic code to a structure based on separate processes or tasks able to synchronize events is used; resulting in an electrocardiograph running on one Central Processing Unit (CPU) based on RTOS.

  18. A Robustness Testing Campaign for IMA-SP Partitioning Kernels

    NASA Astrophysics Data System (ADS)

    Grixti, Stephen; Lopez Trecastro, Jorge; Sammut, Nicholas; Zammit-Mangion, David

    2015-09-01

    With time and space partitioned architectures becoming increasingly appealing to the European space sector, the dependability of partitioning kernel technology is a key factor to its applicability in European Space Agency projects. This paper explores the potential of the data type fault model, which injects faults through the Application Program Interface, in partitioning kernel robustness testing. This fault injection methodology has been tailored to investigate its relevance in uncovering vulnerabilities within partitioning kernels and potentially contributing towards fault removal campaigns within this domain. This is demonstrated through a robustness testing case study of the XtratuM partitioning kernel for SPARC LEON3 processors. The robustness campaign exposed a number of vulnerabilities in XtratuM, exhibiting the potential benefits of using such a methodology for the robustness assessment of partitioning kernels.

  19. Environmental tobacco smoke aerosol in non-smoking households of patients with chronic respiratory diseases

    NASA Astrophysics Data System (ADS)

    Chalbot, Marie-Cecile; Vei, Ino-Christina; Lianou, Maria; Kotronarou, Anastasia; Karakatsani, Anna; Katsouyanni, Klea; Hoek, Gerard; Kavouras, Ilias G.

    2012-12-01

    Fine particulate matter samples were collected in an urban ambient fixed site and, outside and inside residencies in Athens greater area, Greece. n-Alkanes, iso/anteiso-alkanes and polycyclic aromatic hydrocarbons (PAHs) were identified by gas chromatography and mass spectrometry. The values of concentration diagnostic ratios indicated a mixture of vehicular emissions, fuel evaporation, oil residues and environmental tobacco smoke (ETS) in outdoor and indoor samples. Particulate iso/anteiso-alkanes, specific tracers of ETS, were detected in both non-smoking and smoking households. The indoor-to-outdoor ratios of particulate iso/anteiso-alkanes and unresolved complex mixture (a tracer of outdoor air pollution) in non-smoking households were comparable to the measured air exchange rate. This suggested that penetration of outdoor air was solely responsible for the detection of tobacco smoke particulate tracers in indoor non-smoking environments. Overall, residential outdoor concentrations accounted for a large fraction (from 25 up to 79%) of indoor aliphatic and polyaromatic hydrocarbons. Open windows/doors and the operation of an air condition unit yielded also in higher indoor concentrations than those measured outdoors.

  20. Searching for efficient Markov chain Monte Carlo proposal kernels

    PubMed Central

    Yang, Ziheng; Rodríguez, Carlos E.

    2013-01-01

    Markov chain Monte Carlo (MCMC) or the Metropolis–Hastings algorithm is a simulation algorithm that has made modern Bayesian statistical inference possible. Nevertheless, the efficiency of different Metropolis–Hastings proposal kernels has rarely been studied except for the Gaussian proposal. Here we propose a unique class of Bactrian kernels, which avoid proposing values that are very close to the current value, and compare their efficiency with a number of proposals for simulating different target distributions, with efficiency measured by the asymptotic variance of a parameter estimate. The uniform kernel is found to be more efficient than the Gaussian kernel, whereas the Bactrian kernel is even better. When optimal scales are used for both, the Bactrian kernel is at least 50% more efficient than the Gaussian. Implementation in a Bayesian program for molecular clock dating confirms the general applicability of our results to generic MCMC algorithms. Our results refute a previous claim that all proposals had nearly identical performance and will prompt further research into efficient MCMC proposals. PMID:24218600

  1. Defect Analysis Of Quality Palm Kernel Meal Using Statistical Quality Control In Kernels Factory

    NASA Astrophysics Data System (ADS)

    Sembiring, M. T.; Marbun, N. J.

    2018-04-01

    The production quality has an important impact retain the totality of characteristics of a product or service to pay attention to its capabilities to meet the needs that have been established. Quality criteria Palm Kernel Meal (PKM) set Factory kernel is as follows: oil content: max 8.50%, water content: max 12,00% and impurity content: max 4.00% While the average quality of the oil content of 8.94%, the water content of 5.51%, and 8.45% impurity content. To identify the defective product quality PKM produced, then used a method of analysis using Statistical Quality Control (SQC). PKM Plant Quality Kernel shows the oil content was 0.44% excess of a predetermined maximum value, and 4.50% impurity content. With excessive PKM content of oil and dirt cause disability content of production for oil, amounted to 854.6078 kg PKM and 8643.193 kg impurity content of PKM. Analysis of the results of cause and effect diagram and SQC, the factors that lead to poor quality of PKM is Ampere second press oil expeller and hours second press oil expeller.

  2. Scuba: scalable kernel-based gene prioritization.

    PubMed

    Zampieri, Guido; Tran, Dinh Van; Donini, Michele; Navarin, Nicolò; Aiolli, Fabio; Sperduti, Alessandro; Valle, Giorgio

    2018-01-25

    The uncovering of genes linked to human diseases is a pressing challenge in molecular biology and precision medicine. This task is often hindered by the large number of candidate genes and by the heterogeneity of the available information. Computational methods for the prioritization of candidate genes can help to cope with these problems. In particular, kernel-based methods are a powerful resource for the integration of heterogeneous biological knowledge, however, their practical implementation is often precluded by their limited scalability. We propose Scuba, a scalable kernel-based method for gene prioritization. It implements a novel multiple kernel learning approach, based on a semi-supervised perspective and on the optimization of the margin distribution. Scuba is optimized to cope with strongly unbalanced settings where known disease genes are few and large scale predictions are required. Importantly, it is able to efficiently deal both with a large amount of candidate genes and with an arbitrary number of data sources. As a direct consequence of scalability, Scuba integrates also a new efficient strategy to select optimal kernel parameters for each data source. We performed cross-validation experiments and simulated a realistic usage setting, showing that Scuba outperforms a wide range of state-of-the-art methods. Scuba achieves state-of-the-art performance and has enhanced scalability compared to existing kernel-based approaches for genomic data. This method can be useful to prioritize candidate genes, particularly when their number is large or when input data is highly heterogeneous. The code is freely available at https://github.com/gzampieri/Scuba .

  3. Genomic Prediction of Genotype × Environment Interaction Kernel Regression Models.

    PubMed

    Cuevas, Jaime; Crossa, José; Soberanis, Víctor; Pérez-Elizalde, Sergio; Pérez-Rodríguez, Paulino; Campos, Gustavo de Los; Montesinos-López, O A; Burgueño, Juan

    2016-11-01

    In genomic selection (GS), genotype × environment interaction (G × E) can be modeled by a marker × environment interaction (M × E). The G × E may be modeled through a linear kernel or a nonlinear (Gaussian) kernel. In this study, we propose using two nonlinear Gaussian kernels: the reproducing kernel Hilbert space with kernel averaging (RKHS KA) and the Gaussian kernel with the bandwidth estimated through an empirical Bayesian method (RKHS EB). We performed single-environment analyses and extended to account for G × E interaction (GBLUP-G × E, RKHS KA-G × E and RKHS EB-G × E) in wheat ( L.) and maize ( L.) data sets. For single-environment analyses of wheat and maize data sets, RKHS EB and RKHS KA had higher prediction accuracy than GBLUP for all environments. For the wheat data, the RKHS KA-G × E and RKHS EB-G × E models did show up to 60 to 68% superiority over the corresponding single environment for pairs of environments with positive correlations. For the wheat data set, the models with Gaussian kernels had accuracies up to 17% higher than that of GBLUP-G × E. For the maize data set, the prediction accuracy of RKHS EB-G × E and RKHS KA-G × E was, on average, 5 to 6% higher than that of GBLUP-G × E. The superiority of the Gaussian kernel models over the linear kernel is due to more flexible kernels that accounts for small, more complex marker main effects and marker-specific interaction effects. Copyright © 2016 Crop Science Society of America.

  4. Gas Turbine Engine Nonvolatile Particulate Matter Mass Emissions: Correlation with Smoke Number for Conventional and Alternative Fuel Blends.

    PubMed

    Christie, Simon; Lobo, Prem; Lee, David; Raper, David

    2017-01-17

    This study evaluates the relationship between the emissions parameters of smoke number (SN) and mass concentration of nonvolatile particulate matter (nvPM) in the exhaust of a gas turbine engine for a conventional Jet A-1 and a number of alternative fuel blends. The data demonstrate the significant impact of fuel composition on the emissions and highlight the magnitude of the fuel-induced uncertainty for both SN within the Emissions Data Bank as well as nvPM mass within the new regulatory standard under development. Notwithstanding these substantial differences, the data show that correlation between SN and nvPM mass concentration still adheres to the first order approximation (FOA3), and this agreement is maintained over a wide range of fuel compositions. Hence, the data support the supposition that the FOA3 is applicable to engines burning both conventional and alternative fuel blends without adaptation or modification. The chemical composition of the fuel is shown to impact mass and number concentration as well as geometric mean diameter of the emitted nvPM; however, the data do not support assertions that the emissions of black carbon with small mean diameter will result in significant deviations from FOA3.

  5. Mesoscale modeling of smoke radiative feedback over the Sahel region

    NASA Astrophysics Data System (ADS)

    Yang, Z.; Wang, J.; Ichoku, C. M.; Ellison, L.; Zhang, F.; Yue, Y.

    2013-12-01

    This study employs satellite observations and a fully-coupled meteorology-chemistry-aerosol model, Weather Research and Forecasting model with Chemistry (WRF-Chem) to study the smoke radative feedback on surface energy budget, boundary layer processes, and atmospheric lapse rate in February 2008 over the Sahel region. The smoke emission inventories we use come from various sources, including but not limited to the Fire Locating and Modeling of Burning Emissions (FLAMBE) developed by NRL and the Fire Energetic and Emissions Research (FEER) developed by NASA GSFC. Model performance is evaluated using numerous satellite and ground-based datasets: MODIS true color images, ground-based Aerosol Optical Depth (AOD) measurements from AERONET, MODIS AOD retrievals, and Cloud-Aerosol Lidar data with Orthogonal Polarization (CALIOP) atmospheric backscattering and extinction products. Specification of smoke injection height of 650 m in WRF-Chem yields aerosol vertical profiles that are most consistent with CALIOP observations of aerosol layer height. Statistically, 5% of the CALIPSO valid measurements of aerosols in February 2008 show aerosol layers either above the clouds or between the clouds, reinforcing the importance of the aerosol vertical distribution for quantifying aerosol impact on climate in the Sahel region. The results further show that the smoke radiative feedbacks are sensitive to assumptions of black carbon and organic carbon ratio in the particle emission inventory. Also investigated is the smoke semi-direct effect as a function of cloud fraction.

  6. Sepsis mortality prediction with the Quotient Basis Kernel.

    PubMed

    Ribas Ripoll, Vicent J; Vellido, Alfredo; Romero, Enrique; Ruiz-Rodríguez, Juan Carlos

    2014-05-01

    This paper presents an algorithm to assess the risk of death in patients with sepsis. Sepsis is a common clinical syndrome in the intensive care unit (ICU) that can lead to severe sepsis, a severe state of septic shock or multi-organ failure. The proposed algorithm may be implemented as part of a clinical decision support system that can be used in combination with the scores deployed in the ICU to improve the accuracy, sensitivity and specificity of mortality prediction for patients with sepsis. In this paper, we used the Simplified Acute Physiology Score (SAPS) for ICU patients and the Sequential Organ Failure Assessment (SOFA) to build our kernels and algorithms. In the proposed method, we embed the available data in a suitable feature space and use algorithms based on linear algebra, geometry and statistics for inference. We present a simplified version of the Fisher kernel (practical Fisher kernel for multinomial distributions), as well as a novel kernel that we named the Quotient Basis Kernel (QBK). These kernels are used as the basis for mortality prediction using soft-margin support vector machines. The two new kernels presented are compared against other generative kernels based on the Jensen-Shannon metric (centred, exponential and inverse) and other widely used kernels (linear, polynomial and Gaussian). Clinical relevance is also evaluated by comparing these results with logistic regression and the standard clinical prediction method based on the initial SAPS score. As described in this paper, we tested the new methods via cross-validation with a cohort of 400 test patients. The results obtained using our methods compare favourably with those obtained using alternative kernels (80.18% accuracy for the QBK) and the standard clinical prediction method, which are based on the basal SAPS score or logistic regression (71.32% and 71.55%, respectively). The QBK presented a sensitivity and specificity of 79.34% and 83.24%, which outperformed the other kernels

  7. Kernel Methods for Mining Instance Data in Ontologies

    NASA Astrophysics Data System (ADS)

    Bloehdorn, Stephan; Sure, York

    The amount of ontologies and meta data available on the Web is constantly growing. The successful application of machine learning techniques for learning of ontologies from textual data, i.e. mining for the Semantic Web, contributes to this trend. However, no principal approaches exist so far for mining from the Semantic Web. We investigate how machine learning algorithms can be made amenable for directly taking advantage of the rich knowledge expressed in ontologies and associated instance data. Kernel methods have been successfully employed in various learning tasks and provide a clean framework for interfacing between non-vectorial data and machine learning algorithms. In this spirit, we express the problem of mining instances in ontologies as the problem of defining valid corresponding kernels. We present a principled framework for designing such kernels by means of decomposing the kernel computation into specialized kernels for selected characteristics of an ontology which can be flexibly assembled and tuned. Initial experiments on real world Semantic Web data enjoy promising results and show the usefulness of our approach.

  8. Biasing anisotropic scattering kernels for deep-penetration Monte Carlo calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, L.L.; Hendricks, J.S.

    1983-01-01

    The exponential transform is often used to improve the efficiency of deep-penetration Monte Carlo calculations. This technique is usually implemented by biasing the distance-to-collision kernel of the transport equation, but leaving the scattering kernel unchanged. Dwivedi obtained significant improvements in efficiency by biasing an isotropic scattering kernel as well as the distance-to-collision kernel. This idea is extended to anisotropic scattering, particularly the highly forward Klein-Nishina scattering of gamma rays.

  9. Impact of Trans-Boundary Emissions on Modelled Air Pollution in Canada

    NASA Astrophysics Data System (ADS)

    Pavlovic, Radenko; Moran, Mike; Zhang, Junhua; Zheng, Qiong; Menard, Sylvain; Anselmo, David; Davignon, Didier

    2014-05-01

    The operational air quality model GEM-MACH is run twice daily at the Canadian Meteorological Centre in Montreal, Quebec to produce 48-hour forecasts of hourly O3, NO2, and PM2.5 fields over a North American domain. The hourly gridded anthropogenic emissions fields needed by GEM-MACH are currently based on the 2006 Canadian emissions inventory, a 2012 projected U.S. inventory, and the 1999 Mexican inventory. The Sparse Matrix Operator Kernel Emissions (SMOKE) processing package was used to process these three national emissions inventories to create the GEM-MACH emissions fields. While Canada is the second-largest country in the world by total area, its population and its emissions of criteria contaminants are both only about one-tenth of U.S. values and roughly 80% of the Canadian population lives within 150 km of the international border with the U.S. As a consequence, transboundary transport of air pollution has a major impact on air quality in Canada. To quantify the impact of non-Canadian emissions on forecasted pollutant levels in Canada, the following two tests were performed: (a) all U.S. and Mexican anthropogenic emissions were switched off; and (b) anthropogenic emissions from the southernmost tier of U.S. states and Mexico were switched off. These sensitivity tests were performed for the summer and winter periods of 2012 or 2011. The results obtained show that the impact of non-Canadian sources on forecasted pollution is generally larger in summer than in winter, especially in south-eastern parts of Canada. For the three pollutants considered in the Canadian national Air Quality Health Index, PM2.5 is impacted the most (up to 80%) and NO2 the least (<10%). Emissions from the southern U.S. and Mexico do impact Canadian air quality, but the sign may change depending on the season (i.e., increase vs. decrease), reflecting chemical processing en route.

  10. Direct Measurement of Wave Kernels in Time-Distance Helioseismology

    NASA Technical Reports Server (NTRS)

    Duvall, T. L., Jr.

    2006-01-01

    Solar f-mode waves are surface-gravity waves which propagate horizontally in a thin layer near the photosphere with a dispersion relation approximately that of deep water waves. At the power maximum near 3 mHz, the wavelength of 5 Mm is large enough for various wave scattering properties to be observable. Gizon and Birch (2002,ApJ,571,966)h ave calculated kernels, in the Born approximation, for the sensitivity of wave travel times to local changes in damping rate and source strength. In this work, using isolated small magnetic features as approximate point-sourc'e scatterers, such a kernel has been measured. The observed kernel contains similar features to a theoretical damping kernel but not for a source kernel. A full understanding of the effect of small magnetic features on the waves will require more detailed modeling.

  11. Simulating smoke transport from wildland fires with a regional-scale air quality model: sensitivity to spatiotemporal allocation of fire emissions.

    PubMed

    Garcia-Menendez, Fernando; Hu, Yongtao; Odman, Mehmet T

    2014-09-15

    Air quality forecasts generated with chemical transport models can provide valuable information about the potential impacts of fires on pollutant levels. However, significant uncertainties are associated with fire-related emission estimates as well as their distribution on gridded modeling domains. In this study, we explore the sensitivity of fine particulate matter concentrations predicted by a regional-scale air quality model to the spatial and temporal allocation of fire emissions. The assessment was completed by simulating a fire-related smoke episode in which air quality throughout the Atlanta metropolitan area was affected on February 28, 2007. Sensitivity analyses were carried out to evaluate the significance of emission distribution among the model's vertical layers, along the horizontal plane, and into hourly inputs. Predicted PM2.5 concentrations were highly sensitive to emission injection altitude relative to planetary boundary layer height. Simulations were also responsive to the horizontal allocation of fire emissions and their distribution into single or multiple grid cells. Additionally, modeled concentrations were greatly sensitive to the temporal distribution of fire-related emissions. The analyses demonstrate that, in addition to adequate estimates of emitted mass, successfully modeling the impacts of fires on air quality depends on an accurate spatiotemporal allocation of emissions. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Evaluation of High Resolution Rapid Refresh-Smoke (HRRR-Smoke) model products for a case study using surface PM2.5 observations

    NASA Astrophysics Data System (ADS)

    Deanes, L. N.; Ahmadov, R.; McKeen, S. A.; Manross, K.; Grell, G. A.; James, E.

    2016-12-01

    Wildfires are increasing in number and size in the western United States as climate change contributes to warmer and drier conditions in this region. These fires lead to poor air quality and diminished visibility. The High Resolution Rapid Refresh-Smoke modeling system (HRRR-Smoke) is designed to simulate fire emissions and smoke transport with high resolution. The model is based on the Weather Research and Forecasting model, coupled with chemistry (WRF-Chem) and uses fire detection data from the Visible Infrared and Imaging Radiometer Suite (VIIRS) satellite instrument to simulate wildfire emissions and their plume rise. HRRR-Smoke is used in both real-time applications and case studies. In this study, we evaluate the HRRR-Smoke for August 2015, during one of the worst wildfire seasons on record in the United States, by focusing on wildfires that occurred in the northwestern US. We compare HRRR-Smoke simulations with hourly fine particulate matter (PM2.5) observations from the Air Quality System (https://www.epa.gov/aqs) from multiple air quality monitoring sites in Washington state. PM2.5 data includes measurements from urban, suburban and remote sites in the state. We discuss the model performance in capturing large PM2.5 enhancements detected at surface sites due to wildfires. We present various statistical parameters to demonstrate HRRR-Smoke's performance in simulating surface PM2.5 levels.

  13. Dropping macadamia nuts-in-shell reduces kernel roasting quality.

    PubMed

    Walton, David A; Wallace, Helen M

    2010-10-01

    Macadamia nuts ('nuts-in-shell') are subjected to many impacts from dropping during postharvest handling, resulting in damage to the raw kernel. The effect of dropping on roasted kernel quality is unknown. Macadamia nuts-in-shell were dropped in various combinations of moisture content, number of drops and receiving surface in three experiments. After dropping, samples from each treatment and undropped controls were dry oven-roasted for 20 min at 130 °C, and kernels were assessed for colour, mottled colour and surface damage. Dropping nuts-in-shell onto a bed of nuts-in-shell at 3% moisture content or 20% moisture content increased the percentage of dark roasted kernels. Kernels from nuts dropped first at 20%, then 10% moisture content, onto a metal plate had increased mottled colour. Dropping nuts-in-shell at 3% moisture content onto nuts-in-shell significantly increased surface damage. Similarly, surface damage increased for kernels dropped onto a metal plate at 20%, then at 10% moisture content. Postharvest dropping of macadamia nuts-in-shell causes concealed cellular damage to kernels, the effects not evident until roasting. This damage provides the reagents needed for non-enzymatic browning reactions. Improvements in handling, such as reducing the number of drops and improving handling equipment, will reduce cellular damage and after-roast darkening. Copyright © 2010 Society of Chemical Industry.

  14. Compound analysis via graph kernels incorporating chirality.

    PubMed

    Brown, J B; Urata, Takashi; Tamura, Takeyuki; Arai, Midori A; Kawabata, Takeo; Akutsu, Tatsuya

    2010-12-01

    High accuracy is paramount when predicting biochemical characteristics using Quantitative Structural-Property Relationships (QSPRs). Although existing graph-theoretic kernel methods combined with machine learning techniques are efficient for QSPR model construction, they cannot distinguish topologically identical chiral compounds which often exhibit different biological characteristics. In this paper, we propose a new method that extends the recently developed tree pattern graph kernel to accommodate stereoisomers. We show that Support Vector Regression (SVR) with a chiral graph kernel is useful for target property prediction by demonstrating its application to a set of human vitamin D receptor ligands currently under consideration for their potential anti-cancer effects.

  15. TEMPORAL EVOLUTION AND SPATIAL DISTRIBUTION OF WHITE-LIGHT FLARE KERNELS IN A SOLAR FLARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kawate, T.; Ishii, T. T.; Nakatani, Y.

    2016-12-10

    On 2011 September 6, we observed an X2.1-class flare in continuum and H α with a frame rate of about 30 Hz. After processing images of the event by using a speckle-masking image reconstruction, we identified white-light (WL) flare ribbons on opposite sides of the magnetic neutral line. We derive the light curve decay times of the WL flare kernels at each resolution element by assuming that the kernels consist of one or two components that decay exponentially, starting from the peak time. As a result, 42% of the pixels have two decay-time components with average decay times of 15.6 andmore » 587 s, whereas the average decay time is 254 s for WL kernels with only one decay-time component. The peak intensities of the shorter decay-time component exhibit good spatial correlation with the WL intensity, whereas the peak intensities of the long decay-time components tend to be larger in the early phase of the flare at the inner part of the flare ribbons, close to the magnetic neutral line. The average intensity of the longer decay-time components is 1.78 times higher than that of the shorter decay-time components. If the shorter decay time is determined by either the chromospheric cooling time or the nonthermal ionization timescale and the longer decay time is attributed to the coronal cooling time, this result suggests that WL sources from both regions appear in 42% of the WL kernels and that WL emission of the coronal origin is sometimes stronger than that of chromospheric origin.« less

  16. Kernel-aligned multi-view canonical correlation analysis for image recognition

    NASA Astrophysics Data System (ADS)

    Su, Shuzhi; Ge, Hongwei; Yuan, Yun-Hao

    2016-09-01

    Existing kernel-based correlation analysis methods mainly adopt a single kernel in each view. However, only a single kernel is usually insufficient to characterize nonlinear distribution information of a view. To solve the problem, we transform each original feature vector into a 2-dimensional feature matrix by means of kernel alignment, and then propose a novel kernel-aligned multi-view canonical correlation analysis (KAMCCA) method on the basis of the feature matrices. Our proposed method can simultaneously employ multiple kernels to better capture the nonlinear distribution information of each view, so that correlation features learned by KAMCCA can have well discriminating power in real-world image recognition. Extensive experiments are designed on five real-world image datasets, including NIR face images, thermal face images, visible face images, handwritten digit images, and object images. Promising experimental results on the datasets have manifested the effectiveness of our proposed method.

  17. A kernel adaptive algorithm for quaternion-valued inputs.

    PubMed

    Paul, Thomas K; Ogunfunmi, Tokunbo

    2015-10-01

    The use of quaternion data can provide benefit in applications like robotics and image recognition, and particularly for performing transforms in 3-D space. Here, we describe a kernel adaptive algorithm for quaternions. A least mean square (LMS)-based method was used, resulting in the derivation of the quaternion kernel LMS (Quat-KLMS) algorithm. Deriving this algorithm required describing the idea of a quaternion reproducing kernel Hilbert space (RKHS), as well as kernel functions suitable with quaternions. A modified HR calculus for Hilbert spaces was used to find the gradient of cost functions defined on a quaternion RKHS. In addition, the use of widely linear (or augmented) filtering is proposed to improve performance. The benefit of the Quat-KLMS and widely linear forms in learning nonlinear transformations of quaternion data are illustrated with simulations.

  18. Improving the Bandwidth Selection in Kernel Equating

    ERIC Educational Resources Information Center

    Andersson, Björn; von Davier, Alina A.

    2014-01-01

    We investigate the current bandwidth selection methods in kernel equating and propose a method based on Silverman's rule of thumb for selecting the bandwidth parameters. In kernel equating, the bandwidth parameters have previously been obtained by minimizing a penalty function. This minimization process has been criticized by practitioners…

  19. Nature and composition of fat bloom from palm kernel stearin and hydrogenated palm kernel stearin compound chocolates.

    PubMed

    Smith, Kevin W; Cain, Fred W; Talbot, Geoff

    2004-08-25

    Palm kernel stearin and hydrogenated palm kernel stearin can be used to prepare compound chocolate bars or coatings. The objective of this study was to characterize the chemical composition, polymorphism, and melting behavior of the bloom that develops on bars of compound chocolate prepared using these fats. Bars were stored for 1 year at 15, 20, or 25 degrees C. At 15 and 20 degrees C the bloom was enriched in cocoa butter triacylglycerols, with respect to the main fat phase, whereas at 25 degrees C the enrichment was with palm kernel triacylglycerols. The bloom consisted principally of solid fat and was sharper melting than was the fat in the chocolate. Polymorphic transitions from the initial beta' phase to the beta phase accompanied the formation of bloom at all temperatures.

  20. Green synthesis of Pd nanoparticles at Apricot kernel shell substrate using Salvia hydrangea extract: Catalytic activity for reduction of organic dyes.

    PubMed

    Khodadadi, Bahar; Bordbar, Maryam; Nasrollahzadeh, Mahmoud

    2017-03-15

    For the first time the extract of the plant of Salvia hydrangea was used to green synthesis of Pd nanoparticles (NPs) supported on Apricot kernel shell as an environmentally benign support. The Pd NPs/Apricot kernel shell as an effective catalyst was prepared through reduction of Pd 2+ ions using Salvia hydrangea extract as the reducing and capping agent and Pd NPs immobilization on Apricot kernel shell surface in the absence of any stabilizer or surfactant. According to FT-IR analysis, the hydroxyl groups of phenolics in Salvia hydrangea extract as bioreductant agents are directly responsible for the reduction of Pd 2+ ions and formation of Pd NPs. The as-prepared catalyst was characterized by Fourier transform infrared (FT-IR) and UV-Vis spectroscopy, field emission scanning electron microscopy (FESEM) equipped with an energy dispersive X-ray spectroscopy (EDS), Elemental mapping, X-ray diffraction analysis (XRD) and transmittance electron microscopy (TEM). The synthesized catalyst was used in the reduction of 4-nitrophenol (4-NP), Methyl Orange (MO), Methylene Blue (MB), Rhodamine B (RhB), and Congo Red (CR) at room temperature. The Pd NPs/Apricot kernel shell showed excellent catalytic activity in the reduction of these organic dyes. In addition, it was found that Pd NPs/Apricot kernel shell can be recovered and reused several times without significant loss of catalytic activity. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Online learning control using adaptive critic designs with sparse kernel machines.

    PubMed

    Xu, Xin; Hou, Zhongsheng; Lian, Chuanqiang; He, Haibo

    2013-05-01

    In the past decade, adaptive critic designs (ACDs), including heuristic dynamic programming (HDP), dual heuristic programming (DHP), and their action-dependent ones, have been widely studied to realize online learning control of dynamical systems. However, because neural networks with manually designed features are commonly used to deal with continuous state and action spaces, the generalization capability and learning efficiency of previous ACDs still need to be improved. In this paper, a novel framework of ACDs with sparse kernel machines is presented by integrating kernel methods into the critic of ACDs. To improve the generalization capability as well as the computational efficiency of kernel machines, a sparsification method based on the approximately linear dependence analysis is used. Using the sparse kernel machines, two kernel-based ACD algorithms, that is, kernel HDP (KHDP) and kernel DHP (KDHP), are proposed and their performance is analyzed both theoretically and empirically. Because of the representation learning and generalization capability of sparse kernel machines, KHDP and KDHP can obtain much better performance than previous HDP and DHP with manually designed neural networks. Simulation and experimental results of two nonlinear control problems, that is, a continuous-action inverted pendulum problem and a ball and plate control problem, demonstrate the effectiveness of the proposed kernel ACD methods.

  2. Kernel analysis of partial least squares (PLS) regression models.

    PubMed

    Shinzawa, Hideyuki; Ritthiruangdej, Pitiporn; Ozaki, Yukihiro

    2011-05-01

    An analytical technique based on kernel matrix representation is demonstrated to provide further chemically meaningful insight into partial least squares (PLS) regression models. The kernel matrix condenses essential information about scores derived from PLS or principal component analysis (PCA). Thus, it becomes possible to establish the proper interpretation of the scores. A PLS model for the total nitrogen (TN) content in multiple Thai fish sauces is built with a set of near-infrared (NIR) transmittance spectra of the fish sauce samples. The kernel analysis of the scores effectively reveals that the variation of the spectral feature induced by the change in protein content is substantially associated with the total water content and the protein hydration. Kernel analysis is also carried out on a set of time-dependent infrared (IR) spectra representing transient evaporation of ethanol from a binary mixture solution of ethanol and oleic acid. A PLS model to predict the elapsed time is built with the IR spectra and the kernel matrix is derived from the scores. The detailed analysis of the kernel matrix provides penetrating insight into the interaction between the ethanol and the oleic acid.

  3. A multi-label learning based kernel automatic recommendation method for support vector machine.

    PubMed

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance.

  4. A Multi-Label Learning Based Kernel Automatic Recommendation Method for Support Vector Machine

    PubMed Central

    Zhang, Xueying; Song, Qinbao

    2015-01-01

    Choosing an appropriate kernel is very important and critical when classifying a new problem with Support Vector Machine. So far, more attention has been paid on constructing new kernels and choosing suitable parameter values for a specific kernel function, but less on kernel selection. Furthermore, most of current kernel selection methods focus on seeking a best kernel with the highest classification accuracy via cross-validation, they are time consuming and ignore the differences among the number of support vectors and the CPU time of SVM with different kernels. Considering the tradeoff between classification success ratio and CPU time, there may be multiple kernel functions performing equally well on the same classification problem. Aiming to automatically select those appropriate kernel functions for a given data set, we propose a multi-label learning based kernel recommendation method built on the data characteristics. For each data set, the meta-knowledge data base is first created by extracting the feature vector of data characteristics and identifying the corresponding applicable kernel set. Then the kernel recommendation model is constructed on the generated meta-knowledge data base with the multi-label classification method. Finally, the appropriate kernel functions are recommended to a new data set by the recommendation model according to the characteristics of the new data set. Extensive experiments over 132 UCI benchmark data sets, with five different types of data set characteristics, eleven typical kernels (Linear, Polynomial, Radial Basis Function, Sigmoidal function, Laplace, Multiquadric, Rational Quadratic, Spherical, Spline, Wave and Circular), and five multi-label classification methods demonstrate that, compared with the existing kernel selection methods and the most widely used RBF kernel function, SVM with the kernel function recommended by our proposed method achieved the highest classification performance. PMID:25893896

  5. Corn kernel oil and corn fiber oil

    USDA-ARS?s Scientific Manuscript database

    Unlike most edible plant oils that are obtained directly from oil-rich seeds by either pressing or solvent extraction, corn seeds (kernels) have low levels of oil (4%) and commercial corn oil is obtained from the corn germ (embryo) which is an oil-rich portion of the kernel. Commercial corn oil cou...

  6. Convolution kernels for multi-wavelength imaging

    NASA Astrophysics Data System (ADS)

    Boucaud, A.; Bocchio, M.; Abergel, A.; Orieux, F.; Dole, H.; Hadj-Youcef, M. A.

    2016-12-01

    Astrophysical images issued from different instruments and/or spectral bands often require to be processed together, either for fitting or comparison purposes. However each image is affected by an instrumental response, also known as point-spread function (PSF), that depends on the characteristics of the instrument as well as the wavelength and the observing strategy. Given the knowledge of the PSF in each band, a straightforward way of processing images is to homogenise them all to a target PSF using convolution kernels, so that they appear as if they had been acquired by the same instrument. We propose an algorithm that generates such PSF-matching kernels, based on Wiener filtering with a tunable regularisation parameter. This method ensures all anisotropic features in the PSFs to be taken into account. We compare our method to existing procedures using measured Herschel/PACS and SPIRE PSFs and simulated JWST/MIRI PSFs. Significant gains up to two orders of magnitude are obtained with respect to the use of kernels computed assuming Gaussian or circularised PSFs. A software to compute these kernels is available at https://github.com/aboucaud/pypher

  7. Unsupervised multiple kernel learning for heterogeneous data integration.

    PubMed

    Mariette, Jérôme; Villa-Vialaneix, Nathalie

    2018-03-15

    Recent high-throughput sequencing advances have expanded the breadth of available omics datasets and the integrated analysis of multiple datasets obtained on the same samples has allowed to gain important insights in a wide range of applications. However, the integration of various sources of information remains a challenge for systems biology since produced datasets are often of heterogeneous types, with the need of developing generic methods to take their different specificities into account. We propose a multiple kernel framework that allows to integrate multiple datasets of various types into a single exploratory analysis. Several solutions are provided to learn either a consensus meta-kernel or a meta-kernel that preserves the original topology of the datasets. We applied our framework to analyse two public multi-omics datasets. First, the multiple metagenomic datasets, collected during the TARA Oceans expedition, was explored to demonstrate that our method is able to retrieve previous findings in a single kernel PCA as well as to provide a new image of the sample structures when a larger number of datasets are included in the analysis. To perform this analysis, a generic procedure is also proposed to improve the interpretability of the kernel PCA in regards with the original data. Second, the multi-omics breast cancer datasets, provided by The Cancer Genome Atlas, is analysed using a kernel Self-Organizing Maps with both single and multi-omics strategies. The comparison of these two approaches demonstrates the benefit of our integration method to improve the representation of the studied biological system. Proposed methods are available in the R package mixKernel, released on CRAN. It is fully compatible with the mixOmics package and a tutorial describing the approach can be found on mixOmics web site http://mixomics.org/mixkernel/. jerome.mariette@inra.fr or nathalie.villa-vialaneix@inra.fr. Supplementary data are available at Bioinformatics online.

  8. Protein fold recognition using geometric kernel data fusion.

    PubMed

    Zakeri, Pooya; Jeuris, Ben; Vandebril, Raf; Moreau, Yves

    2014-07-01

    Various approaches based on features extracted from protein sequences and often machine learning methods have been used in the prediction of protein folds. Finding an efficient technique for integrating these different protein features has received increasing attention. In particular, kernel methods are an interesting class of techniques for integrating heterogeneous data. Various methods have been proposed to fuse multiple kernels. Most techniques for multiple kernel learning focus on learning a convex linear combination of base kernels. In addition to the limitation of linear combinations, working with such approaches could cause a loss of potentially useful information. We design several techniques to combine kernel matrices by taking more involved, geometry inspired means of these matrices instead of convex linear combinations. We consider various sequence-based protein features including information extracted directly from position-specific scoring matrices and local sequence alignment. We evaluate our methods for classification on the SCOP PDB-40D benchmark dataset for protein fold recognition. The best overall accuracy on the protein fold recognition test set obtained by our methods is ∼ 86.7%. This is an improvement over the results of the best existing approach. Moreover, our computational model has been developed by incorporating the functional domain composition of proteins through a hybridization model. It is observed that by using our proposed hybridization model, the protein fold recognition accuracy is further improved to 89.30%. Furthermore, we investigate the performance of our approach on the protein remote homology detection problem by fusing multiple string kernels. The MATLAB code used for our proposed geometric kernel fusion frameworks are publicly available at http://people.cs.kuleuven.be/∼raf.vandebril/homepage/software/geomean.php?menu=5/. © The Author 2014. Published by Oxford University Press.

  9. Proteome analysis of the almond kernel (Prunus dulcis).

    PubMed

    Li, Shugang; Geng, Fang; Wang, Ping; Lu, Jiankang; Ma, Meihu

    2016-08-01

    Almond (Prunus dulcis) is a popular tree nut worldwide and offers many benefits to human health. However, the importance of almond kernel proteins in the nutrition and function in human health requires further evaluation. The present study presents a systematic evaluation of the proteins in the almond kernel using proteomic analysis. The nutrient and amino acid content in almond kernels from Xinjiang is similar to that of American varieties; however, Xinjiang varieties have a higher protein content. Two-dimensional electrophoresis analysis demonstrated a wide distribution of molecular weights and isoelectric points of almond kernel proteins. A total of 434 proteins were identified by LC-MS/MS, and most were proteins that were experimentally confirmed for the first time. Gene ontology (GO) analysis of the 434 proteins indicated that proteins involved in primary biological processes including metabolic processes (67.5%), cellular processes (54.1%), and single-organism processes (43.4%), the main molecular function of almond kernel proteins are in catalytic activity (48.0%), binding (45.4%) and structural molecule activity (11.9%), and proteins are primarily distributed in cell (59.9%), organelle (44.9%), and membrane (22.8%). Almond kernel is a source of a wide variety of proteins. This study provides important information contributing to the screening and identification of almond proteins, the understanding of almond protein function, and the development of almond protein products. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.

  10. Control Transfer in Operating System Kernels

    DTIC Science & Technology

    1994-05-13

    microkernel system that runs less code in the kernel address space. To realize the performance benefit of allocating stacks in unmapped kseg0 memory, the...review how I modified the Mach 3.0 kernel to use continuations. Because of Mach’s message-passing microkernel structure, interprocess communication was...critical control transfer paths, deeply- nested call chains are undesirable in any case because of the function call overhead. 4.1.3 Microkernel Operating

  11. Experimental study of turbulent flame kernel propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mansour, Mohy; Peters, Norbert; Schrader, Lars-Uve

    2008-07-15

    Flame kernels in spark ignited combustion systems dominate the flame propagation and combustion stability and performance. They are likely controlled by the spark energy, flow field and mixing field. The aim of the present work is to experimentally investigate the structure and propagation of the flame kernel in turbulent premixed methane flow using advanced laser-based techniques. The spark is generated using pulsed Nd:YAG laser with 20 mJ pulse energy in order to avoid the effect of the electrodes on the flame kernel structure and the variation of spark energy from shot-to-shot. Four flames have been investigated at equivalence ratios, {phi}{submore » j}, of 0.8 and 1.0 and jet velocities, U{sub j}, of 6 and 12 m/s. A combined two-dimensional Rayleigh and LIPF-OH technique has been applied. The flame kernel structure has been collected at several time intervals from the laser ignition between 10 {mu}s and 2 ms. The data show that the flame kernel structure starts with spherical shape and changes gradually to peanut-like, then to mushroom-like and finally disturbed by the turbulence. The mushroom-like structure lasts longer in the stoichiometric and slower jet velocity. The growth rate of the average flame kernel radius is divided into two linear relations; the first one during the first 100 {mu}s is almost three times faster than that at the later stage between 100 and 2000 {mu}s. The flame propagation is slightly faster in leaner flames. The trends of the flame propagation, flame radius, flame cross-sectional area and mean flame temperature are related to the jet velocity and equivalence ratio. The relations obtained in the present work allow the prediction of any of these parameters at different conditions. (author)« less

  12. Bivariate discrete beta Kernel graduation of mortality data.

    PubMed

    Mazza, Angelo; Punzo, Antonio

    2015-07-01

    Various parametric/nonparametric techniques have been proposed in literature to graduate mortality data as a function of age. Nonparametric approaches, as for example kernel smoothing regression, are often preferred because they do not assume any particular mortality law. Among the existing kernel smoothing approaches, the recently proposed (univariate) discrete beta kernel smoother has been shown to provide some benefits. Bivariate graduation, over age and calendar years or durations, is common practice in demography and actuarial sciences. In this paper, we generalize the discrete beta kernel smoother to the bivariate case, and we introduce an adaptive bandwidth variant that may provide additional benefits when data on exposures to the risk of death are available; furthermore, we outline a cross-validation procedure for bandwidths selection. Using simulations studies, we compare the bivariate approach proposed here with its corresponding univariate formulation and with two popular nonparametric bivariate graduation techniques, based on Epanechnikov kernels and on P-splines. To make simulations realistic, a bivariate dataset, based on probabilities of dying recorded for the US males, is used. Simulations have confirmed the gain in performance of the new bivariate approach with respect to both the univariate and the bivariate competitors.

  13. A Linear Kernel for Co-Path/Cycle Packing

    NASA Astrophysics Data System (ADS)

    Chen, Zhi-Zhong; Fellows, Michael; Fu, Bin; Jiang, Haitao; Liu, Yang; Wang, Lusheng; Zhu, Binhai

    Bounded-Degree Vertex Deletion is a fundamental problem in graph theory that has new applications in computational biology. In this paper, we address a special case of Bounded-Degree Vertex Deletion, the Co-Path/Cycle Packing problem, which asks to delete as few vertices as possible such that the graph of the remaining (residual) vertices is composed of disjoint paths and simple cycles. The problem falls into the well-known class of 'node-deletion problems with hereditary properties', is hence NP-complete and unlikely to admit a polynomial time approximation algorithm with approximation factor smaller than 2. In the framework of parameterized complexity, we present a kernelization algorithm that produces a kernel with at most 37k vertices, improving on the super-linear kernel of Fellows et al.'s general theorem for Bounded-Degree Vertex Deletion. Using this kernel,and the method of bounded search trees, we devise an FPT algorithm that runs in time O *(3.24 k ). On the negative side, we show that the problem is APX-hard and unlikely to have a kernel smaller than 2k by a reduction from Vertex Cover.

  14. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... classifications provided in this section. When the color of kernels in a lot generally conforms to the “light” or “light amber” classification, that color classification may be used to describe the lot in connection with the grade. (1) “Light” means that the outer surface of the kernel is mostly golden color or...

  15. 7 CFR 51.1403 - Kernel color classification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... classifications provided in this section. When the color of kernels in a lot generally conforms to the “light” or “light amber” classification, that color classification may be used to describe the lot in connection with the grade. (1) “Light” means that the outer surface of the kernel is mostly golden color or...

  16. 40 CFR 87.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) Definitions. Exhaust Emissions (New Aircraft Gas Turbine Engines) § 87.21 Standards for exhaust... each new aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each new aircraft gas turbine engine of...

  17. Chemical and toxicological characterization of residential oil burner emissions: I. Yields and chemical characterization of extractables from combustion of No. 2 fuel oil at different Bacharach Smoke Numbers and firing cycles.

    PubMed Central

    Leary, J A; Biemann, K; Lafleur, A L; Kruzel, E L; Prado, G P; Longwell, J P; Peters, W A

    1987-01-01

    Particulates and complex organic mixtures were sampled from the exhaust of a flame retention head residential oil burner combusting No. 2 fuel oil at three firing conditions: continuous at Bacharach Smoke No. 1, and cyclic (5 min on, 10 min off) at Smoke Nos. 1 and 5. The complex mixtures were recovered by successive Soxhlet extraction of filtered particulates and XAD-2 sorbent resin with methylene chloride (DCM) and then methanol (MeOH). Bacterial mutagenicity [see Paper II (8)] was found in the DCM extractables. Samples of DCM extracts from the two cyclic firing conditions and of the raw fuel were separated by gravity column chromatography on alumina. The resulting fractions were further characterized by a range of instrumental methods. Average yields of both unextracted particulates and of DCM extractables, normalized to a basis of per unit weight of fuel fired, were lower for continuous firing than for cyclic firing. For cyclic firing, decreasing the smoke number lowered the particulates emissions but only slightly reduced the average yield of DCM extractables. These and similar observations, here reported for two other oil burners, show that adjusting the burner to a lower smoke number has little effect on, or may actually increase, emissions of organic extractables of potential public health interest. Modifications of the burner firing cycle aimed at approaching continuous operation offer promise for reducing the amount of complex organic emissions. Unburned fuel accounted for roughly half of the DCM extractables from cyclic firing of the flame retention head burner at high and low smoke number. Large (i.e., greater than 3 ring) polycyclic aromatic hydrocarbons (PAH) were not observed in the DCM extractables from cyclic firing. However, nitroaromatics, typified by alkylated nitronaphthalenes, alkyl-nitrobiphenyls, and alkyl-nitrophenanthrenes were found in a minor subfraction containing a significant portion of the total mutagenic activity of the cyclic low

  18. A framework for optimal kernel-based manifold embedding of medical image data.

    PubMed

    Zimmer, Veronika A; Lekadir, Karim; Hoogendoorn, Corné; Frangi, Alejandro F; Piella, Gemma

    2015-04-01

    Kernel-based dimensionality reduction is a widely used technique in medical image analysis. To fully unravel the underlying nonlinear manifold the selection of an adequate kernel function and of its free parameters is critical. In practice, however, the kernel function is generally chosen as Gaussian or polynomial and such standard kernels might not always be optimal for a given image dataset or application. In this paper, we present a study on the effect of the kernel functions in nonlinear manifold embedding of medical image data. To this end, we first carry out a literature review on existing advanced kernels developed in the statistics, machine learning, and signal processing communities. In addition, we implement kernel-based formulations of well-known nonlinear dimensional reduction techniques such as Isomap and Locally Linear Embedding, thus obtaining a unified framework for manifold embedding using kernels. Subsequently, we present a method to automatically choose a kernel function and its associated parameters from a pool of kernel candidates, with the aim to generate the most optimal manifold embeddings. Furthermore, we show how the calculated selection measures can be extended to take into account the spatial relationships in images, or used to combine several kernels to further improve the embedding results. Experiments are then carried out on various synthetic and phantom datasets for numerical assessment of the methods. Furthermore, the workflow is applied to real data that include brain manifolds and multispectral images to demonstrate the importance of the kernel selection in the analysis of high-dimensional medical images. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Relationship of source and sink in determining kernel composition of maize

    PubMed Central

    Seebauer, Juliann R.; Singletary, George W.; Krumpelman, Paulette M.; Ruffo, Matías L.; Below, Frederick E.

    2010-01-01

    The relative role of the maternal source and the filial sink in controlling the composition of maize (Zea mays L.) kernels is unclear and may be influenced by the genotype and the N supply. The objective of this study was to determine the influence of assimilate supply from the vegetative source and utilization of assimilates by the grain sink on the final composition of maize kernels. Intermated B73×Mo17 recombinant inbred lines (IBM RILs) which displayed contrasting concentrations of endosperm starch were grown in the field with deficient or sufficient N, and the source supply altered by ear truncation (45% reduction) at 15 d after pollination (DAP). The assimilate supply into the kernels was determined at 19 DAP using the agar trap technique, and the final kernel composition was measured. The influence of N supply and kernel ear position on final kernel composition was also determined for a commercial hybrid. Concentrations of kernel protein and starch could be altered by genotype or the N supply, but remained fairly constant along the length of the ear. Ear truncation also produced a range of variation in endosperm starch and protein concentrations. The C/N ratio of the assimilate supply at 19 DAP was directly related to the final kernel composition, with an inverse relationship between the concentrations of starch and protein in the mature endosperm. The accumulation of kernel starch and protein in maize is uniform along the ear, yet adaptable within genotypic limits, suggesting that kernel composition is source limited in maize. PMID:19917600

  20. Resummed memory kernels in generalized system-bath master equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mavros, Michael G.; Van Voorhis, Troy, E-mail: tvan@mit.edu

    2014-08-07

    Generalized master equations provide a concise formalism for studying reduced population dynamics. Usually, these master equations require a perturbative expansion of the memory kernels governing the dynamics; in order to prevent divergences, these expansions must be resummed. Resummation techniques of perturbation series are ubiquitous in physics, but they have not been readily studied for the time-dependent memory kernels used in generalized master equations. In this paper, we present a comparison of different resummation techniques for such memory kernels up to fourth order. We study specifically the spin-boson Hamiltonian as a model system bath Hamiltonian, treating the diabatic coupling between themore » two states as a perturbation. A novel derivation of the fourth-order memory kernel for the spin-boson problem is presented; then, the second- and fourth-order kernels are evaluated numerically for a variety of spin-boson parameter regimes. We find that resumming the kernels through fourth order using a Padé approximant results in divergent populations in the strong electronic coupling regime due to a singularity introduced by the nature of the resummation, and thus recommend a non-divergent exponential resummation (the “Landau-Zener resummation” of previous work). The inclusion of fourth-order effects in a Landau-Zener-resummed kernel is shown to improve both the dephasing rate and the obedience of detailed balance over simpler prescriptions like the non-interacting blip approximation, showing a relatively quick convergence on the exact answer. The results suggest that including higher-order contributions to the memory kernel of a generalized master equation and performing an appropriate resummation can provide a numerically-exact solution to system-bath dynamics for a general spectral density, opening the way to a new class of methods for treating system-bath dynamics.« less

  1. Improving prediction of heterodimeric protein complexes using combination with pairwise kernel.

    PubMed

    Ruan, Peiying; Hayashida, Morihiro; Akutsu, Tatsuya; Vert, Jean-Philippe

    2018-02-19

    Since many proteins become functional only after they interact with their partner proteins and form protein complexes, it is essential to identify the sets of proteins that form complexes. Therefore, several computational methods have been proposed to predict complexes from the topology and structure of experimental protein-protein interaction (PPI) network. These methods work well to predict complexes involving at least three proteins, but generally fail at identifying complexes involving only two different proteins, called heterodimeric complexes or heterodimers. There is however an urgent need for efficient methods to predict heterodimers, since the majority of known protein complexes are precisely heterodimers. In this paper, we use three promising kernel functions, Min kernel and two pairwise kernels, which are Metric Learning Pairwise Kernel (MLPK) and Tensor Product Pairwise Kernel (TPPK). We also consider the normalization forms of Min kernel. Then, we combine Min kernel or its normalization form and one of the pairwise kernels by plugging. We applied kernels based on PPI, domain, phylogenetic profile, and subcellular localization properties to predicting heterodimers. Then, we evaluate our method by employing C-Support Vector Classification (C-SVC), carrying out 10-fold cross-validation, and calculating the average F-measures. The results suggest that the combination of normalized-Min-kernel and MLPK leads to the best F-measure and improved the performance of our previous work, which had been the best existing method so far. We propose new methods to predict heterodimers, using a machine learning-based approach. We train a support vector machine (SVM) to discriminate interacting vs non-interacting protein pairs, based on informations extracted from PPI, domain, phylogenetic profiles and subcellular localization. We evaluate in detail new kernel functions to encode these data, and report prediction performance that outperforms the state-of-the-art.

  2. Biomass Burning Emissions from Fire Remote Sensing

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles

    2010-01-01

    Knowledge of the emission source strengths of different (particulate and gaseous) atmospheric constituents is one of the principal ingredients upon which the modeling and forecasting of their distribution and impacts depend. Biomass burning emissions are complex and difficult to quantify. However, satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP), which has a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. In this presentation, we will show how the satellite measurement of FRP is facilitating the quantitative characterization of biomass burning and smoke emission rates, and the implications of this unique capability for improving our understanding of smoke impacts on air quality, weather, and climate. We will also discuss some of the challenges and uncertainties associated with satellite measurement of FRP and how they are being addressed.

  3. Density Estimation with Mercer Kernels

    NASA Technical Reports Server (NTRS)

    Macready, William G.

    2003-01-01

    We present a new method for density estimation based on Mercer kernels. The density estimate can be understood as the density induced on a data manifold by a mixture of Gaussians fit in a feature space. As is usual, the feature space and data manifold are defined with any suitable positive-definite kernel function. We modify the standard EM algorithm for mixtures of Gaussians to infer the parameters of the density. One benefit of the approach is it's conceptual simplicity, and uniform applicability over many different types of data. Preliminary results are presented for a number of simple problems.

  4. Broken rice kernels and the kinetics of rice hydration and texture during cooking.

    PubMed

    Saleh, Mohammed; Meullenet, Jean-Francois

    2013-05-01

    During rice milling and processing, broken kernels are inevitably present, although to date it has been unclear as to how the presence of broken kernels affects rice hydration and cooked rice texture. Therefore, this work intended to study the effect of broken kernels in a rice sample on rice hydration and texture during cooking. Two medium-grain and two long-grain rice cultivars were harvested, dried and milled, and the broken kernels were separated from unbroken kernels. Broken rice kernels were subsequently combined with unbroken rice kernels forming treatments of 0, 40, 150, 350 or 1000 g kg(-1) broken kernels ratio. Rice samples were then cooked and the moisture content of the cooked rice, the moisture uptake rate, and rice hardness and stickiness were measured. As the amount of broken rice kernels increased, rice sample texture became increasingly softer (P < 0.05) but the unbroken kernels became significantly harder. Moisture content and moisture uptake rate were positively correlated, and cooked rice hardness was negatively correlated to the percentage of broken kernels in rice samples. Differences in the proportions of broken rice in a milled rice sample play a major role in determining the texture properties of cooked rice. Variations in the moisture migration kinetics between broken and unbroken kernels caused faster hydration of the cores of broken rice kernels, with greater starch leach-out during cooking affecting the texture of the cooked rice. The texture of cooked rice can be controlled, to some extent, by varying the proportion of broken kernels in milled rice. © 2012 Society of Chemical Industry.

  5. A new discrete dipole kernel for quantitative susceptibility mapping.

    PubMed

    Milovic, Carlos; Acosta-Cabronero, Julio; Pinto, José Miguel; Mattern, Hendrik; Andia, Marcelo; Uribe, Sergio; Tejos, Cristian

    2018-09-01

    Most approaches for quantitative susceptibility mapping (QSM) are based on a forward model approximation that employs a continuous Fourier transform operator to solve a differential equation system. Such formulation, however, is prone to high-frequency aliasing. The aim of this study was to reduce such errors using an alternative dipole kernel formulation based on the discrete Fourier transform and discrete operators. The impact of such an approach on forward model calculation and susceptibility inversion was evaluated in contrast to the continuous formulation both with synthetic phantoms and in vivo MRI data. The discrete kernel demonstrated systematically better fits to analytic field solutions, and showed less over-oscillations and aliasing artifacts while preserving low- and medium-frequency responses relative to those obtained with the continuous kernel. In the context of QSM estimation, the use of the proposed discrete kernel resulted in error reduction and increased sharpness. This proof-of-concept study demonstrated that discretizing the dipole kernel is advantageous for QSM. The impact on small or narrow structures such as the venous vasculature might by particularly relevant to high-resolution QSM applications with ultra-high field MRI - a topic for future investigations. The proposed dipole kernel has a straightforward implementation to existing QSM routines. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Genetic Analysis of Kernel Traits in Maize-Teosinte Introgression Populations.

    PubMed

    Liu, Zhengbin; Garcia, Arturo; McMullen, Michael D; Flint-Garcia, Sherry A

    2016-08-09

    Seed traits have been targeted by human selection during the domestication of crop species as a way to increase the caloric and nutritional content of food during the transition from hunter-gather to early farming societies. The primary seed trait under selection was likely seed size/weight as it is most directly related to overall grain yield. Additional seed traits involved in seed shape may have also contributed to larger grain. Maize (Zea mays ssp. mays) kernel weight has increased more than 10-fold in the 9000 years since domestication from its wild ancestor, teosinte (Z. mays ssp. parviglumis). In order to study how size and shape affect kernel weight, we analyzed kernel morphometric traits in a set of 10 maize-teosinte introgression populations using digital imaging software. We identified quantitative trait loci (QTL) for kernel area and length with moderate allelic effects that colocalize with kernel weight QTL. Several genomic regions with strong effects during maize domestication were detected, and a genetic framework for kernel traits was characterized by complex pleiotropic interactions. Our results both confirm prior reports of kernel domestication loci and identify previously uncharacterized QTL with a range of allelic effects, enabling future research into the genetic basis of these traits. Copyright © 2016 Liu et al.

  7. Genetic Analysis of Kernel Traits in Maize-Teosinte Introgression Populations

    PubMed Central

    Liu, Zhengbin; Garcia, Arturo; McMullen, Michael D.; Flint-Garcia, Sherry A.

    2016-01-01

    Seed traits have been targeted by human selection during the domestication of crop species as a way to increase the caloric and nutritional content of food during the transition from hunter-gather to early farming societies. The primary seed trait under selection was likely seed size/weight as it is most directly related to overall grain yield. Additional seed traits involved in seed shape may have also contributed to larger grain. Maize (Zea mays ssp. mays) kernel weight has increased more than 10-fold in the 9000 years since domestication from its wild ancestor, teosinte (Z. mays ssp. parviglumis). In order to study how size and shape affect kernel weight, we analyzed kernel morphometric traits in a set of 10 maize-teosinte introgression populations using digital imaging software. We identified quantitative trait loci (QTL) for kernel area and length with moderate allelic effects that colocalize with kernel weight QTL. Several genomic regions with strong effects during maize domestication were detected, and a genetic framework for kernel traits was characterized by complex pleiotropic interactions. Our results both confirm prior reports of kernel domestication loci and identify previously uncharacterized QTL with a range of allelic effects, enabling future research into the genetic basis of these traits. PMID:27317774

  8. Investigation of various energy deposition kernel refinements for the convolution/superposition method

    PubMed Central

    Huang, Jessie Y.; Eklund, David; Childress, Nathan L.; Howell, Rebecca M.; Mirkovic, Dragan; Followill, David S.; Kry, Stephen F.

    2013-01-01

    Purpose: Several simplifications used in clinical implementations of the convolution/superposition (C/S) method, specifically, density scaling of water kernels for heterogeneous media and use of a single polyenergetic kernel, lead to dose calculation inaccuracies. Although these weaknesses of the C/S method are known, it is not well known which of these simplifications has the largest effect on dose calculation accuracy in clinical situations. The purpose of this study was to generate and characterize high-resolution, polyenergetic, and material-specific energy deposition kernels (EDKs), as well as to investigate the dosimetric impact of implementing spatially variant polyenergetic and material-specific kernels in a collapsed cone C/S algorithm. Methods: High-resolution, monoenergetic water EDKs and various material-specific EDKs were simulated using the EGSnrc Monte Carlo code. Polyenergetic kernels, reflecting the primary spectrum of a clinical 6 MV photon beam at different locations in a water phantom, were calculated for different depths, field sizes, and off-axis distances. To investigate the dosimetric impact of implementing spatially variant polyenergetic kernels, depth dose curves in water were calculated using two different implementations of the collapsed cone C/S method. The first method uses a single polyenergetic kernel, while the second method fully takes into account spectral changes in the convolution calculation. To investigate the dosimetric impact of implementing material-specific kernels, depth dose curves were calculated for a simplified titanium implant geometry using both a traditional C/S implementation that performs density scaling of water kernels and a novel implementation using material-specific kernels. Results: For our high-resolution kernels, we found good agreement with the Mackie et al. kernels, with some differences near the interaction site for low photon energies (<500 keV). For our spatially variant polyenergetic kernels, we

  9. Effects of sample size on KERNEL home range estimates

    USGS Publications Warehouse

    Seaman, D.E.; Millspaugh, J.J.; Kernohan, Brian J.; Brundige, Gary C.; Raedeke, Kenneth J.; Gitzen, Robert A.

    1999-01-01

    Kernel methods for estimating home range are being used increasingly in wildlife research, but the effect of sample size on their accuracy is not known. We used computer simulations of 10-200 points/home range and compared accuracy of home range estimates produced by fixed and adaptive kernels with the reference (REF) and least-squares cross-validation (LSCV) methods for determining the amount of smoothing. Simulated home ranges varied from simple to complex shapes created by mixing bivariate normal distributions. We used the size of the 95% home range area and the relative mean squared error of the surface fit to assess the accuracy of the kernel home range estimates. For both measures, the bias and variance approached an asymptote at about 50 observations/home range. The fixed kernel with smoothing selected by LSCV provided the least-biased estimates of the 95% home range area. All kernel methods produced similar surface fit for most simulations, but the fixed kernel with LSCV had the lowest frequency and magnitude of very poor estimates. We reviewed 101 papers published in The Journal of Wildlife Management (JWM) between 1980 and 1997 that estimated animal home ranges. A minority of these papers used nonparametric utilization distribution (UD) estimators, and most did not adequately report sample sizes. We recommend that home range studies using kernel estimates use LSCV to determine the amount of smoothing, obtain a minimum of 30 observations per animal (but preferably a?Y50), and report sample sizes in published results.

  10. Local coding based matching kernel method for image classification.

    PubMed

    Song, Yan; McLoughlin, Ian Vince; Dai, Li-Rong

    2014-01-01

    This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV) techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK) method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.

  11. The flare kernel in the impulsive phase

    NASA Technical Reports Server (NTRS)

    Dejager, C.

    1986-01-01

    The impulsive phase of a flare is characterized by impulsive bursts of X-ray and microwave radiation, related to impulsive footpoint heating up to 50 or 60 MK, by upward gas velocities (150 to 400 km/sec) and by a gradual increase of the flare's thermal energy content. These phenomena, as well as non-thermal effects, are all related to the impulsive energy injection into the flare. The available observations are also quantitatively consistent with a model in which energy is injected into the flare by beams of energetic electrons, causing ablation of chromospheric gas, followed by convective rise of gas. Thus, a hole is burned into the chromosphere; at the end of impulsive phase of an average flare the lower part of that hole is situated about 1800 km above the photosphere. H alpha and other optical and UV line emission is radiated by a thin layer (approx. 20 km) at the bottom of the flare kernel. The upward rising and outward streaming gas cools down by conduction in about 45 s. The non-thermal effects in the initial phase are due to curtailing of the energy distribution function by escape of energetic electrons. The single flux tube model of a flare does not fit with these observations; instead we propose the spaghetti-bundle model. Microwave and gamma-ray observations suggest the occurrence of dense flare knots of approx. 800 km diameter, and of high temperature. Future observations should concentrate on locating the microwave/gamma-ray sources, and on determining the kernel's fine structure and the related multi-loop structure of the flaring area.

  12. Hyperspectral Image Classification via Kernel Sparse Representation

    DTIC Science & Technology

    2013-01-01

    classification algorithms. Moreover, the spatial coherency across neighboring pixels is also incorporated through a kernelized joint sparsity model , where...joint sparsity model , where all of the pixels within a small neighborhood are jointly represented in the feature space by selecting a few common training...hyperspectral imagery, joint spar- sity model , kernel methods, sparse representation. I. INTRODUCTION HYPERSPECTRAL imaging sensors capture images

  13. Effects of Amygdaline from Apricot Kernel on Transplanted Tumors in Mice.

    PubMed

    Yamshanov, V A; Kovan'ko, E G; Pustovalov, Yu I

    2016-03-01

    The effects of amygdaline from apricot kernel added to fodder on the growth of transplanted LYO-1 and Ehrlich carcinoma were studied in mice. Apricot kernels inhibited the growth of both tumors. Apricot kernels, raw and after thermal processing, given 2 days before transplantation produced a pronounced antitumor effect. Heat-processed apricot kernels given in 3 days after transplantation modified the tumor growth and prolonged animal lifespan. Thermal treatment did not considerably reduce the antitumor effect of apricot kernels. It was hypothesized that the antitumor effect of amygdaline on Ehrlich carcinoma and LYO-1 lymphosarcoma was associated with the presence of bacterial genome in the tumor.

  14. Fire, Fuel, and Smoke Program: 2014 Research Accomplishments

    Treesearch

    Faith Ann Heinsch; Robin J. Innes; Colin C. Hardy; Kristine M. Lee

    2015-01-01

    The Fire, Fuel, and Smoke Science Program (FFS) of the U.S. Forest Service, Rocky Mountain Research Station focuses on fundamental and applied research in wildland fire, from fire physics and fire ecology to fuels management and smoke emissions. Located at the Missoula Fire Sciences Laboratory in Montana, the scientists, engineers, technicians, and support staff in FFS...

  15. Using the Intel Math Kernel Library on Peregrine | High-Performance

    Science.gov Websites

    Computing | NREL the Intel Math Kernel Library on Peregrine Using the Intel Math Kernel Library on Peregrine Learn how to use the Intel Math Kernel Library (MKL) with Peregrine system software. MKL architectures. Core math functions in MKL include BLAS, LAPACK, ScaLAPACK, sparse solvers, fast Fourier

  16. Semi-supervised learning for ordinal Kernel Discriminant Analysis.

    PubMed

    Pérez-Ortiz, M; Gutiérrez, P A; Carbonero-Ruz, M; Hervás-Martínez, C

    2016-12-01

    Ordinal classification considers those classification problems where the labels of the variable to predict follow a given order. Naturally, labelled data is scarce or difficult to obtain in this type of problems because, in many cases, ordinal labels are given by a user or expert (e.g. in recommendation systems). Firstly, this paper develops a new strategy for ordinal classification where both labelled and unlabelled data are used in the model construction step (a scheme which is referred to as semi-supervised learning). More specifically, the ordinal version of kernel discriminant learning is extended for this setting considering the neighbourhood information of unlabelled data, which is proposed to be computed in the feature space induced by the kernel function. Secondly, a new method for semi-supervised kernel learning is devised in the context of ordinal classification, which is combined with our developed classification strategy to optimise the kernel parameters. The experiments conducted compare 6 different approaches for semi-supervised learning in the context of ordinal classification in a battery of 30 datasets, showing (1) the good synergy of the ordinal version of discriminant analysis and the use of unlabelled data and (2) the advantage of computing distances in the feature space induced by the kernel function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Hazardous Compounds in Tobacco Smoke

    PubMed Central

    Talhout, Reinskje; Schulz, Thomas; Florek, Ewa; van Benthem, Jan; Wester, Piet; Opperhuizen, Antoon

    2011-01-01

    Tobacco smoke is a toxic and carcinogenic mixture of more than 5,000 chemicals. The present article provides a list of 98 hazardous smoke components, based on an extensive literature search for known smoke components and their human health inhalation risks. An electronic database of smoke components containing more than 2,200 entries was generated. Emission levels in mainstream smoke have been found for 542 of the components and a human inhalation risk value for 98 components. As components with potential carcinogenic, cardiovascular and respiratory effects have been included, the three major smoke-related causes of death are all covered by the list. Given that the currently used Hoffmann list of hazardous smoke components is based on data from the 1990s and only includes carcinogens, it is recommended that the current list of 98 hazardous components is used for regulatory purposes instead. To enable risk assessment of components not covered by this list, thresholds of toxicological concern (TTC) have been established from the inhalation risk values found: 0.0018 μg day−1 for all risks, and 1.2 μg day−1 for all risks excluding carcinogenicity, the latter being similar to previously reported inhalation TTCs. PMID:21556207

  18. High speed sorting of Fusarium-damaged wheat kernels

    USDA-ARS?s Scientific Manuscript database

    Recent studies have found that resistance to Fusarium fungal infection can be inherited in wheat from one generation to another. However, there is not yet available a cost effective method to separate Fusarium-damaged wheat kernels from undamaged kernels so that wheat breeders can take advantage of...

  19. CW-SSIM kernel based random forest for image classification

    NASA Astrophysics Data System (ADS)

    Fan, Guangzhe; Wang, Zhou; Wang, Jiheng

    2010-07-01

    Complex wavelet structural similarity (CW-SSIM) index has been proposed as a powerful image similarity metric that is robust to translation, scaling and rotation of images, but how to employ it in image classification applications has not been deeply investigated. In this paper, we incorporate CW-SSIM as a kernel function into a random forest learning algorithm. This leads to a novel image classification approach that does not require a feature extraction or dimension reduction stage at the front end. We use hand-written digit recognition as an example to demonstrate our algorithm. We compare the performance of the proposed approach with random forest learning based on other kernels, including the widely adopted Gaussian and the inner product kernels. Empirical evidences show that the proposed method is superior in its classification power. We also compared our proposed approach with the direct random forest method without kernel and the popular kernel-learning method support vector machine. Our test results based on both simulated and realworld data suggest that the proposed approach works superior to traditional methods without the feature selection procedure.

  20. Insights from Classifying Visual Concepts with Multiple Kernel Learning

    PubMed Central

    Binder, Alexander; Nakajima, Shinichi; Kloft, Marius; Müller, Christina; Samek, Wojciech; Brefeld, Ulf; Müller, Klaus-Robert; Kawanabe, Motoaki

    2012-01-01

    Combining information from various image features has become a standard technique in concept recognition tasks. However, the optimal way of fusing the resulting kernel functions is usually unknown in practical applications. Multiple kernel learning (MKL) techniques allow to determine an optimal linear combination of such similarity matrices. Classical approaches to MKL promote sparse mixtures. Unfortunately, 1-norm regularized MKL variants are often observed to be outperformed by an unweighted sum kernel. The main contributions of this paper are the following: we apply a recently developed non-sparse MKL variant to state-of-the-art concept recognition tasks from the application domain of computer vision. We provide insights on benefits and limits of non-sparse MKL and compare it against its direct competitors, the sum-kernel SVM and sparse MKL. We report empirical results for the PASCAL VOC 2009 Classification and ImageCLEF2010 Photo Annotation challenge data sets. Data sets (kernel matrices) as well as further information are available at http://doc.ml.tu-berlin.de/image_mkl/(Accessed 2012 Jun 25). PMID:22936970

  1. Nonparametric entropy estimation using kernel densities.

    PubMed

    Lake, Douglas E

    2009-01-01

    The entropy of experimental data from the biological and medical sciences provides additional information over summary statistics. Calculating entropy involves estimates of probability density functions, which can be effectively accomplished using kernel density methods. Kernel density estimation has been widely studied and a univariate implementation is readily available in MATLAB. The traditional definition of Shannon entropy is part of a larger family of statistics, called Renyi entropy, which are useful in applications that require a measure of the Gaussianity of data. Of particular note is the quadratic entropy which is related to the Friedman-Tukey (FT) index, a widely used measure in the statistical community. One application where quadratic entropy is very useful is the detection of abnormal cardiac rhythms, such as atrial fibrillation (AF). Asymptotic and exact small-sample results for optimal bandwidth and kernel selection to estimate the FT index are presented and lead to improved methods for entropy estimation.

  2. Smoke, Clouds, and Radiation-Brazil (SCAR-B) Experiment

    NASA Technical Reports Server (NTRS)

    Kaufman, Y. J.; Hobbs, P. V.; Kirchoff, V. W. J. H.; Artaxo, P.; Remer, L. A.; Holben, B. N.; King, M. D.; Ward, D. E.; Prins, E. M.; Longo, K. M.; hide

    1998-01-01

    The Smoke, Clouds, and Radiation-Brazil (SCAR-B) field project took place in the Brazilian Amazon and cerrado regions in August-September 1995 as a collaboration between Brazilian and American scientists. SCAR-B, a comprehensive experiment to study biomass burning, emphasized measurements of surface biomass, fires, smoke aerosol and trace gases, clouds, and radiation. their climatic effects, and remote sensing from aircraft and satellites. It included aircraft and ground-based in situ measurements of smoke emission factors and the compositions, sizes, and optical properties of the smoke particles; studies of the formation of ozone; the transport and evolution of smoke; and smoke interactions with water vapor and clouds. This overview paper introduces SCAR-B and summarizes some of the main results obtained so far. (1) Fires: measurements of the size distribution of fires, using the 50 m resolution MODIS Airborne Simulator, show that most of the fires are small (e.g. 0.005 square km), but the satellite sensors (e.g., AVHRR and MODIS with I km resolution) can detect fires in Brazil which are responsible for 60-85% of the burned biomass: (2) Aerosol: smoke particles emitted from fires increase their radius by as much as 60%, during their first three days in the atmosphere due to condensation and coagulation, reaching a mass median radius of 0.13-0.17 microns: (3) Radiative forcing: estimates of the globally averaged direct radiative forcing due to smoke worldwide, based on the properties of smoke measured in SCAR-B (-O.l to -0.3 W m(exp -2)), are smaller than previously modeled due to a lower single-scattering albedo (0.8 to 0.9), smaller scattering efficiency (3 square meters g(exp -2) at 550 nm), and low humidification factor; and (4) Effect on clouds: a good relationship was found between cloud condensation nuclei and smoke volume concentrations, thus an increase in the smoke emission is expected to affect cloud properties. In SCAR-B, new techniques were developed

  3. Speed correlation and emission of truck vehicles on dynamic conditions

    NASA Astrophysics Data System (ADS)

    Lutfie, M.; Samang, L.; Adisasmita, S. A.; Ramli, M. I.

    2018-04-01

    Concentration of CO2, NOx, smoke, CO, and HC released from several truck vehicles taken emission and speed data every 5 second through measurements using the mobile emission analyzer as an emission test vehicle that absorbs the gas from exhaust of sample vehicles. Implementation in field is to put emission test equipment on the right side of truck, which will absorb 5 gas compounds for 5 - 20 minutes with a view to knowing truck emissions of moving conditions by considering load factors. The sample vehicles are diesel-fueled trucks. From the research on gas emissions, it is generally found that the tendency that arises is the faster the vehicle speed then the CO2, NOx, Smoke, CO, and HC gases released will be greater or will increase as the vehicle speed increases. Thus, the relationship of CO2, NOx, smoke, CO, and HC concentration with vehicle speed is a linear relationship.

  4. New Fukui, dual and hyper-dual kernels as bond reactivity descriptors.

    PubMed

    Franco-Pérez, Marco; Polanco-Ramírez, Carlos-A; Ayers, Paul W; Gázquez, José L; Vela, Alberto

    2017-06-21

    We define three new linear response indices with promising applications for bond reactivity using the mathematical framework of τ-CRT (finite temperature chemical reactivity theory). The τ-Fukui kernel is defined as the ratio between the fluctuations of the average electron density at two different points in the space and the fluctuations in the average electron number and is designed to integrate to the finite-temperature definition of the electronic Fukui function. When this kernel is condensed, it can be interpreted as a site-reactivity descriptor of the boundary region between two atoms. The τ-dual kernel corresponds to the first order response of the Fukui kernel and is designed to integrate to the finite temperature definition of the dual descriptor; it indicates the ambiphilic reactivity of a specific bond and enriches the traditional dual descriptor by allowing one to distinguish between the electron-accepting and electron-donating processes. Finally, the τ-hyper dual kernel is defined as the second-order derivative of the Fukui kernel and is proposed as a measure of the strength of ambiphilic bonding interactions. Although these quantities have never been proposed, our results for the τ-Fukui kernel and for τ-dual kernel can be derived in zero-temperature formulation of the chemical reactivity theory with, among other things, the widely-used parabolic interpolation model.

  5. Quasi-Dual-Packed-Kerneled Au49 (2,4-DMBT)27 Nanoclusters and the Influence of Kernel Packing on the Electrochemical Gap.

    PubMed

    Liao, Lingwen; Zhuang, Shengli; Wang, Pu; Xu, Yanan; Yan, Nan; Dong, Hongwei; Wang, Chengming; Zhao, Yan; Xia, Nan; Li, Jin; Deng, Haiteng; Pei, Yong; Tian, Shi-Kai; Wu, Zhikun

    2017-10-02

    Although face-centered cubic (fcc), body-centered cubic (bcc), hexagonal close-packed (hcp), and other structured gold nanoclusters have been reported, it was unclear whether gold nanoclusters with mix-packed (fcc and non-fcc) kernels exist, and the correlation between kernel packing and the properties of gold nanoclusters is unknown. A Au 49 (2,4-DMBT) 27 nanocluster with a shell electron count of 22 has now been been synthesized and structurally resolved by single-crystal X-ray crystallography, which revealed that Au 49 (2,4-DMBT) 27 contains a unique Au 34 kernel consisting of one quasi-fcc-structured Au 21 and one non-fcc-structured Au 13 unit (where 2,4-DMBTH=2,4-dimethylbenzenethiol). Further experiments revealed that the kernel packing greatly influences the electrochemical gap (EG) and the fcc structure has a larger EG than the investigated non-fcc structure. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Application of MODIS-Derived Active Fire Radiative Energy to Fire Disaster and Smoke Pollution Monitoring

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Kaufman, Yoram J.; Hao, Wei Min; Habib, Shahid

    2004-01-01

    The radiative energy emitted by large fires and the corresponding smoke aerosol loading are simultaneously measured from the MODIS sensor from both the Terra and Aqua satellites. Quantitative relationships between the rates of emission of fire radiative energy and smoke are being developed for different fire-prone regions of the globe. Preliminary results are presented. When fully developed, the system will enable the use of MODIS direct broadcast fire data for near real-time monitoring of fire strength and smoke emission as well as forecasting of fire progression and smoke dispersion, several hours to a few days in advance.

  7. Fast generation of sparse random kernel graphs

    DOE PAGES

    Hagberg, Aric; Lemons, Nathan; Du, Wen -Bo

    2015-09-10

    The development of kernel-based inhomogeneous random graphs has provided models that are flexible enough to capture many observed characteristics of real networks, and that are also mathematically tractable. We specify a class of inhomogeneous random graph models, called random kernel graphs, that produces sparse graphs with tunable graph properties, and we develop an efficient generation algorithm to sample random instances from this model. As real-world networks are usually large, it is essential that the run-time of generation algorithms scales better than quadratically in the number of vertices n. We show that for many practical kernels our algorithm runs in timemore » at most ο(n(logn)²). As an example, we show how to generate samples of power-law degree distribution graphs with tunable assortativity.« less

  8. Wavelet SVM in Reproducing Kernel Hilbert Space for hyperspectral remote sensing image classification

    NASA Astrophysics Data System (ADS)

    Du, Peijun; Tan, Kun; Xing, Xiaoshi

    2010-12-01

    Combining Support Vector Machine (SVM) with wavelet analysis, we constructed wavelet SVM (WSVM) classifier based on wavelet kernel functions in Reproducing Kernel Hilbert Space (RKHS). In conventional kernel theory, SVM is faced with the bottleneck of kernel parameter selection which further results in time-consuming and low classification accuracy. The wavelet kernel in RKHS is a kind of multidimensional wavelet function that can approximate arbitrary nonlinear functions. Implications on semiparametric estimation are proposed in this paper. Airborne Operational Modular Imaging Spectrometer II (OMIS II) hyperspectral remote sensing image with 64 bands and Reflective Optics System Imaging Spectrometer (ROSIS) data with 115 bands were used to experiment the performance and accuracy of the proposed WSVM classifier. The experimental results indicate that the WSVM classifier can obtain the highest accuracy when using the Coiflet Kernel function in wavelet transform. In contrast with some traditional classifiers, including Spectral Angle Mapping (SAM) and Minimum Distance Classification (MDC), and SVM classifier using Radial Basis Function kernel, the proposed wavelet SVM classifier using the wavelet kernel function in Reproducing Kernel Hilbert Space is capable of improving classification accuracy obviously.

  9. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature

    PubMed Central

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems. PMID:29099838

  10. Distributed smoothed tree kernel for protein-protein interaction extraction from the biomedical literature.

    PubMed

    Murugesan, Gurusamy; Abdulkadhar, Sabenabanu; Natarajan, Jeyakumar

    2017-01-01

    Automatic extraction of protein-protein interaction (PPI) pairs from biomedical literature is a widely examined task in biological information extraction. Currently, many kernel based approaches such as linear kernel, tree kernel, graph kernel and combination of multiple kernels has achieved promising results in PPI task. However, most of these kernel methods fail to capture the semantic relation information between two entities. In this paper, we present a special type of tree kernel for PPI extraction which exploits both syntactic (structural) and semantic vectors information known as Distributed Smoothed Tree kernel (DSTK). DSTK comprises of distributed trees with syntactic information along with distributional semantic vectors representing semantic information of the sentences or phrases. To generate robust machine learning model composition of feature based kernel and DSTK were combined using ensemble support vector machine (SVM). Five different corpora (AIMed, BioInfer, HPRD50, IEPA, and LLL) were used for evaluating the performance of our system. Experimental results show that our system achieves better f-score with five different corpora compared to other state-of-the-art systems.

  11. Airborne characterization of smoke marker ratios from prescribed burning

    Treesearch

    A. P. Sullivan; A. A. May; T. Lee; G. R. McMeeking; S. M. Kreidenweis; S. K. Akagi; R. J. Yokelson; S. P. Urbanski; J. L. Collett

    2014-01-01

    A Particle-Into-Liquid Sampler - Total Organic Carbon (PILS-TOC) and fraction collector system was flown aboard a Twin Otter aircraft sampling prescribed burning emissions in South Carolina in November 2011 to obtain smoke marker measurements. The fraction collector provided 2 min time-integrated offline samples for carbohydrate (i.e., smoke markers levoglucosan,...

  12. Genome-wide Association Analysis of Kernel Weight in Hard Winter Wheat

    USDA-ARS?s Scientific Manuscript database

    Wheat kernel weight is an important and heritable component of wheat grain yield and a key predictor of flour extraction. Genome-wide association analysis was conducted to identify genomic regions associated with kernel weight and kernel weight environmental response in 8 trials of 299 hard winter ...

  13. Evidence-Based Kernels: Fundamental Units of Behavioral Influence

    ERIC Educational Resources Information Center

    Embry, Dennis D.; Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior-influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of…

  14. Noise kernels of stochastic gravity in conformally-flat spacetimes

    NASA Astrophysics Data System (ADS)

    Cho, H. T.; Hu, B. L.

    2015-03-01

    The central object in the theory of semiclassical stochastic gravity is the noise kernel, which is the symmetric two point correlation function of the stress-energy tensor. Using the corresponding Wightman functions in Minkowski, Einstein and open Einstein spaces, we construct the noise kernels of a conformally coupled scalar field in these spacetimes. From them we show that the noise kernels in conformally-flat spacetimes, including the Friedmann-Robertson-Walker universes, can be obtained in closed analytic forms by using a combination of conformal and coordinate transformations.

  15. Travel-time sensitivity kernels in long-range propagation.

    PubMed

    Skarsoulis, E K; Cornuelle, B D; Dzieciuch, M A

    2009-11-01

    Wave-theoretic travel-time sensitivity kernels (TSKs) are calculated in two-dimensional (2D) and three-dimensional (3D) environments and their behavior with increasing propagation range is studied and compared to that of ray-theoretic TSKs and corresponding Fresnel-volumes. The differences between the 2D and 3D TSKs average out when horizontal or cross-range marginals are considered, which indicates that they are not important in the case of range-independent sound-speed perturbations or perturbations of large scale compared to the lateral TSK extent. With increasing range, the wave-theoretic TSKs expand in the horizontal cross-range direction, their cross-range extent being comparable to that of the corresponding free-space Fresnel zone, whereas they remain bounded in the vertical. Vertical travel-time sensitivity kernels (VTSKs)-one-dimensional kernels describing the effect of horizontally uniform sound-speed changes on travel-times-are calculated analytically using a perturbation approach, and also numerically, as horizontal marginals of the corresponding TSKs. Good agreement between analytical and numerical VTSKs, as well as between 2D and 3D VTSKs, is found. As an alternative method to obtain wave-theoretic sensitivity kernels, the parabolic approximation is used; the resulting TSKs and VTSKs are in good agreement with normal-mode results. With increasing range, the wave-theoretic VTSKs approach the corresponding ray-theoretic sensitivity kernels.

  16. Multi-instrument comparison and compilation of non-methane organic gas emissions from biomass burning and implications for smoke-derived secondary organic aerosol precursors

    NASA Astrophysics Data System (ADS)

    Hatch, Lindsay E.; Yokelson, Robert J.; Stockwell, Chelsea E.; Veres, Patrick R.; Simpson, Isobel J.; Blake, Donald R.; Orlando, John J.; Barsanti, Kelley C.

    2017-01-01

    Multiple trace-gas instruments were deployed during the fourth Fire Lab at Missoula Experiment (FLAME-4), including the first application of proton-transfer-reaction time-of-flight mass spectrometry (PTR-TOFMS) and comprehensive two-dimensional gas chromatography-time-of-flight mass spectrometry (GC × GC-TOFMS) for laboratory biomass burning (BB) measurements. Open-path Fourier transform infrared spectroscopy (OP-FTIR) was also deployed, as well as whole-air sampling (WAS) with one-dimensional gas chromatography-mass spectrometry (GC-MS) analysis. This combination of instruments provided an unprecedented level of detection and chemical speciation. The chemical composition and emission factors (EFs) determined by these four analytical techniques were compared for four representative fuels. The results demonstrate that the instruments are highly complementary, with each covering some unique and important ranges of compositional space, thus demonstrating the need for multi-instrument approaches to adequately characterize BB smoke emissions. Emission factors for overlapping compounds generally compared within experimental uncertainty, despite some outliers, including monoterpenes. Data from all measurements were synthesized into a single EF database that includes over 500 non-methane organic gases (NMOGs) to provide a comprehensive picture of speciated, gaseous BB emissions. The identified compounds were assessed as a function of volatility; 6-11 % of the total NMOG EF was associated with intermediate-volatility organic compounds (IVOCs). These atmospherically relevant compounds historically have been unresolved in BB smoke measurements and thus are largely missing from emission inventories. Additionally, the identified compounds were screened for published secondary organic aerosol (SOA) yields. Of the total reactive carbon (defined as EF scaled by the OH rate constant and carbon number of each compound) in the BB emissions, 55-77 % was associated with compounds for

  17. Validation of Born Traveltime Kernels

    NASA Astrophysics Data System (ADS)

    Baig, A. M.; Dahlen, F. A.; Hung, S.

    2001-12-01

    Most inversions for Earth structure using seismic traveltimes rely on linear ray theory to translate observed traveltime anomalies into seismic velocity anomalies distributed throughout the mantle. However, ray theory is not an appropriate tool to use when velocity anomalies have scale lengths less than the width of the Fresnel zone. In the presence of these structures, we need to turn to a scattering theory in order to adequately describe all of the features observed in the waveform. By coupling the Born approximation to ray theory, the first order dependence of heterogeneity on the cross-correlated traveltimes (described by the Fréchet derivative or, more colourfully, the banana-doughnut kernel) may be determined. To determine for what range of parameters these banana-doughnut kernels outperform linear ray theory, we generate several random media specified by their statistical properties, namely the RMS slowness perturbation and the scale length of the heterogeneity. Acoustic waves are numerically generated from a point source using a 3-D pseudo-spectral wave propagation code. These waves are then recorded at a variety of propagation distances from the source introducing a third parameter to the problem: the number of wavelengths traversed by the wave. When all of the heterogeneity has scale lengths larger than the width of the Fresnel zone, ray theory does as good a job at predicting the cross-correlated traveltime as the banana-doughnut kernels do. Below this limit, wavefront healing becomes a significant effect and ray theory ceases to be effective even though the kernels remain relatively accurate provided the heterogeneity is weak. The study of wave propagation in random media is of a more general interest and we will also show our measurements of the velocity shift and the variance of traveltime compare to various theoretical predictions in a given regime.

  18. End-use quality of soft kernel durum wheat

    USDA-ARS?s Scientific Manuscript database

    Kernel texture is a major determinant of end-use quality of wheat. Durum wheat is known for its very hard texture, which influences how it is milled and for what products it is well suited. We developed soft kernel durum wheat lines via Ph1b-mediated homoeologous recombination with Dr. Leonard Joppa...

  19. Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery.

    PubMed

    Feng, Yunlong; Lv, Shao-Gao; Hang, Hanyuan; Suykens, Johan A K

    2016-03-01

    Kernelized elastic net regularization (KENReg) is a kernelization of the well-known elastic net regularization (Zou & Hastie, 2005). The kernel in KENReg is not required to be a Mercer kernel since it learns from a kernelized dictionary in the coefficient space. Feng, Yang, Zhao, Lv, and Suykens (2014) showed that KENReg has some nice properties including stability, sparseness, and generalization. In this letter, we continue our study on KENReg by conducting a refined learning theory analysis. This letter makes the following three main contributions. First, we present refined error analysis on the generalization performance of KENReg. The main difficulty of analyzing the generalization error of KENReg lies in characterizing the population version of its empirical target function. We overcome this by introducing a weighted Banach space associated with the elastic net regularization. We are then able to conduct elaborated learning theory analysis and obtain fast convergence rates under proper complexity and regularity assumptions. Second, we study the sparse recovery problem in KENReg with fixed design and show that the kernelization may improve the sparse recovery ability compared to the classical elastic net regularization. Finally, we discuss the interplay among different properties of KENReg that include sparseness, stability, and generalization. We show that the stability of KENReg leads to generalization, and its sparseness confidence can be derived from generalization. Moreover, KENReg is stable and can be simultaneously sparse, which makes it attractive theoretically and practically.

  20. Health Harms from Secondhand Smoke

    MedlinePlus

    ... an acute coronary event.” According to the report, experimental studies have found that secondhand smoke exposure causes ... air contaminant and therefore be subject to emissions control regulations to be promulgated by the State of California. In this report, ... the IARC concluded that there ...

  1. Fire, Fuel, and Smoke Science Program 2015 Research Accomplishments

    Treesearch

    Faith Ann Heinsch; Charles W. McHugh; Colin C. Hardy

    2016-01-01

    The Fire, Fuel, and Smoke Science Program (FFS) of the U.S. Forest Service, Rocky Mountain Research Station focuses on fundamental and applied research in wildland fire, from fire physics and fire ecology to fuels management and smoke emissions. Located at the Missoula Fire Sciences Laboratory in Montana, the scientists, engineers, technicians, and support...

  2. Important parameters for smoke plume rise simulation with Daysmoke

    Treesearch

    L. Liu; G.L. Achtemeier; S.L. Goodrick; W. Jackson

    2010-01-01

    Daysmoke is a local smoke transport model and has been used to provide smoke plume rise information. It includes a large number of parameters describing the dynamic and stochastic processes of particle upward movement, fallout, fluctuation, and burn emissions. This study identifies the important parameters for Daysmoke simulations of plume rise and seeks to understand...

  3. Prioritizing individual genetic variants after kernel machine testing using variable selection.

    PubMed

    He, Qianchuan; Cai, Tianxi; Liu, Yang; Zhao, Ni; Harmon, Quaker E; Almli, Lynn M; Binder, Elisabeth B; Engel, Stephanie M; Ressler, Kerry J; Conneely, Karen N; Lin, Xihong; Wu, Michael C

    2016-12-01

    Kernel machine learning methods, such as the SNP-set kernel association test (SKAT), have been widely used to test associations between traits and genetic polymorphisms. In contrast to traditional single-SNP analysis methods, these methods are designed to examine the joint effect of a set of related SNPs (such as a group of SNPs within a gene or a pathway) and are able to identify sets of SNPs that are associated with the trait of interest. However, as with many multi-SNP testing approaches, kernel machine testing can draw conclusion only at the SNP-set level, and does not directly inform on which one(s) of the identified SNP set is actually driving the associations. A recently proposed procedure, KerNel Iterative Feature Extraction (KNIFE), provides a general framework for incorporating variable selection into kernel machine methods. In this article, we focus on quantitative traits and relatively common SNPs, and adapt the KNIFE procedure to genetic association studies and propose an approach to identify driver SNPs after the application of SKAT to gene set analysis. Our approach accommodates several kernels that are widely used in SNP analysis, such as the linear kernel and the Identity by State (IBS) kernel. The proposed approach provides practically useful utilities to prioritize SNPs, and fills the gap between SNP set analysis and biological functional studies. Both simulation studies and real data application are used to demonstrate the proposed approach. © 2016 WILEY PERIODICALS, INC.

  4. Nonlinear PET parametric image reconstruction with MRI information using kernel method

    NASA Astrophysics Data System (ADS)

    Gong, Kuang; Wang, Guobao; Chen, Kevin T.; Catana, Ciprian; Qi, Jinyi

    2017-03-01

    Positron Emission Tomography (PET) is a functional imaging modality widely used in oncology, cardiology, and neurology. It is highly sensitive, but suffers from relatively poor spatial resolution, as compared with anatomical imaging modalities, such as magnetic resonance imaging (MRI). With the recent development of combined PET/MR systems, we can improve the PET image quality by incorporating MR information. Previously we have used kernel learning to embed MR information in static PET reconstruction and direct Patlak reconstruction. Here we extend this method to direct reconstruction of nonlinear parameters in a compartment model by using the alternating direction of multiplier method (ADMM) algorithm. Simulation studies show that the proposed method can produce superior parametric images compared with existing methods.

  5. Recent emissions research in southwestern shrub and grassland fuels

    Treesearch

    David R. Weise; Wayne Miller; David R. Cocker; Heejung Jung; Seyedehsan Hosseini; Marko Princevac; Robert J. Yokelson; Ian Burling; Sheryl Akagi; Shawn Urbanski; WeiMin Hao

    2015-01-01

    While it is currently challenging to use prescribed burning in chaparral and other southwestern shrub fuel types due to many constraints, any such activities require smoke management planning. Information on fuels and emissions from chaparral were limited and based on older sampling systems. The DoD SERDP program funded a project to measure fuels and smoke emissions in...

  6. Multitasking kernel for the C and Fortran programming languages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, E.D. III

    1984-09-01

    A multitasking kernel for the C and Fortran programming languages which runs on the Unix operating system is presented. The kernel provides a multitasking environment which serves two purposes. The first is to provide an efficient portable environment for the coding, debugging and execution of production multiprocessor programs. The second is to provide a means of evaluating the performance of a multitasking program on model multiprocessors. The performance evaluation features require no changes in the source code of the application and are implemented as a set of compile and run time options in the kernel.

  7. Deep kernel learning method for SAR image target recognition

    NASA Astrophysics Data System (ADS)

    Chen, Xiuyuan; Peng, Xiyuan; Duan, Ran; Li, Junbao

    2017-10-01

    With the development of deep learning, research on image target recognition has made great progress in recent years. Remote sensing detection urgently requires target recognition for military, geographic, and other scientific research. This paper aims to solve the synthetic aperture radar image target recognition problem by combining deep and kernel learning. The model, which has a multilayer multiple kernel structure, is optimized layer by layer with the parameters of Support Vector Machine and a gradient descent algorithm. This new deep kernel learning method improves accuracy and achieves competitive recognition results compared with other learning methods.

  8. PERI - Auto-tuning Memory Intensive Kernels for Multicore

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H; Williams, Samuel; Datta, Kaushik

    2008-06-24

    We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to Sparse Matrix Vector Multiplication (SpMV), the explicit heat equation PDE on a regular grid (Stencil), and a lattice Boltzmann application (LBMHD). We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon Clovertown, AMD Opteron Barcelona, Sun Victoria Falls, and the Sony-Toshiba-IBM (STI) Cell. Rather than hand-tuning each kernel for each system, we developmore » a code generator for each kernel that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned kernel applications often achieve a better than 4X improvement compared with the original code. Additionally, we analyze a Roofline performance model for each platform to reveal hardware bottlenecks and software challenges for future multicore systems and applications.« less

  9. An Enhanced Smoke Detection Using MODIS Measurements

    NASA Astrophysics Data System (ADS)

    Xie, Y.; Qu, J.; Xiong, X.; Hao, X.; Wang, W.; Wang, L.

    2005-12-01

    Smoke emitted from wildfire fires or prescribed fires is one of the major pollutions that pose a risk to human health and affect the air quality significantly. The remote sensing technique has been demonstrated as an efficient approach for detecting and tracing smoke plume. As a mixture pollutant, smoke does not have stable spectral signature because of diversified component mixing levels in different situation, but it has some particular characteristics different from others such as cloud, soil, water and so on. In earlier studies, we have already developed a multi-threshold algorithm to detect smoke in the eastern United States by combining both MODIS reflective solar bands and thermal emissive bands measurements. In order to apply out approach to global scale, we have enhanced the smoke detection algorithm by taking the land surface type into account. Smoke pixels will be output as well as the confidence in the quality of product in final result. In addition, smoke detection is also helpful to fire detection. With current fire detection algorithm, some small and cool fires can not be detected. However, understanding the features and spread direction of smoke can provide us a potential way to identify these fires.

  10. An Ensemble Approach to Building Mercer Kernels with Prior Information

    NASA Technical Reports Server (NTRS)

    Srivastava, Ashok N.; Schumann, Johann; Fischer, Bernd

    2005-01-01

    This paper presents a new methodology for automatic knowledge driven data mining based on the theory of Mercer Kernels, which are highly nonlinear symmetric positive definite mappings from the original image space to a very high, possibly dimensional feature space. we describe a new method called Mixture Density Mercer Kernels to learn kernel function directly from data, rather than using pre-defined kernels. These data adaptive kernels can encode prior knowledge in the kernel using a Bayesian formulation, thus allowing for physical information to be encoded in the model. Specifically, we demonstrate the use of the algorithm in situations with extremely small samples of data. We compare the results with existing algorithms on data from the Sloan Digital Sky Survey (SDSS) and demonstrate the method's superior performance against standard methods. The code for these experiments has been generated with the AUTOBAYES tool, which automatically generates efficient and documented C/C++ code from abstract statistical model specifications. The core of the system is a schema library which contains templates for learning and knowledge discovery algorithms like different versions of EM, or numeric optimization methods like conjugate gradient methods. The template instantiation is supported by symbolic-algebraic computations, which allows AUTOBAYES to find closed-form solutions and, where possible, to integrate them into the code.

  11. A survey of kernel-type estimators for copula and their applications

    NASA Astrophysics Data System (ADS)

    Sumarjaya, I. W.

    2017-10-01

    Copulas have been widely used to model nonlinear dependence structure. Main applications of copulas include areas such as finance, insurance, hydrology, rainfall to name but a few. The flexibility of copula allows researchers to model dependence structure beyond Gaussian distribution. Basically, a copula is a function that couples multivariate distribution functions to their one-dimensional marginal distribution functions. In general, there are three methods to estimate copula. These are parametric, nonparametric, and semiparametric method. In this article we survey kernel-type estimators for copula such as mirror reflection kernel, beta kernel, transformation method and local likelihood transformation method. Then, we apply these kernel methods to three stock indexes in Asia. The results of our analysis suggest that, albeit variation in information criterion values, the local likelihood transformation method performs better than the other kernel methods.

  12. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies

    PubMed Central

    Manitz, Juliane; Burger, Patricia; Amos, Christopher I.; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility. PMID:28785300

  13. Pathway-Based Kernel Boosting for the Analysis of Genome-Wide Association Studies.

    PubMed

    Friedrichs, Stefanie; Manitz, Juliane; Burger, Patricia; Amos, Christopher I; Risch, Angela; Chang-Claude, Jenny; Wichmann, Heinz-Erich; Kneib, Thomas; Bickeböller, Heike; Hofner, Benjamin

    2017-01-01

    The analysis of genome-wide association studies (GWAS) benefits from the investigation of biologically meaningful gene sets, such as gene-interaction networks (pathways). We propose an extension to a successful kernel-based pathway analysis approach by integrating kernel functions into a powerful algorithmic framework for variable selection, to enable investigation of multiple pathways simultaneously. We employ genetic similarity kernels from the logistic kernel machine test (LKMT) as base-learners in a boosting algorithm. A model to explain case-control status is created iteratively by selecting pathways that improve its prediction ability. We evaluated our method in simulation studies adopting 50 pathways for different sample sizes and genetic effect strengths. Additionally, we included an exemplary application of kernel boosting to a rheumatoid arthritis and a lung cancer dataset. Simulations indicate that kernel boosting outperforms the LKMT in certain genetic scenarios. Applications to GWAS data on rheumatoid arthritis and lung cancer resulted in sparse models which were based on pathways interpretable in a clinical sense. Kernel boosting is highly flexible in terms of considered variables and overcomes the problem of multiple testing. Additionally, it enables the prediction of clinical outcomes. Thus, kernel boosting constitutes a new, powerful tool in the analysis of GWAS data and towards the understanding of biological processes involved in disease susceptibility.

  14. Fire, Fuel, and Smoke Science Program: 2013 Research accomplishments

    Treesearch

    Faith Ann Heinsch; Robin J. Innes; Colin C. Hardy; Kristine M. Lee

    2014-01-01

    The Fire, Fuel, and Smoke Science Program (FFS) of the U.S. Forest Service, Rocky Mountain Research Station, focuses on fundamental and applied research in wildland fire, from fire physics and fire ecology to fuels management and smoke emissions. Located at the Missoula Fire Sciences Laboratory in Montana, the scientists, engineers, technicians, and support staff in...

  15. Oil point and mechanical behaviour of oil palm kernels in linear compression

    NASA Astrophysics Data System (ADS)

    Kabutey, Abraham; Herak, David; Choteborsky, Rostislav; Mizera, Čestmír; Sigalingging, Riswanti; Akangbe, Olaosebikan Layi

    2017-07-01

    The study described the oil point and mechanical properties of roasted and unroasted bulk oil palm kernels under compression loading. The literature information available is very limited. A universal compression testing machine and vessel diameter of 60 mm with a plunger were used by applying maximum force of 100 kN and speed ranging from 5 to 25 mm min-1. The initial pressing height of the bulk kernels was measured at 40 mm. The oil point was determined by a litmus test for each deformation level of 5, 10, 15, 20, and 25 mm at a minimum speed of 5 mmmin-1. The measured parameters were the deformation, deformation energy, oil yield, oil point strain and oil point pressure. Clearly, the roasted bulk kernels required less deformation energy compared to the unroasted kernels for recovering the kernel oil. However, both kernels were not permanently deformed. The average oil point strain was determined at 0.57. The study is an essential contribution to pursuing innovative methods for processing palm kernel oil in rural areas of developing countries.

  16. Pressure Sensitivity Kernels Applied to Time-reversal Acoustics

    DTIC Science & Technology

    2009-06-29

    experimental data, along with an internal wave model, using various metrics. The linear limitations of the kernels are explored in the context of time...Acknowledgments . . . . . . . . . . . . . . . . . . . . . . 82 3.A Internal wave modeling . . . . . . . . . . . . . . . . . . . 82 Bibliography...multipaths corresponding to direct path, single surface/bottom bounce, double bounce off the surface and bot- tom, Bottom: Time-domain sensitivity kernel for

  17. Optimal Bandwidth Selection in Observed-Score Kernel Equating

    ERIC Educational Resources Information Center

    Häggström, Jenny; Wiberg, Marie

    2014-01-01

    The selection of bandwidth in kernel equating is important because it has a direct impact on the equated test scores. The aim of this article is to examine the use of double smoothing when selecting bandwidths in kernel equating and to compare double smoothing with the commonly used penalty method. This comparison was made using both an equivalent…

  18. Unconventional Signal Processing Using the Cone Kernel Time-Frequency Representation.

    DTIC Science & Technology

    1992-10-30

    Wigner - Ville distribution ( WVD ), the Choi- Williams distribution , and the cone kernel distribution were compared with the spectrograms. Results were...ambiguity function. Figures A-18(c) and (d) are the Wigner - Ville Distribution ( WVD ) and CK-TFR Doppler maps. In this noiseless case all three exhibit...kernel is the basis for the well known Wigner - Ville distribution . In A-9(2), the cone kernel defined by Zhao, Atlas and Marks [21 is described

  19. Kernel structures for Clouds

    NASA Technical Reports Server (NTRS)

    Spafford, Eugene H.; Mckendry, Martin S.

    1986-01-01

    An overview of the internal structure of the Clouds kernel was presented. An indication of how these structures will interact in the prototype Clouds implementation is given. Many specific details have yet to be determined and await experimentation with an actual working system.

  20. Smoke incursions into urban areas: simulation of a Georgia prescribed burn

    Treesearch

    Y. Liu; S. Goodrick; G. Achtemeier

    2009-01-01

    This study investigates smoke incursion into urban areas by examining a prescribed burn in central Georgia,USA, on 28 February 2007. Simulations were conducted with a regional modeling framework to understand transport, dispersion,and structure of smoke plumes, the air quality effects, sensitivity to emissions,...

  1. A transdisciplinary approach to understanding the health effects of wildfire and prescribed fire smoke regimes

    NASA Astrophysics Data System (ADS)

    Williamson, G. J.; Bowman, D. M. J. S.; Price, O. F.; Henderson, S. B.; Johnston, F. H.

    2016-12-01

    Prescribed burning is used to reduce the occurrence, extent and severity of uncontrolled fires in many flammable landscapes. However, epidemiologic evidence of the human health impacts of landscape fire smoke emissions is shaping fire management practice through increasingly stringent environmental regulation and public health policy. An unresolved question, critical for sustainable fire management, concerns the comparative human health effects of smoke from wild and prescribed fires. Here we review current knowledge of the health effects of landscape fire emissions and consider the similarities and differences in smoke from wild and prescribed fires with respect to the typical combustion conditions and fuel properties, the quality and magnitude of air pollution emissions, and the potential for dispersion to large populations. We further examine the interactions between these considerations, and how they may shape the longer term smoke regimes to which populations are exposed. We identify numerous knowledge gaps and propose a conceptual framework that describes pathways to better understanding of the health trade-offs of prescribed and wildfire smoke regimes.

  2. Emissions from laboratory combustion of wildland fuels: Emission factors and source profiles

    Treesearch

    L.-W. Anthony Chen; Hans Moosmuller; W. Patrick Arnott; Judith C. Chow; John G. Watson; Ronald A. Susott; Ronald E. Babbitt; Cyle E. Wold; Emily N. Lincoln; Wei Min Hao

    2007-01-01

    Combustion of wildland fuels represents a major source of particulate matter (PM) and light-absorbing elemental carbon (EC) on a national and global scale, but the emission factors and source profiles have not been well characterized with respect to different fuels and combustion phases. These uncertainties limit the accuracy of current emission inventories, smoke...

  3. TICK: Transparent Incremental Checkpointing at Kernel Level

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Petrini, Fabrizio; Gioiosa, Roberto

    2004-10-25

    TICK is a software package implemented in Linux 2.6 that allows the save and restore of user processes, without any change to the user code or binary. With TICK a process can be suspended by the Linux kernel upon receiving an interrupt and saved in a file. This file can be later thawed in another computer running Linux (potentially the same computer). TICK is implemented as a Linux kernel module, in the Linux version 2.6.5

  4. Phenolic constituents of shea (Vitellaria paradoxa) kernels.

    PubMed

    Maranz, Steven; Wiesman, Zeev; Garti, Nissim

    2003-10-08

    Analysis of the phenolic constituents of shea (Vitellaria paradoxa) kernels by LC-MS revealed eight catechin compounds-gallic acid, catechin, epicatechin, epicatechin gallate, gallocatechin, epigallocatechin, gallocatechin gallate, and epigallocatechin gallate-as well as quercetin and trans-cinnamic acid. The mean kernel content of the eight catechin compounds was 4000 ppm (0.4% of kernel dry weight), with a 2100-9500 ppm range. Comparison of the profiles of the six major catechins from 40 Vitellaria provenances from 10 African countries showed that the relative proportions of these compounds varied from region to region. Gallic acid was the major phenolic compound, comprising an average of 27% of the measured total phenols and exceeding 70% in some populations. Colorimetric analysis (101 samples) of total polyphenols extracted from shea butter into hexane gave an average of 97 ppm, with the values for different provenances varying between 62 and 135 ppm of total polyphenols.

  5. Occurrence of 'super soft' wheat kernel texture in hexaploid and tetraploid wheats

    USDA-ARS?s Scientific Manuscript database

    Wheat kernel texture is a key trait that governs milling performance, flour starch damage, flour particle size, flour hydration properties, and baking quality. Kernel texture is commonly measured using the Perten Single Kernel Characterization System (SKCS). The SKCS returns texture values (Hardness...

  6. Finite-frequency sensitivity kernels for head waves

    NASA Astrophysics Data System (ADS)

    Zhang, Zhigang; Shen, Yang; Zhao, Li

    2007-11-01

    Head waves are extremely important in determining the structure of the predominantly layered Earth. While several recent studies have shown the diffractive nature and the 3-D Fréchet kernels of finite-frequency turning waves, analogues of head waves in a continuous velocity structure, the finite-frequency effects and sensitivity kernels of head waves are yet to be carefully examined. We present the results of a numerical study focusing on the finite-frequency effects of head waves. Our model has a low-velocity layer over a high-velocity half-space and a cylindrical-shaped velocity perturbation placed beneath the interface at different locations. A 3-D finite-difference method is used to calculate synthetic waveforms. Traveltime and amplitude anomalies are measured by the cross-correlation of synthetic seismograms from models with and without the velocity perturbation and are compared to the 3-D sensitivity kernels constructed from full waveform simulations. The results show that the head wave arrival-time and amplitude are influenced by the velocity structure surrounding the ray path in a pattern that is consistent with the Fresnel zones. Unlike the `banana-doughnut' traveltime sensitivity kernels of turning waves, the traveltime sensitivity of the head wave along the ray path below the interface is weak, but non-zero. Below the ray path, the traveltime sensitivity reaches the maximum (absolute value) at a depth that depends on the wavelength and propagation distance. The sensitivity kernels vary with the vertical velocity gradient in the lower layer, but the variation is relatively small at short propagation distances when the vertical velocity gradient is within the range of the commonly accepted values. Finally, the depression or shoaling of the interface results in increased or decreased sensitivities, respectively, beneath the interface topography.

  7. DNA sequence+shape kernel enables alignment-free modeling of transcription factor binding.

    PubMed

    Ma, Wenxiu; Yang, Lin; Rohs, Remo; Noble, William Stafford

    2017-10-01

    Transcription factors (TFs) bind to specific DNA sequence motifs. Several lines of evidence suggest that TF-DNA binding is mediated in part by properties of the local DNA shape: the width of the minor groove, the relative orientations of adjacent base pairs, etc. Several methods have been developed to jointly account for DNA sequence and shape properties in predicting TF binding affinity. However, a limitation of these methods is that they typically require a training set of aligned TF binding sites. We describe a sequence + shape kernel that leverages DNA sequence and shape information to better understand protein-DNA binding preference and affinity. This kernel extends an existing class of k-mer based sequence kernels, based on the recently described di-mismatch kernel. Using three in vitro benchmark datasets, derived from universal protein binding microarrays (uPBMs), genomic context PBMs (gcPBMs) and SELEX-seq data, we demonstrate that incorporating DNA shape information improves our ability to predict protein-DNA binding affinity. In particular, we observe that (i) the k-spectrum + shape model performs better than the classical k-spectrum kernel, particularly for small k values; (ii) the di-mismatch kernel performs better than the k-mer kernel, for larger k; and (iii) the di-mismatch + shape kernel performs better than the di-mismatch kernel for intermediate k values. The software is available at https://bitbucket.org/wenxiu/sequence-shape.git. rohs@usc.edu or william-noble@uw.edu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  8. Multiple Kernel Sparse Representation based Orthogonal Discriminative Projection and Its Cost-Sensitive Extension.

    PubMed

    Zhang, Guoqing; Sun, Huaijiang; Xia, Guiyu; Sun, Quansen

    2016-07-07

    Sparse representation based classification (SRC) has been developed and shown great potential for real-world application. Based on SRC, Yang et al. [10] devised a SRC steered discriminative projection (SRC-DP) method. However, as a linear algorithm, SRC-DP cannot handle the data with highly nonlinear distribution. Kernel sparse representation-based classifier (KSRC) is a non-linear extension of SRC and can remedy the drawback of SRC. KSRC requires the use of a predetermined kernel function and selection of the kernel function and its parameters is difficult. Recently, multiple kernel learning for SRC (MKL-SRC) [22] has been proposed to learn a kernel from a set of base kernels. However, MKL-SRC only considers the within-class reconstruction residual while ignoring the between-class relationship, when learning the kernel weights. In this paper, we propose a novel multiple kernel sparse representation-based classifier (MKSRC), and then we use it as a criterion to design a multiple kernel sparse representation based orthogonal discriminative projection method (MK-SR-ODP). The proposed algorithm aims at learning a projection matrix and a corresponding kernel from the given base kernels such that in the low dimension subspace the between-class reconstruction residual is maximized and the within-class reconstruction residual is minimized. Furthermore, to achieve a minimum overall loss by performing recognition in the learned low-dimensional subspace, we introduce cost information into the dimensionality reduction method. The solutions for the proposed method can be efficiently found based on trace ratio optimization method [33]. Extensive experimental results demonstrate the superiority of the proposed algorithm when compared with the state-of-the-art methods.

  9. Mixed kernel function support vector regression for global sensitivity analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Kai; Lu, Zhenzhou; Wei, Yuhao; Shi, Yan; Zhou, Yicheng

    2017-11-01

    Global sensitivity analysis (GSA) plays an important role in exploring the respective effects of input variables on an assigned output response. Amongst the wide sensitivity analyses in literature, the Sobol indices have attracted much attention since they can provide accurate information for most models. In this paper, a mixed kernel function (MKF) based support vector regression (SVR) model is employed to evaluate the Sobol indices at low computational cost. By the proposed derivation, the estimation of the Sobol indices can be obtained by post-processing the coefficients of the SVR meta-model. The MKF is constituted by the orthogonal polynomials kernel function and Gaussian radial basis kernel function, thus the MKF possesses both the global characteristic advantage of the polynomials kernel function and the local characteristic advantage of the Gaussian radial basis kernel function. The proposed approach is suitable for high-dimensional and non-linear problems. Performance of the proposed approach is validated by various analytical functions and compared with the popular polynomial chaos expansion (PCE). Results demonstrate that the proposed approach is an efficient method for global sensitivity analysis.

  10. Semisupervised kernel marginal Fisher analysis for face recognition.

    PubMed

    Wang, Ziqiang; Sun, Xia; Sun, Lijun; Huang, Yuchun

    2013-01-01

    Dimensionality reduction is a key problem in face recognition due to the high-dimensionality of face image. To effectively cope with this problem, a novel dimensionality reduction algorithm called semisupervised kernel marginal Fisher analysis (SKMFA) for face recognition is proposed in this paper. SKMFA can make use of both labelled and unlabeled samples to learn the projection matrix for nonlinear dimensionality reduction. Meanwhile, it can successfully avoid the singularity problem by not calculating the matrix inverse. In addition, in order to make the nonlinear structure captured by the data-dependent kernel consistent with the intrinsic manifold structure, a manifold adaptive nonparameter kernel is incorporated into the learning process of SKMFA. Experimental results on three face image databases demonstrate the effectiveness of our proposed algorithm.

  11. Searching remote homology with spectral clustering with symmetry in neighborhood cluster kernels.

    PubMed

    Maulik, Ujjwal; Sarkar, Anasua

    2013-01-01

    Remote homology detection among proteins utilizing only the unlabelled sequences is a central problem in comparative genomics. The existing cluster kernel methods based on neighborhoods and profiles and the Markov clustering algorithms are currently the most popular methods for protein family recognition. The deviation from random walks with inflation or dependency on hard threshold in similarity measure in those methods requires an enhancement for homology detection among multi-domain proteins. We propose to combine spectral clustering with neighborhood kernels in Markov similarity for enhancing sensitivity in detecting homology independent of "recent" paralogs. The spectral clustering approach with new combined local alignment kernels more effectively exploits the unsupervised protein sequences globally reducing inter-cluster walks. When combined with the corrections based on modified symmetry based proximity norm deemphasizing outliers, the technique proposed in this article outperforms other state-of-the-art cluster kernels among all twelve implemented kernels. The comparison with the state-of-the-art string and mismatch kernels also show the superior performance scores provided by the proposed kernels. Similar performance improvement also is found over an existing large dataset. Therefore the proposed spectral clustering framework over combined local alignment kernels with modified symmetry based correction achieves superior performance for unsupervised remote homolog detection even in multi-domain and promiscuous domain proteins from Genolevures database families with better biological relevance. Source code available upon request. sarkar@labri.fr.

  12. A dry-inoculation method for nut kernels.

    PubMed

    Blessington, Tyann; Theofel, Christopher G; Harris, Linda J

    2013-04-01

    A dry-inoculation method for almonds and walnuts was developed to eliminate the need for the postinoculation drying required for wet-inoculation methods. The survival of Salmonella enterica Enteritidis PT 30 on wet- and dry-inoculated almond and walnut kernels stored under ambient conditions (average: 23 °C; 41 or 47% RH) was then compared over 14 weeks. For wet inoculation, an aqueous Salmonella preparation was added directly to almond or walnut kernels, which were then dried under ambient conditions (3 or 7 days, respectively) to initial nut moisture levels. For the dry inoculation, liquid inoculum was mixed with sterilized sand and dried for 24 h at 40 °C. The dried inoculated sand was mixed with kernels, and the sand was removed by shaking the mixture in a sterile sieve. Mixing procedures to optimize the bacterial transfer from sand to kernel were evaluated; in general, similar levels were achieved on walnuts (4.8-5.2 log CFU/g) and almonds (4.2-5.1 log CFU/g). The decline of Salmonella Enteritidis populations was similar during ambient storage (98 days) for both wet-and dry-inoculation methods for both almonds and walnuts. The dry-inoculation method mimics some of the suspected routes of contamination for tree nuts and may be appropriate for some postharvest challenge studies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Differential evolution algorithm-based kernel parameter selection for Fukunaga-Koontz Transform subspaces construction

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin

    2015-10-01

    The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.

  14. Design of a multiple kernel learning algorithm for LS-SVM by convex programming.

    PubMed

    Jian, Ling; Xia, Zhonghang; Liang, Xijun; Gao, Chuanhou

    2011-06-01

    As a kernel based method, the performance of least squares support vector machine (LS-SVM) depends on the selection of the kernel as well as the regularization parameter (Duan, Keerthi, & Poo, 2003). Cross-validation is efficient in selecting a single kernel and the regularization parameter; however, it suffers from heavy computational cost and is not flexible to deal with multiple kernels. In this paper, we address the issue of multiple kernel learning for LS-SVM by formulating it as semidefinite programming (SDP). Furthermore, we show that the regularization parameter can be optimized in a unified framework with the kernel, which leads to an automatic process for model selection. Extensive experimental validations are performed and analyzed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Chemical components of cold pressed kernel oils from different Torreya grandis cultivars.

    PubMed

    He, Zhiyong; Zhu, Haidong; Li, Wangling; Zeng, Maomao; Wu, Shengfang; Chen, Shangwei; Qin, Fang; Chen, Jie

    2016-10-15

    The chemical compositions of cold pressed kernel oils of seven Torreya grandis cultivars from China were analyzed in this study. The contents of the chemical components of T. grandis kernels and kernel oils varied to different extents with the cultivar. The T. grandis kernels contained relatively high oil and protein content (45.80-53.16% and 10.34-14.29%, respectively). The kernel oils were rich in unsaturated fatty acids including linoleic (39.39-47.77%), oleic (30.47-37.54%) and eicosatrienoic acid (6.78-8.37%). The kernel oils contained some abundant bioactive substances such as tocopherols (0.64-1.77mg/g) consisting of α-, β-, γ- and δ-isomers; sterols including β-sitosterol (0.90-1.29mg/g), campesterol (0.06-0.32mg/g) and stigmasterol (0.04-0.18mg/g) in addition to polyphenols (9.22-22.16μgGAE/g). The results revealed that the T. grandis kernel oils possessed the potentially important nutrition and health benefits and could be used as oils in the human diet or functional ingredients in the food industry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Quasi-kernel polynomials and convergence results for quasi-minimal residual iterations

    NASA Technical Reports Server (NTRS)

    Freund, Roland W.

    1992-01-01

    Recently, Freund and Nachtigal have proposed a novel polynominal-based iteration, the quasi-minimal residual algorithm (QMR), for solving general nonsingular non-Hermitian linear systems. Motivated by the QMR method, we have introduced the general concept of quasi-kernel polynomials, and we have shown that the QMR algorithm is based on a particular instance of quasi-kernel polynomials. In this paper, we continue our study of quasi-kernel polynomials. In particular, we derive bounds for the norms of quasi-kernel polynomials. These results are then applied to obtain convergence theorems both for the QMR method and for a transpose-free variant of QMR, the TFQMR algorithm.

  17. Mapping QTLs controlling kernel dimensions in a wheat inter-varietal RIL mapping population.

    PubMed

    Cheng, Ruiru; Kong, Zhongxin; Zhang, Liwei; Xie, Quan; Jia, Haiyan; Yu, Dong; Huang, Yulong; Ma, Zhengqiang

    2017-07-01

    Seven kernel dimension QTLs were identified in wheat, and kernel thickness was found to be the most important dimension for grain weight improvement. Kernel morphology and weight of wheat (Triticum aestivum L.) affect both yield and quality; however, the genetic basis of these traits and their interactions has not been fully understood. In this study, to investigate the genetic factors affecting kernel morphology and the association of kernel morphology traits with kernel weight, kernel length (KL), width (KW) and thickness (KT) were evaluated, together with hundred-grain weight (HGW), in a recombinant inbred line population derived from Nanda2419 × Wangshuibai, with data from five trials (two different locations over 3 years). The results showed that HGW was more closely correlated with KT and KW than with KL. A whole genome scan revealed four QTLs for KL, one for KW and two for KT, distributed on five different chromosomes. Of them, QKl.nau-2D for KL, and QKt.nau-4B and QKt.nau-5A for KT were newly identified major QTLs for the respective traits, explaining up to 32.6 and 41.5% of the phenotypic variations, respectively. Increase of KW and KT and reduction of KL/KT and KW/KT ratios always resulted in significant higher grain weight. Lines combining the Nanda 2419 alleles of the 4B and 5A intervals had wider, thicker, rounder kernels and a 14% higher grain weight in the genotype-based analysis. A strong, negative linear relationship of the KW/KT ratio with grain weight was observed. It thus appears that kernel thickness is the most important kernel dimension factor in wheat improvement for higher yield. Mapping and marker identification of the kernel dimension-related QTLs definitely help realize the breeding goals.

  18. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions

    NASA Astrophysics Data System (ADS)

    Novosad, Philip; Reader, Andrew J.

    2016-06-01

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [18F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel

  19. MR-guided dynamic PET reconstruction with the kernel method and spectral temporal basis functions.

    PubMed

    Novosad, Philip; Reader, Andrew J

    2016-06-21

    Recent advances in dynamic positron emission tomography (PET) reconstruction have demonstrated that it is possible to achieve markedly improved end-point kinetic parameter maps by incorporating a temporal model of the radiotracer directly into the reconstruction algorithm. In this work we have developed a highly constrained, fully dynamic PET reconstruction algorithm incorporating both spectral analysis temporal basis functions and spatial basis functions derived from the kernel method applied to a co-registered T1-weighted magnetic resonance (MR) image. The dynamic PET image is modelled as a linear combination of spatial and temporal basis functions, and a maximum likelihood estimate for the coefficients can be found using the expectation-maximization (EM) algorithm. Following reconstruction, kinetic fitting using any temporal model of interest can be applied. Based on a BrainWeb T1-weighted MR phantom, we performed a realistic dynamic [(18)F]FDG simulation study with two noise levels, and investigated the quantitative performance of the proposed reconstruction algorithm, comparing it with reconstructions incorporating either spectral analysis temporal basis functions alone or kernel spatial basis functions alone, as well as with conventional frame-independent reconstruction. Compared to the other reconstruction algorithms, the proposed algorithm achieved superior performance, offering a decrease in spatially averaged pixel-level root-mean-square-error on post-reconstruction kinetic parametric maps in the grey/white matter, as well as in the tumours when they were present on the co-registered MR image. When the tumours were not visible in the MR image, reconstruction with the proposed algorithm performed similarly to reconstruction with spectral temporal basis functions and was superior to both conventional frame-independent reconstruction and frame-independent reconstruction with kernel spatial basis functions. Furthermore, we demonstrate that a joint spectral/kernel

  20. Weighted Feature Gaussian Kernel SVM for Emotion Recognition

    PubMed Central

    Jia, Qingxuan

    2016-01-01

    Emotion recognition with weighted feature based on facial expression is a challenging research topic and has attracted great attention in the past few years. This paper presents a novel method, utilizing subregion recognition rate to weight kernel function. First, we divide the facial expression image into some uniform subregions and calculate corresponding recognition rate and weight. Then, we get a weighted feature Gaussian kernel function and construct a classifier based on Support Vector Machine (SVM). At last, the experimental results suggest that the approach based on weighted feature Gaussian kernel function has good performance on the correct rate in emotion recognition. The experiments on the extended Cohn-Kanade (CK+) dataset show that our method has achieved encouraging recognition results compared to the state-of-the-art methods. PMID:27807443

  1. Searching Remote Homology with Spectral Clustering with Symmetry in Neighborhood Cluster Kernels

    PubMed Central

    Maulik, Ujjwal; Sarkar, Anasua

    2013-01-01

    Remote homology detection among proteins utilizing only the unlabelled sequences is a central problem in comparative genomics. The existing cluster kernel methods based on neighborhoods and profiles and the Markov clustering algorithms are currently the most popular methods for protein family recognition. The deviation from random walks with inflation or dependency on hard threshold in similarity measure in those methods requires an enhancement for homology detection among multi-domain proteins. We propose to combine spectral clustering with neighborhood kernels in Markov similarity for enhancing sensitivity in detecting homology independent of “recent” paralogs. The spectral clustering approach with new combined local alignment kernels more effectively exploits the unsupervised protein sequences globally reducing inter-cluster walks. When combined with the corrections based on modified symmetry based proximity norm deemphasizing outliers, the technique proposed in this article outperforms other state-of-the-art cluster kernels among all twelve implemented kernels. The comparison with the state-of-the-art string and mismatch kernels also show the superior performance scores provided by the proposed kernels. Similar performance improvement also is found over an existing large dataset. Therefore the proposed spectral clustering framework over combined local alignment kernels with modified symmetry based correction achieves superior performance for unsupervised remote homolog detection even in multi-domain and promiscuous domain proteins from Genolevures database families with better biological relevance. Source code available upon request. Contact: sarkar@labri.fr. PMID:23457439

  2. Celluclast 1.5L pretreatment enhanced aroma of palm kernels and oil after kernel roasting.

    PubMed

    Zhang, Wencan; Zhao, Fangju; Yang, Tiankui; Zhao, Feifei; Liu, Shaoquan

    2017-12-01

    The aroma of palm kernel oil (PKO) affects its applications. Little information is available on how enzymatic modification of palm kernels (PK) affects PK and PKO aroma after kernel roasting. Celluclast (cellulase) pretreatment of PK resulted in a 2.4-fold increment in the concentration of soluble sugars, with glucose being increased by 6.0-fold. Higher levels of 1.7-, 1.8- and 1.9-fold of O-heterocyclic volatile compounds were found in the treated PK after roasting at 180 °C for 8, 14 and 20 min respectively relative to the corresponding control, with furfural, 5-methyl-2-furancarboxaldehyde, 2-furanmethanol and maltol in particularly higher amounts. Volatile differences between PKOs from control and treated PK were also found, though less obvious owing to the aqueous extraction process. Principal component analysis based on aroma-active compounds revealed that upon the proceeding of roasting, the differentiation between control and treated PK was enlarged while that of corresponding PKOs was less clear-cut. Celluclast pretreatment enabled the medium roasted PK to impart more nutty, roasty and caramelic odor and the corresponding PKO to impart more caramelic but less roasty and burnt notes. Celluclast pretreatment of PK followed by roasting may be a promising new way of improving PKO aroma. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  3. 40 CFR 87.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each new aircraft gas turbine engine of class TF and of... gas turbine engine of class T3 manufactured on or after January 1, 1978, shall not exceed: Smoke...

  4. 40 CFR 87.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974, shall not exceed: Smoke number of 30. (b) Exhaust emissions of smoke from each new aircraft gas turbine engine of class TF and of... gas turbine engine of class T3 manufactured on or after January 1, 1978, shall not exceed: Smoke...

  5. 14 CFR 34.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Exhaust Emissions (In-use Aircraft Gas Turbine Engines) § 34.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall...

  6. 14 CFR 34.21 - Standards for exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Exhaust Emissions (New Aircraft Gas Turbine Engines) § 34.21 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each new aircraft gas turbine engine of class T8 manufactured on or after February 1, 1974...

  7. 14 CFR 34.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Exhaust Emissions (In-use Aircraft Gas Turbine Engines) § 34.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall...

  8. 14 CFR 34.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... FUEL VENTING AND EXHAUST EMISSION REQUIREMENTS FOR TURBINE ENGINE POWERED AIRPLANES Exhaust Emissions (In-use Aircraft Gas Turbine Engines) § 34.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8, beginning February 1, 1974, shall...

  9. Multiple kernel learning in protein-protein interaction extraction from biomedical literature.

    PubMed

    Yang, Zhihao; Tang, Nan; Zhang, Xiao; Lin, Hongfei; Li, Yanpeng; Yang, Zhiwei

    2011-03-01

    Knowledge about protein-protein interactions (PPIs) unveils the molecular mechanisms of biological processes. The volume and content of published biomedical literature on protein interactions is expanding rapidly, making it increasingly difficult for interaction database administrators, responsible for content input and maintenance to detect and manually update protein interaction information. The objective of this work is to develop an effective approach to automatic extraction of PPI information from biomedical literature. We present a weighted multiple kernel learning-based approach for automatic PPI extraction from biomedical literature. The approach combines the following kernels: feature-based, tree, graph and part-of-speech (POS) path. In particular, we extend the shortest path-enclosed tree (SPT) and dependency path tree to capture richer contextual information. Our experimental results show that the combination of SPT and dependency path tree extensions contributes to the improvement of performance by almost 0.7 percentage units in F-score and 2 percentage units in area under the receiver operating characteristics curve (AUC). Combining two or more appropriately weighed individual will further improve the performance. Both on the individual corpus and cross-corpus evaluation our combined kernel can achieve state-of-the-art performance with respect to comparable evaluations, with 64.41% F-score and 88.46% AUC on the AImed corpus. As different kernels calculate the similarity between two sentences from different aspects. Our combined kernel can reduce the risk of missing important features. More specifically, we use a weighted linear combination of individual kernels instead of assigning the same weight to each individual kernel, thus allowing the introduction of each kernel to incrementally contribute to the performance improvement. In addition, SPT and dependency path tree extensions can improve the performance by including richer context information

  10. Relationship between processing score and kernel-fraction particle size in whole-plant corn silage.

    PubMed

    Dias Junior, G S; Ferraretto, L F; Salvati, G G S; de Resende, L C; Hoffman, P C; Pereira, M N; Shaver, R D

    2016-04-01

    Kernel processing increases starch digestibility in whole-plant corn silage (WPCS). Corn silage processing score (CSPS), the percentage of starch passing through a 4.75-mm sieve, is widely used to assess degree of kernel breakage in WPCS. However, the geometric mean particle size (GMPS) of the kernel-fraction that passes through the 4.75-mm sieve has not been well described. Therefore, the objectives of this study were (1) to evaluate particle size distribution and digestibility of kernels cut in varied particle sizes; (2) to propose a method to measure GMPS in WPCS kernels; and (3) to evaluate the relationship between CSPS and GMPS of the kernel fraction in WPCS. Composite samples of unfermented, dried kernels from 110 corn hybrids commonly used for silage production were kept whole (WH) or manually cut in 2, 4, 8, 16, 32 or 64 pieces (2P, 4P, 8P, 16P, 32P, and 64P, respectively). Dry sieving to determine GMPS, surface area, and particle size distribution using 9 sieves with nominal square apertures of 9.50, 6.70, 4.75, 3.35, 2.36, 1.70, 1.18, and 0.59 mm and pan, as well as ruminal in situ dry matter (DM) digestibilities were performed for each kernel particle number treatment. Incubation times were 0, 3, 6, 12, and 24 h. The ruminal in situ DM disappearance of unfermented kernels increased with the reduction in particle size of corn kernels. Kernels kept whole had the lowest ruminal DM disappearance for all time points with maximum DM disappearance of 6.9% at 24 h and the greatest disappearance was observed for 64P, followed by 32P and 16P. Samples of WPCS (n=80) from 3 studies representing varied theoretical length of cut settings and processor types and settings were also evaluated. Each WPCS sample was divided in 2 and then dried at 60 °C for 48 h. The CSPS was determined in duplicate on 1 of the split samples, whereas on the other split sample the kernel and stover fractions were separated using a hydrodynamic separation procedure. After separation, the

  11. Boundary conditions for gas flow problems from anisotropic scattering kernels

    NASA Astrophysics Data System (ADS)

    To, Quy-Dong; Vu, Van-Huyen; Lauriat, Guy; Léonard, Céline

    2015-10-01

    The paper presents an interface model for gas flowing through a channel constituted of anisotropic wall surfaces. Using anisotropic scattering kernels and Chapman Enskog phase density, the boundary conditions (BCs) for velocity, temperature, and discontinuities including velocity slip and temperature jump at the wall are obtained. Two scattering kernels, Dadzie and Méolans (DM) kernel, and generalized anisotropic Cercignani-Lampis (ACL) are examined in the present paper, yielding simple BCs at the wall fluid interface. With these two kernels, we rigorously recover the analytical expression for orientation dependent slip shown in our previous works [Pham et al., Phys. Rev. E 86, 051201 (2012) and To et al., J. Heat Transfer 137, 091002 (2015)] which is in good agreement with molecular dynamics simulation results. More important, our models include both thermal transpiration effect and new equations for the temperature jump. While the same expression depending on the two tangential accommodation coefficients is obtained for slip velocity, the DM and ACL temperature equations are significantly different. The derived BC equations associated with these two kernels are of interest for the gas simulations since they are able to capture the direction dependent slip behavior of anisotropic interfaces.

  12. Structured Kernel Dictionary Learning with Correlation Constraint for Object Recognition.

    PubMed

    Wang, Zhengjue; Wang, Yinghua; Liu, Hongwei; Zhang, Hao

    2017-06-21

    In this paper, we propose a new discriminative non-linear dictionary learning approach, called correlation constrained structured kernel KSVD, for object recognition. The objective function for dictionary learning contains a reconstructive term and a discriminative term. In the reconstructive term, signals are implicitly non-linearly mapped into a space, where a structured kernel dictionary, each sub-dictionary of which lies in the span of the mapped signals from the corresponding class, is established. In the discriminative term, by analyzing the classification mechanism, the correlation constraint is proposed in kernel form, constraining the correlations between different discriminative codes, and restricting the coefficient vectors to be transformed into a feature space, where the features are highly correlated inner-class and nearly independent between-classes. The objective function is optimized by the proposed structured kernel KSVD. During the classification stage, the specific form of the discriminative feature is needless to be known, while the inner product of the discriminative feature with kernel matrix embedded is available, and is suitable for a linear SVM classifier. Experimental results demonstrate that the proposed approach outperforms many state-of-the-art dictionary learning approaches for face, scene and synthetic aperture radar (SAR) vehicle target recognition.

  13. Ambered kernels in stenospermocarpic fruit of eastern black walnut

    Treesearch

    Michele R. Warmund; J.W. Van Sambeek

    2014-01-01

    "Ambers" is a term used to describe poorly filled, shriveled eastern black walnut (Juglans nigra L.) kernels with a dark brown or black-colored pellicle that are unmarketable. Studies were conducted to determine the incidence of ambered black walnut kernels and to ascertain when symptoms were apparent in specific tissues. The occurrence of...

  14. Antioxidant and antimicrobial activities of bitter and sweet apricot (Prunus armeniaca L.) kernels.

    PubMed

    Yiğit, D; Yiğit, N; Mavi, A

    2009-04-01

    The present study describes the in vitro antimicrobial and antioxidant activity of methanol and water extracts of sweet and bitter apricot (Prunus armeniaca L.) kernels. The antioxidant properties of apricot kernels were evaluated by determining radical scavenging power, lipid peroxidation inhibition activity and total phenol content measured with a DPPH test, the thiocyanate method and the Folin method, respectively. In contrast to extracts of the bitter kernels, both the water and methanol extracts of sweet kernels have antioxidant potential. The highest percent inhibition of lipid peroxidation (69%) and total phenolic content (7.9 +/- 0.2 microg/mL) were detected in the methanol extract of sweet kernels (Hasanbey) and in the water extract of the same cultivar, respectively. The antimicrobial activities of the above extracts were also tested against human pathogenic microorganisms using a disc-diffusion method, and the minimal inhibitory concentration (MIC) values of each active extract were determined. The most effective antibacterial activity was observed in the methanol and water extracts of bitter kernels and in the methanol extract of sweet kernels against the Gram-positive bacteria Staphylococcus aureus. Additionally, the methanol extracts of the bitter kernels were very potent against the Gram-negative bacteria Escherichia coli (0.312 mg/mL MIC value). Significant anti-candida activity was also observed with the methanol extract of bitter apricot kernels against Candida albicans, consisting of a 14 mm in diameter of inhibition zone and a 0.625 mg/mL MIC value.

  15. Acute cyanide toxicity caused by apricot kernel ingestion.

    PubMed

    Suchard, J R; Wallace, K L; Gerkin, R D

    1998-12-01

    A 41-year-old woman ingested apricot kernels purchased at a health food store and became weak and dyspneic within 20 minutes. The patient was comatose and hypothermic on presentation but responded promptly to antidotal therapy for cyanide poisoning. She was later treated with a continuous thiosulfate infusion for persistent metabolic acidosis. This is the first reported case of cyanide toxicity from apricot kernel ingestion in the United States since 1979.

  16. Nutrition quality of extraction mannan residue from palm kernel cake on brolier chicken

    NASA Astrophysics Data System (ADS)

    Tafsin, M.; Hanafi, N. D.; Kejora, E.; Yusraini, E.

    2018-02-01

    This study aims to find out the nutrient residue of palm kernel cake from mannan extraction on broiler chicken by evaluating physical quality (specific gravity, bulk density and compacted bulk density), chemical quality (proximate analysis and Van Soest Test) and biological test (metabolizable energy). Treatment composed of T0 : palm kernel cake extracted aquadest (control), T1 : palm kernel cake extracted acetic acid (CH3COOH) 1%, T2 : palm kernel cake extracted aquadest + mannanase enzyme 100 u/l and T3 : palm kernel cake extracted acetic acid (CH3COOH) 1% + enzyme mannanase 100 u/l. The results showed that mannan extraction had significant effect (P<0.05) in improving the quality of physical and numerically increase the value of crude protein and decrease the value of NDF (Neutral Detergent Fiber). Treatments had highly significant influence (P<0.01) on the metabolizable energy value of palm kernel cake residue in broiler chickens. It can be concluded that extraction with aquadest + enzyme mannanase 100 u/l yields the best nutrient quality of palm kernel cake residue for broiler chicken.

  17. The site, size, spatial stability, and energetics of an X-ray flare kernel

    NASA Technical Reports Server (NTRS)

    Petrasso, R.; Gerassimenko, M.; Nolte, J.

    1979-01-01

    The site, size evolution, and energetics of an X-ray kernel that dominated a solar flare during its rise and somewhat during its peak are investigated. The position of the kernel remained stationary to within about 3 arc sec over the 30-min interval of observations, despite pulsations in the kernel X-ray brightness in excess of a factor of 10. This suggests a tightly bound, deeply rooted magnetic structure, more plausibly associated with the near chromosphere or low corona rather than with the high corona. The H-alpha flare onset coincided with the appearance of the kernel, again suggesting a close spatial and temporal coupling between the chromospheric H-alpha event and the X-ray kernel. At the first kernel brightness peak its size was no larger than about 2 arc sec, when it accounted for about 40% of the total flare flux. In the second rise phase of the kernel, a source power input of order 2 times 10 to the 24th ergs/sec is minimally required.

  18. Reduction of Aflatoxins in Apricot Kernels by Electronic and Manual Color Sorting.

    PubMed

    Zivoli, Rosanna; Gambacorta, Lucia; Piemontese, Luca; Solfrizzo, Michele

    2016-01-19

    The efficacy of color sorting on reducing aflatoxin levels in shelled apricot kernels was assessed. Naturally-contaminated kernels were submitted to an electronic optical sorter or blanched, peeled, and manually sorted to visually identify and sort discolored kernels (dark and spotted) from healthy ones. The samples obtained from the two sorting approaches were ground, homogenized, and analysed by HPLC-FLD for their aflatoxin content. A mass balance approach was used to measure the distribution of aflatoxins in the collected fractions. Aflatoxin B₁ and B₂ were identified and quantitated in all collected fractions at levels ranging from 1.7 to 22,451.5 µg/kg of AFB₁ + AFB₂, whereas AFG₁ and AFG₂ were not detected. Excellent results were obtained by manual sorting of peeled kernels since the removal of discolored kernels (2.6%-19.9% of total peeled kernels) removed 97.3%-99.5% of total aflatoxins. The combination of peeling and visual/manual separation of discolored kernels is a feasible strategy to remove 97%-99% of aflatoxins accumulated in naturally-contaminated samples. Electronic optical sorter gave highly variable results since the amount of AFB₁ + AFB₂ measured in rejected fractions (15%-18% of total kernels) ranged from 13% to 59% of total aflatoxins. An improved immunoaffinity-based HPLC-FLD method having low limits of detection for the four aflatoxins (0.01-0.05 µg/kg) was developed and used to monitor the occurrence of aflatoxins in 47 commercial products containing apricot kernels and/or almonds commercialized in Italy. Low aflatoxin levels were found in 38% of the tested samples and ranged from 0.06 to 1.50 μg/kg for AFB₁ and from 0.06 to 1.79 μg/kg for total aflatoxins.

  19. Reduction of Aflatoxins in Apricot Kernels by Electronic and Manual Color Sorting

    PubMed Central

    Zivoli, Rosanna; Gambacorta, Lucia; Piemontese, Luca; Solfrizzo, Michele

    2016-01-01

    The efficacy of color sorting on reducing aflatoxin levels in shelled apricot kernels was assessed. Naturally-contaminated kernels were submitted to an electronic optical sorter or blanched, peeled, and manually sorted to visually identify and sort discolored kernels (dark and spotted) from healthy ones. The samples obtained from the two sorting approaches were ground, homogenized, and analysed by HPLC-FLD for their aflatoxin content. A mass balance approach was used to measure the distribution of aflatoxins in the collected fractions. Aflatoxin B1 and B2 were identified and quantitated in all collected fractions at levels ranging from 1.7 to 22,451.5 µg/kg of AFB1 + AFB2, whereas AFG1 and AFG2 were not detected. Excellent results were obtained by manual sorting of peeled kernels since the removal of discolored kernels (2.6%–19.9% of total peeled kernels) removed 97.3%–99.5% of total aflatoxins. The combination of peeling and visual/manual separation of discolored kernels is a feasible strategy to remove 97%–99% of aflatoxins accumulated in naturally-contaminated samples. Electronic optical sorter gave highly variable results since the amount of AFB1 + AFB2 measured in rejected fractions (15%–18% of total kernels) ranged from 13% to 59% of total aflatoxins. An improved immunoaffinity-based HPLC-FLD method having low limits of detection for the four aflatoxins (0.01–0.05 µg/kg) was developed and used to monitor the occurrence of aflatoxins in 47 commercial products containing apricot kernels and/or almonds commercialized in Italy. Low aflatoxin levels were found in 38% of the tested samples and ranged from 0.06 to 1.50 μg/kg for AFB1 and from 0.06 to 1.79 μg/kg for total aflatoxins. PMID:26797635

  20. Gaussian processes with optimal kernel construction for neuro-degenerative clinical onset prediction

    NASA Astrophysics Data System (ADS)

    Canas, Liane S.; Yvernault, Benjamin; Cash, David M.; Molteni, Erika; Veale, Tom; Benzinger, Tammie; Ourselin, Sébastien; Mead, Simon; Modat, Marc

    2018-02-01

    Gaussian Processes (GP) are a powerful tool to capture the complex time-variations of a dataset. In the context of medical imaging analysis, they allow a robust modelling even in case of highly uncertain or incomplete datasets. Predictions from GP are dependent of the covariance kernel function selected to explain the data variance. To overcome this limitation, we propose a framework to identify the optimal covariance kernel function to model the data.The optimal kernel is defined as a composition of base kernel functions used to identify correlation patterns between data points. Our approach includes a modified version of the Compositional Kernel Learning (CKL) algorithm, in which we score the kernel families using a new energy function that depends both the Bayesian Information Criterion (BIC) and the explained variance score. We applied the proposed framework to model the progression of neurodegenerative diseases over time, in particular the progression of autosomal dominantly-inherited Alzheimer's disease, and use it to predict the time to clinical onset of subjects carrying genetic mutation.

  1. A Comparative Study of Pairwise Learning Methods Based on Kernel Ridge Regression.

    PubMed

    Stock, Michiel; Pahikkala, Tapio; Airola, Antti; De Baets, Bernard; Waegeman, Willem

    2018-06-12

    Many machine learning problems can be formulated as predicting labels for a pair of objects. Problems of that kind are often referred to as pairwise learning, dyadic prediction, or network inference problems. During the past decade, kernel methods have played a dominant role in pairwise learning. They still obtain a state-of-the-art predictive performance, but a theoretical analysis of their behavior has been underexplored in the machine learning literature. In this work we review and unify kernel-based algorithms that are commonly used in different pairwise learning settings, ranging from matrix filtering to zero-shot learning. To this end, we focus on closed-form efficient instantiations of Kronecker kernel ridge regression. We show that independent task kernel ridge regression, two-step kernel ridge regression, and a linear matrix filter arise naturally as a special case of Kronecker kernel ridge regression, implying that all these methods implicitly minimize a squared loss. In addition, we analyze universality, consistency, and spectral filtering properties. Our theoretical results provide valuable insights into assessing the advantages and limitations of existing pairwise learning methods.

  2. Quantifying the impact of smoke aerosol on the UV radiation

    NASA Astrophysics Data System (ADS)

    Sokolik, I. N.; Tatarskii, V.; Hall, S. R.; Petropavlovskikh, I. V.

    2017-12-01

    We present an analysis of the impact of smoke on the UV radiation. The analysis is performed for a case study by combining the modeling and measurements. The case study is focusing in wildfires occurred in California in ????. The fires have been affecting the environment in the region, posing a serious threat to the human well - being.The modeling is performed using a fully couple WRF- Chem- SMOKE model. The model uses the FRP MODIS satellite data to generate the smoke emission for an actual event. The smoke aerosol is treated in a size and composition resolved manner. The optical properties are computed online and provided to the TUV model that is incorporated in the WRF - Chem-SMOKE model. The analysis of the impact of smoke on the UV radiation is performed. We assess the impact of smoke on the TOA radiative forcing. Our results show a significant impact of smoke on the radiative regime of the atmosphere.

  3. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach

    PubMed Central

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-01-01

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification. PMID:28629202

  4. A Novel Extreme Learning Machine Classification Model for e-Nose Application Based on the Multiple Kernel Approach.

    PubMed

    Jian, Yulin; Huang, Daoyu; Yan, Jia; Lu, Kun; Huang, Ying; Wen, Tailai; Zeng, Tanyue; Zhong, Shijie; Xie, Qilong

    2017-06-19

    A novel classification model, named the quantum-behaved particle swarm optimization (QPSO)-based weighted multiple kernel extreme learning machine (QWMK-ELM), is proposed in this paper. Experimental validation is carried out with two different electronic nose (e-nose) datasets. Being different from the existing multiple kernel extreme learning machine (MK-ELM) algorithms, the combination coefficients of base kernels are regarded as external parameters of single-hidden layer feedforward neural networks (SLFNs). The combination coefficients of base kernels, the model parameters of each base kernel, and the regularization parameter are optimized by QPSO simultaneously before implementing the kernel extreme learning machine (KELM) with the composite kernel function. Four types of common single kernel functions (Gaussian kernel, polynomial kernel, sigmoid kernel, and wavelet kernel) are utilized to constitute different composite kernel functions. Moreover, the method is also compared with other existing classification methods: extreme learning machine (ELM), kernel extreme learning machine (KELM), k-nearest neighbors (KNN), support vector machine (SVM), multi-layer perceptron (MLP), radical basis function neural network (RBFNN), and probabilistic neural network (PNN). The results have demonstrated that the proposed QWMK-ELM outperforms the aforementioned methods, not only in precision, but also in efficiency for gas classification.

  5. Improved scatter correction using adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Sun, M.; Star-Lack, J. M.

    2010-11-01

    Accurate scatter correction is required to produce high-quality reconstructions of x-ray cone-beam computed tomography (CBCT) scans. This paper describes new scatter kernel superposition (SKS) algorithms for deconvolving scatter from projection data. The algorithms are designed to improve upon the conventional approach whose accuracy is limited by the use of symmetric kernels that characterize the scatter properties of uniform slabs. To model scatter transport in more realistic objects, nonstationary kernels, whose shapes adapt to local thickness variations in the projection data, are proposed. Two methods are introduced: (1) adaptive scatter kernel superposition (ASKS) requiring spatial domain convolutions and (2) fast adaptive scatter kernel superposition (fASKS) where, through a linearity approximation, convolution is efficiently performed in Fourier space. The conventional SKS algorithm, ASKS, and fASKS, were tested with Monte Carlo simulations and with phantom data acquired on a table-top CBCT system matching the Varian On-Board Imager (OBI). All three models accounted for scatter point-spread broadening due to object thickening, object edge effects, detector scatter properties and an anti-scatter grid. Hounsfield unit (HU) errors in reconstructions of a large pelvis phantom with a measured maximum scatter-to-primary ratio over 200% were reduced from -90 ± 58 HU (mean ± standard deviation) with no scatter correction to 53 ± 82 HU with SKS, to 19 ± 25 HU with fASKS and to 13 ± 21 HU with ASKS. HU accuracies and measured contrast were similarly improved in reconstructions of a body-sized elliptical Catphan phantom. The results show that the adaptive SKS methods offer significant advantages over the conventional scatter deconvolution technique.

  6. Notes on a storage manager for the Clouds kernel

    NASA Technical Reports Server (NTRS)

    Pitts, David V.; Spafford, Eugene H.

    1986-01-01

    The Clouds project is research directed towards producing a reliable distributed computing system. The initial goal is to produce a kernel which provides a reliable environment with which a distributed operating system can be built. The Clouds kernal consists of a set of replicated subkernels, each of which runs on a machine in the Clouds system. Each subkernel is responsible for the management of resources on its machine; the subkernal components communicate to provide the cooperation necessary to meld the various machines into one kernel. The implementation of a kernel-level storage manager that supports reliability is documented. The storage manager is a part of each subkernel and maintains the secondary storage residing at each machine in the distributed system. In addition to providing the usual data transfer services, the storage manager ensures that data being stored survives machine and system crashes, and that the secondary storage of a failed machine is recovered (made consistent) automatically when the machine is restarted. Since the storage manager is part of the Clouds kernel, efficiency of operation is also a concern.

  7. Metabolite identification through multiple kernel learning on fragmentation trees.

    PubMed

    Shen, Huibin; Dührkop, Kai; Böcker, Sebastian; Rousu, Juho

    2014-06-15

    Metabolite identification from tandem mass spectrometric data is a key task in metabolomics. Various computational methods have been proposed for the identification of metabolites from tandem mass spectra. Fragmentation tree methods explore the space of possible ways in which the metabolite can fragment, and base the metabolite identification on scoring of these fragmentation trees. Machine learning methods have been used to map mass spectra to molecular fingerprints; predicted fingerprints, in turn, can be used to score candidate molecular structures. Here, we combine fragmentation tree computations with kernel-based machine learning to predict molecular fingerprints and identify molecular structures. We introduce a family of kernels capturing the similarity of fragmentation trees, and combine these kernels using recently proposed multiple kernel learning approaches. Experiments on two large reference datasets show that the new methods significantly improve molecular fingerprint prediction accuracy. These improvements result in better metabolite identification, doubling the number of metabolites ranked at the top position of the candidates list. © The Author 2014. Published by Oxford University Press.

  8. Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation.

    PubMed

    Niu, Wenjia; Xia, Kewen; Zu, Baokai; Bai, Jianchuan

    2017-01-01

    Unlike Support Vector Machine (SVM), Multiple Kernel Learning (MKL) allows datasets to be free to choose the useful kernels based on their distribution characteristics rather than a precise one. It has been shown in the literature that MKL holds superior recognition accuracy compared with SVM, however, at the expense of time consuming computations. This creates analytical and computational difficulties in solving MKL algorithms. To overcome this issue, we first develop a novel kernel approximation approach for MKL and then propose an efficient Low-Rank MKL (LR-MKL) algorithm by using the Low-Rank Representation (LRR). It is well-acknowledged that LRR can reduce dimension while retaining the data features under a global low-rank constraint. Furthermore, we redesign the binary-class MKL as the multiclass MKL based on pairwise strategy. Finally, the recognition effect and efficiency of LR-MKL are verified on the datasets Yale, ORL, LSVT, and Digit. Experimental results show that the proposed LR-MKL algorithm is an efficient kernel weights allocation method in MKL and boosts the performance of MKL largely.

  9. Methods for reducing pollutant emissions from jet aircraft

    NASA Technical Reports Server (NTRS)

    Butze, H. F.

    1971-01-01

    Pollutant emissions from jet aircraft and combustion research aimed at reducing these emissions are defined. The problem of smoke formation and results achieved in smoke reduction from commercial combustors are discussed. Expermental results of parametric tests performed on both conventional and experimental combustors over a range of combustor-inlet conditions are presented. Combustor design techniques for reducing pollutant emissions are discussed. Improved fuel atomization resulting from the use of air-assist fuel nozzles has brought about significant reductions in hydrocarbon and carbon monoxide emissions at idle. Diffuser tests have shown that the combustor-inlet airflow profile can be controlled through the use of diffuser-wall bleed and that it may thus be possible to reduce emissions by controlling combustor airflow distribution. Emissions of nitric oxide from a shortlength annular swirl-can combustor were significantly lower than those from a conventional combustor operating at similar conditions.

  10. Classification of corn kernels contaminated with aflatoxins using fluorescence and reflectance hyperspectral images analysis

    NASA Astrophysics Data System (ADS)

    Zhu, Fengle; Yao, Haibo; Hruska, Zuzana; Kincaid, Russell; Brown, Robert; Bhatnagar, Deepak; Cleveland, Thomas

    2015-05-01

    Aflatoxins are secondary metabolites produced by certain fungal species of the Aspergillus genus. Aflatoxin contamination remains a problem in agricultural products due to its toxic and carcinogenic properties. Conventional chemical methods for aflatoxin detection are time-consuming and destructive. This study employed fluorescence and reflectance visible near-infrared (VNIR) hyperspectral images to classify aflatoxin contaminated corn kernels rapidly and non-destructively. Corn ears were artificially inoculated in the field with toxigenic A. flavus spores at the early dough stage of kernel development. After harvest, a total of 300 kernels were collected from the inoculated ears. Fluorescence hyperspectral imagery with UV excitation and reflectance hyperspectral imagery with halogen illumination were acquired on both endosperm and germ sides of kernels. All kernels were then subjected to chemical analysis individually to determine aflatoxin concentrations. A region of interest (ROI) was created for each kernel to extract averaged spectra. Compared with healthy kernels, fluorescence spectral peaks for contaminated kernels shifted to longer wavelengths with lower intensity, and reflectance values for contaminated kernels were lower with a different spectral shape in 700-800 nm region. Principal component analysis was applied for data compression before classifying kernels into contaminated and healthy based on a 20 ppb threshold utilizing the K-nearest neighbors algorithm. The best overall accuracy achieved was 92.67% for germ side in the fluorescence data analysis. The germ side generally performed better than endosperm side. Fluorescence and reflectance image data achieved similar accuracy.

  11. Influence of Kernel Age on Fumonisin B1 Production in Maize by Fusarium moniliforme

    PubMed Central

    Warfield, Colleen Y.; Gilchrist, David G.

    1999-01-01

    Production of fumonisins by Fusarium moniliforme on naturally infected maize ears is an important food safety concern due to the toxic nature of this class of mycotoxins. Assessing the potential risk of fumonisin production in developing maize ears prior to harvest requires an understanding of the regulation of toxin biosynthesis during kernel maturation. We investigated the developmental-stage-dependent relationship between maize kernels and fumonisin B1 production by using kernels collected at the blister (R2), milk (R3), dough (R4), and dent (R5) stages following inoculation in culture at their respective field moisture contents with F. moniliforme. Highly significant differences (P ≤ 0.001) in fumonisin B1 production were found among kernels at the different developmental stages. The highest levels of fumonisin B1 were produced on the dent stage kernels, and the lowest levels were produced on the blister stage kernels. The differences in fumonisin B1 production among kernels at the different developmental stages remained significant (P ≤ 0.001) when the moisture contents of the kernels were adjusted to the same level prior to inoculation. We concluded that toxin production is affected by substrate composition as well as by moisture content. Our study also demonstrated that fumonisin B1 biosynthesis on maize kernels is influenced by factors which vary with the developmental age of the tissue. The risk of fumonisin contamination may begin early in maize ear development and increases as the kernels reach physiological maturity. PMID:10388675

  12. Aerosol transport model evaluation of an extreme smoke episode in Southeast Asia

    NASA Astrophysics Data System (ADS)

    Hyer, Edward J.; Chew, Boon Ning

    2010-04-01

    Biomass burning is one of many sources of particulate pollution in Southeast Asia, but its irregular spatial and temporal patterns mean that large episodes can cause acute air quality problems in urban areas. Fires in Sumatra and Borneo during September and October 2006 contributed to 24-h mean PM 10 concentrations above 150 μg m -3 at multiple locations in Singapore and Malaysia over several days. We use the FLAMBE model of biomass burning emissions and the NAAPS model of aerosol transport and evolution to simulate these events, and compare our simulation results to 24-h average PM 10 measurements from 54 stations in Singapore and Malaysia. The model simulation, including the FLAMBE smoke source as well as dust, sulfate, and sea salt aerosol species, was able to explain 50% or more of the variance in 24-h PM 10 observations at 29 of 54 sites. Simulation results indicated that biomass burning smoke contributed to nearly all of the extreme PM 10 observations during September-November 2006, but the exact contribution of smoke was unclear because the model severely underestimated total smoke emissions. Using regression analysis at each site, the bias in the smoke aerosol flux was determined to be a factor of between 2.5 and 10, and an overall factor of 3.5 was estimated. After application of this factor, the simulated smoke aerosol concentration averaged 20% of observed PM 10, and 40% of PM 10 for days with 24-h average concentrations above 150 μg m -3. These results suggest that aerosol transport models can aid analysis of severe pollution events in Southeast Asia, but that improvements are needed in models of biomass burning smoke emissions.

  13. Differential metabolome analysis of field-grown maize kernels in response to drought stress

    USDA-ARS?s Scientific Manuscript database

    Drought stress constrains maize kernel development and can exacerbate aflatoxin contamination. In order to identify drought responsive metabolites and explore pathways involved in kernel responses, a metabolomics analysis was conducted on kernels from a drought tolerant line, Lo964, and a sensitive ...

  14. Considering causal genes in the genetic dissection of kernel traits in common wheat.

    PubMed

    Mohler, Volker; Albrecht, Theresa; Castell, Adelheid; Diethelm, Manuela; Schweizer, Günther; Hartl, Lorenz

    2016-11-01

    Genetic factors controlling thousand-kernel weight (TKW) were characterized for their association with other seed traits, including kernel width, kernel length, ratio of kernel width to kernel length (KW/KL), kernel area, and spike number per m 2 (SN). For this purpose, a genetic map was established utilizing a doubled haploid population derived from a cross between German winter wheat cultivars Pamier and Format. Association studies in a diversity panel of elite cultivars supplemented genetic analysis of kernel traits. In both populations, genomic signatures of 13 candidate genes for TKW and kernel size were analyzed. Major quantitative trait loci (QTL) for TKW were identified on chromosomes 1B, 2A, 2D, and 4D, and their locations coincided with major QTL for kernel size traits, supporting the common belief that TKW is a function of other kernel traits. The QTL on chromosome 2A was associated with TKW candidate gene TaCwi-A1 and the QTL on chromosome 4D was associated with dwarfing gene Rht-D1. A minor QTL for TKW on chromosome 6B coincided with TaGW2-6B. The QTL for kernel dimensions that did not affect TKW were detected on eight chromosomes. A major QTL for KW/KL located at the distal tip of chromosome arm 5AS is being reported for the first time. TaSus1-7A and TaSAP-A1, closely linked to each other on chromosome 7A, could be related to a minor QTL for KW/KL. Genetic analysis of SN confirmed its negative correlation with TKW in this cross. In the diversity panel, TaSus1-7A was associated with TKW. Compared to the Pamier/Format bi-parental population where TaCwi-A1a was associated with higher TKW, the same allele reduced grain yield in the diversity panel, suggesting opposite effects of TaCwi-A1 on these two traits.

  15. Kernel machines for epilepsy diagnosis via EEG signal classification: a comparative study.

    PubMed

    Lima, Clodoaldo A M; Coelho, André L V

    2011-10-01

    We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely, Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs

  16. Sparse Event Modeling with Hierarchical Bayesian Kernel Methods

    DTIC Science & Technology

    2016-01-05

    SECURITY CLASSIFICATION OF: The research objective of this proposal was to develop a predictive Bayesian kernel approach to model count data based on...several predictive variables. Such an approach, which we refer to as the Poisson Bayesian kernel model , is able to model the rate of occurrence of...which adds specificity to the model and can make nonlinear data more manageable. Early results show that the 1. REPORT DATE (DD-MM-YYYY) 4. TITLE

  17. Fire and Smoke Model Evaluation Experiment: Coordination of a study to improve smoke modeling for fire operations within the United States

    NASA Astrophysics Data System (ADS)

    French, N. H. F.; Ottmar, R. D.; Brown, T. J.; Larkin, N. K.

    2017-12-01

    The Fire and Smoke Model Evaluation Experiment (FASMEE) is an integrative research effort to identify and collect critical measurements to improve operational wildland fire and smoke prediction systems. FASMEE has two active phases and one suggested phase. Phase 1 is the analysis and planning process to assess the current state of fire-plume-smoke modeling and to determine the critical measurements required to evaluate and improve these operational fire and smoke models. As the major deliverable for Phase 1, a study plan has been completed that describes the measurement needs, field campaigns, and command, safety and air space de-confliction plans necessary to complete the FASMEE project. Phase 2 is a set of field campaigns to collect data during 2019-2022. Future Improvements would be a set of analyses and model improvements based on the data collected within Phase 2 that is dependent on identifying future funding sources. In this presentation, we will review the FASMEE Study Plan and detailed measurements and conditions expected for the four to five proposed research burns. The recommended measurements during Phase 2 span the four interrelated disciplines of FASMEE: fuels and consumption, fire behavior and energy, plume dynamics and meteorology, and smoke emissions, chemistry, and transport. Fuel type, condition, and consumption during wildland fire relates to several fire impacts including radiative heating, which provides the energy that drives fire dynamics. Local-scale meteorology is an important factor which relates to atmospheric chemistry, dispersion, and transport. Plume dynamics provide the connection between fire behavior and far-field smoke dispersion, because it determines the vertical distribution of the emissions. Guided by the data needs and science questions generated during Phase 1, three wildland fire campaigns were selected. These included the western wildfire campaign (rapid deployment aimed at western wildfires supporting NOAA, NASA, and NSF

  18. Mapping Fire Severity Using Imaging Spectroscopy and Kernel Based Image Analysis

    NASA Astrophysics Data System (ADS)

    Prasad, S.; Cui, M.; Zhang, Y.; Veraverbeke, S.

    2014-12-01

    Improved spatial representation of within-burn heterogeneity after wildfires is paramount to effective land management decisions and more accurate fire emissions estimates. In this work, we demonstrate feasibility and efficacy of airborne imaging spectroscopy (hyperspectral imagery) for quantifying wildfire burn severity, using kernel based image analysis techniques. Two different airborne hyperspectral datasets, acquired over the 2011 Canyon and 2013 Rim fire in California using the Airborne Visible InfraRed Imaging Spectrometer (AVIRIS) sensor, were used in this study. The Rim Fire, covering parts of the Yosemite National Park started on August 17, 2013, and was the third largest fire in California's history. Canyon Fire occurred in the Tehachapi mountains, and started on September 4, 2011. In addition to post-fire data for both fires, half of the Rim fire was also covered with pre-fire images. Fire severity was measured in the field using Geo Composite Burn Index (GeoCBI). The field data was utilized to train and validate our models, wherein the trained models, in conjunction with imaging spectroscopy data were used for GeoCBI estimation wide geographical regions. This work presents an approach for using remotely sensed imagery combined with GeoCBI field data to map fire scars based on a non-linear (kernel based) epsilon-Support Vector Regression (e-SVR), which was used to learn the relationship between spectra and GeoCBI in a kernel-induced feature space. Classification of healthy vegetation versus fire-affected areas based on morphological multi-attribute profiles was also studied. The availability of pre- and post-fire imaging spectroscopy data over the Rim Fire provided a unique opportunity to evaluate the performance of bi-temporal imaging spectroscopy for assessing post-fire effects. This type of data is currently constrained because of limited airborne acquisitions before a fire, but will become widespread with future spaceborne sensors such as those on

  19. Omnibus Risk Assessment via Accelerated Failure Time Kernel Machine Modeling

    PubMed Central

    Sinnott, Jennifer A.; Cai, Tianxi

    2013-01-01

    Summary Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai et al., 2011). In this paper, we derive testing and prediction methods for KM regression under the accelerated failure time model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. PMID:24328713

  20. Omnibus risk assessment via accelerated failure time kernel machine modeling.

    PubMed

    Sinnott, Jennifer A; Cai, Tianxi

    2013-12-01

    Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai, Tonini, and Lin, 2011). In this article, we derive testing and prediction methods for KM regression under the accelerated failure time (AFT) model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. © 2013, The International Biometric Society.

  1. Kernel Wiener filter and its application to pattern recognition.

    PubMed

    Yoshino, Hirokazu; Dong, Chen; Washizawa, Yoshikazu; Yamashita, Yukihiko

    2010-11-01

    The Wiener filter (WF) is widely used for inverse problems. From an observed signal, it provides the best estimated signal with respect to the squared error averaged over the original and the observed signals among linear operators. The kernel WF (KWF), extended directly from WF, has a problem that an additive noise has to be handled by samples. Since the computational complexity of kernel methods depends on the number of samples, a huge computational cost is necessary for the case. By using the first-order approximation of kernel functions, we realize KWF that can handle such a noise not by samples but as a random variable. We also propose the error estimation method for kernel filters by using the approximations. In order to show the advantages of the proposed methods, we conducted the experiments to denoise images and estimate errors. We also apply KWF to classification since KWF can provide an approximated result of the maximum a posteriori classifier that provides the best recognition accuracy. The noise term in the criterion can be used for the classification in the presence of noise or a new regularization to suppress changes in the input space, whereas the ordinary regularization for the kernel method suppresses changes in the feature space. In order to show the advantages of the proposed methods, we conducted experiments of binary and multiclass classifications and classification in the presence of noise.

  2. Combined multi-kernel head computed tomography images optimized for depicting both brain parenchyma and bone.

    PubMed

    Takagi, Satoshi; Nagase, Hiroyuki; Hayashi, Tatsuya; Kita, Tamotsu; Hayashi, Katsumi; Sanada, Shigeru; Koike, Masayuki

    2014-01-01

    The hybrid convolution kernel technique for computed tomography (CT) is known to enable the depiction of an image set using different window settings. Our purpose was to decrease the number of artifacts in the hybrid convolution kernel technique for head CT and to determine whether our improved combined multi-kernel head CT images enabled diagnosis as a substitute for both brain (low-pass kernel-reconstructed) and bone (high-pass kernel-reconstructed) images. Forty-four patients with nondisplaced skull fractures were included. Our improved multi-kernel images were generated so that pixels of >100 Hounsfield unit in both brain and bone images were composed of CT values of bone images and other pixels were composed of CT values of brain images. Three radiologists compared the improved multi-kernel images with bone images. The improved multi-kernel images and brain images were identically displayed on the brain window settings. All three radiologists agreed that the improved multi-kernel images on the bone window settings were sufficient for diagnosing skull fractures in all patients. This improved multi-kernel technique has a simple algorithm and is practical for clinical use. Thus, simplified head CT examinations and fewer images that need to be stored can be expected.

  3. Kernelization

    NASA Astrophysics Data System (ADS)

    Fomin, Fedor V.

    Preprocessing (data reduction or kernelization) as a strategy of coping with hard problems is universally used in almost every implementation. The history of preprocessing, like applying reduction rules simplifying truth functions, can be traced back to the 1950's [6]. A natural question in this regard is how to measure the quality of preprocessing rules proposed for a specific problem. For a long time the mathematical analysis of polynomial time preprocessing algorithms was neglected. The basic reason for this anomaly was that if we start with an instance I of an NP-hard problem and can show that in polynomial time we can replace this with an equivalent instance I' with |I'| < |I| then that would imply P=NP in classical complexity.

  4. Chemical fields during Southeast Nexus (SENEX) field experiment and design of verification metrics for efficacy of capturing wild fire emissions

    NASA Astrophysics Data System (ADS)

    Lee, P.

    2016-12-01

    Wildfires are commonplace in North America. Air pollution resulted from wildfires pose a significant risk for human health and crop damage. The pollutants alter the vertical distribution of many atmospheric constituents including O3 and many fine particulate (PM) species. Compared to anthropogenic emissions of air pollutants, emissions from wildfires are largely uncontrolled and unpredictable. Therefore, quantitatively describing wildfire emissions and their contributions to air pollution remains a substantial challenge for atmospheric modeler and air quality forecasters. In this study, we investigated the modification and redistribution of atmospheric composition within the Conterminous U.S (CONUS) by wild fire plumes originated within and outside of the CONUS. We used the National Air Quality Forecasting Capability (NAQFC) to conduct the investigation. NAQFC uses dynamic lateral chemical boundary conditions derived from the National Weather Service experimental global aerosol tracer model accounting for intrusion of fire-associated aerosol species. Within CONUS, the NAQFC derives both gaseous and aerosol wildfire associated species from the National Environmental Satellite, Data, and Information Service (NESDIS) hazard mapping system (HMS) hot-spot detection, and US Forestry Service Blue-sky protocol for quantifying fire characteristics, and the US EPA Sparse Matrix Object Kernel Emission (SMOKE) calculation for plume rise. Attributions of both of these wildfire influences inherently reflect the aged plumes intruded into the CONUS through the model boundaries as well as the fresher emissions from sources within the CONUS. Both emission sources contribute significantly to the vertical structure modification of the atmosphere. We conducted case studies within the fire active seasons to demonstrate some possible impacts on the vertical structures of O3 and PM species by the wildfire activities.

  5. CALIOP-based Biomass Burning Smoke Plume Injection Height

    NASA Astrophysics Data System (ADS)

    Soja, A. J.; Choi, H. D.; Fairlie, T. D.; Pouliot, G.; Baker, K. R.; Winker, D. M.; Trepte, C. R.; Szykman, J.

    2017-12-01

    Carbon and aerosols are cycled between terrestrial and atmosphere environments during fire events, and these emissions have strong feedbacks to near-field weather, air quality, and longer-term climate systems. Fire severity and burned area are under the control of weather and climate, and fire emissions have the potential to alter numerous land and atmospheric processes that, in turn, feedback to and interact with climate systems (e.g., changes in patterns of precipitation, black/brown carbon deposition on ice/snow, alteration in landscape and atmospheric/cloud albedo). If plume injection height is incorrectly estimated, then the transport and deposition of those emissions will also be incorrect. The heights to which smoke is injected governs short- or long-range transport, which influences surface pollution, cloud interaction (altered albedo), and modifies patterns of precipitation (cloud condensation nuclei). We are working with the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) science team and other stakeholder agencies, primarily the Environmental Protection Agency and regional partners, to generate a biomass burning (BB) plume injection height database using multiple platforms, sensors and models (CALIOP, MODIS, NOAA HMS, Langley Trajectory Model). These data have the capacity to provide enhanced smoke plume injection height parameterization in regional, national and international scientific and air quality models. Statistics that link fire behavior and weather to plume rise are crucial for verifying and enhancing plume rise parameterization in local-, regional- and global-scale models used for air quality, chemical transport and climate. Specifically, we will present: (1) a methodology that links BB injection height and CALIOP air parcels to specific fires; (2) the daily evolution of smoke plumes for specific fires; (3) plumes transport and deposited on the Greenland Ice Sheet; and (4) compare CALIOP-derived smoke plume injection

  6. Introducing etch kernels for efficient pattern sampling and etch bias prediction

    NASA Astrophysics Data System (ADS)

    Weisbuch, François; Lutich, Andrey; Schatz, Jirka

    2018-01-01

    Successful patterning requires good control of the photolithography and etch processes. While compact litho models, mainly based on rigorous physics, can predict very well the contours printed in photoresist, pure empirical etch models are less accurate and more unstable. Compact etch models are based on geometrical kernels to compute the litho-etch biases that measure the distance between litho and etch contours. The definition of the kernels, as well as the choice of calibration patterns, is critical to get a robust etch model. This work proposes to define a set of independent and anisotropic etch kernels-"internal, external, curvature, Gaussian, z_profile"-designed to represent the finest details of the resist geometry to characterize precisely the etch bias at any point along a resist contour. By evaluating the etch kernels on various structures, it is possible to map their etch signatures in a multidimensional space and analyze them to find an optimal sampling of structures. The etch kernels evaluated on these structures were combined with experimental etch bias derived from scanning electron microscope contours to train artificial neural networks to predict etch bias. The method applied to contact and line/space layers shows an improvement in etch model prediction accuracy over standard etch model. This work emphasizes the importance of the etch kernel definition to characterize and predict complex etch effects.

  7. Three-Dimensional Sensitivity Kernels of Z/H Amplitude Ratios of Surface and Body Waves

    NASA Astrophysics Data System (ADS)

    Bao, X.; Shen, Y.

    2017-12-01

    The ellipticity of Rayleigh wave particle motion, or Z/H amplitude ratio, has received increasing attention in inversion for shallow Earth structures. Previous studies of the Z/H ratio assumed one-dimensional (1D) velocity structures beneath the receiver, ignoring the effects of three-dimensional (3D) heterogeneities on wave amplitudes. This simplification may introduce bias in the resulting models. Here we present 3D sensitivity kernels of the Z/H ratio to Vs, Vp, and density perturbations, based on finite-difference modeling of wave propagation in 3D structures and the scattering-integral method. Our full-wave approach overcomes two main issues in previous studies of Rayleigh wave ellipticity: (1) the finite-frequency effects of wave propagation in 3D Earth structures, and (2) isolation of the fundamental mode Rayleigh waves from Rayleigh wave overtones and converted Love waves. In contrast to the 1D depth sensitivity kernels in previous studies, our 3D sensitivity kernels exhibit patterns that vary with azimuths and distances to the receiver. The laterally-summed 3D sensitivity kernels and 1D depth sensitivity kernels, based on the same homogeneous reference model, are nearly identical with small differences that are attributable to the single period of the 1D kernels and a finite period range of the 3D kernels. We further verify the 3D sensitivity kernels by comparing the predictions from the kernels with the measurements from numerical simulations of wave propagation for models with various small-scale perturbations. We also calculate and verify the amplitude kernels for P waves. This study shows that both Rayleigh and body wave Z/H ratios provide vertical and lateral constraints on the structure near the receiver. With seismic arrays, the 3D kernels afford a powerful tool to use the Z/H ratios to obtain accurate and high-resolution Earth models.

  8. Multiple kernel learning using single stage function approximation for binary classification problems

    NASA Astrophysics Data System (ADS)

    Shiju, S.; Sumitra, S.

    2017-12-01

    In this paper, the multiple kernel learning (MKL) is formulated as a supervised classification problem. We dealt with binary classification data and hence the data modelling problem involves the computation of two decision boundaries of which one related with that of kernel learning and the other with that of input data. In our approach, they are found with the aid of a single cost function by constructing a global reproducing kernel Hilbert space (RKHS) as the direct sum of the RKHSs corresponding to the decision boundaries of kernel learning and input data and searching that function from the global RKHS, which can be represented as the direct sum of the decision boundaries under consideration. In our experimental analysis, the proposed model had shown superior performance in comparison with that of existing two stage function approximation formulation of MKL, where the decision functions of kernel learning and input data are found separately using two different cost functions. This is due to the fact that single stage representation helps the knowledge transfer between the computation procedures for finding the decision boundaries of kernel learning and input data, which inturn boosts the generalisation capacity of the model.

  9. On supervised graph Laplacian embedding CA model & kernel construction and its application

    NASA Astrophysics Data System (ADS)

    Zeng, Junwei; Qian, Yongsheng; Wang, Min; Yang, Yongzhong

    2017-01-01

    There are many methods to construct kernel with given data attribute information. Gaussian radial basis function (RBF) kernel is one of the most popular ways to construct a kernel. The key observation is that in real-world data, besides the data attribute information, data label information also exists, which indicates the data class. In order to make use of both data attribute information and data label information, in this work, we propose a supervised kernel construction method. Supervised information from training data is integrated into standard kernel construction process to improve the discriminative property of resulting kernel. A supervised Laplacian embedding cellular automaton model is another key application developed for two-lane heterogeneous traffic flow with the safe distance and large-scale truck. Based on the properties of traffic flow in China, we re-calibrate the cell length, velocity, random slowing mechanism and lane-change conditions and use simulation tests to study the relationships among the speed, density and flux. The numerical results show that the large-scale trucks will have great effects on the traffic flow, which are relevant to the proportion of the large-scale trucks, random slowing rate and the times of the lane space change.

  10. Dynamic PET Image reconstruction for parametric imaging using the HYPR kernel method

    NASA Astrophysics Data System (ADS)

    Spencer, Benjamin; Qi, Jinyi; Badawi, Ramsey D.; Wang, Guobao

    2017-03-01

    Dynamic PET image reconstruction is a challenging problem because of the ill-conditioned nature of PET and the lowcounting statistics resulted from short time-frames in dynamic imaging. The kernel method for image reconstruction has been developed to improve image reconstruction of low-count PET data by incorporating prior information derived from high-count composite data. In contrast to most of the existing regularization-based methods, the kernel method embeds image prior information in the forward projection model and does not require an explicit regularization term in the reconstruction formula. Inspired by the existing highly constrained back-projection (HYPR) algorithm for dynamic PET image denoising, we propose in this work a new type of kernel that is simpler to implement and further improves the kernel-based dynamic PET image reconstruction. Our evaluation study using a physical phantom scan with synthetic FDG tracer kinetics has demonstrated that the new HYPR kernel-based reconstruction can achieve a better region-of-interest (ROI) bias versus standard deviation trade-off for dynamic PET parametric imaging than the post-reconstruction HYPR denoising method and the previously used nonlocal-means kernel.

  11. Pollen source effects on growth of kernel structures and embryo chemical compounds in maize.

    PubMed

    Tanaka, W; Mantese, A I; Maddonni, G A

    2009-08-01

    Previous studies have reported effects of pollen source on the oil concentration of maize (Zea mays) kernels through modifications to both the embryo/kernel ratio and embryo oil concentration. The present study expands upon previous analyses by addressing pollen source effects on the growth of kernel structures (i.e. pericarp, endosperm and embryo), allocation of embryo chemical constituents (i.e. oil, protein, starch and soluble sugars), and the anatomy and histology of the embryos. Maize kernels with different oil concentration were obtained from pollinations with two parental genotypes of contrasting oil concentration. The dynamics of the growth of kernel structures and allocation of embryo chemical constituents were analysed during the post-flowering period. Mature kernels were dissected to study the anatomy (embryonic axis and scutellum) and histology [cell number and cell size of the scutellums, presence of sub-cellular structures in scutellum tissue (starch granules, oil and protein bodies)] of the embryos. Plants of all crosses exhibited a similar kernel number and kernel weight. Pollen source modified neither the growth period of kernel structures, nor pericarp growth rate. By contrast, pollen source determined a trade-off between embryo and endosperm growth rates, which impacted on the embryo/kernel ratio of mature kernels. Modifications to the embryo size were mediated by scutellum cell number. Pollen source also affected (P < 0.01) allocation of embryo chemical compounds. Negative correlations among embryo oil concentration and those of starch (r = 0.98, P < 0.01) and soluble sugars (r = 0.95, P < 0.05) were found. Coincidently, embryos with low oil concentration had an increased (P < 0.05-0.10) scutellum cell area occupied by starch granules and fewer oil bodies. The effects of pollen source on both embryo/kernel ratio and allocation of embryo chemicals seems to be related to the early established sink strength (i.e. sink size and sink activity) of the

  12. Reconstruction of noisy and blurred images using blur kernel

    NASA Astrophysics Data System (ADS)

    Ellappan, Vijayan; Chopra, Vishal

    2017-11-01

    Blur is a common in so many digital images. Blur can be caused by motion of the camera and scene object. In this work we proposed a new method for deblurring images. This work uses sparse representation to identify the blur kernel. By analyzing the image coordinates Using coarse and fine, we fetch the kernel based image coordinates and according to that observation we get the motion angle of the shaken or blurred image. Then we calculate the length of the motion kernel using radon transformation and Fourier for the length calculation of the image and we use Lucy Richardson algorithm which is also called NON-Blind(NBID) Algorithm for more clean and less noisy image output. All these operation will be performed in MATLAB IDE.

  13. Novel near-infrared sampling apparatus for single kernel analysis of oil content in maize.

    PubMed

    Janni, James; Weinstock, B André; Hagen, Lisa; Wright, Steve

    2008-04-01

    A method of rapid, nondestructive chemical and physical analysis of individual maize (Zea mays L.) kernels is needed for the development of high value food, feed, and fuel traits. Near-infrared (NIR) spectroscopy offers a robust nondestructive method of trait determination. However, traditional NIR bulk sampling techniques cannot be applied successfully to individual kernels. Obtaining optimized single kernel NIR spectra for applied chemometric predictive analysis requires a novel sampling technique that can account for the heterogeneous forms, morphologies, and opacities exhibited in individual maize kernels. In this study such a novel technique is described and compared to less effective means of single kernel NIR analysis. Results of the application of a partial least squares (PLS) derived model for predictive determination of percent oil content per individual kernel are shown.

  14. OVOC Emissions and Atmospheric Transformations.

    NASA Astrophysics Data System (ADS)

    Yokelson, R. J.; Christian, T. J.; Bertschi, I. T.; Ward, D. E.; Field, R. J.; Hobbs, P. V.; Goode, J.; Mason, S.; Susott, R.; Babbitt, R.; Hao, W. M.

    2002-12-01

    We quantified the main emissions from a few vegetation samples and many biomass fires using ground-based, open-path FTIR and airborne, closed-cell FTIR. The two instruments have been rigorously compared to each other and to PTR-MS and canister sampling. OVOC are major emissions from plants. OVOC account for about 70 percent of NMOC from savanna fires (the largest type of biomass burning) and 70-80 percent of NMOC from production and use of domestic biofuels (the second largest type of biomass burning). A table of average biofuel emissions is presented. Data from laboratory and free-burning fires, obtained from Alaska to South Africa, is used to develop equations that predict OVOC emissions from a wide variety of global fires. The impact of OVOC on smoke plume chemistry and the post-emission transformations of OVOC were investigated with two models. Addition of HCHO alone to the simple chemistry used in some global models dramatically reduces NOx lifetime and speeds up O3 formation rates in plumes. A detailed model verifies these effects and shows that OVOC profoundly affect formation of HOx, peroxide, and nitrogen reservoir species. The modeled photochemical transformations of OVOC are diverse, but some key pathways are unknown. We observed rapid production of both O3 and additional OVOC and OH of 1.7E7 in smoke plumes in Alaska and Africa; all reasonably consistent with model predictions. In addition, we found that cloud processing caused large post-emission changes in smoke trace gases including removal of nearly all methanol, a decrease in acetic acid, and a large increase in HCHO. These observations suggest that OVOC could react in cloud droplets and lead to production of modified aerosol. In addition, transport of OVOC by deep convection may be associated with large effects not explained by solubility alone.

  15. Investigating smoke's influence on primary production throughout the Amazon

    NASA Astrophysics Data System (ADS)

    Flanner, M. G.; Mahowald, N. M.; Zender, C. S.; Randerson, J. T.; Tosca, M. G.

    2007-12-01

    Smoke from annual burning in the Amazon causes large reduction in surface insolation and increases the diffuse fraction of photosynthetically-active radiation (PAR). These effects have competing influence on gross primary production (GPP). Recent studies indicate that the sign of net influence depends on aerosol optical depth, but the magnitude of smoke's effect on continental-scale carbon cycling is very poorly constrained and may constitute an important term of fire's net impact on carbon storage. To investigate widespread effects of Amazon smoke on surface radiation properties, we apply a version of the NCAR Community Atmosphere Model with prognostic aerosol transport, driven with re-analysis winds. Carbon aerosol emissions are derived from the Global Fire Emissions Database (GFED). We use AERONET observations to identify model biases in aerosol optical depth, single-scatter albedo, and surface radiative forcing, and prescribe new aerosol optical properties based on field observations to improve model agreement with AERONET data. Finally, we quantify a potential range of smoke-induced change in large-scale GPP based on: 1) ground measurements of GPP in the Amazon as a function of aerosol optical depth and diffuse fraction of PAR, and 2) empirical functions of ecosystem-scale photosynthesis rates currently employed in models such as the Community Land Model (CLM).

  16. Structured Kernel Subspace Learning for Autonomous Robot Navigation.

    PubMed

    Kim, Eunwoo; Choi, Sungjoon; Oh, Songhwai

    2018-02-14

    This paper considers two important problems for autonomous robot navigation in a dynamic environment, where the goal is to predict pedestrian motion and control a robot with the prediction for safe navigation. While there are several methods for predicting the motion of a pedestrian and controlling a robot to avoid incoming pedestrians, it is still difficult to safely navigate in a dynamic environment due to challenges, such as the varying quality and complexity of training data with unwanted noises. This paper addresses these challenges simultaneously by proposing a robust kernel subspace learning algorithm based on the recent advances in nuclear-norm and l 1 -norm minimization. We model the motion of a pedestrian and the robot controller using Gaussian processes. The proposed method efficiently approximates a kernel matrix used in Gaussian process regression by learning low-rank structured matrix (with symmetric positive semi-definiteness) to find an orthogonal basis, which eliminates the effects of erroneous and inconsistent data. Based on structured kernel subspace learning, we propose a robust motion model and motion controller for safe navigation in dynamic environments. We evaluate the proposed robust kernel learning in various tasks, including regression, motion prediction, and motion control problems, and demonstrate that the proposed learning-based systems are robust against outliers and outperform existing regression and navigation methods.

  17. An Adaptive Genetic Association Test Using Double Kernel Machines

    PubMed Central

    Zhan, Xiang; Epstein, Michael P.; Ghosh, Debashis

    2014-01-01

    Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study. PMID:26640602

  18. An Adaptive Genetic Association Test Using Double Kernel Machines.

    PubMed

    Zhan, Xiang; Epstein, Michael P; Ghosh, Debashis

    2015-10-01

    Recently, gene set-based approaches have become very popular in gene expression profiling studies for assessing how genetic variants are related to disease outcomes. Since most genes are not differentially expressed, existing pathway tests considering all genes within a pathway suffer from considerable noise and power loss. Moreover, for a differentially expressed pathway, it is of interest to select important genes that drive the effect of the pathway. In this article, we propose an adaptive association test using double kernel machines (DKM), which can both select important genes within the pathway as well as test for the overall genetic pathway effect. This DKM procedure first uses the garrote kernel machines (GKM) test for the purposes of subset selection and then the least squares kernel machine (LSKM) test for testing the effect of the subset of genes. An appealing feature of the kernel machine framework is that it can provide a flexible and unified method for multi-dimensional modeling of the genetic pathway effect allowing for both parametric and nonparametric components. This DKM approach is illustrated with application to simulated data as well as to data from a neuroimaging genetics study.

  19. Salt stress reduces kernel number of corn by inhibiting plasma membrane H+-ATPase activity.

    PubMed

    Jung, Stephan; Hütsch, Birgit W; Schubert, Sven

    2017-04-01

    Salt stress affects yield formation of corn (Zea mays L.) at various physiological levels resulting in an overall grain yield decrease. In this study we investigated how salt stress affects kernel development of two corn cultivars (cvs. Pioneer 3906 and Fabregas) at and shortly after pollination. In an earlier study, we found an accumulation of hexoses in the kernel tissue. Therefore, it was hypothesized that hexose uptake into developing endosperm and embryo might be inhibited. Hexoses are transported into the developing endosperm by carriers localized in the plasma membrane (PM). The transport is driven by the pH gradient which is built up by the PM H + -ATPase. It was investigated whether the PM H + -ATPase activity in developing corn kernels was inhibited by salt stress, which would cause a lower pH gradient resulting in impaired hexose import and finally in kernel abortion. Corn grown under control and salt stress conditions was harvested 0 and 2 days after pollination (DAP). Under salt stress sucrose and hexose concentrations in kernel tissue were higher 0 and 2 DAP. Kernel PM H + -ATPase activity was not affected at 0 DAP, but it was reduced at 2 DAP. This is in agreement with the finding, that kernel growth and thus kernel setting was not affected in the salt stress treatment at pollination, but it was reduced 2 days later. It is concluded that inhibition of PM H + -ATPase under salt stress impaired the energization of hexose transporters into the cells, resulting in lower kernel growth and finally in kernel abortion. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  20. Adaptive kernel regression for freehand 3D ultrasound reconstruction

    NASA Astrophysics Data System (ADS)

    Alshalalfah, Abdel-Latif; Daoud, Mohammad I.; Al-Najar, Mahasen

    2017-03-01

    Freehand three-dimensional (3D) ultrasound imaging enables low-cost and flexible 3D scanning of arbitrary-shaped organs, where the operator can freely move a two-dimensional (2D) ultrasound probe to acquire a sequence of tracked cross-sectional images of the anatomy. Often, the acquired 2D ultrasound images are irregularly and sparsely distributed in the 3D space. Several 3D reconstruction algorithms have been proposed to synthesize 3D ultrasound volumes based on the acquired 2D images. A challenging task during the reconstruction process is to preserve the texture patterns in the synthesized volume and ensure that all gaps in the volume are correctly filled. This paper presents an adaptive kernel regression algorithm that can effectively reconstruct high-quality freehand 3D ultrasound volumes. The algorithm employs a kernel regression model that enables nonparametric interpolation of the voxel gray-level values. The kernel size of the regression model is adaptively adjusted based on the characteristics of the voxel that is being interpolated. In particular, when the algorithm is employed to interpolate a voxel located in a region with dense ultrasound data samples, the size of the kernel is reduced to preserve the texture patterns. On the other hand, the size of the kernel is increased in areas that include large gaps to enable effective gap filling. The performance of the proposed algorithm was compared with seven previous interpolation approaches by synthesizing freehand 3D ultrasound volumes of a benign breast tumor. The experimental results show that the proposed algorithm outperforms the other interpolation approaches.

  1. A locally adaptive kernel regression method for facies delineation

    NASA Astrophysics Data System (ADS)

    Fernàndez-Garcia, D.; Barahona-Palomo, M.; Henri, C. V.; Sanchez-Vila, X.

    2015-12-01

    Facies delineation is defined as the separation of geological units with distinct intrinsic characteristics (grain size, hydraulic conductivity, mineralogical composition). A major challenge in this area stems from the fact that only a few scattered pieces of hydrogeological information are available to delineate geological facies. Several methods to delineate facies are available in the literature, ranging from those based only on existing hard data, to those including secondary data or external knowledge about sedimentological patterns. This paper describes a methodology to use kernel regression methods as an effective tool for facies delineation. The method uses both the spatial and the actual sampled values to produce, for each individual hard data point, a locally adaptive steering kernel function, self-adjusting the principal directions of the local anisotropic kernels to the direction of highest local spatial correlation. The method is shown to outperform the nearest neighbor classification method in a number of synthetic aquifers whenever the available number of hard data is small and randomly distributed in space. In the case of exhaustive sampling, the steering kernel regression method converges to the true solution. Simulations ran in a suite of synthetic examples are used to explore the selection of kernel parameters in typical field settings. It is shown that, in practice, a rule of thumb can be used to obtain suboptimal results. The performance of the method is demonstrated to significantly improve when external information regarding facies proportions is incorporated. Remarkably, the method allows for a reasonable reconstruction of the facies connectivity patterns, shown in terms of breakthrough curves performance.

  2. Kernel analysis in TeV gamma-ray selection

    NASA Astrophysics Data System (ADS)

    Moriarty, P.; Samuelson, F. W.

    2000-06-01

    We discuss the use of kernel analysis as a technique for selecting gamma-ray candidates in Atmospheric Cherenkov astronomy. The method is applied to observations of the Crab Nebula and Markarian 501 recorded with the Whipple 10 m Atmospheric Cherenkov imaging system, and the results are compared with the standard Supercuts analysis. Since kernel analysis is computationally intensive, we examine approaches to reducing the computational load. Extension of the technique to estimate the energy of the gamma-ray primary is considered. .

  3. Surface and top-of-atmosphere radiative feedback kernels for CESM-CAM5

    NASA Astrophysics Data System (ADS)

    Pendergrass, Angeline G.; Conley, Andrew; Vitt, Francis M.

    2018-02-01

    Radiative kernels at the top of the atmosphere are useful for decomposing changes in atmospheric radiative fluxes due to feedbacks from atmosphere and surface temperature, water vapor, and surface albedo. Here we describe and validate radiative kernels calculated with the large-ensemble version of CAM5, CESM1.1.2, at the top of the atmosphere and the surface. Estimates of the radiative forcing from greenhouse gases and aerosols in RCP8.5 in the CESM large-ensemble simulations are also diagnosed. As an application, feedbacks are calculated for the CESM large ensemble. The kernels are freely available at https://doi.org/10.5065/D6F47MT6, and accompanying software can be downloaded from kernels" target="_blank">https://github.com/apendergrass/cam5-kernels.

  4. Kernel-Correlated Levy Field Driven Forward Rate and Application to Derivative Pricing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bo Lijun; Wang Yongjin; Yang Xuewei, E-mail: xwyangnk@yahoo.com.cn

    2013-08-01

    We propose a term structure of forward rates driven by a kernel-correlated Levy random field under the HJM framework. The kernel-correlated Levy random field is composed of a kernel-correlated Gaussian random field and a centered Poisson random measure. We shall give a criterion to preclude arbitrage under the risk-neutral pricing measure. As applications, an interest rate derivative with general payoff functional is priced under this pricing measure.

  5. Smoke-Point Properties of Nonbuoyant Round Laminar Jet Diffusion Flames

    NASA Technical Reports Server (NTRS)

    Urban, D. L.; Yuan, Z.-G.; Sunderland, R. B.; Lin, K.-C.; Dai, Z.; Faeth, G. M.

    2000-01-01

    The laminar smoke-point properties of nonbuoyant round laminar jet diffusion flames were studied emphasizing results from long duration (100-230 s) experiments at microgravity carried -out on- orbit in the Space Shuttle Columbia. Experimental conditions included ethylene-and propane-fueled flames burning in still air at an ambient temperature of 300 K, initial jet exit diameters of 1.6 and 2.7 mm, jet exit velocities of 170-1630 mm/s, jet exit Reynolds numbers of 46-172, characteristic flame residence times of 40-302 ms, and luminous flame lengths of 15-63 mm. The onset of laminar smoke-point conditions involved two flame configurations: closed-tip flames with first soot emissions along the flame axis and open-tip flames with first soot emissions from an annular ring about the flame axis. Open-tip flames were observed at large characteristic flame residence times with the onset of soot emissions associated with radiative quenching near the flame tip; nevertheless, unified correlations of laminar smoke-point properties were obtained that included both flame configurations. Flame lengths at laminar smoke-point conditions were well-correlated in terms of a corrected fuel flow rate suggested by a simplified analysis of flame shape. The present steady and nonbuoyant flames emitted soot more readily than earlier tests of nonbuoyant flames at microgravity using ground-based facilities and of buoyant flames at normal gravity due to reduced effects of unsteadiness, flame disturbances and buoyant motion. For example, laminar smoke-point flame lengths from ground-based microgravity measurements were up to 2.3 times longer and from buoyant flame measurements were up to 6.4 times longer than the present measurements at comparable conditions. Finally, present laminar smoke-point flame lengths were roughly inversely proportional to pressure, which is a somewhat slower variation than observed during earlier tests both at microgravity using ground-based facilities and at normal

  6. Smoke emissions due to burning of green waste in the Mediterranean area: Influence of fuel moisture content and fuel mass

    NASA Astrophysics Data System (ADS)

    Tihay-Felicelli, V.; Santoni, P. A.; Gerandi, G.; Barboni, T.

    2017-06-01

    The aim of this study was to investigate emission characteristics in relation to differences in fuel moisture content (FMC) and initial dry mass. For this purpose, branches and twigs with leaves of Cistus monspeliensis were burned in a Large Scale Heat Release apparatus coupled to a Fourier Transform Infrared Spectrometer. A smoke analysis was conducted and the results highlighted the presence of CO2, H2O, CO, CH4, NO, NO2, NH3, SO2, and non-methane organic compounds (NMOC). CO2, NO, and NO2 species are mainly released during flaming combustion, whereas CO, CH4, NH3, and NMOC are emitted during both flaming and smoldering combustion. The emission of these compounds during flaming combustion is due to a rich fuel to air mixture, leading to incomplete combustion. The fuel moisture content and initial dry mass influence the flame residence time, the duration of smoldering combustion, the combustion efficiency, and the emission factors. By increasing the initial dry mass, the emission factors of NO, NO2, and CO2 decrease, whereas those of CO and CH4 increase. The increase of FMC induces an increase of the emission factors of CO, CH4, NH3, NMOC, and aerosols, and a decrease of those of CO2, NO, and NO2. Increasing fuel moisture content reduces fuel consumption, duration of smoldering, and peak heat release rate, but simultaneously increases the duration of propagation within the packed bed, and the flame residence time. Increasing the initial dry mass, causes all the previous combustion parameters to increase. These findings have implications for modeling biomass burning emissions and impacts.

  7. SOME ENGINEERING PROPERTIES OF SHELLED AND KERNEL TEA (Camellia sinensis) SEEDS.

    PubMed

    Altuntas, Ebubekir; Yildiz, Merve

    2017-01-01

    Camellia sinensis is the source of tea leaves and it is an economic crop now grown around the World. Tea seed oil has been used for cooking in China and other Asian countries for more than a thousand years. Tea is the most widely consumed beverages after water in the world. It is mainly produced in Asia, central Africa, and exported throughout the World. Some engineering properties (size dimensions, sphericity, volume, bulk and true densities, friction coefficient, colour characteristics and mechanical behaviour as rupture force of shelled and kernel tea ( Camellia sinensis ) seeds were determined in this study. This research was carried out for shelled and kernel tea seeds. The shelled tea seeds used in this study were obtained from East-Black Sea Tea Cooperative Institution in Rize city of Turkey. Shelled and kernel tea seeds were characterized as large and small sizes. The average geometric mean diameter and seed mass of the shelled tea seeds were 15.8 mm, 10.7 mm (large size); 1.47 g, 0.49 g (small size); while the average geometric mean diameter and seed mass of the kernel tea seeds were 11.8 mm, 8 mm for large size; 0.97 g, 0.31 g for small size, respectively. The sphericity, surface area and volume values were found to be higher in a larger size than small size for the shelled and kernel tea samples. The shelled tea seed's colour intensity (Chroma) were found between 59.31 and 64.22 for large size, while the kernel tea seed's chroma values were found between 56.04 68.34 for large size, respectively. The rupture force values of kernel tea seeds were higher than shelled tea seeds for the large size along X axis; whereas, the rupture force values of along X axis were higher than Y axis for large size of shelled tea seeds. The static coefficients of friction of shelled and kernel tea seeds for the large and small sizes higher values for rubber than the other friction surfaces. Some engineering properties, such as geometric mean diameter, sphericity, volume, bulk

  8. Learning a peptide-protein binding affinity predictor with kernel ridge regression

    PubMed Central

    2013-01-01

    Background The cellular function of a vast majority of proteins is performed through physical interactions with other biomolecules, which, most of the time, are other proteins. Peptides represent templates of choice for mimicking a secondary structure in order to modulate protein-protein interaction. They are thus an interesting class of therapeutics since they also display strong activity, high selectivity, low toxicity and few drug-drug interactions. Furthermore, predicting peptides that would bind to a specific MHC alleles would be of tremendous benefit to improve vaccine based therapy and possibly generate antibodies with greater affinity. Modern computational methods have the potential to accelerate and lower the cost of drug and vaccine discovery by selecting potential compounds for testing in silico prior to biological validation. Results We propose a specialized string kernel for small bio-molecules, peptides and pseudo-sequences of binding interfaces. The kernel incorporates physico-chemical properties of amino acids and elegantly generalizes eight kernels, comprised of the Oligo, the Weighted Degree, the Blended Spectrum, and the Radial Basis Function. We provide a low complexity dynamic programming algorithm for the exact computation of the kernel and a linear time algorithm for it’s approximation. Combined with kernel ridge regression and SupCK, a novel binding pocket kernel, the proposed kernel yields biologically relevant and good prediction accuracy on the PepX database. For the first time, a machine learning predictor is capable of predicting the binding affinity of any peptide to any protein with reasonable accuracy. The method was also applied to both single-target and pan-specific Major Histocompatibility Complex class II benchmark datasets and three Quantitative Structure Affinity Model benchmark datasets. Conclusion On all benchmarks, our method significantly (p-value ≤ 0.057) outperforms the current state-of-the-art methods at predicting

  9. Sliding Window Generalized Kernel Affine Projection Algorithm Using Projection Mappings

    NASA Astrophysics Data System (ADS)

    Slavakis, Konstantinos; Theodoridis, Sergios

    2008-12-01

    Very recently, a solution to the kernel-based online classification problem has been given by the adaptive projected subgradient method (APSM). The developed algorithm can be considered as a generalization of a kernel affine projection algorithm (APA) and the kernel normalized least mean squares (NLMS). Furthermore, sparsification of the resulting kernel series expansion was achieved by imposing a closed ball (convex set) constraint on the norm of the classifiers. This paper presents another sparsification method for the APSM approach to the online classification task by generating a sequence of linear subspaces in a reproducing kernel Hilbert space (RKHS). To cope with the inherent memory limitations of online systems and to embed tracking capabilities to the design, an upper bound on the dimension of the linear subspaces is imposed. The underlying principle of the design is the notion of projection mappings. Classification is performed by metric projection mappings, sparsification is achieved by orthogonal projections, while the online system's memory requirements and tracking are attained by oblique projections. The resulting sparsification scheme shows strong similarities with the classical sliding window adaptive schemes. The proposed design is validated by the adaptive equalization problem of a nonlinear communication channel, and is compared with classical and recent stochastic gradient descent techniques, as well as with the APSM's solution where sparsification is performed by a closed ball constraint on the norm of the classifiers.

  10. Detoxification of Jatropha curcas kernel cake by a novel Streptomyces fimicarius strain.

    PubMed

    Wang, Xing-Hong; Ou, Lingcheng; Fu, Liang-Liang; Zheng, Shui; Lou, Ji-Dong; Gomes-Laranjo, José; Li, Jiao; Zhang, Changhe

    2013-09-15

    A huge amount of kernel cake, which contains a variety of toxins including phorbol esters (tumor promoters), is projected to be generated yearly in the near future by the Jatropha biodiesel industry. We showed that the kernel cake strongly inhibited plant seed germination and root growth and was highly toxic to carp fingerlings, even though phorbol esters were undetectable by HPLC. Therefore it must be detoxified before disposal to the environment. A mathematic model was established to estimate the general toxicity of the kernel cake by determining the survival time of carp fingerling. A new strain (Streptomyces fimicarius YUCM 310038) capable of degrading the total toxicity by more than 97% in a 9-day solid state fermentation was screened out from 578 strains including 198 known strains and 380 strains isolated from air and soil. The kernel cake fermented by YUCM 310038 was nontoxic to plants and carp fingerlings and significantly promoted tobacco plant growth, indicating its potential to transform the toxic kernel cake to bio-safe animal feed or organic fertilizer to remove the environmental concern and to reduce the cost of the Jatropha biodiesel industry. Microbial strain profile essential for the kernel cake detoxification was discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Biomass Burning Smoke Climatology of the United States: Implications for Particulate Matter Air Quality.

    PubMed

    Kaulfus, Aaron S; Nair, Udaysankar; Jaffe, Daniel; Christopher, Sundar A; Goodrick, Scott

    2017-10-17

    We utilize the NOAA Hazard Mapping System smoke product for the period of 2005 to 2016 to develop climatology of smoke occurrence over the Continental United States (CONUS) region and to study the impact of wildland fires on particulate matter air quality at the surface. Our results indicate that smoke is most frequently found over the Great Plains and western states during the summer months. Other hotspots of smoke occurrence are found over state and national parks in the southeast during winter and spring, in the Gulf of Mexico southwards of the Texas and Louisiana coastline during spring season and along the Mississippi River Delta during the fall season. A substantial portion (20%) of the 24 h federal standard for particulate pollution exceedance events in the CONUS region occur when smoke is present. If the U.S. Environmental Protection Agency regulations continue to reduce anthropogenic emissions, wildland fire emissions will become the major contributor to particulate pollution and exceedance events. In this context, we show that HMS smoke product is a valuable tool for analysis of exceptional events caused by wildland fires and our results indicate that these tools can be valuable for policy and decision makers.

  12. Comparison of Kernel Equating and Item Response Theory Equating Methods

    ERIC Educational Resources Information Center

    Meng, Yu

    2012-01-01

    The kernel method of test equating is a unified approach to test equating with some advantages over traditional equating methods. Therefore, it is important to evaluate in a comprehensive way the usefulness and appropriateness of the Kernel equating (KE) method, as well as its advantages and disadvantages compared with several popular item…

  13. 40 CFR 1033.525 - Smoke testing.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... measure smoke emissions using a full-flow, open path light extinction smokemeter. A light extinction meter... path length equal to the hydraulic diameter. The light extinction meter must meet the requirements of... apertures (or windows and lenses) and on the axis of the light beam. (8) You may use light extinction meters...

  14. 40 CFR 1033.525 - Smoke testing.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... measure smoke emissions using a full-flow, open path light extinction smokemeter. A light extinction meter... path length equal to the hydraulic diameter. The light extinction meter must meet the requirements of... apertures (or windows and lenses) and on the axis of the light beam. (8) You may use light extinction meters...

  15. 40 CFR 1033.525 - Smoke testing.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... measure smoke emissions using a full-flow, open path light extinction smokemeter. A light extinction meter... path length equal to the hydraulic diameter. The light extinction meter must meet the requirements of... apertures (or windows and lenses) and on the axis of the light beam. (8) You may use light extinction meters...

  16. 40 CFR 1033.525 - Smoke testing.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... measure smoke emissions using a full-flow, open path light extinction smokemeter. A light extinction meter... path length equal to the hydraulic diameter. The light extinction meter must meet the requirements of... apertures (or windows and lenses) and on the axis of the light beam. (8) You may use light extinction meters...

  17. A Fast Reduced Kernel Extreme Learning Machine.

    PubMed

    Deng, Wan-Yu; Ong, Yew-Soon; Zheng, Qing-Hua

    2016-04-01

    In this paper, we present a fast and accurate kernel-based supervised algorithm referred to as the Reduced Kernel Extreme Learning Machine (RKELM). In contrast to the work on Support Vector Machine (SVM) or Least Square SVM (LS-SVM), which identifies the support vectors or weight vectors iteratively, the proposed RKELM randomly selects a subset of the available data samples as support vectors (or mapping samples). By avoiding the iterative steps of SVM, significant cost savings in the training process can be readily attained, especially on Big datasets. RKELM is established based on the rigorous proof of universal learning involving reduced kernel-based SLFN. In particular, we prove that RKELM can approximate any nonlinear functions accurately under the condition of support vectors sufficiency. Experimental results on a wide variety of real world small instance size and large instance size applications in the context of binary classification, multi-class problem and regression are then reported to show that RKELM can perform at competitive level of generalized performance as the SVM/LS-SVM at only a fraction of the computational effort incurred. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. Single aflatoxin contaminated corn kernel analysis with fluorescence hyperspectral image

    NASA Astrophysics Data System (ADS)

    Yao, Haibo; Hruska, Zuzana; Kincaid, Russell; Ononye, Ambrose; Brown, Robert L.; Cleveland, Thomas E.

    2010-04-01

    Aflatoxins are toxic secondary metabolites of the fungi Aspergillus flavus and Aspergillus parasiticus, among others. Aflatoxin contaminated corn is toxic to domestic animals when ingested in feed and is a known carcinogen associated with liver and lung cancer in humans. Consequently, aflatoxin levels in food and feed are regulated by the Food and Drug Administration (FDA) in the US, allowing 20 ppb (parts per billion) limits in food and 100 ppb in feed for interstate commerce. Currently, aflatoxin detection and quantification methods are based on analytical tests including thin-layer chromatography (TCL) and high performance liquid chromatography (HPLC). These analytical tests require the destruction of samples, and are costly and time consuming. Thus, the ability to detect aflatoxin in a rapid, nondestructive way is crucial to the grain industry, particularly to corn industry. Hyperspectral imaging technology offers a non-invasive approach toward screening for food safety inspection and quality control based on its spectral signature. The focus of this paper is to classify aflatoxin contaminated single corn kernels using fluorescence hyperspectral imagery. Field inoculated corn kernels were used in the study. Contaminated and control kernels under long wavelength ultraviolet excitation were imaged using a visible near-infrared (VNIR) hyperspectral camera. The imaged kernels were chemically analyzed to provide reference information for image analysis. This paper describes a procedure to process corn kernels located in different images for statistical training and classification. Two classification algorithms, Maximum Likelihood and Binary Encoding, were used to classify each corn kernel into "control" or "contaminated" through pixel classification. The Binary Encoding approach had a slightly better performance with accuracy equals to 87% or 88% when 20 ppb or 100 ppb was used as classification threshold, respectively.

  19. Validation of smoke plume rise models using ground based lidar

    Treesearch

    Cyle E. Wold; Shawn Urbanski; Vladimir Kovalev; Alexander Petkov; Wei Min Hao

    2010-01-01

    Biomass fires can significantly degrade regional air quality. Plume rise height is one of the critical factors determining the impact of fire emissions on air quality. Plume rise models are used to prescribe the vertical distribution of fire emissions which are critical input for smoke dispersion and air quality models. The poor state of model evaluation is due in...

  20. [Smoking history worldwide--cigarette smoking, passive smoking and smoke free environment in Switzerland].

    PubMed

    Brändli, Otto

    2010-08-01

    After the invention of the cigarette 1881 the health consequences of active smoking were fully known only in 1964. Since 1986 research findings allow increasingly stronger conclusions about the impact of passive smoking on health, especially for lung cancer, cardiovascular and respiratory disease in adults and children and the sudden infant death syndrome. On the basis of current consumption patterns, approximately 450 million adults will be killed by smoking between 2000 and 2050. At least half of these adults will die between age 30 and 69. Cancer and total deaths due to smoking have fallen so far only in men in high-income countries but will rise globally unless current smokers stop smoking before or during middle age. Higher taxes, regulations on smoking, including 100 % smoke free indoor spaces, and information for consumers could avoid smoking-associated deaths. Irland was 2004 the first country worldwide introducing smoke free bars and restaurants with positive effects on compliance, health of employees and business. In the first year after the introduction these policies have resulted in a 10 - 20 % reduction of acute coronary events. In Switzerland smoke free regulations have been accepted by popular vote first in the canton of Ticino in 2006 and since then in 15 more cantons. The smoking rate dropped from 33 to 27 % since 2001.

  1. Multiscale Support Vector Learning With Projection Operator Wavelet Kernel for Nonlinear Dynamical System Identification.

    PubMed

    Lu, Zhao; Sun, Jing; Butts, Kenneth

    2016-02-03

    A giant leap has been made in the past couple of decades with the introduction of kernel-based learning as a mainstay for designing effective nonlinear computational learning algorithms. In view of the geometric interpretation of conditional expectation and the ubiquity of multiscale characteristics in highly complex nonlinear dynamic systems [1]-[3], this paper presents a new orthogonal projection operator wavelet kernel, aiming at developing an efficient computational learning approach for nonlinear dynamical system identification. In the framework of multiresolution analysis, the proposed projection operator wavelet kernel can fulfill the multiscale, multidimensional learning to estimate complex dependencies. The special advantage of the projection operator wavelet kernel developed in this paper lies in the fact that it has a closed-form expression, which greatly facilitates its application in kernel learning. To the best of our knowledge, it is the first closed-form orthogonal projection wavelet kernel reported in the literature. It provides a link between grid-based wavelets and mesh-free kernel-based methods. Simulation studies for identifying the parallel models of two benchmark nonlinear dynamical systems confirm its superiority in model accuracy and sparsity.

  2. Surface and top-of-atmosphere radiative feedback kernels for CESM-CAM5

    DOE PAGES

    Pendergrass, Angeline G.; Conley, Andrew; Vitt, Francis M.

    2018-02-21

    Radiative kernels at the top of the atmosphere are useful for decomposing changes in atmospheric radiative fluxes due to feedbacks from atmosphere and surface temperature, water vapor, and surface albedo. Here we describe and validate radiative kernels calculated with the large-ensemble version of CAM5, CESM1.1.2, at the top of the atmosphere and the surface. Estimates of the radiative forcing from greenhouse gases and aerosols in RCP8.5 in the CESM large-ensemble simulations are also diagnosed. As an application, feedbacks are calculated for the CESM large ensemble. The kernels are freely available at https://doi.org/10.5065/D6F47MT6, and accompanying software can be downloaded from https://github.com/apendergrass/cam5-kernels.

  3. Surface and top-of-atmosphere radiative feedback kernels for CESM-CAM5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pendergrass, Angeline G.; Conley, Andrew; Vitt, Francis M.

    Radiative kernels at the top of the atmosphere are useful for decomposing changes in atmospheric radiative fluxes due to feedbacks from atmosphere and surface temperature, water vapor, and surface albedo. Here we describe and validate radiative kernels calculated with the large-ensemble version of CAM5, CESM1.1.2, at the top of the atmosphere and the surface. Estimates of the radiative forcing from greenhouse gases and aerosols in RCP8.5 in the CESM large-ensemble simulations are also diagnosed. As an application, feedbacks are calculated for the CESM large ensemble. The kernels are freely available at https://doi.org/10.5065/D6F47MT6, and accompanying software can be downloaded from https://github.com/apendergrass/cam5-kernels.

  4. 40 CFR 87.31 - Standards for exhaust emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) Definitions. Exhaust Emissions (In-Use Aircraft Gas Turbine Engines) § 87.31 Standards for exhaust emissions. (a) Exhaust emissions of smoke from each in-use aircraft gas turbine engine of Class T8... in-use aircraft gas turbine engine of class TF and of rated output of 129 kilonewtons thrust or...

  5. Quantitative estimation of the energy flux during an explosive chromospheric evaporation in a white light flare kernel observed by Hinode, IRIS, SDO, and RHESSI

    NASA Astrophysics Data System (ADS)

    Lee, Kyoung-Sun; Imada, Shinsuke; Kyoko, Watanabe; Bamba, Yumi; Brooks, David H.

    2016-10-01

    An X1.6 flare occurred at the AR 12192 on 2014 October 22 at14:02 UT was observed by Hinode, IRIS, SDO, and RHESSI. We analyze a bright kernel which produces a white light (WL) flare with continuum enhancement and a hard X-ray (HXR) peak. Taking advantage of the spectroscopic observations of IRIS and Hinode/EIS, we measure the temporal variation of the plasma properties in the bright kernel in the chromosphere and corona. We found that explosive evaporation was observed when the WL emission occurred, even though the intensity enhancement in hotter lines is quite weak. The temporal correlation of the WL emission, HXR peak, and evaporation flows indicate that the WL emission was produced by accelerated electrons. To understand the white light emission processes, we calculated the deposited energy flux from the non-thermal electrons observed by RHESSI and compared it to the dissipated energy estimated from the chromospheric line (Mg II triplet) observed by IRIS. The deposited energy flux from the non-thermal electrons is about 3.1 × 1010erg cm-2 s-1 when we consider a cut-off energy 20 keV. The estimated energy flux from the temperature changes in the chromosphere measured from the Mg II subordinate line is about 4.6-6.7×109erg cm-2 s-1, ˜ 15-22% of the deposited energy. By comparison of these estimated energy fluxes we conclude that the continuum enhancement was directly produced by the non-thermal electrons.

  6. Adaptive Shape Kernel-Based Mean Shift Tracker in Robot Vision System

    PubMed Central

    2016-01-01

    This paper proposes an adaptive shape kernel-based mean shift tracker using a single static camera for the robot vision system. The question that we address in this paper is how to construct such a kernel shape that is adaptive to the object shape. We perform nonlinear manifold learning technique to obtain the low-dimensional shape space which is trained by training data with the same view as the tracking video. The proposed kernel searches the shape in the low-dimensional shape space obtained by nonlinear manifold learning technique and constructs the adaptive kernel shape in the high-dimensional shape space. It can improve mean shift tracker performance to track object position and object contour and avoid the background clutter. In the experimental part, we take the walking human as example to validate that our method is accurate and robust to track human position and describe human contour. PMID:27379165

  7. Secondhand Tobacco Smoke (Environmental Tobacco Smoke)

    Cancer.gov

    Learn about secondhand tobacco smoke, which can raise your risk of lung cancer. Secondhand tobacco smoke is the combination of the smoke given off by a burning tobacco product and the smoke exhaled by a smoker. Also called environmental tobacco smoke, involuntary smoke, and passive smoke.

  8. Classification of Microarray Data Using Kernel Fuzzy Inference System

    PubMed Central

    Kumar Rath, Santanu

    2014-01-01

    The DNA microarray classification technique has gained more popularity in both research and practice. In real data analysis, such as microarray data, the dataset contains a huge number of insignificant and irrelevant features that tend to lose useful information. Classes with high relevance and feature sets with high significance are generally referred for the selected features, which determine the samples classification into their respective classes. In this paper, kernel fuzzy inference system (K-FIS) algorithm is applied to classify the microarray data (leukemia) using t-test as a feature selection method. Kernel functions are used to map original data points into a higher-dimensional (possibly infinite-dimensional) feature space defined by a (usually nonlinear) function ϕ through a mathematical process called the kernel trick. This paper also presents a comparative study for classification using K-FIS along with support vector machine (SVM) for different set of features (genes). Performance parameters available in the literature such as precision, recall, specificity, F-measure, ROC curve, and accuracy are considered to analyze the efficiency of the classification model. From the proposed approach, it is apparent that K-FIS model obtains similar results when compared with SVM model. This is an indication that the proposed approach relies on kernel function. PMID:27433543

  9. Kernels, Degrees of Freedom, and Power Properties of Quadratic Distance Goodness-of-Fit Tests

    PubMed Central

    Lindsay, Bruce G.; Markatou, Marianthi; Ray, Surajit

    2014-01-01

    In this article, we study the power properties of quadratic-distance-based goodness-of-fit tests. First, we introduce the concept of a root kernel and discuss the considerations that enter the selection of this kernel. We derive an easy to use normal approximation to the power of quadratic distance goodness-of-fit tests and base the construction of a noncentrality index, an analogue of the traditional noncentrality parameter, on it. This leads to a method akin to the Neyman-Pearson lemma for constructing optimal kernels for specific alternatives. We then introduce a midpower analysis as a device for choosing optimal degrees of freedom for a family of alternatives of interest. Finally, we introduce a new diffusion kernel, called the Pearson-normal kernel, and study the extent to which the normal approximation to the power of tests based on this kernel is valid. Supplementary materials for this article are available online. PMID:24764609

  10. Integrating semantic information into multiple kernels for protein-protein interaction extraction from biomedical literatures.

    PubMed

    Li, Lishuang; Zhang, Panpan; Zheng, Tianfu; Zhang, Hongying; Jiang, Zhenchao; Huang, Degen

    2014-01-01

    Protein-Protein Interaction (PPI) extraction is an important task in the biomedical information extraction. Presently, many machine learning methods for PPI extraction have achieved promising results. However, the performance is still not satisfactory. One reason is that the semantic resources were basically ignored. In this paper, we propose a multiple-kernel learning-based approach to extract PPIs, combining the feature-based kernel, tree kernel and semantic kernel. Particularly, we extend the shortest path-enclosed tree kernel (SPT) by a dynamic extended strategy to retrieve the richer syntactic information. Our semantic kernel calculates the protein-protein pair similarity and the context similarity based on two semantic resources: WordNet and Medical Subject Heading (MeSH). We evaluate our method with Support Vector Machine (SVM) and achieve an F-score of 69.40% and an AUC of 92.00%, which show that our method outperforms most of the state-of-the-art systems by integrating semantic information.

  11. Many Molecular Properties from One Kernel in Chemical Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Raghunathan; von Lilienfeld, O. Anatole

    We introduce property-independent kernels for machine learning modeling of arbitrarily many molecular properties. The kernels encode molecular structures for training sets of varying size, as well as similarity measures sufficiently diffuse in chemical space to sample over all training molecules. Corresponding molecular reference properties provided, they enable the instantaneous generation of ML models which can systematically be improved through the addition of more data. This idea is exemplified for single kernel based modeling of internal energy, enthalpy, free energy, heat capacity, polarizability, electronic spread, zero-point vibrational energy, energies of frontier orbitals, HOMOLUMO gap, and the highest fundamental vibrational wavenumber. Modelsmore » of these properties are trained and tested using 112 kilo organic molecules of similar size. Resulting models are discussed as well as the kernels’ use for generating and using other property models.« less

  12. A method of smoothed particle hydrodynamics using spheroidal kernels

    NASA Technical Reports Server (NTRS)

    Fulbright, Michael S.; Benz, Willy; Davies, Melvyn B.

    1995-01-01

    We present a new method of three-dimensional smoothed particle hydrodynamics (SPH) designed to model systems dominated by deformation along a preferential axis. These systems cause severe problems for SPH codes using spherical kernels, which are best suited for modeling systems which retain rough spherical symmetry. Our method allows the smoothing length in the direction of the deformation to evolve independently of the smoothing length in the perpendicular plane, resulting in a kernel with a spheroidal shape. As a result the spatial resolution in the direction of deformation is significantly improved. As a test case we present the one-dimensional homologous collapse of a zero-temperature, uniform-density cloud, which serves to demonstrate the advantages of spheroidal kernels. We also present new results on the problem of the tidal disruption of a star by a massive black hole.

  13. Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology.

    PubMed

    Poon, Art F Y

    2015-09-01

    The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this "kernel-ABC" method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods. © The Author 2015. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  14. Data-Driven Hierarchical Structure Kernel for Multiscale Part-Based Object Recognition

    PubMed Central

    Wang, Botao; Xiong, Hongkai; Jiang, Xiaoqian; Zheng, Yuan F.

    2017-01-01

    Detecting generic object categories in images and videos are a fundamental issue in computer vision. However, it faces the challenges from inter and intraclass diversity, as well as distortions caused by viewpoints, poses, deformations, and so on. To solve object variations, this paper constructs a structure kernel and proposes a multiscale part-based model incorporating the discriminative power of kernels. The structure kernel would measure the resemblance of part-based objects in three aspects: 1) the global similarity term to measure the resemblance of the global visual appearance of relevant objects; 2) the part similarity term to measure the resemblance of the visual appearance of distinctive parts; and 3) the spatial similarity term to measure the resemblance of the spatial layout of parts. In essence, the deformation of parts in the structure kernel is penalized in a multiscale space with respect to horizontal displacement, vertical displacement, and scale difference. Part similarities are combined with different weights, which are optimized efficiently to maximize the intraclass similarities and minimize the interclass similarities by the normalized stochastic gradient ascent algorithm. In addition, the parameters of the structure kernel are learned during the training process with regard to the distribution of the data in a more discriminative way. With flexible part sizes on scale and displacement, it can be more robust to the intraclass variations, poses, and viewpoints. Theoretical analysis and experimental evaluations demonstrate that the proposed multiscale part-based representation model with structure kernel exhibits accurate and robust performance, and outperforms state-of-the-art object classification approaches. PMID:24808345

  15. Assessment of Particle Pollution from Jetliners: from Smoke Visibility to Nanoparticle Counting.

    PubMed

    Durdina, Lukas; Brem, Benjamin T; Setyan, Ari; Siegerist, Frithjof; Rindlisbacher, Theo; Wang, Jing

    2017-03-21

    Aviation is a substantial and a fast growing emissions source. Besides greenhouse gases, aircraft engines emit black carbon (BC), a climate forcer and air pollutant. Aviation BC emissions have been regulated and estimated through exhaust smoke visibility (smoke number). Their impacts are poorly understood because emission inventories lack representative data. Here, we measured BC mass and number-based emissions of the most popular airliner's engines according to a new emission standard. We used a calibrated engine performance model to determine the emissions on the ground, at cruise altitude, and over entire flight missions. Compared to previous estimates, we found up to a factor of 4 less BC mass emitted from the standardized landing and takeoff cycle and up to a factor of 40 less during taxiing. However, the taxi phase accounted for up to 30% of the total BC number emissions. Depending on the fuel composition and flight distance, the mass and number-based emission indices (/kg fuel burned) were 6.2-14.7 mg and 2.8 × 10 14 - 8.7 × 10 14 , respectively. The BC mass emissions per passenger-km were similar to gasoline vehicles, but the number-based emissions were relatively higher, comparable to old diesel vehicles. This study provides representative data for models and will lead to more accurate assessments of environmental impacts of aviation.

  16. Emission of methyl bromide from biomass burning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoe, S.; Andreae, M.O.

    1994-03-04

    Bromine is, per atom, far more efficient than chlorine in destroying stratospheric ozone, and methyl bromide is the single largest source of stratospheric bromine. The two main previously known sources of this compound are emissions from the ocean and from the compound's use as an agricultural pesticide. Laboratory biomass combustion experiments showed that methyl bromide was emitted in the smoke from various fuels tested. Methyl bromide was also found in smoke plumes from wildfires in savannas, chaparral, and boreal forest. Global emissions of methyl bromide from biomass burning are estimated to be in the range of 10 to 50 gigagramsmore » per year, which is comparable to the amount produced by ocean emission and pesticide use and represents a major contribution ([approximately]30 percent) to the stratospheric bromine budget.« less

  17. Kernel-based whole-genome prediction of complex traits: a review.

    PubMed

    Morota, Gota; Gianola, Daniel

    2014-01-01

    Prediction of genetic values has been a focus of applied quantitative genetics since the beginning of the 20th century, with renewed interest following the advent of the era of whole genome-enabled prediction. Opportunities offered by the emergence of high-dimensional genomic data fueled by post-Sanger sequencing technologies, especially molecular markers, have driven researchers to extend Ronald Fisher and Sewall Wright's models to confront new challenges. In particular, kernel methods are gaining consideration as a regression method of choice for genome-enabled prediction. Complex traits are presumably influenced by many genomic regions working in concert with others (clearly so when considering pathways), thus generating interactions. Motivated by this view, a growing number of statistical approaches based on kernels attempt to capture non-additive effects, either parametrically or non-parametrically. This review centers on whole-genome regression using kernel methods applied to a wide range of quantitative traits of agricultural importance in animals and plants. We discuss various kernel-based approaches tailored to capturing total genetic variation, with the aim of arriving at an enhanced predictive performance in the light of available genome annotation information. Connections between prediction machines born in animal breeding, statistics, and machine learning are revisited, and their empirical prediction performance is discussed. Overall, while some encouraging results have been obtained with non-parametric kernels, recovering non-additive genetic variation in a validation dataset remains a challenge in quantitative genetics.

  18. Volterra series truncation and kernel estimation of nonlinear systems in the frequency domain

    NASA Astrophysics Data System (ADS)

    Zhang, B.; Billings, S. A.

    2017-02-01

    The Volterra series model is a direct generalisation of the linear convolution integral and is capable of displaying the intrinsic features of a nonlinear system in a simple and easy to apply way. Nonlinear system analysis using Volterra series is normally based on the analysis of its frequency-domain kernels and a truncated description. But the estimation of Volterra kernels and the truncation of Volterra series are coupled with each other. In this paper, a novel complex-valued orthogonal least squares algorithm is developed. The new algorithm provides a powerful tool to determine which terms should be included in the Volterra series expansion and to estimate the kernels and thus solves the two problems all together. The estimated results are compared with those determined using the analytical expressions of the kernels to validate the method. To further evaluate the effectiveness of the method, the physical parameters of the system are also extracted from the measured kernels. Simulation studies demonstrates that the new approach not only can truncate the Volterra series expansion and estimate the kernels of a weakly nonlinear system, but also can indicate the applicability of the Volterra series analysis in a severely nonlinear system case.

  19. Epileptic Seizure Detection with Log-Euclidean Gaussian Kernel-Based Sparse Representation.

    PubMed

    Yuan, Shasha; Zhou, Weidong; Wu, Qi; Zhang, Yanli

    2016-05-01

    Epileptic seizure detection plays an important role in the diagnosis of epilepsy and reducing the massive workload of reviewing electroencephalography (EEG) recordings. In this work, a novel algorithm is developed to detect seizures employing log-Euclidean Gaussian kernel-based sparse representation (SR) in long-term EEG recordings. Unlike the traditional SR for vector data in Euclidean space, the log-Euclidean Gaussian kernel-based SR framework is proposed for seizure detection in the space of the symmetric positive definite (SPD) matrices, which form a Riemannian manifold. Since the Riemannian manifold is nonlinear, the log-Euclidean Gaussian kernel function is applied to embed it into a reproducing kernel Hilbert space (RKHS) for performing SR. The EEG signals of all channels are divided into epochs and the SPD matrices representing EEG epochs are generated by covariance descriptors. Then, the testing samples are sparsely coded over the dictionary composed by training samples utilizing log-Euclidean Gaussian kernel-based SR. The classification of testing samples is achieved by computing the minimal reconstructed residuals. The proposed method is evaluated on the Freiburg EEG dataset of 21 patients and shows its notable performance on both epoch-based and event-based assessments. Moreover, this method handles multiple channels of EEG recordings synchronously which is more speedy and efficient than traditional seizure detection methods.

  20. Study of the convergence behavior of the complex kernel least mean square algorithm.

    PubMed

    Paul, Thomas K; Ogunfunmi, Tokunbo

    2013-09-01

    The complex kernel least mean square (CKLMS) algorithm is recently derived and allows for online kernel adaptive learning for complex data. Kernel adaptive methods can be used in finding solutions for neural network and machine learning applications. The derivation of CKLMS involved the development of a modified Wirtinger calculus for Hilbert spaces to obtain the cost function gradient. We analyze the convergence of the CKLMS with different kernel forms for complex data. The expressions obtained enable us to generate theory-predicted mean-square error curves considering the circularity of the complex input signals and their effect on nonlinear learning. Simulations are used for verifying the analysis results.

  1. Influence of maladjustment on emissions from two heavy-duty diesel bus engines

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ullman, T.L.; Hare, C.T.; Baines, T.M.

    1984-01-01

    Diesel engines are adjusted to manufacturers' specifications when produced and placed in service, but varying degrees of maintenance and wear cause changes in engine performance and exhaust emissions. Maladjustments were made on two heavy-duty diesel engines typically used in buses in an effort to simulate some degree of wear and/or lack of maintenance. Emissions were characterized over steady-state and transient engine operation, in both baseline and maladjusted configurations. Selected maladjustments of the Cummins VTB-903 substantially increased HC, smoke and particulate emission levels. Maladjustments of the Detroit Diesel 6V-71 coach engine resulted in lower HC and NO/sup x/ emission levels, butmore » higher CO emissions, smoke, and particulate.« less

  2. Moisture Adsorption Isotherm and Storability of Hazelnut Inshells and Kernels Produced in Oregon, USA.

    PubMed

    Jung, Jooyeoun; Wang, Wenjie; McGorrin, Robert J; Zhao, Yanyun

    2018-02-01

    Moisture adsorption isotherms and storability of dried hazelnut inshells and kernels produced in Oregon were evaluated and compared among cultivars, including Barcelona, Yamhill, and Jefferson. Experimental moisture adsorption data fitted to Guggenheim-Anderson-de Boer (GAB) model, showing less hygroscopic properties in Yamhill than other cultivars of inshells and kernels due to lower content of carbohydrate and protein, but higher content of fat. The safe levels of moisture content (MC, dry basis) of dried inshells and kernels for reaching kernel water activity (a w ) ≤0.65 were estimated using the GAB model as 11.3% and 5.0% for Barcelona, 9.4% and 4.2% for Yamhill, and 10.7% and 4.9% for Jefferson, respectively. Storage conditions (2 °C at 85% to 95% relative humidity [RH], 10 °C at 65% to 75% RH, and 27 °C at 35% to 45% RH), times (0, 4, 8, or 12 mo), and packaging methods (atmosphere vs. vacuum) affected MC, a w , bioactive compounds, lipid oxidation, and enzyme activity of dried hazelnut inshells or kernels. For inshells packaged at woven polypropylene bag, MC and a w of inshells and kernels (inside shells) increased at 2 and 10 °C, but decreased at 27 °C during storage. For kernels, lipid oxidation and polyphenol oxidase activity also increased with extended storage time (P < 0.05), and MC and a w of vacuum packaged samples were more stable during storage than those atmospherically packaged ones. Principal component analysis showed correlation of kernel qualities with storage condition, time, and packaging method. This study demonstrated that the ideal storage condition or packaging method varied among cultivars due to their different moisture adsorption and physicochemical and enzymatic stability during storage. Moisture adsorption isotherm of hazelnut inshells and kernels is useful for predicting the storability of nuts. This study found that water adsorption and storability varied among the different cultivars of nuts, in which Yamhill was

  3. The NAS kernel benchmark program

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.; Barton, J. T.

    1985-01-01

    A collection of benchmark test kernels that measure supercomputer performance has been developed for the use of the NAS (Numerical Aerodynamic Simulation) program at the NASA Ames Research Center. This benchmark program is described in detail and the specific ground rules are given for running the program as a performance test.

  4. Finite-frequency sensitivity kernels for global seismic wave propagation based upon adjoint methods

    NASA Astrophysics Data System (ADS)

    Liu, Qinya; Tromp, Jeroen

    2008-07-01

    We determine adjoint equations and Fréchet kernels for global seismic wave propagation based upon a Lagrange multiplier method. We start from the equations of motion for a rotating, self-gravitating earth model initially in hydrostatic equilibrium, and derive the corresponding adjoint equations that involve motions on an earth model that rotates in the opposite direction. Variations in the misfit function χ then may be expressed as , where δlnm = δm/m denotes relative model perturbations in the volume V, δlnd denotes relative topographic variations on solid-solid or fluid-solid boundaries Σ, and ∇Σδlnd denotes surface gradients in relative topographic variations on fluid-solid boundaries ΣFS. The 3-D Fréchet kernel Km determines the sensitivity to model perturbations δlnm, and the 2-D kernels Kd and Kd determine the sensitivity to topographic variations δlnd. We demonstrate also how anelasticity may be incorporated within the framework of adjoint methods. Finite-frequency sensitivity kernels are calculated by simultaneously computing the adjoint wavefield forward in time and reconstructing the regular wavefield backward in time. Both the forward and adjoint simulations are based upon a spectral-element method. We apply the adjoint technique to generate finite-frequency traveltime kernels for global seismic phases (P, Pdiff, PKP, S, SKS, depth phases, surface-reflected phases, surface waves, etc.) in both 1-D and 3-D earth models. For 1-D models these adjoint-generated kernels generally agree well with results obtained from ray-based methods. However, adjoint methods do not have the same theoretical limitations as ray-based methods, and can produce sensitivity kernels for any given phase in any 3-D earth model. The Fréchet kernels presented in this paper illustrate the sensitivity of seismic observations to structural parameters and topography on internal discontinuities. These kernels form the basis of future 3-D tomographic inversions.

  5. IDENTIFICATION AND EMISSION RATES OF MOLECULAR TRACERS IN COAL SMOKE PARTICULATE MATTER. (R823990)

    EPA Science Inventory

    The abundances and distributions of organic constituents in coal smoke particulate matter are dependent on thermal combustion temperature, ventilation, burn time, and coal rank (geologic maturity). Important coal rank indicators from smoke include (1) the decreases in CPIs of ...

  6. Direct Patlak Reconstruction From Dynamic PET Data Using the Kernel Method With MRI Information Based on Structural Similarity.

    PubMed

    Gong, Kuang; Cheng-Liao, Jinxiu; Wang, Guobao; Chen, Kevin T; Catana, Ciprian; Qi, Jinyi

    2018-04-01

    Positron emission tomography (PET) is a functional imaging modality widely used in oncology, cardiology, and neuroscience. It is highly sensitive, but suffers from relatively poor spatial resolution, as compared with anatomical imaging modalities, such as magnetic resonance imaging (MRI). With the recent development of combined PET/MR systems, we can improve the PET image quality by incorporating MR information into image reconstruction. Previously, kernel learning has been successfully embedded into static and dynamic PET image reconstruction using either PET temporal or MRI information. Here, we combine both PET temporal and MRI information adaptively to improve the quality of direct Patlak reconstruction. We examined different approaches to combine the PET and MRI information in kernel learning to address the issue of potential mismatches between MRI and PET signals. Computer simulations and hybrid real-patient data acquired on a simultaneous PET/MR scanner were used to evaluate the proposed methods. Results show that the method that combines PET temporal information and MRI spatial information adaptively based on the structure similarity index has the best performance in terms of noise reduction and resolution improvement.

  7. Dielectric relaxation measurement and analysis of restricted water structure in rice kernels

    NASA Astrophysics Data System (ADS)

    Yagihara, Shin; Oyama, Mikio; Inoue, Akio; Asano, Megumi; Sudo, Seiichi; Shinyashiki, Naoki

    2007-04-01

    Dielectric relaxation measurements were performed for rice kernels by time domain reflectometry (TDR) with flat-end coaxial electrodes. Difficulties in good contact between the surfaces of the electrodes and the kernels are eliminated by a TDR set-up with a sample holder for a kernel, and the water content could be evaluated from relaxation curves. Dielectric measurements were performed for rice kernels, rice flour and boiled rice with various water contents, and the water amount and dynamic behaviour of water molecules were explained from restricted dynamics of water molecules and also from the τ-β (relaxation time versus the relaxation-time distribution parameter of the Cole-Cole equation) diagram. In comparison with other aqueous systems, the dynamic structure of water in moist rice is more similar to aqueous dispersion systems than to aqueous solutions.

  8. Fruit position within the canopy affects kernel lipid composition of hazelnuts.

    PubMed

    Pannico, Antonio; Cirillo, Chiara; Giaccone, Matteo; Scognamiglio, Pasquale; Romano, Raffaele; Caporaso, Nicola; Sacchi, Raffaele; Basile, Boris

    2017-11-01

    The aim of this research was to study the variability in kernel composition within the canopy of hazelnut trees. Kernel fresh and dry weight increased linearly with fruit height above the ground. Fat content decreased, while protein and ash content increased, from the bottom to the top layers of the canopy. The level of unsaturation of fatty acids decreased from the bottom to the top of the canopy. Thus, the kernels located in the bottom layers of the canopy appear to be more interesting from a nutritional point of view, but their lipids may be more exposed to oxidation. The content of different phytosterols increased progressively from bottom to top canopy layers. Most of these effects correlated with the pattern in light distribution inside the canopy. The results of this study indicate that fruit position within the canopy is an important factor in determining hazelnut kernel growth and composition. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Sensitivities Kernels of Seismic Traveltimes and Amplitudes for Quality Factor and Boundary Topography

    NASA Astrophysics Data System (ADS)

    Hsieh, M.; Zhao, L.; Ma, K.

    2010-12-01

    Finite-frequency approach enables seismic tomography to fully utilize the spatial and temporal distributions of the seismic wavefield to improve resolution. In achieving this goal, one of the most important tasks is to compute efficiently and accurately the (Fréchet) sensitivity kernels of finite-frequency seismic observables such as traveltime and amplitude to the perturbations of model parameters. In scattering-integral approach, the Fréchet kernels are expressed in terms of the strain Green tensors (SGTs), and a pre-established SGT database is necessary to achieve practical efficiency for a three-dimensional reference model in which the SGTs must be calculated numerically. Methods for computing Fréchet kernels for seismic velocities have long been established. In this study, we develop algorithms based on the finite-difference method for calculating Fréchet kernels for the quality factor Qμ and seismic boundary topography. Kernels for the quality factor can be obtained in a way similar to those for seismic velocities with the help of the Hilbert transform. The effects of seismic velocities and quality factor on either traveltime or amplitude are coupled. Kernels for boundary topography involve spatial gradient of the SGTs and they also exhibit interesting finite-frequency characteristics. Examples of quality factor and boundary topography kernels will be shown for a realistic model for the Taiwan region with three-dimensional velocity variation as well as surface and Moho discontinuity topography.

  10. Wilson Dslash Kernel From Lattice QCD Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joo, Balint; Smelyanskiy, Mikhail; Kalamkar, Dhiraj D.

    2015-07-01

    Lattice Quantum Chromodynamics (LQCD) is a numerical technique used for calculations in Theoretical Nuclear and High Energy Physics. LQCD is traditionally one of the first applications ported to many new high performance computing architectures and indeed LQCD practitioners have been known to design and build custom LQCD computers. Lattice QCD kernels are frequently used as benchmarks (e.g. 168.wupwise in the SPEC suite) and are generally well understood, and as such are ideal to illustrate several optimization techniques. In this chapter we will detail our work in optimizing the Wilson-Dslash kernels for Intel Xeon Phi, however, as we will show themore » technique gives excellent performance on regular Xeon Architecture as well.« less

  11. Anelastic sensitivity kernels with parsimonious storage for adjoint tomography and full waveform inversion

    NASA Astrophysics Data System (ADS)

    Komatitsch, Dimitri; Xie, Zhinan; Bozdaǧ, Ebru; Sales de Andrade, Elliott; Peter, Daniel; Liu, Qinya; Tromp, Jeroen

    2016-09-01

    We introduce a technique to compute exact anelastic sensitivity kernels in the time domain using parsimonious disk storage. The method is based on a reordering of the time loop of time-domain forward/adjoint wave propagation solvers combined with the use of a memory buffer. It avoids instabilities that occur when time-reversing dissipative wave propagation simulations. The total number of required time steps is unchanged compared to usual acoustic or elastic approaches. The cost is reduced by a factor of 4/3 compared to the case in which anelasticity is partially accounted for by accommodating the effects of physical dispersion. We validate our technique by performing a test in which we compare the Kα sensitivity kernel to the exact kernel obtained by saving the entire forward calculation. This benchmark confirms that our approach is also exact. We illustrate the importance of including full attenuation in the calculation of sensitivity kernels by showing significant differences with physical-dispersion-only kernels.

  12. Kernel-based least squares policy iteration for reinforcement learning.

    PubMed

    Xu, Xin; Hu, Dewen; Lu, Xicheng

    2007-07-01

    In this paper, we present a kernel-based least squares policy iteration (KLSPI) algorithm for reinforcement learning (RL) in large or continuous state spaces, which can be used to realize adaptive feedback control of uncertain dynamic systems. By using KLSPI, near-optimal control policies can be obtained without much a priori knowledge on dynamic models of control plants. In KLSPI, Mercer kernels are used in the policy evaluation of a policy iteration process, where a new kernel-based least squares temporal-difference algorithm called KLSTD-Q is proposed for efficient policy evaluation. To keep the sparsity and improve the generalization ability of KLSTD-Q solutions, a kernel sparsification procedure based on approximate linear dependency (ALD) is performed. Compared to the previous works on approximate RL methods, KLSPI makes two progresses to eliminate the main difficulties of existing results. One is the better convergence and (near) optimality guarantee by using the KLSTD-Q algorithm for policy evaluation with high precision. The other is the automatic feature selection using the ALD-based kernel sparsification. Therefore, the KLSPI algorithm provides a general RL method with generalization performance and convergence guarantee for large-scale Markov decision problems (MDPs). Experimental results on a typical RL task for a stochastic chain problem demonstrate that KLSPI can consistently achieve better learning efficiency and policy quality than the previous least squares policy iteration (LSPI) algorithm. Furthermore, the KLSPI method was also evaluated on two nonlinear feedback control problems, including a ship heading control problem and the swing up control of a double-link underactuated pendulum called acrobot. Simulation results illustrate that the proposed method can optimize controller performance using little a priori information of uncertain dynamic systems. It is also demonstrated that KLSPI can be applied to online learning control by incorporating

  13. The Genetic Basis of Natural Variation in Kernel Size and Related Traits Using a Four-Way Cross Population in Maize.

    PubMed

    Chen, Jiafa; Zhang, Luyan; Liu, Songtao; Li, Zhimin; Huang, Rongrong; Li, Yongming; Cheng, Hongliang; Li, Xiantang; Zhou, Bo; Wu, Suowei; Chen, Wei; Wu, Jianyu; Ding, Junqiang

    2016-01-01

    Kernel size is an important component of grain yield in maize breeding programs. To extend the understanding on the genetic basis of kernel size traits (i.e., kernel length, kernel width and kernel thickness), we developed a set of four-way cross mapping population derived from four maize inbred lines with varied kernel sizes. In the present study, we investigated the genetic basis of natural variation in seed size and other components of maize yield (e.g., hundred kernel weight, number of rows per ear, number of kernels per row). In total, ten QTL affecting kernel size were identified, three of which (two for kernel length and one for kernel width) had stable expression in other components of maize yield. The possible genetic mechanism behind the trade-off of kernel size and yield components was discussed.

  14. The Genetic Basis of Natural Variation in Kernel Size and Related Traits Using a Four-Way Cross Population in Maize

    PubMed Central

    Liu, Songtao; Li, Zhimin; Huang, Rongrong; Li, Yongming; Cheng, Hongliang; Li, Xiantang; Zhou, Bo; Wu, Suowei; Chen, Wei; Wu, Jianyu; Ding, Junqiang

    2016-01-01

    Kernel size is an important component of grain yield in maize breeding programs. To extend the understanding on the genetic basis of kernel size traits (i.e., kernel length, kernel width and kernel thickness), we developed a set of four-way cross mapping population derived from four maize inbred lines with varied kernel sizes. In the present study, we investigated the genetic basis of natural variation in seed size and other components of maize yield (e.g., hundred kernel weight, number of rows per ear, number of kernels per row). In total, ten QTL affecting kernel size were identified, three of which (two for kernel length and one for kernel width) had stable expression in other components of maize yield. The possible genetic mechanism behind the trade-off of kernel size and yield components was discussed. PMID:27070143

  15. Regional air quality impacts of future fire emissions in Sumatra and Kalimantan

    NASA Astrophysics Data System (ADS)

    Marlier, Miriam E.; DeFries, Ruth S.; Kim, Patrick S.; Gaveau, David L. A.; Koplitz, Shannon N.; Jacob, Daniel J.; Mickley, Loretta J.; Margono, Belinda A.; Myers, Samuel S.

    2015-05-01

    Fire emissions associated with land cover change and land management contribute to the concentrations of atmospheric pollutants, which can affect regional air quality and climate. Mitigating these impacts requires a comprehensive understanding of the relationship between fires and different land cover change trajectories and land management strategies. We develop future fire emissions inventories from 2010-2030 for Sumatra and Kalimantan (Indonesian Borneo) to assess the impact of varying levels of forest and peatland conservation on air quality in Equatorial Asia. To compile these inventories, we combine detailed land cover information from published maps of forest extent, satellite fire radiative power observations, fire emissions from the Global Fire Emissions Database, and spatially explicit future land cover projections using a land cover change model. We apply the sensitivities of mean smoke concentrations to Indonesian fire emissions, calculated by the GEOS-Chem adjoint model, to our scenario-based future fire emissions inventories to quantify the different impacts of fires on surface air quality across Equatorial Asia. We find that public health impacts are highly sensitive to the location of fires, with emissions from Sumatra contributing more to smoke concentrations at population centers across the region than Kalimantan, which had higher emissions by more than a factor of two. Compared to business-as-usual projections, protecting peatlands from fires reduces smoke concentrations in the cities of Singapore and Palembang by 70% and 40%, and by 60% for the Equatorial Asian region, weighted by the population in each grid cell. Our results indicate the importance of focusing conservation priorities on protecting both forested (intact or logged) peatlands and non-forested peatlands from fire, even after considering potential leakage of deforestation pressure to other areas, in order to limit the impact of fire emissions on atmospheric smoke concentrations and

  16. Chemical Composition of Aerosol from an E-Cigarette: A Quantitative Comparison with Cigarette Smoke.

    PubMed

    Margham, Jennifer; McAdam, Kevin; Forster, Mark; Liu, Chuan; Wright, Christopher; Mariner, Derek; Proctor, Christopher

    2016-10-17

    There is interest in the relative toxicities of emissions from electronic cigarettes and tobacco cigarettes. Lists of cigarette smoke priority toxicants have been developed to focus regulatory initiatives. However, a comprehensive assessment of e-cigarette chemical emissions including all tobacco smoke Harmful and Potentially Harmful Constituents, and additional toxic species reportedly present in e-cigarette emissions, is lacking. We examined 150 chemical emissions from an e-cigarette (Vype ePen), a reference tobacco cigarette (Ky3R4F), and laboratory air/method blanks. All measurements were conducted by a contract research laboratory using ISO 17025 accredited methods. The data show that it is essential to conduct laboratory air/method measurements when measuring e-cigarette emissions, owing to the combination of low emissions and the associated impact of laboratory background that can lead to false-positive results and overestimates. Of the 150 measurands examined in the e-cigarette aerosol, 104 were not detected and 21 were present due to laboratory background. Of the 25 detected aerosol constituents, 9 were present at levels too low to be quantified and 16 were generated in whole or in part by the e-cigarette. These comprised major e-liquid constituents (nicotine, propylene glycol, and glycerol), recognized impurities in Pharmacopoeia-quality nicotine, and eight thermal decomposition products of propylene glycol or glycerol. By contrast, approximately 100 measurands were detected in mainstream cigarette smoke. Depending on the regulatory list considered and the puffing regime used, the emissions of toxicants identified for regulation were from 82 to >99% lower on a per-puff basis from the e-cigarette compared with those from Ky3R4F. Thus, the aerosol from the e-cigarette is compositionally less complex than cigarette smoke and contains significantly lower levels of toxicants. These data demonstrate that e-cigarettes can be developed that offer the potential

  17. An iterative kernel based method for fourth order nonlinear equation with nonlinear boundary condition

    NASA Astrophysics Data System (ADS)

    Azarnavid, Babak; Parand, Kourosh; Abbasbandy, Saeid

    2018-06-01

    This article discusses an iterative reproducing kernel method with respect to its effectiveness and capability of solving a fourth-order boundary value problem with nonlinear boundary conditions modeling beams on elastic foundations. Since there is no method of obtaining reproducing kernel which satisfies nonlinear boundary conditions, the standard reproducing kernel methods cannot be used directly to solve boundary value problems with nonlinear boundary conditions as there is no knowledge about the existence and uniqueness of the solution. The aim of this paper is, therefore, to construct an iterative method by the use of a combination of reproducing kernel Hilbert space method and a shooting-like technique to solve the mentioned problems. Error estimation for reproducing kernel Hilbert space methods for nonlinear boundary value problems have yet to be discussed in the literature. In this paper, we present error estimation for the reproducing kernel method to solve nonlinear boundary value problems probably for the first time. Some numerical results are given out to demonstrate the applicability of the method.

  18. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    PubMed

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  19. Estimation of fire emissions from satellite-based measurements

    NASA Astrophysics Data System (ADS)

    Ichoku, C. M.; Kaufman, Y. J.

    2004-12-01

    Biomass burning is a worldwide phenomenon affecting many vegetated parts of the globe regularly. Fires emit large quantities of aerosol and trace gases into the atmosphere, thus influencing the atmospheric chemistry and climate. Traditional methods of fire emissions estimation achieved only limited success, because they were based on peripheral information such as rainfall patterns, vegetation types and changes, agricultural practices, and surface ozone concentrations. During the last several years, rapid developments in satellite remote sensing has allowed more direct estimation of smoke emissions using remotely-sensed fire data. However, current methods use fire pixel counts or burned areas, thereby depending on the accuracy of independent estimations of the biomass fuel loadings, combustion efficiency, and emission factors. With the enhanced radiometric range of its 4-micron fire channel, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor, which flies aboard both of the Earth Observing System (EOS) Terra and Aqua Satellites, is able to measure the rate of release of fire radiative energy (FRE) in MJ/s (something that older sensors could not do). MODIS also measures aerosol distribution. Taking advantage of these new resources, we have developed a procedure combining MODIS fire and aerosol products to derive FRE-based smoke emission coefficients (Ce in kg/MJ) for different regions of the globe. These coefficients are simply used to multiply FRE from MODIS to derive the emitted smoke aerosol mass. Results from this novel methodology are very encouraging. For instance, it was found that the smoke total particulate mass emission coefficient for the Brazilian Cerrado ecosystem (approximately 0.022 kg/MJ) is about twice the value for North America or Australia, but about 50 percent lower than the value for Zambia in southern Africa.

  20. Estimation of Fire Emissions from Satellite-Based Measurements

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Kaufman, Yoram J.

    2004-01-01

    Biomass burning is a worldwide phenomenon affecting many vegetated parts of the globe regularly. Fires emit large quantities of aerosol and trace gases into the atmosphere, thus influencing the atmospheric chemistry and climate. Traditional methods of fire emissions estimation achieved only limited success, because they were based on peripheral information such as rainfall patterns, vegetation types and changes, agricultural practices, and surface ozone concentrations. During the last several years, rapid developments in satellite remote sensing has allowed more direct estimation of smoke emissions using remotely-sensed fire data. However, current methods use fire pixel counts or burned areas, thereby depending on the accuracy of independent estimations of the biomass fuel loadings, combustion efficiency, and emission factors. With the enhanced radiometric range of its 4-micron fire channel, the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor, which flies aboard both of the Earth Observing System EOS) Terra and Aqua Satellites, is able to measure the rate of release of fire radiative energy (FRE) in MJ/s (something that older sensors could not do). MODIS also measures aerosol distribution. Taking advantage of these new resources, we have developed a procedure combining MODIS fire and aerosol products to derive FRE-based smoke emission coefficients (C(e), in kg/MJ) for different regions of the globe. These coefficients are simply used to multiply FRE from MODIS to derive the emitted smoke aerosol mass. Results from this novel methodology are very encouraging. For instance, it was found that the smoke total particulate mass emission coefficient for the Brazilian Cerrado ecosystem (approximately 0.022 kg/MJ) is about twice the value for North America, Western Europe, or Australia, but about 50% lower than the value for southern Africa.