Advanced Optimal Extraction for the Spitzer/IRS
NASA Astrophysics Data System (ADS)
Lebouteiller, V.; Bernard-Salas, J.; Sloan, G. C.; Barry, D. J.
2010-02-01
We present new advances in the spectral extraction of pointlike sources adapted to the Infrared Spectrograph (IRS) on board the Spitzer Space Telescope. For the first time, we created a supersampled point-spread function of the low-resolution modules. We describe how to use the point-spread function to perform optimal extraction of a single source and of multiple sources within the slit. We also examine the case of the optimal extraction of one or several sources with a complex background. The new algorithms are gathered in a plug-in called AdOpt which is part of the SMART data analysis software.
Mainhagu, Jon; Morrison, C.; Truex, Michael J.; ...
2014-08-05
A method termed vapor-phase tomography has recently been proposed to characterize the distribution of volatile organic contaminant mass in vadose-zone source areas, and to measure associated three-dimensional distributions of local contaminant mass discharge. The method is based on measuring the spatial variability of vapor flux, and thus inherent to its effectiveness is the premise that the magnitudes and temporal variability of vapor concentrations measured at different monitoring points within the interrogated area will be a function of the geospatial positions of the points relative to the source location. A series of flow-cell experiments was conducted to evaluate this premise. Amore » well-defined source zone was created by injection and extraction of a non-reactive gas (SF6). Spatial and temporal concentration distributions obtained from the tests were compared to simulations produced with a mathematical model describing advective and diffusive transport. Tests were conducted to characterize both areal and vertical components of the application. Decreases in concentration over time were observed for monitoring points located on the opposite side of the source zone from the local–extraction point, whereas increases were observed for monitoring points located between the local–extraction point and the source zone. We found that the results illustrate that comparison of temporal concentration profiles obtained at various monitoring points gives a general indication of the source location with respect to the extraction and monitoring points.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9650-8] Draft NPDES General Permit for Discharges From the Oil and Gas Extraction Point Source Category to Coastal Waters in Texas (TXG330000) AGENCY: Environmental Protection Agency (EPA). ACTION: Proposal of NPDES General Permit Renewal. SUMMARY: EPA Region 6...
Innovations in the Analysis of Chandra-ACIS Observations
NASA Astrophysics Data System (ADS)
Broos, Patrick S.; Townsley, Leisa K.; Feigelson, Eric D.; Getman, Konstantin V.; Bauer, Franz E.; Garmire, Gordon P.
2010-05-01
As members of the instrument team for the Advanced CCD Imaging Spectrometer (ACIS) on NASA's Chandra X-ray Observatory and as Chandra General Observers, we have developed a wide variety of data analysis methods that we believe are useful to the Chandra community, and have constructed a significant body of publicly available software (the ACIS Extract package) addressing important ACIS data and science analysis tasks. This paper seeks to describe these data analysis methods for two purposes: to document the data analysis work performed in our own science projects and to help other ACIS observers judge whether these methods may be useful in their own projects (regardless of what tools and procedures they choose to implement those methods). The ACIS data analysis recommendations we offer here address much of the workflow in a typical ACIS project, including data preparation, point source detection via both wavelet decomposition and image reconstruction, masking point sources, identification of diffuse structures, event extraction for both point and diffuse sources, merging extractions from multiple observations, nonparametric broadband photometry, analysis of low-count spectra, and automation of these tasks. Many of the innovations presented here arise from several, often interwoven, complications that are found in many Chandra projects: large numbers of point sources (hundreds to several thousand), faint point sources, misaligned multiple observations of an astronomical field, point source crowding, and scientifically relevant diffuse emission.
Light extraction block with curved surface
Levermore, Peter; Krall, Emory; Silvernail, Jeffrey; Rajan, Kamala; Brown, Julia J.
2016-03-22
Light extraction blocks, and OLED lighting panels using light extraction blocks, are described, in which the light extraction blocks include various curved shapes that provide improved light extraction properties compared to parallel emissive surface, and a thinner form factor and better light extraction than a hemisphere. Lighting systems described herein may include a light source with an OLED panel. A light extraction block with a three-dimensional light emitting surface may be optically coupled to the light source. The three-dimensional light emitting surface of the block may includes a substantially curved surface, with further characteristics related to the curvature of the surface at given points. A first radius of curvature corresponding to a maximum principal curvature k.sub.1 at a point p on the substantially curved surface may be greater than a maximum height of the light extraction block. A maximum height of the light extraction block may be less than 50% of a maximum width of the light extraction block. Surfaces with cross sections made up of line segments and inflection points may also be fit to approximated curves for calculating the radius of curvature.
Time-frequency approach to underdetermined blind source separation.
Xie, Shengli; Yang, Liu; Yang, Jun-Mei; Zhou, Guoxu; Xiang, Yong
2012-02-01
This paper presents a new time-frequency (TF) underdetermined blind source separation approach based on Wigner-Ville distribution (WVD) and Khatri-Rao product to separate N non-stationary sources from M(M <; N) mixtures. First, an improved method is proposed for estimating the mixing matrix, where the negative value of the auto WVD of the sources is fully considered. Then after extracting all the auto-term TF points, the auto WVD value of the sources at every auto-term TF point can be found out exactly with the proposed approach no matter how many active sources there are as long as N ≤ 2M-1. Further discussion about the extraction of auto-term TF points is made and finally the numerical simulation results are presented to show the superiority of the proposed algorithm by comparing it with the existing ones.
40 CFR 439.21 - Special definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... STANDARDS PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products § 439.21 Special definitions. For the purpose of this subpart: (a) Extraction means process operations that derive pharmaceutically active ingredients from natural sources such as plant roots and leaves, animal glands, and...
40 CFR 439.21 - Special definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... STANDARDS PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products § 439.21 Special definitions. For the purpose of this subpart: (a) Extraction means process operations that derive pharmaceutically active ingredients from natural sources such as plant roots and leaves, animal glands, and...
The VLITE Post-Processing Pipeline
NASA Astrophysics Data System (ADS)
Richards, Emily E.; Clarke, Tracy; Peters, Wendy; Polisensky, Emil; Kassim, Namir E.
2018-01-01
A post-processing pipeline to adaptively extract and catalog point sources is being developed to enhance the scientific value and accessibility of data products generated by the VLA Low-band Ionosphere and Transient Experiment (VLITE;
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-02
... ENVIRONMENTAL PROTECTION AGENCY [FRL-9903-65-Region-6] Draft NPDES General Permit Modification for Discharges From the Oil and Gas Extraction Point Source Category to Coastal Waters in Texas and Onshore Stripper Well Category East of The 98th Meridian (TXG330000) AGENCY: Environmental Protection Agency (EPA...
Characterization of mercury contamination in the Androscoggin River, Coos County, New Hampshire
Chalmers, Ann; Marvin-DiPasquale, Mark C.; Degnan, James R.; Coles, James; Agee, Jennifer L.; Luce, Darryl
2013-01-01
Concentrations of total mercury (THg) and MeHg in sediment, pore water, and biota in the Androscoggin River were elevated downstream from the former chloralkali facility compared with those upstream from reference sites. Sequential extraction of surface sediment showed a distinct difference in Hg speciation upstream compared with downstream from the contamination site. An upstream site was dominated by potassium hydroxide-extractable forms (for example, organic-Hg or particle-bound Hg(II)), whereas sites downstream from the point source were dominated by more chemically recalcitrant forms (largely concentrated nitric acid-extractable), indicative of elemental mercury or mercurous chloride. At all sites, only a minor fraction (less than 0.1 percent) of THg existed in chemically labile forms (for example, water extractable or weak acid extractable). All metrics indicated that a greater percentage of mercury at an upstream site was available for Hg(II)-methylation compared with sites downstream from the point source, but the absolute concentration of bioavailable Hg(II) was greater downstream from the point source. In addition, the concentration of tin-reducible inorganic reactive mercury, a surrogate measure of bioavailable Hg(II) generally increased with distance downstream from the point source. Whereas concentrations of mercury species on a sediment-dry-weight basis generally reflected the relative location of the sample to the point source, river-reach integrated mercury-species inventories and MeHg production potential (MPP) rates reflected the amount of fine-grained sediment in a given reach. THg concentrations in biota were significantly higher downstream from the point source compared with upstream reference sites for smallmouth bass, white sucker, crayfish, oligochaetes, bat fur, nestling tree swallow blood and feathers, adult tree swallow blood, and tree swallow eggs. As with tin-reducible inorganic reactive mercury, THg in smallmouth bass also increased with distance downstream from the point source. Toxicity tests and invertebrate community assessments suggested that invertebrates were not impaired at the current (2009 and 2010) levels of mercury contamination downstream from the point source. Concentrations of THg and MeHg in most water and sediment samples from the Androscoggin River were below U.S. Environmental Protection Agency (USEPA), the Canadian Council of Ministers of the Environment, and probable effects level guidelines. Surface-water and sediment samples from the Androscoggin River had similar THg concentrations but lower MeHg concentrations compared with other rivers in the region. Concentrations of THg in fish tissue were all above regional and U.S. Environmental Protection Agency guidelines. Moreover, median THg concentrations in smallmouth bass from the Androscoggin River were significantly higher than those reported in regional surveys of river and streams nationwide and in the Northeastern United States and Canada. The higher concentrations of mercury in smallmouth bass suggest conditions may be more favorable for Hg(II)-methylation and bioaccumulation in the Androscoggin River compared with many other rivers in the United States and Canada.
HerMES: point source catalogues from Herschel-SPIRE observations II
NASA Astrophysics Data System (ADS)
Wang, L.; Viero, M.; Clarke, C.; Bock, J.; Buat, V.; Conley, A.; Farrah, D.; Guo, K.; Heinis, S.; Magdis, G.; Marchetti, L.; Marsden, G.; Norberg, P.; Oliver, S. J.; Page, M. J.; Roehlly, Y.; Roseboom, I. G.; Schulz, B.; Smith, A. J.; Vaccari, M.; Zemcov, M.
2014-11-01
The Herschel Multi-tiered Extragalactic Survey (HerMES) is the largest Guaranteed Time Key Programme on the Herschel Space Observatory. With a wedding cake survey strategy, it consists of nested fields with varying depth and area totalling ˜380 deg2. In this paper, we present deep point source catalogues extracted from Herschel-Spectral and Photometric Imaging Receiver (SPIRE) observations of all HerMES fields, except for the later addition of the 270 deg2 HerMES Large-Mode Survey (HeLMS) field. These catalogues constitute the second Data Release (DR2) made in 2013 October. A sub-set of these catalogues, which consists of bright sources extracted from Herschel-SPIRE observations completed by 2010 May 1 (covering ˜74 deg2) were released earlier in the first extensive data release in 2012 March. Two different methods are used to generate the point source catalogues, the SUSSEXTRACTOR point source extractor used in two earlier data releases (EDR and EDR2) and a new source detection and photometry method. The latter combines an iterative source detection algorithm, STARFINDER, and a De-blended SPIRE Photometry algorithm. We use end-to-end Herschel-SPIRE simulations with realistic number counts and clustering properties to characterize basic properties of the point source catalogues, such as the completeness, reliability, photometric and positional accuracy. Over 500 000 catalogue entries in HerMES fields (except HeLMS) are released to the public through the HeDAM (Herschel Database in Marseille) website (http://hedam.lam.fr/HerMES).
40 CFR 435.60 - Applicability; description of the stripper subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper... with recognized conservation practices. These facilities are engaged in production, and well treatment in the oil and gas extraction industry. ...
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Offshore... 40 CFR 125.30-32, any existing point source subject to this subpart must achieve the following... Minimum of 1 mg/l and maintained as close to this concentration as possible. Sanitary M91M Floating solids...
Convex Hull Aided Registration Method (CHARM).
Fan, Jingfan; Yang, Jian; Zhao, Yitian; Ai, Danni; Liu, Yonghuai; Wang, Ge; Wang, Yongtian
2017-09-01
Non-rigid registration finds many applications such as photogrammetry, motion tracking, model retrieval, and object recognition. In this paper we propose a novel convex hull aided registration method (CHARM) to match two point sets subject to a non-rigid transformation. First, two convex hulls are extracted from the source and target respectively. Then, all points of the point sets are projected onto the reference plane through each triangular facet of the hulls. From these projections, invariant features are extracted and matched optimally. The matched feature point pairs are mapped back onto the triangular facets of the convex hulls to remove outliers that are outside any relevant triangular facet. The rigid transformation from the source to the target is robustly estimated by the random sample consensus (RANSAC) scheme through minimizing the distance between the matched feature point pairs. Finally, these feature points are utilized as the control points to achieve non-rigid deformation in the form of thin-plate spline of the entire source point set towards the target one. The experimental results based on both synthetic and real data show that the proposed algorithm outperforms several state-of-the-art ones with respect to sampling, rotational angle, and data noise. In addition, the proposed CHARM algorithm also shows higher computational efficiency compared to these methods.
Speech-Message Extraction from Interference Introduced by External Distributed Sources
NASA Astrophysics Data System (ADS)
Kanakov, V. A.; Mironov, N. A.
2017-08-01
The problem of this study involves the extraction of a speech signal originating from a certain spatial point and calculation of the intelligibility of the extracted voice message. It is solved by the method of decreasing the influence of interference from the speech-message sources on the extracted signal. This method is based on introducing the time delays, which depend on the spatial coordinates, to the recording channels. Audio records of the voices of eight different people were used as test objects during the studies. It is proved that an increase in the number of microphones improves intelligibility of the speech message which is extracted from interference.
40 CFR 439.27 - Pretreatment standards for new sources (PSNS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for new sources (PSNS). 439.27 Section 439.27 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products...
40 CFR 439.26 - Pretreatment standards for existing sources (PSES).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards for existing sources (PSES). 439.26 Section 439.26 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products...
Code of Federal Regulations, 2010 CFR
2010-07-01
... PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY... provided in 40 CFR 125.30-32, any existing point source subject to this subpart must achieve the following... maintained as close to this concentration as possible. 3 There shall be no floating solids as a result of the...
40 CFR 439.25 - New source performance standards (NSPS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false New source performance standards (NSPS). 439.25 Section 439.25 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products § 439.25 New...
40 CFR 435.45 - Standards of performance for new sources (NSPS).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Standards of performance for new sources (NSPS). 435.45 Section 435.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.45 - Standards of performance for new sources (NSPS).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Standards of performance for new sources (NSPS). 435.45 Section 435.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.45 - Standards of performance for new sources (NSPS).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Standards of performance for new sources (NSPS). 435.45 Section 435.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fortgang, C. M., E-mail: cfortgang@lanl.gov; Batygin, Y. K.; Draganic, I. N.
The 750-keV H{sup +} Cockcroft-Walton at LANSCE will be replaced with a recently fabricated 4-rod Radio Frequency Quadrupole (RFQ) with injection energy of 35 keV. The existing duoplasmatron source extraction optics need to be modified to produce up to 35 mA of H{sup +} current with an emittance <0.02 π-cm-mrad (rms, norm) for injection into the RFQ. Parts for the new source have been fabricated and assembly is in process. We will use the existing duoplasmatron source with a newly designed extraction system and low energy beam transport (LEBT) for beam injection into the RFQ. In addition to source modifications,more » we need a new LEBT for transport and matching into the RFQ. The LEBT uses two magnetic solenoids with enough drift space between them to accommodate diagnostics and a beam deflector. The LEBT is designed to work over a range of space-charge neutralized currents and emittances. The LEBT is optimized in the sense that it minimizes the beam size in both solenoids for a point design of a given neutralized current and emittance. Special attention has been given to estimating emittance growth due to source extraction optics and solenoid aberrations. Examples of source-to-RFQ matching and emittance growth (due to both non-linear space charge and solenoid aberrations) are presented over a range of currents and emittances about the design point. A mechanical layout drawing will be presented along with the status of the source and LEBT, design, and fabrication.« less
Hanjabam, Mandakini Devi; Kannaiyan, Sathish Kumar; Kamei, Gaihiamngam; Jakhar, Jitender Kumar; Chouksey, Mithlesh Kumar; Gudipati, Venkateshwarlu
2015-02-01
Physical properties of gelatin extracted from Unicorn leatherjacket (Aluterus monoceros) skin, which is generated as a waste from fish processing industries, were optimised using Response Surface Methodology (RSM). A Box-Behnken design was used to study the combined effects of three independent variables, namely phosphoric acid (H3PO4) concentration (0.15-0.25 M), extraction temperature (40-50 °C) and extraction time (4-12 h) on different responses like yield, gel strength and melting point of gelatin. The optimum conditions derived by RSM for the yield (10.58%) were 0.2 M H3PO4 for 9.01 h of extraction time and hot water extraction of 45.83 °C. The maximum achieved gel strength and melting point was 138.54 g and 22.61 °C respectively. Extraction time was found to be most influencing variable and had a positive coefficient on yield and negative coefficient on gel strength and melting point. The results indicated that Unicorn leatherjacket skins can be a source of gelatin having mild gel strength and melting point.
40 CFR 435.45 - Standards of performance for new sources (NSPS).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Standards of performance for new sources (NSPS). 435.45 Section 435.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal Subcategory § 435.45...
40 CFR 435.47 - Pretreatment standards of performance for new sources (PSNS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards of performance for new sources (PSNS). 435.47 Section 435.47 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.47 - Pretreatment standards of performance for new sources (PSNS).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Pretreatment standards of performance for new sources (PSNS). 435.47 Section 435.47 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.45 - Standards of performance for new sources (NSPS).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Standards of performance for new sources (NSPS). 435.45 Section 435.45 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal Subcategory § 435.45...
40 CFR 435.46 - Pretreatment standards of performance for existing sources (PSES).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Pretreatment standards of performance for existing sources (PSES). 435.46 Section 435.46 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.46 - Pretreatment standards of performance for existing sources (PSES).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Pretreatment standards of performance for existing sources (PSES). 435.46 Section 435.46 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
The Herschel-SPIRE Point Source Catalog Version 2
NASA Astrophysics Data System (ADS)
Schulz, Bernhard; Marton, Gábor; Valtchanov, Ivan; María Pérez García, Ana; Pintér, Sándor; Appleton, Phil; Kiss, Csaba; Lim, Tanya; Lu, Nanyao; Papageorgiou, Andreas; Pearson, Chris; Rector, John; Sánchez Portal, Miguel; Shupe, David; Tóth, Viktor L.; Van Dyk, Schuyler; Varga-Verebélyi, Erika; Xu, Kevin
2018-01-01
The Herschel-SPIRE instrument mapped about 8% of the sky in Submillimeter broad-band filters centered at 250, 350, and 500 microns (1199, 857, 600 GHz) with spatial resolutions of 17.9”, 24.2”, and 35.4” respectively. We present here the 2nd version of the SPIRE Point Source Catalog (SPSC). Stacking on WISE 22 micron catalog sources led to the identification of 108 maps, out of 6878, that had astrometry offsets of greater than 5”. After fixing these deviations and re-derivation of all affected map-mosaics, we repeated the systematic and homogeneous source extraction performed on all maps, using an improved version of the 4 different photometry extraction methods that were already employed in the generation of the first version catalog. Only regions affected by strong Galactic emission, mostly in the Galactic Plane, were excluded, as they exceeded the limits of the available source extraction methods. Aimed primarily at point sources, that allow for the best photometric accuracy, the catalog contains also significant fractions of slightly extended sources. With most SPIRE maps being confusion limited, uncertainties in flux densities were established as a function of structure noise and flux density, based on the results of artificial source insertion experiments into real data along a range of celestial backgrounds. Many sources have been rejected that do not pass the imposed SNR threshold, especially at flux densities approaching the extragalactic confusion limit. A range of additional flags provide information on the reliability of the flux information, as well as the spatial extent and orientation of a source. The catalog should be particularly helpful for determining cold dust content in extragalactic and galactic sources with low to moderate background confusion. We present an overview of catalog construction, detailed content, and validation results, with focus on the improvements achieved in the second version that is soon to be released.
76 FR 66286 - Notice of Final 2010 Effluent Guidelines Program Plan
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-26
... Coalbed Methane Extraction (CBM) industry and will develop pretreatments requirements for discharges of...) industry. EPA is also issuing the detailed study report for the Coalbed Methane Extraction and the... Methane Point Source Category: Detailed Study Report, EPA-820-R-10-022, DCN 09999; Draft Guidance Document...
Jelicic Kadic, Antonia; Vucic, Katarina; Dosenovic, Svjetlana; Sapunar, Damir; Puljak, Livia
2016-06-01
To compare speed and accuracy of graphical data extraction using manual estimation and open source software. Data points from eligible graphs/figures published in randomized controlled trials (RCTs) from 2009 to 2014 were extracted by two authors independently, both by manual estimation and with the Plot Digitizer, open source software. Corresponding authors of each RCT were contacted up to four times via e-mail to obtain exact numbers that were used to create graphs. Accuracy of each method was compared against the source data from which the original graphs were produced. Software data extraction was significantly faster, reducing time for extraction for 47%. Percent agreement between the two raters was 51% for manual and 53.5% for software data extraction. Percent agreement between the raters and original data was 66% vs. 75% for the first rater and 69% vs. 73% for the second rater, for manual and software extraction, respectively. Data extraction from figures should be conducted using software, whereas manual estimation should be avoided. Using software for data extraction of data presented only in figures is faster and enables higher interrater reliability. Copyright © 2016 Elsevier Inc. All rights reserved.
Halfon, Philippe; Ouzan, Denis; Khiri, Hacène; Pénaranda, Guillaume; Castellani, Paul; Oulès, Valerie; Kahloun, Asma; Amrani, Nolwenn; Fanteria, Lise; Martineau, Agnès; Naldi, Lou; Bourlière, Marc
2012-01-01
Background & Aims Point mutations in the coding region of the interleukin 28 gene (rs12979860) have recently been identified for predicting the outcome of treatment of hepatitis C virus infection. This polymorphism detection was based on whole blood DNA extraction. Alternatively, DNA for genetic diagnosis has been derived from buccal epithelial cells (BEC), dried blood spots (DBS), and genomic DNA from serum. The aim of the study was to investigate the reliability and accuracy of alternative routes of testing for single nucleotide polymorphism allele rs12979860CC. Methods Blood, plasma, and sera samples from 200 patients were extracted (400 µL). Buccal smears were tested using an FTA card. To simulate postal delay, we tested the influence of storage at ambient temperature on the different sources of DNA at five time points (baseline, 48 h, 6 days, 9 days, and 12 days) Results There was 100% concordance between blood, plasma, sera, and BEC, validating the use of DNA extracted from BEC collected on cytology brushes for genetic testing. Genetic variations in HPTR1 gene were detected using smear technique in blood smear (3620 copies) as well as in buccal smears (5870 copies). These results are similar to those for whole blood diluted at 1/10. A minimum of 0.04 µL, 4 µL, and 40 µL was necessary to obtain exploitable results respectively for whole blood, sera, and plasma. No significant variation between each time point was observed for the different sources of DNA. IL28B SNPs analysis at these different time points showed the same results using the four sources of DNA. Conclusion We demonstrated that genomic DNA extraction from buccal cells, small amounts of serum, and dried blood spots is an alternative to DNA extracted from peripheral blood cells and is helpful in retrospective and prospective studies for multiple genetic markers, specifically in hard-to-reach individuals. PMID:22412970
SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs
NASA Astrophysics Data System (ADS)
McCarron, Adam; Ciardullo, Robin; Eracleous, Michael
2018-01-01
The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.
Ultracompact/ultralow power electron cyclotron resonance ion source for multipurpose applications.
Sortais, P; Lamy, T; Médard, J; Angot, J; Latrasse, L; Thuillier, T
2010-02-01
In order to drastically reduce the power consumption of a microwave ion source, we have studied some specific discharge cavity geometries in order to reduce the operating point below 1 W of microwave power (at 2.45 GHz). We show that it is possible to drive an electron cyclotron resonance ion source with a transmitter technology similar to those used for cellular phones. By the reduction in the size and of the required microwave power, we have developed a new type of ultralow cost ion sources. This microwave discharge system (called COMIC, for COmpact MIcrowave and Coaxial) can be used as a source of light, plasma or ions. We will show geometries of conductive cavities where it is possible, in a 20 mm diameter chamber, to reduce the ignition of the plasma below 100 mW and define typical operating points around 5 W. Inside a simple vacuum chamber it is easy to place the source and its extraction system anywhere and fully under vacuum. In that case, current densities from 0.1 to 10 mA/cm(2) (Ar, extraction 4 mm, 1 mAe, 20 kV) have been observed. Preliminary measurements and calculations show the possibility, with a two electrodes system, to extract beams within a low emittance. The first application for these ion sources is the ion injection for charge breeding, surface analyzing system and surface treatment. For this purpose, a very small extraction hole is used (typically 3/10 mm for a 3 microA extracted current with 2 W of HF power). Mass spectrum and emittance measurements will be presented. In these conditions, values down to 1 pi mm mrad at 15 kV (1sigma) are observed, thus very close to the ones currently observed for a surface ionization source. A major interest of this approach is the possibility to connect together several COMIC devices. We will introduce some new on-going developments such as sources for high voltage implantation platforms, fully quartz radioactive ion source at ISOLDE or large plasma generators for plasma immersion, broad or ribbon beams generation.
40 CFR 439.20 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... PHARMACEUTICAL MANUFACTURING POINT SOURCE CATEGORY Extraction Products § 439.20 Applicability. This subpart applies to discharges of process wastewater resulting from the manufacture of pharmaceutical products by...
Wavelet-based techniques for the gamma-ray sky
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDermott, Samuel D.; Fox, Patrick J.; Cholis, Ilias
2016-07-01
Here, we demonstrate how the image analysis technique of wavelet decomposition can be applied to the gamma-ray sky to separate emission on different angular scales. New structures on scales that differ from the scales of the conventional astrophysical foreground and background uncertainties can be robustly extracted, allowing a model-independent characterization with no presumption of exact signal morphology. As a test case, we generate mock gamma-ray data to demonstrate our ability to extract extended signals without assuming a fixed spatial template. For some point source luminosity functions, our technique also allows us to differentiate a diffuse signal in gamma-rays from darkmore » matter annihilation and extended gamma-ray point source populations in a data-driven way.« less
NASA Astrophysics Data System (ADS)
Choi, Woo Young; Woo, Dong-Soo; Choi, Byung Yong; Lee, Jong Duk; Park, Byung-Gook
2004-04-01
We proposed a stable extraction algorithm for threshold voltage using transconductance change method by optimizing node interval. With the algorithm, noise-free gm2 (=dgm/dVGS) profiles can be extracted within one-percent error, which leads to more physically-meaningful threshold voltage calculation by the transconductance change method. The extracted threshold voltage predicts the gate-to-source voltage at which the surface potential is within kT/q of φs=2φf+VSB. Our algorithm makes the transconductance change method more practical by overcoming noise problem. This threshold voltage extraction algorithm yields the threshold roll-off behavior of nanoscale metal oxide semiconductor field effect transistor (MOSFETs) accurately and makes it possible to calculate the surface potential φs at any other point on the drain-to-source current (IDS) versus gate-to-source voltage (VGS) curve. It will provide us with a useful analysis tool in the field of device modeling, simulation and characterization.
Simulation and Spectrum Extraction in the Spectroscopic Channel of the SNAP Experiment
NASA Astrophysics Data System (ADS)
Tilquin, Andre; Bonissent, A.; Gerdes, D.; Ealet, A.; Prieto, E.; Macaire, C.; Aumenier, M. H.
2007-05-01
A pixel-level simulation software is described. It is composed of two modules. The first module applies Fourier optics at each active element of the system to construct the PSF at a large variety of wavelengths and spatial locations of the point source. The input is provided by the engineer's design program (Zemax). It describes the optical path and the distortions. The PSF properties are compressed and interpolated using shapelets decomposition and neural network techniques. A second module is used for production jobs. It uses the output of the first module to reconstruct the relevant PSF and integrate it on the detector pixels. Extended and polychromatic sources are approximated by a combination of monochromatic point sources. For the spectrum extraction, we use a fast simulator based on a multidimensional linear interpolation of the pixel response tabulated on a grid of values of wavelength, position on sky and slice number. The prediction of the fast simulator is compared to the observed pixel content, and a chi-square minimization where the parameters are the bin contents is used to build the extracted spectrum. The visible and infrared arms are combined in the same chi-square, providing a single spectrum.
40 CFR 435.11 - Specialized definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Extraction Point Source Category,” EPA-821-R-11-004. See paragraph (uu) of this section. (e) Biodegradation... Bottle Biodegradation Test System: Modified ISO 11734:1995,” EPA Method 1647, supplemented with...
40 CFR 435.11 - Specialized definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Extraction Point Source Category,” EPA-821-R-11-004. See paragraph (uu) of this section. (e) Biodegradation... Bottle Biodegradation Test System: Modified ISO 11734:1995,” EPA Method 1647, supplemented with...
40 CFR 435.11 - Specialized definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Extraction Point Source Category,” EPA-821-R-11-004. See paragraph (uu) of this section. (e) Biodegradation... Bottle Biodegradation Test System: Modified ISO 11734:1995,” EPA Method 1647, supplemented with...
VizieR Online Data Catalog: The Chandra Source Catalog, Release 1.1 (Evans+ 2012)
NASA Astrophysics Data System (ADS)
Evans, I. N.; Primini, F. A.; Glotfelty, C. S.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G.; Grier, J. D.; Hain, R. M.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Kashyap, V. L.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Mossman, A. E.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.
2014-01-01
This version of the catalog is release 1.1. It includes the information contained in release 1.0.1, plus point and compact source data extracted from HRC imaging observations, and catch-up ACIS observations released publicly prior to the end of 2009. (1 data file).
The Raptor Real-Time Processing Architecture
NASA Astrophysics Data System (ADS)
Galassi, M.; Starr, D.; Wozniak, P.; Brozdin, K.
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback, etc.) is implemented with a ``component'' approach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally, the Raptor architecture is entirely based on free software (sometimes referred to as ``open source'' software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
Raptor -- Mining the Sky in Real Time
NASA Astrophysics Data System (ADS)
Galassi, M.; Borozdin, K.; Casperson, D.; McGowan, K.; Starr, D.; White, R.; Wozniak, P.; Wren, J.
2004-06-01
The primary goal of Raptor is ambitious: to identify interesting optical transients from very wide field of view telescopes in real time, and then to quickly point the higher resolution Raptor ``fovea'' cameras and spectrometer to the location of the optical transient. The most interesting of Raptor's many applications is the real-time search for orphan optical counterparts of Gamma Ray Bursts. The sequence of steps (data acquisition, basic calibration, source extraction, astrometry, relative photometry, the smarts of transient identification and elimination of false positives, telescope pointing feedback...) is implemented with a ``component'' aproach. All basic elements of the pipeline functionality have been written from scratch or adapted (as in the case of SExtractor for source extraction) to form a consistent modern API operating on memory resident images and source lists. The result is a pipeline which meets our real-time requirements and which can easily operate as a monolithic or distributed processing system. Finally: the Raptor architecture is entirely based on free software (sometimes referred to as "open source" software). In this paper we also discuss the interplay between various free software technologies in this type of astronomical problem.
Building Facade Reconstruction by Fusing Terrestrial Laser Points and Images
Pu, Shi; Vosselman, George
2009-01-01
Laser data and optical data have a complementary nature for three dimensional feature extraction. Efficient integration of the two data sources will lead to a more reliable and automated extraction of three dimensional features. This paper presents a semiautomatic building facade reconstruction approach, which efficiently combines information from terrestrial laser point clouds and close range images. A building facade's general structure is discovered and established using the planar features from laser data. Then strong lines in images are extracted using Canny extractor and Hough transformation, and compared with current model edges for necessary improvement. Finally, textures with optimal visibility are selected and applied according to accurate image orientations. Solutions to several challenge problems throughout the collaborated reconstruction, such as referencing between laser points and multiple images and automated texturing, are described. The limitations and remaining works of this approach are also discussed. PMID:22408539
NASA Astrophysics Data System (ADS)
Fang, Huaiyang; Lu, Qingshui; Gao, Zhiqiang; Shi, Runhe; Gao, Wei
2013-09-01
China economy has been rapidly increased since 1978. Rapid economic growth led to fast growth of fertilizer and pesticide consumption. A significant portion of fertilizers and pesticides entered the water and caused water quality degradation. At the same time, rapid economic growth also caused more and more point source pollution discharge into the water. Eutrophication has become a major threat to the water bodies. Worsening environment problems forced governments to take measures to control water pollution. We extracted land cover from Landsat TM images; calculated point source pollution with export coefficient method; then SWAT model was run to simulate non-point source pollution. We found that the annual TP loads from industry pollution into rivers are 115.0 t in the entire watershed. Average annual TP loads from each sub-basin ranged from 0 to 189.4 ton. Higher TP loads of each basin from livestock and human living mainly occurs in the areas where they are far from large towns or cities and the TP loads from industry are relatively low. Mean annual TP loads that delivered to the streams was 246.4 tons and the highest TP loads occurred in north part of this area, and the lowest TP loads is mainly distributed in middle part. Therefore, point source pollution has much high proportion in this area and governments should take measures to control point source pollution.
Prediction of the Critical Curvature for LX-17 with the Time of Arrival Data from DNS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yao, Jin; Fried, Laurence E.; Moss, William C.
2017-01-10
We extract the detonation shock front velocity, curvature and acceleration from time of arrival data measured at grid points from direct numerical simulations of a 50mm rate-stick lit by a disk-source, with the ignition and growth reaction model and a JWL equation of state calibrated for LX-17. We compute the quasi-steady (D, κ) relation based on the extracted properties and predicted the critical curvatures of LX-17. We also proposed an explicit formula that contains the failure turning point, obtained from optimization for the (D, κ) relation of LX-17.
Place in Perspective: Extracting Online Information about Points of Interest
NASA Astrophysics Data System (ADS)
Alves, Ana O.; Pereira, Francisco C.; Rodrigues, Filipe; Oliveirinha, João
During the last few years, the amount of online descriptive information about places has reached reasonable dimensions for many cities in the world. Being such information mostly in Natural Language text, Information Extraction techniques are needed for obtaining the meaning of places that underlies these massive amounts of commonsense and user made sources. In this article, we show how we automatically label places using Information Extraction techniques applied to online resources such as Wikipedia, Yellow Pages and Yahoo!.
Discrimination between diffuse and point sources of arsenic at Zimapán, Hidalgo state, Mexico.
Sracek, Ondra; Armienta, María Aurora; Rodríguez, Ramiro; Villaseñor, Guadalupe
2010-01-01
There are two principal sources of arsenic in Zimapán. Point sources are linked to mining and smelting activities and especially to mine tailings. Diffuse sources are not well defined and are linked to regional flow systems in carbonate rocks. Both sources are caused by the oxidation of arsenic-rich sulfidic mineralization. Point sources are characterized by Ca-SO(4)-HCO(3) ground water type and relatively enriched values of deltaD, delta(18)O, and delta(34)S(SO(4)). Diffuse sources are characterized by Ca-Na-HCO(3) type of ground water and more depleted values of deltaD, delta(18)O, and delta(34)S(SO(4)). Values of deltaD and delta(18)O indicate similar altitude of recharge for both arsenic sources and stronger impact of evaporation for point sources in mine tailings. There are also different values of delta(34)S(SO(4)) for both sources, presumably due to different types of mineralization or isotopic zonality in deposits. In Principal Component Analysis (PCA), the principal component 1 (PC1), which describes the impact of sulfide oxidation and neutralization by the dissolution of carbonates, has higher values in samples from point sources. In spite of similar concentrations of As in ground water affected by diffuse sources and point sources (mean values 0.21 mg L(-1) and 0.31 mg L(-1), respectively, in the years from 2003 to 2008), the diffuse sources have more impact on the health of population in Zimapán. This is caused by the extraction of ground water from wells tapping regional flow system. In contrast, wells located in the proximity of mine tailings are not generally used for water supply.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-15
... Protection Agency. ACTION: Notice of Final NPDES General Permit. SUMMARY: The Director of the Water Quality... Extraction Point Source Category as authorized by section 402 of the Clean Water Act, 33 U.S.C. 1342 (CWA... change to the proposed permit. A copy of the Region's responses to comments and the final permit may be...
Correcting STIS CCD Point-Source Spectra for CTE Loss
NASA Technical Reports Server (NTRS)
Goudfrooij, Paul; Bohlin, Ralph C.; Maiz-Apellaniz, Jesus
2006-01-01
We review the on-orbit spectroscopic observations that are being used to characterize the Charge Transfer Efficiency (CTE) of the STIS CCD in spectroscopic mode. We parameterize the CTE-related loss for spectrophotometry of point sources in terms of dependencies on the brightness of the source, the background level, the signal in the PSF outside the standard extraction box, and the time of observation. Primary constraints on our correction algorithm are provided by measurements of the CTE loss rates for simulated spectra (images of a tungsten lamp taken through slits oriented along the dispersion axis) combined with estimates of CTE losses for actual spectra of spectrophotometric standard stars in the first order CCD modes. For point-source spectra at the standard reference position at the CCD center, CTE losses as large as 30% are corrected to within approx.1% RMS after application of the algorithm presented here, rendering the Poisson noise associated with the source detection itself to be the dominant contributor to the total flux calibration uncertainty.
Winkelmann, Tim; Cee, Rainer; Haberer, Thomas; Naas, Bernd; Peters, Andreas; Schreiner, Jochen
2014-02-01
The clinical operation at the Heidelberg Ion Beam Therapy Center (HIT) started in November 2009; since then more than 1600 patients have been treated. In a 24/7 operation scheme two 14.5 GHz electron cyclotron resonance ion sources are routinely used to produce protons and carbon ions. The modification of the low energy beam transport line and the integration of a third ion source into the therapy facility will be shown. In the last year we implemented a new extraction system at all three sources to enhance the lifetime of extraction parts and reduce preventive and corrective maintenance. The new four-electrode-design provides electron suppression as well as lower beam emittance. Unwanted beam sputtering effects which typically lead to contamination of the insulator ceramics and subsequent high-voltage break-downs are minimized by the beam guidance of the new extraction system. By this measure the service interval can be increased significantly. As a side effect, the beam emittance can be reduced allowing a less challenging working point for the ion sources without reducing the effective beam performance. This paper gives also an outlook to further enhancements at the HIT ion source testbench.
A quantitative evaluation of two methods for preserving hair samples
Roon, David A.; Waits, L.P.; Kendall, K.C.
2003-01-01
Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.
40 CFR 125.133 - What special definitions apply to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... Subcategories of the Oil and Gas Extraction Point Source Category Effluent Guidelines in 40 CFR 435.10 or 40 CFR..., floating, mobile, facility engaged in the processing of fresh, frozen, canned, smoked, salted or pickled...
Towards 3D Matching of Point Clouds Derived from Oblique and Nadir Airborne Imagery
NASA Astrophysics Data System (ADS)
Zhang, Ming
Because of the low-expense high-efficient image collection process and the rich 3D and texture information presented in the images, a combined use of 2D airborne nadir and oblique images to reconstruct 3D geometric scene has a promising market for future commercial usage like urban planning or first responders. The methodology introduced in this thesis provides a feasible way towards fully automated 3D city modeling from oblique and nadir airborne imagery. In this thesis, the difficulty of matching 2D images with large disparity is avoided by grouping the images first and applying the 3D registration afterward. The procedure starts with the extraction of point clouds using a modified version of the RIT 3D Extraction Workflow. Then the point clouds are refined by noise removal and surface smoothing processes. Since the point clouds extracted from different image groups use independent coordinate systems, there are translation, rotation and scale differences existing. To figure out these differences, 3D keypoints and their features are extracted. For each pair of point clouds, an initial alignment and a more accurate registration are applied in succession. The final transform matrix presents the parameters describing the translation, rotation and scale requirements. The methodology presented in the thesis has been shown to behave well for test data. The robustness of this method is discussed by adding artificial noise to the test data. For Pictometry oblique aerial imagery, the initial alignment provides a rough alignment result, which contains a larger offset compared to that of test data because of the low quality of the point clouds themselves, but it can be further refined through the final optimization. The accuracy of the final registration result is evaluated by comparing it to the result obtained from manual selection of matched points. Using the method introduced, point clouds extracted from different image groups could be combined with each other to build a more complete point cloud, or be used as a complement to existing point clouds extracted from other sources. This research will both improve the state of the art of 3D city modeling and inspire new ideas in related fields.
Jones-Lepp, Tammy L.; Sanchez, Charles; Alvarez, David A.; Wilson, Doyle C.; Taniguchi-Fu, Randi-Laurant
2012-01-01
Emerging contaminants (ECs) (e.g., pharmaceuticals, illicit drugs, personal care products) have been detected in waters across the United States. The objective of this study was to evaluate point sources of ECs along the Colorado River, from the headwaters in Colorado to the Gulf of California. At selected locations in the Colorado River Basin (sites in Colorado, Utah, Nevada, Arizona, and California), waste stream tributaries and receiving surface waters were sampled using either grab sampling or polar organic chemical integrative samplers (POCIS). The grab samples were extracted using solid-phase cartridge extraction (SPE), and the POCIS sorbents were transferred into empty SPEs and eluted with methanol. All extracts were prepared for, and analyzed by, liquid chromatography–electrospray-ion trap mass spectrometry (LC–ESI-ITMS). Log DOW values were calculated for all ECs in the study and compared to the empirical data collected. POCIS extracts were screened for the presence of estrogenic chemicals using the yeast estrogen screen (YES) assay. Extracts from the 2008 POCIS deployment in the Las Vegas Wash showed the second highest estrogenicity response. In the grab samples, azithromycin (an antibiotic) was detected in all but one urban waste stream, with concentrations ranging from 30 ng/L to 2800 ng/L. Concentration levels of azithromycin, methamphetamine and pseudoephedrine showed temporal variation from the Tucson WWTP. Those ECs that were detected in the main surface water channels (those that are diverted for urban use and irrigation along the Colorado River) were in the region of the limit-of-detection (e.g., 10 ng/L), but most were below detection limits.
40 CFR 435.41 - Specialized definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Specialized definitions. 435.41 Section 435.41 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal Subcategory § 435.41...
40 CFR 435.41 - Specialized definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Specialized definitions. 435.41 Section 435.41 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal Subcategory § 435.41...
40 CFR 435.41 - Specialized definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Specialized definitions. 435.41 Section 435.41 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal Subcategory § 435.41...
40 CFR 435.31 - Specialized definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Specialized definitions. 435.31 Section 435.31 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Onshore Subcategory § 435.31 Specialized...
40 CFR 435.61 - Specialized definitions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Specialized definitions. 435.61 Section 435.61 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper Subcategory § 435.61...
40 CFR 435.61 - Specialized definitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Specialized definitions. 435.61 Section 435.61 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper Subcategory § 435.61 Specialized...
40 CFR 435.61 - Specialized definitions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Specialized definitions. 435.61 Section 435.61 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper Subcategory § 435.61...
40 CFR 435.61 - Specialized definitions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Specialized definitions. 435.61 Section 435.61 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper Subcategory § 435.61...
Distribution, behavior, and transport of inorganic and methylmercury in a high gradient stream
Flanders, J.R.; Turner, R.R.; Morrison, T.; Jensen, R.; Pizzuto, J.; Skalak, K.; Stahl, R.
2010-01-01
Concentrations of Hg remain elevated in physical and biological media of the South River (Virginia, USA), despite the cessation of the industrial use of Hg in its watershed nearly six decades ago, and physical characteristics that would not seem to favor Hg(II)-methylation. A 3-a study of inorganic Hg (IHg) and methylmercury (MeHg) was conducted in physical media (soil, sediment, surface water, porewater and soil/sediment extracts) to identify non-point sources, transport mechanisms, and potential controls on Hg(II)-methylation. Data collected from surface water and sediment indicate that the majority of the non-point sources of IHg to the South River are within the first 14. km downstream from the historic point source. Partitioning data indicate that particle bound IHg is introduced in this reach, releasing dissolved and colloidal bound IHg, which is transported downstream. Extraction experiments revealed that floodplain soils released a higher fraction of their IHg content in aqueous extractions than fine-grained sediment (FGS). Based on ultrafiltration [<5000 nominal molecular weight cutoff (NMWC)] the majority of soil IHg released was colloidal in nature, providing evidence for the continued evolution of IHg for Hg(II)-methylation from soil. Strong seasonal patterns in MeHg concentrations were observed in surface water and sediment. The highest concentrations of MeHg in surface water were observed at moderate temperatures, suggesting that other factors limit net Hg(II)-methylation. Seasonal changes in sediment organic content and the fraction of 1. N KOH-extractable THg were also observed and may be important factors in controlling net Hg(II)-methylation rates. Sulfate concentrations in surface water are low and the evidence suggests that Fe reduction may be an important Hg(II)-methylation process. The highest sediment MeHg concentrations were observed in habitats with large amounts of FGS, which are more prevalent in the upper half of the study area due to the lower hydrologic gradient and agricultural impacts. Past and present land use practices and other geomorphologic controls contribute to the erosion of banks and accumulation of fine-grained sediment in this section of the river, acting as sources of IHg. ?? 2010 Elsevier Ltd.
Towards a realistic 3D simulation of the extraction region in ITER NBI relevant ion source
NASA Astrophysics Data System (ADS)
Mochalskyy, S.; Wünderlich, D.; Fantz, U.; Franzen, P.; Minea, T.
2015-03-01
The development of negative ion (NI) sources for ITER is strongly accompanied by modelling activities. The ONIX code addresses the physics of formation and extraction of negative hydrogen ions at caesiated sources as well as the amount of co-extracted electrons. In order to be closer to the experimental conditions the code has been improved. It includes now the bias potential applied to first grid (plasma grid) of the extraction system, and the presence of Cs+ ions in the plasma. The simulation results show that such aspects play an important role for the formation of an ion-ion plasma in the boundary region by reducing the depth of the negative potential well in vicinity to the plasma grid that limits the extraction of the NIs produced at the Cs covered plasma grid surface. The influence of the initial temperature of the surface produced NI and its emission rate on the NI density in the bulk plasma that in turn affects the beam formation region was analysed. The formation of the plasma meniscus, the boundary between the plasma and the beam, was investigated for the extraction potentials of 5 and 10 kV. At the smaller extraction potential the meniscus moves closer to the plasma grid but as in the case of 10 kV the deepest meniscus bend point is still outside of the aperture. Finally, a plasma containing the same amount of NI and electrons (nH- =ne =1017 m-3) , representing good source conditioning, was simulated. It is shown that at such conditions the extracted NI current can reach values of ˜32 mA cm-2 using ITER-relevant extraction potential of 10 kV and ˜19 mA cm-2 at 5 kV. These results are in good agreement with experimental measurements performed at the small scale ITER prototype source at the test facility BATMAN.
Discovery of three strongly lensed quasars in the Sloan Digital Sky Survey
NASA Astrophysics Data System (ADS)
Williams, P. R.; Agnello, A.; Treu, T.; Abramson, L. E.; Anguita, T.; Apostolovski, Y.; Chen, G. C.-F.; Fassnacht, C. D.; Hsueh, J. W.; Lemaux, B. C.; Motta, V.; Oldham, L.; Rojas, K.; Rusu, C. E.; Shajib, A. J.; Wang, X.
2018-06-01
We present the discovery of three quasar lenses in the Sloan Digital Sky Survey, selected using two novel photometry-based selection techniques. The J0941+0518 system, with two point sources separated by 5.46 arcsec on either side of a galaxy, has source and lens redshifts 1.54 and 0.343. Images of J2257+2349 show two point sources separated by 1.67 arcsec on either side of an E/S0 galaxy. The extracted spectra show two images of the same quasar at zs = 2.10. SDSS J1640+1045 has two quasar spectra at zs = 1.70 and fits to the SDSS and Pan-STARRS images confirm the presence of a galaxy between the two point sources. We observed 56 photometrically selected lens candidates in this follow-up campaign, confirming three new lenses, re-discovering one known lens, and ruling out 36 candidates, with 16 still inconclusive. This initial campaign demonstrates the power of purely photometric selection techniques in finding lensed quasars.
Airborne LiDAR : a new source of traffic flow data.
DOT National Transportation Integrated Search
2005-10-01
LiDAR (or airborne laser scanning) systems became a dominant player in high-precision spatial data acquisition : to efficiently create DEM/DSM in the late 90's. With increasing point density, new systems are now able to : support object extraction, s...
Airborne LiDAR : a new source of traffic flow data.
DOT National Transportation Integrated Search
2005-10-01
LiDAR (or airborne laser scanning) systems became a dominant player in high-precision spatial data acquisition : to efficiently create DEM/DSM in the late 90s. With increasing point density, new systems are now able to : support object extraction, ...
40 CFR 435.60 - Applicability; description of the stripper subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... stripper subcategory. 435.60 Section 435.60 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Stripper Subcategory § 435.60 Applicability; description of the stripper subcategory. The provisions of this subpart...
Extracting the Essential Cartographic Functionality of Programs on the Web
NASA Astrophysics Data System (ADS)
Ledermann, Florian
2018-05-01
Following Aristotle, F. P. Brooks (1987) emphasizes the distinction between "essential difficulties" and "accidental difficulties" as a key challenge in software engineering. From the point of view of cartography, it would be desirable to identify the cartographic essence of a program, and subject it to additional scrutiny, while its accidental proper-ties, again from the point of view of cartography, are usually of lesser relevance to cartographic analysis. In this paper, two methods that facilitate extracting the cartographic essence of programs are presented: close reading of their source code, and the automated analysis of their runtime behavior. The advantages and shortcomings of both methods are discussed, followed by an outlook to future developments and potential applications.
Hęś, Marzanna; Gliszczyńska-Świgło, Anna; Gramza-Michałowska, Anna
2017-01-01
Plants are an important source of phenolic compounds. The antioxidant capacities of green tea, thyme and rosemary extracts that contain these compounds have been reported earlier. However, there is a lack of accessible information about their activity against lipid oxidation in emulsions and inhibit the interaction of lipid oxidation products with amino acids. Therefore, the influence of green tea, thyme and rosemary extracts and BHT (butylated hydroxytoluene) on quantitative changes in lysine and methionine in linoleic acid emulsions at a pH of isoelectric point and a pH lower than the isoelectric point of amino acids was investigated. Total phenolic contents in plant extracts were determined spectrophotometrically by using Folin-Ciocalteu's reagent, and individual phenols by using HPLC. The level of oxidation of emulsion was determined using the measurement of peroxides and TBARS (thiobarbituric acid reactive substances). Methionine and lysine in the system were reacted with sodium nitroprusside and trinitrobenzenesulphonic acid respectively, and the absorbance of the complexes was measured. Extract of green tea had the highest total polyphenol content. The system containing antioxidants and amino acid protected linoleic acid more efficiently than by the addition of antioxidants only. Lysine and methionine losses in samples without the addition of antioxidants were lower in their isoelectric points than below these points. Antioxidants decrease the loss of amino acids. The protective properties of antioxidants towards methionine were higher in a pH of isoelectric point whereas towards lysine in pH below this point. Green tea, thyme and rosemary extracts exhibit antioxidant activity in linoleic acid emulsions. Moreover, they can be utilized to inhibit quantitative changes in amino acids in lipid emulsions. However, the antioxidant efficiency of these extracts seems to depend on pH conditions. Further investigations should be carried out to clarify this issue.
NASA Astrophysics Data System (ADS)
Zhang, Yuanyuan; Gao, Zhiqiang; Liu, Xiangyang; Xu, Ning; Liu, Chaoshun; Gao, Wei
2016-09-01
Reclamation caused a significant dynamic change in the coastal zone, the tidal flat zone is an unstable reserve land resource, it has important significance for its research. In order to realize the efficient extraction of the tidal flat area information, this paper takes Rudong County in Jiangsu Province as the research area, using the HJ1A/1B images as the data source, on the basis of previous research experience and literature review, the paper chooses the method of object-oriented classification as a semi-automatic extraction method to generate waterlines. Then waterlines are analyzed by DSAS software to obtain tide points, automatic extraction of outer boundary points are followed under the use of Python to determine the extent of tidal flats in 2014 of Rudong County, the extraction area was 55182hm2, the confusion matrix is used to verify the accuracy and the result shows that the kappa coefficient is 0.945. The method could improve deficiencies of previous studies and its available free nature on the Internet makes a generalization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torcellini, Paul A.; Bonnema, Eric; Goldwasser, David
Building energy consumption can only be measured at the site or at the point of utility interconnection with a building. Often, to evaluate the total energy impact, this site-based energy consumption is translated into source energy, that is, the energy at the point of fuel extraction. Consistent with this approach, the U.S. Department of Energy's (DOE) definition of zero energy buildings uses source energy as the metric to account for energy losses from the extraction, transformation, and delivery of energy. Other organizations, as well, use source energy to characterize the energy impacts. Four methods of making the conversion from sitemore » energy to source energy were investigated in the context of the DOE definition of zero energy buildings. These methods were evaluated based on three guiding principles--improve energy efficiency, reduce and stabilize power demand, and use power from nonrenewable energy sources as efficiently as possible. This study examines relative trends between strategies as they are implemented on very low-energy buildings to achieve zero energy. A typical office building was modeled and variations to this model performed. The photovoltaic output that was required to create a zero energy building was calculated. Trends were examined with these variations to study the impacts of the calculation method on the building's ability to achieve zero energy status. The paper will highlight the different methods and give conclusions on the advantages and disadvantages of the methods studied.« less
40 CFR 435.40 - Applicability; description of the coastal subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Applicability; description of the coastal subcategory. 435.40 Section 435.40 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.40 - Applicability; description of the coastal subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Applicability; description of the coastal subcategory. 435.40 Section 435.40 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
40 CFR 435.70 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
... disposal or treatment and disposal, provided: (i) If an Oil and Gas facility, operator or its agent or... AND GAS EXTRACTION POINT SOURCE CATEGORY General Provisions § 435.70 Applicability. (a) Purpose. This subpart is intended to prevent oil and gas facilities, for which effluent limitations guidelines and...
40 CFR 435.10 - Applicability; description of the offshore subcategory.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Applicability; description of the offshore subcategory. 435.10 Section 435.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
40 CFR 435.10 - Applicability; description of the offshore subcategory.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Applicability; description of the offshore subcategory. 435.10 Section 435.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
40 CFR 435.10 - Applicability; description of the offshore subcategory.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Applicability; description of the offshore subcategory. 435.10 Section 435.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
40 CFR 435.10 - Applicability; description of the offshore subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Applicability; description of the offshore subcategory. 435.10 Section 435.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Offshore...
40 CFR 435.10 - Applicability; description of the offshore subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Applicability; description of the offshore subcategory. 435.10 Section 435.10 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Offshore...
40 CFR 435.51 - Specialized definitions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Specialized definitions. 435.51... AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Agricultural and Wildlife Water Use Subcategory § 435.51 Specialized definitions. For the purpose of this subpart: (a) Except as provided below...
NASA Astrophysics Data System (ADS)
Lee, I.-Chieh
Shoreline delineation and shoreline change detection are expensive processes in data source acquisition and manual shoreline delineation. These costs confine the frequency and interval of shoreline mapping periods. In this dissertation, a new shoreline delineation approach was developed targeting on lowering the data source cost and reducing human labor. To lower the cost of data sources, we used the public domain LiDAR data sets and satellite images to delineate shorelines without the requirement of data sets being acquired simultaneously, which is a new concept in this field. To reduce the labor cost, we made improvements in classifying LiDAR points and satellite images. Analyzing shadow relations with topography to improve the satellite image classification performance is also a brand-new concept. The extracted shoreline of the proposed approach could achieve an accuracy of 1.495 m RMSE, or 4.452m at the 95% confidence level. Consequently, the proposed approach could successfully lower the cost and shorten the processing time, in other words, to increase the shoreline mapping frequency with a reasonable accuracy. However, the extracted shoreline may not compete with the shoreline extracted by aerial photogrammetric procedures in the aspect of accuracy. Hence, this is a trade-off between cost and accuracy. This approach consists of three phases, first, a shoreline extraction procedure based mainly on LiDAR point cloud data with multispectral information from satellite images. Second, an object oriented shoreline extraction procedure to delineate shoreline solely from satellite images; in this case WorldView-2 images were used. Third, a shoreline integration procedure combining these two shorelines based on actual shoreline changes and physical terrain properties. The actual data source cost would only be from the acquisition of satellite images. On the other hand, only two processes needed human attention. First, the shoreline within harbor areas needed to be manually connected, for its length was less than 3% of the total shoreline length in our dataset. Secondly, the parameters for satellite image classification needed to be manually determined. The need for manpower was significantly less compared to the ground surveying or aerial photogrammetry. The first phase of shoreline extraction was to utilize Normalized Difference Vegetation Index (NDVI), Mean-Shift segmentation on the coordinate (X, Y, Z), and attributes (multispectral bands from satellite images) of the LiDAR points to classify each LiDAR point into land or water surface. Boundary of the land points were then traced to create the shoreline. The second phase of shoreline extraction solely from satellite images utilized spectrum, NDVI, and shadow analysis to classify the satellite images into classes. These classes were then refined by mean-shift segmentation on the panchromatic band. By tracing the boundary of the water surface, the shoreline can be created. Since these two shorelines may represent different shoreline instances in time, evaluating the changes of shoreline was the first to be done. Then an independent scenario analysis and a procedure are performed for the shoreline of each of the three conditions: in the process of erosion, in the process of accession, and remaining the same. With these three conditions, we could analysis the actual terrain type and correct the classification errors to obtain a more accurate shoreline. Meanwhile, methods of evaluating the quality of shorelines had also been discussed. The experiment showed that there were three indicators could best represent the quality of the shoreline. These indicators were: (1) shoreline accuracy, (2) land area difference between extracted shoreline and ground truth shoreline, and (3) bias factor from shoreline quality metrics.
NASA Technical Reports Server (NTRS)
Lucas, J.
1979-01-01
Thermal or electrical power from the sun's radiated energy through Point-Focusing Distributed Receiver technology is the goal of this Project. The energy thus produced must be economically competitive with other sources. The Project supports the industrial development of technology and hardware for extracting energy from solar power to achieve the stated goal. Present studies are working to concentrate the solar energy through mirrors or lenses, to a working fluid or gas, and through a power converter change to an energy source useful to man. Rankine-cycle and Brayton-cycle engines are currently being developed as the most promising energy converters for our near future needs.
Luminosity limits for liquid argon calorimetry
NASA Astrophysics Data System (ADS)
J, Rutherfoord; B, Walker R.
2012-12-01
We have irradiated liquid argon ionization chambers with betas using high-activity Strontium-90 sources. The radiation environment is comparable to that in the liquid argon calorimeters which are part of the ATLAS detector installed at CERN's Large Hadron Collider. We measure the ionization current over a wide range of applied potential for two different source activities and for three different chamber gaps. These studies provide operating experience at exceptionally high ionization rates. We can operate these chambers either in the normal mode or in the space-charge limited regime and thereby determine the transition point between the two. From the transition point we indirectly extract the positive argon ion mobility.
NASA Astrophysics Data System (ADS)
Gilles, Antonin; Gioia, Patrick; Cozot, Rémi; Morin, Luce
2015-09-01
The hybrid point-source/wave-field method is a newly proposed approach for Computer-Generated Hologram (CGH) calculation, based on the slicing of the scene into several depth layers parallel to the hologram plane. The complex wave scattered by each depth layer is then computed using either a wave-field or a point-source approach according to a threshold criterion on the number of points within the layer. Finally, the complex waves scattered by all the depth layers are summed up in order to obtain the final CGH. Although outperforming both point-source and wave-field methods without producing any visible artifact, this approach has not yet been used for animated holograms, and the possible exploitation of temporal redundancies has not been studied. In this paper, we propose a fast computation of video holograms by taking into account those redundancies. Our algorithm consists of three steps. First, intensity and depth data of the current 3D video frame are extracted and compared with those of the previous frame in order to remove temporally redundant data. Then the CGH pattern for this compressed frame is generated using the hybrid point-source/wave-field approach. The resulting CGH pattern is finally transmitted to the video output and stored in the previous frame buffer. Experimental results reveal that our proposed method is able to produce video holograms at interactive rates without producing any visible artifact.
The ISOPHOT far-infrared serendipity north ecliptic pole minisurvey
NASA Astrophysics Data System (ADS)
Stickel, M.; Bogun, S.; Lemke, D.; Klaas, U.; Toth, L. V.; Herbstmeier, U.; Richter, G.; Assendorp, R.; Laureijs, R.; Kessler, M. F.; Burgdorf, M.; Beichman, C. A.; Rowan-Robinson, M.; Efstathiou, A.
1998-08-01
The ISOPHOT Serendipity Survey fills the otherwise unused slew time between ISO's fine pointings with measurements in an unexplored wavelength regime near 200 microns. In order to test point source extraction software, the completeness of the detected objects as well as the astrophysical content we investigate a 100 sq degr field near the North ecliptic pole, dubbed ISOPHOT Serendipity Minisurvey field. A total of 21 IRAS point sources were detected on the Serendipity slews crossing the field. 19 of these objects are galaxies, one is a planetary nebula and one is an empty field without a bright optical counterpart. The detection completeness is better than 90% for IRAS sources brighter than 2 Jy at 100 microns and better than 80% for sources brighter than 1.5 Jy. The source detection frequency is about 1 per 40degr slew length, in agreement with previous estimations based on galaxy number counts. After the end of the ISO mission, about 4000 point sources are expected to be found in the Serendipity slews. Based on observations with ISO, an ESA project with instruments funded by ESA Member States (especially the PI countries: France, Germany, the Netherlands and the United Kingdom) and with the participation of ISAS and NASA. Members of the Consortium on the ISOPHOT Serendipity Survey (CISS) are MPIA Heidelberg, ESA ISO SOC Villafranca, AIP Potsdam, IPAC Pasadena, Imperial College London
Extracting spatial information from large aperture exposures of diffuse sources
NASA Technical Reports Server (NTRS)
Clarke, J. T.; Moos, H. W.
1981-01-01
The spatial properties of large aperture exposures of diffuse emission can be used both to investigate spatial variations in the emission and to filter out camera noise in exposures of weak emission sources. Spatial imaging can be accomplished both parallel and perpendicular to dispersion with a resolution of 5-6 arc sec, and a narrow median filter running perpendicular to dispersion across a diffuse image selectively filters out point source features, such as reseaux marks and fast particle hits. Spatial information derived from observations of solar system objects is presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knuth, Eldon L.; Miller, David R.; Even, Uzi
2014-12-09
Data extracted from time-of-flight (TOF) measurements made on steady-state He free jets at Göttingen already in 1986 and for pulsed Ne free jets investigated recently at Tel Aviv have been added to an earlier plot of terminal condensed-phase mass fraction x{sub 2∞} as a function of the dimensionless scaling parameter Γ. Γ characterizes the source (fluid species, temperature, pressure and throat diameter); values of x{sub 2∞} are extracted from TOF measurements using conservation of energy in the free-jet expansion. For nozzles consisting of an orifice in a thin plate; the extracted data yield 22 data points which are correlated satisfactorilymore » by a single curve. The Ne free jets were expanded from a conical nozzle with a 20° half angle; the three extracted data points stand together but apart from the aforementioned curve, indicating that the presence of the conical wall influences significantly the expansion and hence the condensation. The 22 data points for the expansions via an orifice consist of 15 measurements with expansions from the gas-phase side of the binodal curve which crossed the binodal curve downstream from the sonic point and 7 measurements with expansions of the gas-phase product of the flashing which occurred after an expansion from the liquid-phase side of the binodal curve crossed the binodal curve upstream from the sonic point. The association of these 22 points with a single curve supports the alternating-phase model for flows with flashing upstream from the sonic point proposed earlier. In order to assess the role of the spinodal curve in such expansions, the spinodal curves for He and Ne were computed using general multi-parameter Helmholtz-free-energy equation-of-state formulations. Then, for the several sets of source-chamber conditions used in the free-jet measurements, thermodynamic states at key locations in the free-jet expansions (binodal curve, sonic point and spinodal curve) were evaluated, with the expansion presumed to be metastable from the binodal curve to the spinodal curve. TOF distributions with more than two peaks (interpreted earlier as superimposed alternating-state TOF distributions) indicated flashing of the metastable flow downstream from the binodal curve but upstream from the sonic point. This relatively early flashing is due apparently to destabilizing interactions with the walls of the source. If the expansion crosses the binodal curve downstream from the nozzle, the metastable fluid does not interact with surfaces and flashing might be delayed until the expansion reaches the spinodal curve. It is concluded that, if the expansion crosses the binodal curve before reaching the sonic point, the resulting metastable fluid downstream from the binodal curve interacts with the adjacent surfaces and flashes into liquid and vapor phases which expand alternately through the nozzle; the two associated alternating TOF distributions are superposed by the chopping process so that the result has the appearance of a single distribution with three peaks.« less
In-Situ Wave Observations in the High Resolution Air-Sea Interaction DRI
2007-09-30
directional spectra extracted from the Coastal Data Information Program ( CDIP ) Harvest buoy located in 204 m depth off Point Conception. The initial sea...frequency-directional wave spectra (source: CDIP ). Upper panels: Typical summer-time South swell in the presence of a light North-West wind sea
Utilizing water treatment residuals to reduce phosphorus runoff from biosolids
USDA-ARS?s Scientific Manuscript database
Approximately 40% of biosolids (sewage sludge) produced in the U.S. are incinerated or landfilled rather than land applied due to concern over non-point source phosphorus (P) runoff. The objective of this study was to determine the impact of chemical amendments on water-extractable P (WEP) in appli...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... economically achievable (BAT). 435.43 Section 435.43 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... control technology (BCT). 435.44 Section 435.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... technology currently available (BPT). 435.42 Section 435.42 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Effluent limitations guidelines... control technology (BCT). 435.44 Section 435.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Effluent limitations guidelines... economically achievable (BAT). 435.43 Section 435.43 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Coastal...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Effluent limitations guidelines... technology currently available (BPT). 435.42 Section 435.42 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Effluent limitations guidelines... technology currently available. 435.52 Section 435.52 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 30 2014-07-01 2014-07-01 false Effluent limitations guidelines... technology currently available. 435.32 Section 435.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Effluent limitations guidelines... technology currently available. 435.52 Section 435.52 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 31 2012-07-01 2012-07-01 false Effluent limitations guidelines... technology currently available. 435.32 Section 435.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... technology currently available. 435.52 Section 435.52 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... technology currently available (BPT). 435.12 Section 435.12 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Effluent limitations guidelines... technology currently available. 435.32 Section 435.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Effluent limitations guidelines... technology currently available. 435.32 Section 435.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Onshore...
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 31 2013-07-01 2013-07-01 false Effluent limitations guidelines... technology currently available. 435.52 Section 435.52 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE...
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Effluent limitations guidelines... technology currently available. 435.32 Section 435.32 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Onshore...
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Effluent limitations guidelines... technology currently available. 435.52 Section 435.52 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
Excess TDS/Major Ionic Stress/Elevated Conductivities appeared increasing in streams in Central and Eastern Appalachia. Direct discharges from permitted point sources and regional interest in setting eco-based effluent guidelines/aquatic life criteria, as well as potential differ...
Fractal Complexity-Based Feature Extraction Algorithm of Communication Signals
NASA Astrophysics Data System (ADS)
Wang, Hui; Li, Jingchao; Guo, Lili; Dou, Zheng; Lin, Yun; Zhou, Ruolin
How to analyze and identify the characteristics of radiation sources and estimate the threat level by means of detecting, intercepting and locating has been the central issue of electronic support in the electronic warfare, and communication signal recognition is one of the key points to solve this issue. Aiming at accurately extracting the individual characteristics of the radiation source for the increasingly complex communication electromagnetic environment, a novel feature extraction algorithm for individual characteristics of the communication radiation source based on the fractal complexity of the signal is proposed. According to the complexity of the received signal and the situation of environmental noise, use the fractal dimension characteristics of different complexity to depict the subtle characteristics of the signal to establish the characteristic database, and then identify different broadcasting station by gray relation theory system. The simulation results demonstrate that the algorithm can achieve recognition rate of 94% even in the environment with SNR of -10dB, and this provides an important theoretical basis for the accurate identification of the subtle features of the signal at low SNR in the field of information confrontation.
LESTO: an Open Source GIS-based toolbox for LiDAR analysis
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino
2015-04-01
During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed considering an Optimizer Algorithm based on Particle Swarm (PS) and a matching procedure which takes the position and the height of the extracted trees respect to the measured ones and iteratively tries to improve the candidate solution changing the models' parameters. Example of application of the LESTO tools will be presented on test sites. Test area consists in a series of circular sampling plots randomly selected from a 50x50 m regular grid within a buffer zone of 150 m from the forest road. Other studies on the same sites take as reference measurements of position, diameter, species and height and proposed allometric relationships. These allometric relationship were obtained for each species deriving the stem volume of single trees based on height and diameter at breast height. LESTO is integrated in the JGrassTools project and available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas
2016-10-01
In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.
SPITZER MIPS 24 and 70 {mu}m IMAGING NEAR THE SOUTH ECLIPTIC POLE: MAPS AND SOURCE CATALOGS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scott, Kimberly S.; Stabenau, Hans F.; Devlin, Mark J.
2010-12-15
We have imaged an 11.5 deg{sup 2} region of sky toward the South Ecliptic Pole (R.A. =04{sup h}43{sup m}, decl. =-53{sup 0}40', J2000) at 24 and 70 {mu}m with MIPS, the Multiband Imaging Photometer for Spitzer. This region is coincident with a field mapped at longer wavelengths by AKARI and BLAST. We discuss our data reduction and source extraction procedures. The median 1{sigma} depths of the maps are 47 {mu}Jy beam{sup -1} at 24 {mu}m and 4.3 mJy beam{sup -1} at 70 {mu}m. At 24 {mu}m, we identify 93,098 point sources with signal-to-noise ratio (S/N) {>=}5 and an additional 63more » resolved galaxies; at 70 {mu}m we identify 891 point sources with S/N {>=}6. From simulations, we determine a false detection rate of 1.8% (1.1%) for the 24 {mu}m (70 {mu}m) catalog. The 24 and 70 {mu}m point-source catalogs are 80% complete at 230 {mu}Jy and 11 mJy, respectively. These mosaic images and source catalogs will be available to the public through the NASA/IPAC Infrared Science Archive.« less
40 CFR 436.20 - Applicability; description of the crushed stone subcategory.
Code of Federal Regulations, 2010 CFR
2010-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS MINERAL MINING AND PROCESSING POINT SOURCE CATEGORY Crushed... stone and riprap. This subpart includes all types of rock and stone. Rock and stone that is crushed or broken prior to the extraction of a mineral are elsewhere covered. The processing of calcite, however, in...
40 CFR 436.20 - Applicability; description of the crushed stone subcategory.
Code of Federal Regulations, 2012 CFR
2012-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) MINERAL MINING AND PROCESSING POINT SOURCE CATEGORY... stone and riprap. This subpart includes all types of rock and stone. Rock and stone that is crushed or broken prior to the extraction of a mineral are elsewhere covered. The processing of calcite, however, in...
40 CFR 436.20 - Applicability; description of the crushed stone subcategory.
Code of Federal Regulations, 2013 CFR
2013-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) MINERAL MINING AND PROCESSING POINT SOURCE CATEGORY... stone and riprap. This subpart includes all types of rock and stone. Rock and stone that is crushed or broken prior to the extraction of a mineral are elsewhere covered. The processing of calcite, however, in...
40 CFR 436.20 - Applicability; description of the crushed stone subcategory.
Code of Federal Regulations, 2011 CFR
2011-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS MINERAL MINING AND PROCESSING POINT SOURCE CATEGORY Crushed... stone and riprap. This subpart includes all types of rock and stone. Rock and stone that is crushed or broken prior to the extraction of a mineral are elsewhere covered. The processing of calcite, however, in...
40 CFR 436.20 - Applicability; description of the crushed stone subcategory.
Code of Federal Regulations, 2014 CFR
2014-07-01
... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) MINERAL MINING AND PROCESSING POINT SOURCE CATEGORY... stone and riprap. This subpart includes all types of rock and stone. Rock and stone that is crushed or broken prior to the extraction of a mineral are elsewhere covered. The processing of calcite, however, in...
40 CFR 125.81 - Who is subject to this subpart?
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 22 2014-07-01 2013-07-01 true Who is subject to this subpart? 125.81 Section 125.81 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... coastal subcategories of the oil and gas extraction point source category as defined under 40 CFR 435.10...
40 CFR 125.81 - Who is subject to this subpart?
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 23 2012-07-01 2012-07-01 false Who is subject to this subpart? 125.81 Section 125.81 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... coastal subcategories of the oil and gas extraction point source category as defined under 40 CFR 435.10...
40 CFR 125.81 - Who is subject to this subpart?
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 23 2013-07-01 2013-07-01 false Who is subject to this subpart? 125.81 Section 125.81 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... coastal subcategories of the oil and gas extraction point source category as defined under 40 CFR 435.10...
Code of Federal Regulations, 2013 CFR
2013-07-01
... located beyond 3 miles from shore: Water-based drilling fluids and associated drill cuttings Free Oil No... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY... parameter BCT effluent limitation Produced water Oil & grease The maximum for any one day shall not exceed...
Code of Federal Regulations, 2012 CFR
2012-07-01
... located beyond 3 miles from shore: Water-based drilling fluids and associated drill cuttings Free Oil No... (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS (CONTINUED) OIL AND GAS EXTRACTION POINT SOURCE CATEGORY... parameter BCT effluent limitation Produced water Oil & grease The maximum for any one day shall not exceed...
40 CFR Appendix 3 to Subpart A of... - Procedure for Mixing Base Fluids With Sediments
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Procedure for Mixing Base Fluids With Sediments 3 Appendix 3 to Subpart A of Part 435 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Offshore...
40 CFR Appendix 3 to Subpart A of... - Procedure for Mixing Base Fluids With Sediments
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Procedure for Mixing Base Fluids With Sediments 3 Appendix 3 to Subpart A of Part 435 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY Offshore...
40 CFR 125.81 - Who is subject to this subpart?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 21 2010-07-01 2010-07-01 false Who is subject to this subpart? 125.81 Section 125.81 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... coastal subcategories of the oil and gas extraction point source category as defined under 40 CFR 435.10...
40 CFR 125.81 - Who is subject to this subpart?
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 22 2011-07-01 2011-07-01 false Who is subject to this subpart? 125.81 Section 125.81 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) WATER PROGRAMS... coastal subcategories of the oil and gas extraction point source category as defined under 40 CFR 435.10...
Costa, Margarida; Garcia, Mónica; Costa-Rodrigues, João; Costa, Maria Sofia; Ribeiro, Maria João; Fernandes, Maria Helena; Barros, Piedade; Barreiro, Aldo; Vasconcelos, Vitor; Martins, Rosário
2013-01-01
The oceans remain a major source of natural compounds with potential in pharmacology. In particular, during the last few decades, marine cyanobacteria have been in focus as producers of interesting bioactive compounds, especially for the treatment of cancer. In this study, the anticancer potential of extracts from twenty eight marine cyanobacteria strains, belonging to the underexplored picoplanktonic genera, Cyanobium, Synechocystis and Synechococcus, and the filamentous genera, Nodosilinea, Leptolyngbya, Pseudanabaena and Romeria, were assessed in eight human tumor cell lines. First, a crude extract was obtained by dichloromethane:methanol extraction, and from it, three fractions were separated in a Si column chromatography. The crude extract and fractions were tested in eight human cancer cell lines for cell viability/toxicity, accessed with the 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl tetrazolium bromide (MTT) and lactic dehydrogenase release (LDH) assays. Eight point nine percent of the strains revealed strong cytotoxicity; 17.8% showed moderate cytotoxicity, and 14.3% assays showed low toxicity. The results obtained revealed that the studied genera of marine cyanobacteria are a promising source of novel compounds with potential anticancer activity and highlight the interest in also exploring the smaller filamentous and picoplanktonic genera of cyanobacteria. PMID:24384871
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathew, Jose V.; Paul, Samit; Bhattacharjee, Sudeep
2010-05-15
An earlier study of the axial ion energy distribution in the extraction region (plasma meniscus) of a compact microwave plasma ion source showed that the axial ion energy spread near the meniscus is small ({approx}5 eV) and comparable to that of a liquid metal ion source, making it a promising candidate for focused ion beam (FIB) applications [J. V. Mathew and S. Bhattacharjee, J. Appl. Phys. 105, 96101 (2009)]. In the present work we have investigated the radial ion energy distribution (IED) under the influence of beam extraction. Initially a single Einzel lens system has been used for beam extractionmore » with potentials up to -6 kV for obtaining parallel beams. In situ measurements of IED with extraction voltages upto -5 kV indicates that beam extraction has a weak influence on the energy spread ({+-}0.5 eV) which is of significance from the point of view of FIB applications. It is found that by reducing the geometrical acceptance angle at the ion energy analyzer probe, close to unidirectional distribution can be obtained with a spread that is smaller by at least 1 eV.« less
NASA Astrophysics Data System (ADS)
Setiyoko, A.; Dharma, I. G. W. S.; Haryanto, T.
2017-01-01
Multispectral data and hyperspectral data acquired from satellite sensor have the ability in detecting various objects on the earth ranging from low scale to high scale modeling. These data are increasingly being used to produce geospatial information for rapid analysis by running feature extraction or classification process. Applying the most suited model for this data mining is still challenging because there are issues regarding accuracy and computational cost. This research aim is to develop a better understanding regarding object feature extraction and classification applied for satellite image by systematically reviewing related recent research projects. A method used in this research is based on PRISMA statement. After deriving important points from trusted sources, pixel based and texture-based feature extraction techniques are promising technique to be analyzed more in recent development of feature extraction and classification.
Production Strategies and Applications of Microbial Single Cell Oils
Ochsenreither, Katrin; Glück, Claudia; Stressler, Timo; Fischer, Lutz; Syldatk, Christoph
2016-01-01
Polyunsaturated fatty acids (PUFAs) of the ω-3 and ω-6 class (e.g., α-linolenic acid, linoleic acid) are essential for maintaining biofunctions in mammalians like humans. Due to the fact that humans cannot synthesize these essential fatty acids, they must be taken up from different food sources. Classical sources for these fatty acids are porcine liver and fish oil. However, microbial lipids or single cell oils, produced by oleaginous microorganisms such as algae, fungi and bacteria, are a promising source as well. These single cell oils can be used for many valuable chemicals with applications not only for nutrition but also for fuels and are therefore an ideal basis for a bio-based economy. A crucial point for the establishment of microbial lipids utilization is the cost-effective production and purification of fuels or products of higher value. The fermentative production can be realized by submerged (SmF) or solid state fermentation (SSF). The yield and the composition of the obtained microbial lipids depend on the type of fermentation and the particular conditions (e.g., medium, pH-value, temperature, aeration, nitrogen source). From an economical point of view, waste or by-product streams can be used as cheap and renewable carbon and nitrogen sources. In general, downstream processing costs are one of the major obstacles to be solved for full economic efficiency of microbial lipids. For the extraction of lipids from microbial biomass cell disruption is most important, because efficiency of cell disruption directly influences subsequent downstream operations and overall extraction efficiencies. A multitude of cell disruption and lipid extraction methods are available, conventional as well as newly emerging methods, which will be described and discussed in terms of large scale applicability, their potential in a modern biorefinery and their influence on product quality. Furthermore, an overview is given about applications of microbial lipids or derived fatty acids with emphasis on food applications. PMID:27761130
User's Guide for the Agricultural Non-Point Source (AGNPS) Pollution Model Data Generator
Finn, Michael P.; Scheidt, Douglas J.; Jaromack, Gregory M.
2003-01-01
BACKGROUND Throughout this user guide, we refer to datasets that we used in conjunction with developing of this software for supporting cartographic research and producing the datasets to conduct research. However, this software can be used with these datasets or with more 'generic' versions of data of the appropriate type. For example, throughout the guide, we refer to national land cover data (NLCD) and digital elevation model (DEM) data from the U.S. Geological Survey (USGS) at a 30-m resolution, but any digital terrain model or land cover data at any appropriate resolution will produce results. Another key point to keep in mind is to use a consistent data resolution for all the datasets per model run. The U.S. Department of Agriculture (USDA) developed the Agricultural Nonpoint Source (AGNPS) pollution model of watershed hydrology in response to the complex problem of managing nonpoint sources of pollution. AGNPS simulates the behavior of runoff, sediment, and nutrient transport from watersheds that have agriculture as their prime use. The model operates on a cell basis and is a distributed parameter, event-based model. The model requires 22 input parameters. Output parameters are grouped primarily by hydrology, sediment, and chemical output (Young and others, 1995.) Elevation, land cover, and soil are the base data from which to extract the 22 input parameters required by the AGNPS. For automatic parameter extraction, follow the general process described in this guide of extraction from the geospatial data through the AGNPS Data Generator to generate input parameters required by the pollution model (Finn and others, 2002.)
2009-02-01
All Sky Survey ( 2MASS ) coordinates of the nucleus were used to verify the coordinates of each observation. The SH and LH staring observations include...isolate the nuclear region in the mapping obser- vations, fluxes were extracted from a single slit coinciding with the radio or 2MASS nuclear...presence of a hard X-ray point source coin- cident with either the radio or 2MASS nucleus and log(LX) 38 erg s−1. The resulting subsample consists of
NASA Astrophysics Data System (ADS)
Chmiel, Malgorzata; Roux, Philippe; Herrmann, Philippe; Rondeleux, Baptiste; Wathelet, Marc
2018-05-01
We investigated the construction of diffraction kernels for surface waves using two-point convolution and/or correlation from land active seismic data recorded in the context of exploration geophysics. The high density of controlled sources and receivers, combined with the application of the reciprocity principle, allows us to retrieve two-dimensional phase-oscillation diffraction kernels (DKs) of surface waves between any two source or receiver points in the medium at each frequency (up to 15 Hz, at least). These DKs are purely data-based as no model calculations and no synthetic data are needed. They naturally emerge from the interference patterns of the recorded wavefields projected on the dense array of sources and/or receivers. The DKs are used to obtain multi-mode dispersion relations of Rayleigh waves, from which near-surface shear velocity can be extracted. Using convolution versus correlation with a grid of active sources is an important step in understanding the physics of the retrieval of surface wave Green's functions. This provides the foundation for future studies based on noise sources or active sources with a sparse spatial distribution.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobulnicky, Henry A.; Alexander, Michael J.; Babler, Brian L.
We characterize the completeness of point source lists from Spitzer Space Telescope surveys in the four Infrared Array Camera (IRAC) bandpasses, emphasizing the Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE) programs (GLIMPSE I, II, 3D, 360; Deep GLIMPSE) and their resulting point source Catalogs and Archives. The analysis separately addresses effects of incompleteness resulting from high diffuse background emission and incompleteness resulting from point source confusion (i.e., crowding). An artificial star addition and extraction analysis demonstrates that completeness is strongly dependent on local background brightness and structure, with high-surface-brightness regions suffering up to five magnitudes of reduced sensitivity to pointmore » sources. This effect is most pronounced at the IRAC 5.8 and 8.0 {mu}m bands where UV-excited polycyclic aromatic hydrocarbon emission produces bright, complex structures (photodissociation regions). With regard to diffuse background effects, we provide the completeness as a function of stellar magnitude and diffuse background level in graphical and tabular formats. These data are suitable for estimating completeness in the low-source-density limit in any of the four IRAC bands in GLIMPSE Catalogs and Archives and some other Spitzer IRAC programs that employ similar observational strategies and are processed by the GLIMPSE pipeline. By performing the same analysis on smoothed images we show that the point source incompleteness is primarily a consequence of structure in the diffuse background emission rather than photon noise. With regard to source confusion in the high-source-density regions of the Galactic Plane, we provide figures illustrating the 90% completeness levels as a function of point source density at each band. We caution that completeness of the GLIMPSE 360/Deep GLIMPSE Catalogs is suppressed relative to the corresponding Archives as a consequence of rejecting stars that lie in the point-spread function wings of saturated sources. This effect is minor in regions of low saturated star density, such as toward the Outer Galaxy; this effect is significant along sightlines having a high density of saturated sources, especially for Deep GLIMPSE and other programs observing closer to the Galactic center using 12 s or longer exposure times.« less
Kamali, Tschackad; Považay, Boris; Kumar, Sunil; Silberberg, Yaron; Hermann, Boris; Werkmeister, René; Drexler, Wolfgang; Unterhuber, Angelika
2014-10-01
We demonstrate a multimodal optical coherence tomography (OCT) and online Fourier transform coherent anti-Stokes Raman scattering (FTCARS) platform using a single sub-12 femtosecond (fs) Ti:sapphire laser enabling simultaneous extraction of structural and chemical ("morphomolecular") information of biological samples. Spectral domain OCT prescreens the specimen providing a fast ultrahigh (4×12 μm axial and transverse) resolution wide field morphologic overview. Additional complementary intrinsic molecular information is obtained by zooming into regions of interest for fast label-free chemical mapping with online FTCARS spectroscopy. Background-free CARS is based on a Michelson interferometer in combination with a highly linear piezo stage, which allows for quick point-to-point extraction of CARS spectra in the fingerprint region in less than 125 ms with a resolution better than 4 cm(-1) without the need for averaging. OCT morphology and CARS spectral maps indicating phosphate and carbonate bond vibrations from human bone samples are extracted to demonstrate the performance of this hybrid imaging platform.
D-D neutron generator development at LBNL.
Reijonen, J; Gicquel, F; Hahto, S K; King, M; Lou, T-P; Leung, K-N
2005-01-01
The plasma and ion source technology group in Lawrence Berkeley National Laboratory is developing advanced, next generation D-D neutron generators. There are three distinctive developments, which are discussed in this presentation, namely, multi-stage, accelerator-based axial neutron generator, high-output co-axial neutron generator and point source neutron generator. These generators employ RF-induction discharge to produce deuterium ions. The distinctive feature of RF-discharge is its capability to generate high atomic hydrogen species, high current densities and stable and long-life operation. The axial neutron generator is designed for applications that require fast pulsing together with medium to high D-D neutron output. The co-axial neutron generator is aimed for high neutron output with cw or pulsed operation, using either the D-D or D-T fusion reaction. The point source neutron generator is a new concept, utilizing a toroidal-shaped plasma generator. The beam is extracted from multiple apertures and focus to the target tube, which is located at the middle of the generator. This will generate a point source of D-D, T-T or D-T neutrons with high output flux. The latest development together with measured data will be discussed in this article.
SIFT optimization and automation for matching images from multiple temporal sources
NASA Astrophysics Data System (ADS)
Castillo-Carrión, Sebastián; Guerrero-Ginel, José-Emilio
2017-05-01
Scale Invariant Feature Transformation (SIFT) was applied to extract tie-points from multiple source images. Although SIFT is reported to perform reliably under widely different radiometric and geometric conditions, using the default input parameters resulted in too few points being found. We found that the best solution was to focus on large features as these are more robust and not prone to scene changes over time, which constitutes a first approach to the automation of processes using mapping applications such as geometric correction, creation of orthophotos and 3D models generation. The optimization of five key SIFT parameters is proposed as a way of increasing the number of correct matches; the performance of SIFT is explored in different images and parameter values, finding optimization values which are corroborated using different validation imagery. The results show that the optimization model improves the performance of SIFT in correlating multitemporal images captured from different sources.
Choi, WooJhon; Baumann, Bernhard; Swanson, Eric A.; Fujimoto, James G.
2012-01-01
We present a numerical approach to extract the dispersion mismatch in ultrahigh-resolution Fourier domain optical coherence tomography (OCT) imaging of the retina. The method draws upon an analogy with a Shack-Hartmann wavefront sensor. By exploiting mathematical similarities between the expressions for aberration in optical imaging and dispersion mismatch in spectral / Fourier domain OCT, Shack-Hartmann principles can be extended from the two-dimensional paraxial wavevector space (or the x-y plane in the spatial domain) to the one-dimensional wavenumber space (or the z-axis in the spatial domain). For OCT imaging of the retina, different retinal layers, such as the retinal nerve fiber layer (RNFL), the photoreceptor inner and outer segment junction (IS/OS), or all the retinal layers near the retinal pigment epithelium (RPE) can be used as point source beacons in the axial direction, analogous to point source beacons used in conventional two-dimensional Shack-Hartman wavefront sensors for aberration characterization. Subtleties regarding speckle phenomena in optical imaging, which affect the Shack-Hartmann wavefront sensor used in adaptive optics, also occur analogously in this application. Using this approach and carefully suppressing speckle, the dispersion mismatch in spectral / Fourier domain OCT retinal imaging can be successfully extracted numerically and used for numerical dispersion compensation to generate sharper, ultrahigh-resolution OCT images. PMID:23187353
40 CFR Appendix 8 to Subpart A of... - Reference C16-C18 Internal Olefin Drilling Fluid Formulation
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 29 2010-07-01 2010-07-01 false Reference C16-C18 Internal Olefin Drilling Fluid Formulation 8 Appendix 8 to Subpart A of Part 435 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
40 CFR Appendix 8 to Subpart A of... - Reference C16-C18 Internal Olefin Drilling Fluid Formulation
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 30 2011-07-01 2011-07-01 false Reference C16-C18 Internal Olefin Drilling Fluid Formulation 8 Appendix 8 to Subpart A of Part 435 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS OIL AND GAS EXTRACTION POINT SOURCE CATEGORY...
NASA Astrophysics Data System (ADS)
Delgado, Carlos; Cátedra, Manuel Felipe
2018-05-01
This work presents a technique that allows a very noticeable relaxation of the computational requirements for full-wave electromagnetic simulations based on the Method of Moments. A ray-tracing analysis of the geometry is performed in order to extract the critical points with significant contributions. These points are then used to generate a reduced mesh, considering the regions of the geometry that surround each critical point and taking into account the electrical path followed from the source. The electromagnetic analysis of the reduced mesh produces very accurate results, requiring a fraction of the resources that the conventional analysis would utilize.
The Spitzer-IRAC Point-source Catalog of the Vela-D Cloud
NASA Astrophysics Data System (ADS)
Strafella, F.; Elia, D.; Campeggio, L.; Giannini, T.; Lorenzetti, D.; Marengo, M.; Smith, H. A.; Fazio, G.; De Luca, M.; Massi, F.
2010-08-01
This paper presents the observations of Cloud D in the Vela Molecular Ridge, obtained with the Infrared Array Camera (IRAC) camera on board the Spitzer Space Telescope at the wavelengths λ = 3.6, 4.5, 5.8, and 8.0 μm. A photometric catalog of point sources, covering a field of approximately 1.2 deg2, has been extracted and complemented with additional available observational data in the millimeter region. Previous observations of the same region, obtained with the Spitzer MIPS camera in the photometric bands at 24 μm and 70 μm, have also been reconsidered to allow an estimate of the spectral slope of the sources in a wider spectral range. A total of 170,299 point sources, detected at the 5σ sensitivity level in at least one of the IRAC bands, have been reported in the catalog. There were 8796 sources for which good quality photometry was obtained in all four IRAC bands. For this sample, a preliminary characterization of the young stellar population based on the determination of spectral slope is discussed; combining this with diagnostics in the color-magnitude and color-color diagrams, the relative population of young stellar objects (YSOs) in different evolutionary classes has been estimated and a total of 637 candidate YSOs have been selected. The main differences in their relative abundances have been highlighted and a brief account for their spatial distribution is given. The star formation rate has also been estimated and compared with the values derived for other star-forming regions. Finally, an analysis of the spatial distribution of the sources by means of the two-point correlation function shows that the younger population, constituted by the Class I and flat-spectrum sources, is significantly more clustered than the Class II and III sources.
Yttrium recovery from primary and secondary sources: A review of main hydrometallurgical processes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Innocenzi, Valentina, E-mail: valentina.innocenzi1@univaq.it; De Michelis, Ida; Kopacek, Bernd
2014-07-15
Highlights: • Review of the main hydrometallurgical processes to recover yttrium. • Recovery of yttrium from primary sources. • Recovery of yttrium from e-waste and other types of waste. - Abstract: Yttrium is important rare earths (REs) used in numerous fields, mainly in the phosphor powders for low-energy lighting. The uses of these elements, especially for high-tech products are increased in recent years and combined with the scarcity of the resources and the environmental impact of the technologies to extract them from ores make the recycling waste, that contain Y and other RE, a priority. The present review summarized themore » main hydrometallurgical technologies to extract Y from ores, contaminated solutions, WEEE and generic wastes. Before to discuss the works about the treatment of wastes, the processes to retrieval Y from ores are discussed, since the processes are similar and derived from those already developed for the extraction from primary sources. Particular attention was given to the recovery of Y from WEEE because the recycle of them is important not only for economical point of view, considering its value, but also for environmental impact that this could be generated if not properly disposal.« less
Comparison of Psilocybe cubensis spore and mycelium allergens.
Helbling, A; Horner, W E; Lehrer, S B
1993-05-01
Basidiospores are an important cause of respiratory allergy in mold-sensitive atopic subjects. Collection of the large amounts of spores required for extract preparation is tedious and difficult. A desirable alternative could be mycelium grown in vitro if it is allergenically similar to spores. Therefore this study compared the allergen contents of Psilocybe cubensis spore and mycelium extracts by different techniques with the use of pooled sera from subjects who had skin test and RAST results that were positive to P. cubensis spores. Isoelectric focusing immunoprints revealed six common IgE-binding bands at isoelectric points 4.7, 5.0, 5.5, 5.6, 8.7, and 9.3. Two additional bands at isoelectric points 3.9 and 5.7 were detected only in the spore extract. Sodium dodecylsulfate-polyacrylamide gel electrophoresis immunoblots exhibited six common IgE-binding bands at 16, 35, 487, 52, 62, and 76 kd; 20 and 40 kd bands were present only in the spore extract. Although RAST and isoelectric focusing inhibition demonstrated that P. cubensis spore and mycelium extracts share many allergens, spores were allergenically more potent than mycelium. The results indicate that mycelium is a useful source of P. cubensis allergen, even though several spore allergens were not detected in mycelium.
Ion Diode Experiments on PBFA-X
NASA Astrophysics Data System (ADS)
Lockner, Thomas
1996-05-01
The PBFA-II pulsed power accelerator at Sandia National Laboratories has been modified to replace the radially focusing ion diode with an extraction ion diode. In the extraction diode mode (PBFA X) the ion beam is generated on the surface of an annular disk and extracted along the cylindrical axis. An additional magnetically insulated transmission line (MITL) has been installed to transmit power from the center to the bottom of the accelerator, where it drives a magnetically insulated extraction ion diode. The modification increases access to the diode and the diagnostics, permitting a higher shot rate, and allows us to study extraction diode technology at a power level near what is required for a high yield facility. The modification also includes reversing the polarity of the top half of the accelerator to permit operation at twice the previous source voltage. In the new configuration the diode could operate at 15 MV and 0.8 MA. This operating point is near the 30 MV, 1.0 MA operating point envisioned for one module of a high yield facility, and will allow the study of intense extraction ion diodes at power levels relevant to such a facility. Experimental results will be presented including MITL coupling studies, beam current density control, discharge cleaning of diode surfaces to reduce the presence of contaminant ions in the source beam, and the effect of anode substrate materials on the purity of the lithium beam. A comparison between predicted and measured radial beam profiles will also be presented, with the predicted profiles obtained from the ATHETA code that solves magnetostatics problems in two dimensions. This work was supported by the US/DOE under contract No. DE-AC04-94AL85000. +In collaboration with R. S. Coats, M. E. Cuneo, M. P. Desjarlias, D. J. Johnson, T. A. Mehlhorn, C. W. Mendel, Jr., P. Menge#, and W. J. Poukey,
Dust Storm over the Middle East: Retrieval Approach, Source Identification, and Trend Analysis
NASA Astrophysics Data System (ADS)
Moridnejad, A.; Karimi, N.; Ariya, P. A.
2014-12-01
The Middle East region has been considered to be responsible for approximately 25% of the Earth's global emissions of dust particles. By developing Middle East Dust Index (MEDI) and applying to 70 dust storms characterized on MODIS images and occurred during the period between 2001 and 2012, we herein present a new high resolution mapping of major atmospheric dust source points participating in this region. To assist environmental managers and decision maker in taking proper and prioritized measures, we then categorize identified sources in terms of intensity based on extracted indices for Deep Blue algorithm and also utilize frequency of occurrence approach to find the sensitive sources. In next step, by implementing the spectral mixture analysis on the Landsat TM images (1984 and 2012), a novel desertification map will be presented. The aim is to understand how human perturbations and land-use change have influenced the dust storm points in the region. Preliminary results of this study indicate for the first time that c.a., 39 % of all detected source points are located in this newly anthropogenically desertified area. A large number of low frequency sources are located within or close to the newly desertified areas. These severely desertified regions require immediate concern at a global scale. During next 6 months, further research will be performed to confirm these preliminary results.
2018-01-01
Nine urban intertidal regions in Burrard Inlet, Vancouver, British Columbia, Canada, were sampled for plastic debris. Debris included macro and micro plastics and originated from a wide diversity of uses ranging from personal hygiene to solar cells. Debris was characterized for its polymer through standard physiochemical characteristics, then subject to a weak acid extraction to remove the metals, zinc, copper, cadmium and lead from the polymer. Recently manufactured low density polyethylene (LDPE), nylon, polyethylene terephthalate (PET), polypropylene (PP), polystyrene (PS) and polyvinyl chloride (PVC) were subject to the same extraction. Data was statistically analyzed by appropriate parametric and non-parametric tests when needed with significance set at P < 0.05. Polymers identified in field samples in order of abundance were; PVC (39), LDPE (28), PS (18), polyethylene (PE, 9), PP (8), nylon (8), high density polyethylene (HDPE, 7), polycarbonate (PC, 6), PET (6), polyurethane (PUR, 3) and polyoxymethylene (POM, 2). PVC and LDPE accounted for 46% of all samples. Field samples of PVC, HDPE and LDPE had significantly greater amounts of acid extracted copper and HDPE, LDPE and PUR significantly greater amounts of acid extracted zinc. PVC and LDPE had significantly greater amounts of acid extracted cadmium and PVC tended to have greater levels of acid extracted lead, significantly so for HDPE. Five of the collected items demonstrated extreme levels of acid extracted metal; greatest concentrations were 188, 6667, 698,000 and 930 μgg-1 of copper, zinc, lead and cadmium respectively recovered from an unidentified object comprised of PVC. Comparison of recently manufactured versus field samples indicated that recently manufactured samples had significantly greater amounts of acid extracted cadmium and zinc and field samples significantly greater amounts of acid extracted copper and lead which was primarily attributed to metal extracted from field samples of PVC. Plastic debris will affect metals within coastal ecosystems by; 1) providing a sorption site (copper and lead), notably for PVC 2) desorption from the plastic i.e., the “inherent” load (cadmium and zinc) and 3) serving as a point source of acute trace metal exposure to coastal ecosystems. All three mechanisms will put coastal ecosystems at risk to the toxic effects of these metals. PMID:29444103
Munier, B; Bendell, L I
2018-01-01
Nine urban intertidal regions in Burrard Inlet, Vancouver, British Columbia, Canada, were sampled for plastic debris. Debris included macro and micro plastics and originated from a wide diversity of uses ranging from personal hygiene to solar cells. Debris was characterized for its polymer through standard physiochemical characteristics, then subject to a weak acid extraction to remove the metals, zinc, copper, cadmium and lead from the polymer. Recently manufactured low density polyethylene (LDPE), nylon, polyethylene terephthalate (PET), polypropylene (PP), polystyrene (PS) and polyvinyl chloride (PVC) were subject to the same extraction. Data was statistically analyzed by appropriate parametric and non-parametric tests when needed with significance set at P < 0.05. Polymers identified in field samples in order of abundance were; PVC (39), LDPE (28), PS (18), polyethylene (PE, 9), PP (8), nylon (8), high density polyethylene (HDPE, 7), polycarbonate (PC, 6), PET (6), polyurethane (PUR, 3) and polyoxymethylene (POM, 2). PVC and LDPE accounted for 46% of all samples. Field samples of PVC, HDPE and LDPE had significantly greater amounts of acid extracted copper and HDPE, LDPE and PUR significantly greater amounts of acid extracted zinc. PVC and LDPE had significantly greater amounts of acid extracted cadmium and PVC tended to have greater levels of acid extracted lead, significantly so for HDPE. Five of the collected items demonstrated extreme levels of acid extracted metal; greatest concentrations were 188, 6667, 698,000 and 930 μgg-1 of copper, zinc, lead and cadmium respectively recovered from an unidentified object comprised of PVC. Comparison of recently manufactured versus field samples indicated that recently manufactured samples had significantly greater amounts of acid extracted cadmium and zinc and field samples significantly greater amounts of acid extracted copper and lead which was primarily attributed to metal extracted from field samples of PVC. Plastic debris will affect metals within coastal ecosystems by; 1) providing a sorption site (copper and lead), notably for PVC 2) desorption from the plastic i.e., the "inherent" load (cadmium and zinc) and 3) serving as a point source of acute trace metal exposure to coastal ecosystems. All three mechanisms will put coastal ecosystems at risk to the toxic effects of these metals.
Markov Logic Networks for Adverse Drug Event Extraction from Text.
Natarajan, Sriraam; Bangera, Vishal; Khot, Tushar; Picado, Jose; Wazalwar, Anurag; Costa, Vitor Santos; Page, David; Caldwell, Michael
2017-05-01
Adverse drug events (ADEs) are a major concern and point of emphasis for the medical profession, government, and society. A diverse set of techniques from epidemiology, statistics, and computer science are being proposed and studied for ADE discovery from observational health data (e.g., EHR and claims data), social network data (e.g., Google and Twitter posts), and other information sources. Methodologies are needed for evaluating, quantitatively measuring, and comparing the ability of these various approaches to accurately discover ADEs. This work is motivated by the observation that text sources such as the Medline/Medinfo library provide a wealth of information on human health. Unfortunately, ADEs often result from unexpected interactions, and the connection between conditions and drugs is not explicit in these sources. Thus, in this work we address the question of whether we can quantitatively estimate relationships between drugs and conditions from the medical literature. This paper proposes and studies a state-of-the-art NLP-based extraction of ADEs from text.
Road and Roadside Feature Extraction Using Imagery and LIDAR Data for Transportation Operation
NASA Astrophysics Data System (ADS)
Ural, S.; Shan, J.; Romero, M. A.; Tarko, A.
2015-03-01
Transportation agencies require up-to-date, reliable, and feasibly acquired information on road geometry and features within proximity to the roads as input for evaluating and prioritizing new or improvement road projects. The information needed for a robust evaluation of road projects includes road centerline, width, and extent together with the average grade, cross-sections, and obstructions near the travelled way. Remote sensing is equipped with a large collection of data and well-established tools for acquiring the information and extracting aforementioned various road features at various levels and scopes. Even with many remote sensing data and methods available for road extraction, transportation operation requires more than the centerlines. Acquiring information that is spatially coherent at the operational level for the entire road system is challenging and needs multiple data sources to be integrated. In the presented study, we established a framework that used data from multiple sources, including one-foot resolution color infrared orthophotos, airborne LiDAR point clouds, and existing spatially non-accurate ancillary road networks. We were able to extract 90.25% of a total of 23.6 miles of road networks together with estimated road width, average grade along the road, and cross sections at specified intervals. Also, we have extracted buildings and vegetation within a predetermined proximity to the extracted road extent. 90.6% of 107 existing buildings were correctly identified with 31% false detection rate.
Automated Mounting Bias Calibration for Airborne LIDAR System
NASA Astrophysics Data System (ADS)
Zhang, J.; Jiang, W.; Jiang, S.
2012-07-01
Mounting bias is the major error source of Airborne LIDAR system. In this paper, an automated calibration method for estimating LIDAR system mounting parameters is introduced. LIDAR direct geo-referencing model is used to calculate systematic errors. Due to LIDAR footprints discretely sampled, the real corresponding laser points are hardly existence among different strips. The traditional corresponding point methodology does not seem to apply to LIDAR strip registration. We proposed a Virtual Corresponding Point Model to resolve the corresponding problem among discrete laser points. Each VCPM contains a corresponding point and three real laser footprints. Two rules are defined to calculate tie point coordinate from real laser footprints. The Scale Invariant Feature Transform (SIFT) is used to extract corresponding points in LIDAR strips, and the automatic flow of LIDAR system calibration based on VCPM is detailed described. The practical examples illustrate the feasibility and effectiveness of the proposed calibration method.
NASA Astrophysics Data System (ADS)
Galvao, Diogo
2013-04-01
As a result of various economic, social and environmental factors, we can all experience the increase in importance of water resources at a global scale. As a consequence, we can also notice the increasing need of methods and systems capable of efficiently managing and combining the rich and heterogeneous data available that concerns, directly or indirectly, these water resources, such as in-situ monitoring station data, Earth Observation images and measurements, Meteorological modeling forecasts and Hydrological modeling. Under the scope of the MyWater project, we developed a water management system capable of satisfying just such needs, under a flexible platform capable of accommodating future challenges, not only in terms of sources of data but also on applicable models to extract information from it. From a methodological point of view, the MyWater platform obtains data from distinct sources, and in distinct formats, be they Satellite images or meteorological model forecasts, transforms and combines them in ways that allow them to be fed to a variety of hydrological models (such as MOHID Land, SIMGRO, etc…), which themselves can also be combined, using such approaches as those advocated by the OpenMI standard, to extract information in an automated and time efficient manner. Such an approach brings its own deal of challenges, and further research was developed under this project on the best ways to combine such data and on novel approaches to hydrological modeling (like the PriceXD model). From a technical point of view, the MyWater platform is structured according to a classical SOA architecture, with a flexible object oriented modular backend service responsible for all the model process management and data treatment, while the information extracted can be interacted with using a variety of frontends, from a web portal, including also a desktop client, down to mobile phone and tablet applications. From an operational point of view, a user can not only see these model results on graphically rich user interfaces, but also interact with them in ways that allows them to extract their own information. This platform was then applied to a variety of case studies in such countries as the Netherlands, Greece, Portugal, Brazil and Africa, to verify the practicality, accuracy and value that it brings to end users and stakeholders.
A case for classifying the Rio Grande silvery minnow (Hybognathus amarus) as an omnivore
Hugo A. Magana
2007-01-01
The Rio Grande has been identified as one of the most endangered rivers in the United States by American Rivers. Water impoundment, water extraction, and point-source pollution have likely contributed to the decline of the federally endangered Rio Grande silvery minnow (Hybognathus amarus). The overall goal of this study was to locate, identify, and...
Code of Federal Regulations, 2011 CFR
2011-07-01
... which employ dump, heap, in situ leach or vat leach processes for the extraction of copper from ores or... as provided in subpart L of this part and 40 CFR 125.30 through 125.32, any existing point source... available (BPT): (a) The concentration of pollutants discharged in mine drainage from mines operated to...
Code of Federal Regulations, 2010 CFR
2010-07-01
... which employ dump, heap, in situ leach or vat leach processes for the extraction of copper from ores or... as provided in subpart L of this part and 40 CFR 125.30 through 125.32, any existing point source... available (BPT): (a) The concentration of pollutants discharged in mine drainage from mines operated to...
A New Method for Calculating Counts in Cells
NASA Astrophysics Data System (ADS)
Szapudi, István
1998-04-01
In the near future, a new generation of CCD-based galaxy surveys will enable high-precision determination of the N-point correlation functions. The resulting information will help to resolve the ambiguities associated with two-point correlation functions, thus constraining theories of structure formation, biasing, and Gaussianity of initial conditions independently of the value of Ω. As one of the most successful methods of extracting the amplitude of higher order correlations is based on measuring the distribution of counts in cells, this work presents an advanced way of measuring it with unprecedented accuracy. Szapudi & Colombi identified the main sources of theoretical errors in extracting counts in cells from galaxy catalogs. One of these sources, termed as measurement error, stems from the fact that conventional methods use a finite number of sampling cells to estimate counts in cells. This effect can be circumvented by using an infinite number of cells. This paper presents an algorithm, which in practice achieves this goal; that is, it is equivalent to throwing an infinite number of sampling cells in finite time. The errors associated with sampling cells are completely eliminated by this procedure, which will be essential for the accurate analysis of future surveys.
Photogrammetric Method and Software for Stream Planform Identification
NASA Astrophysics Data System (ADS)
Stonedahl, S. H.; Stonedahl, F.; Lohberg, M. M.; Lusk, K.; Miller, D.
2013-12-01
Accurately characterizing the planform of a stream is important for many purposes, including recording measurement and sampling locations, monitoring change due to erosion or volumetric discharge, and spatial modeling of stream processes. While expensive surveying equipment or high resolution aerial photography can be used to obtain planform data, our research focused on developing a close-range photogrammetric method (and accompanying free/open-source software) to serve as a cost-effective alternative. This method involves securing and floating a wooden square frame on the stream surface at several locations, taking photographs from numerous angles at each location, and then post-processing and merging data from these photos using the corners of the square for reference points, unit scale, and perspective correction. For our test field site we chose a ~35m reach along Black Hawk Creek in Sunderbruch Park (Davenport, IA), a small, slow-moving stream with overhanging trees. To quantify error we measured 88 distances between 30 marked control points along the reach. We calculated error by comparing these 'ground truth' distances to the corresponding distances extracted from our photogrammetric method. We placed the square at three locations along our reach and photographed it from multiple angles. The square corners, visible control points, and visible stream outline were hand-marked in these photos using the GIMP (open-source image editor). We wrote an open-source GUI in Java (hosted on GitHub), which allows the user to load marked-up photos, designate square corners and label control points. The GUI also extracts the marked pixel coordinates from the images. We also wrote several scripts (currently in MATLAB) that correct the pixel coordinates for radial distortion using Brown's lens distortion model, correct for perspective by forcing the four square corner pixels to form a parallelogram in 3-space, and rotate the points in order to correctly orient all photos of the same square location. Planform data from multiple photos (and multiple square locations) are combined using weighting functions that mitigate the error stemming from the markup-process, imperfect camera calibration, etc. We have used our (beta) software to mark and process over 100 photos, yielding an average error of only 1.5% relative to our 88 measured lengths. Next we plan to translate the MATLAB scripts into Python and release their source code, at which point only free software, consumer-grade digital cameras, and inexpensive building materials will be needed for others to replicate this method at new field sites. Three sample photographs of the square with the created planform and control points
A new morphology algorithm for shoreline extraction from DEM data
NASA Astrophysics Data System (ADS)
Yousef, Amr H.; Iftekharuddin, Khan; Karim, Mohammad
2013-03-01
Digital elevation models (DEMs) are a digital representation of elevations at regularly spaced points. They provide an accurate tool to extract the shoreline profiles. One of the emerging sources of creating them is light detection and ranging (LiDAR) that can capture a highly dense cloud points with high resolution that can reach 15 cm and 100 cm in the vertical and horizontal directions respectively in short periods of time. In this paper we present a multi-step morphological algorithm to extract shorelines locations from the DEM data and a predefined tidal datum. Unlike similar approaches, it utilizes Lowess nonparametric regression to estimate the missing values within the DEM file. Also, it will detect and eliminate the outliers and errors that result from waves, ships, etc by means of anomality test with neighborhood constrains. Because, there might be some significant broken regions such as branches and islands, it utilizes a constrained morphological open and close to reduce these artifacts that can affect the extracted shorelines. In addition, it eliminates docks, bridges and fishing piers along the extracted shorelines by means of Hough transform. Based on a specific tidal datum, the algorithm will segment the DEM data into water and land objects. Without sacrificing the accuracy and the spatial details of the extracted boundaries, the algorithm should smooth and extract the shoreline profiles by tracing the boundary pixels between the land and the water segments. For given tidal values, we qualitatively assess the visual quality of the extracted shorelines by superimposing them on the available aerial photographs.
NASA Astrophysics Data System (ADS)
Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.
2016-09-01
Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.
A Registration Method Based on Contour Point Cloud for 3D Whole-Body PET and CT Images
Yang, Qiyao; Wang, Zhiguo; Zhang, Guoxu
2017-01-01
The PET and CT fusion image, combining the anatomical and functional information, has important clinical meaning. An effective registration of PET and CT images is the basis of image fusion. This paper presents a multithread registration method based on contour point cloud for 3D whole-body PET and CT images. Firstly, a geometric feature-based segmentation (GFS) method and a dynamic threshold denoising (DTD) method are creatively proposed to preprocess CT and PET images, respectively. Next, a new automated trunk slices extraction method is presented for extracting feature point clouds. Finally, the multithread Iterative Closet Point is adopted to drive an affine transform. We compare our method with a multiresolution registration method based on Mattes Mutual Information on 13 pairs (246~286 slices per pair) of 3D whole-body PET and CT data. Experimental results demonstrate the registration effectiveness of our method with lower negative normalization correlation (NC = −0.933) on feature images and less Euclidean distance error (ED = 2.826) on landmark points, outperforming the source data (NC = −0.496, ED = 25.847) and the compared method (NC = −0.614, ED = 16.085). Moreover, our method is about ten times faster than the compared one. PMID:28316979
Model for Semantically Rich Point Cloud Data
NASA Astrophysics Data System (ADS)
Poux, F.; Neuville, R.; Hallot, P.; Billen, R.
2017-10-01
This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.
Sambucus nigra extracts inhibit infectious bronchitis virus at an early point during replication
2014-01-01
Background Infectious bronchitis virus (IBV) is a pathogenic chicken coronavirus. Currently, vaccination against IBV is only partially protective; therefore, better preventions and treatments are needed. Plants produce antimicrobial secondary compounds, which may be a source for novel anti-viral drugs. Non-cytotoxic, crude ethanol extracts of Rhodiola rosea roots, Nigella sativa seeds, and Sambucus nigra fruit were tested for anti-IBV activity, since these safe, widely used plant tissues contain polyphenol derivatives that inhibit other viruses. Results Dose–response cytotoxicity curves on Vero cells using trypan blue staining determined the highest non-cytotoxic concentrations of each plant extract. To screen for IBV inhibition, cells and virus were pretreated with extracts, followed by infection in the presence of extract. Viral cytopathic effect was assessed visually following an additional 24 h incubation with extract. Cells and supernatants were harvested separately and virus titers were quantified by plaque assay. Variations of this screening protocol determined the effects of a number of shortened S. nigra extract treatments. Finally, S. nigra extract-treated virions were visualized by transmission electron microscopy with negative staining. Virus titers from infected cells treated with R. rosea and N. sativa extracts were not substantially different from infected cells treated with solvent alone. However, treatment with S. nigra extracts reduced virus titers by four orders of magnitude at a multiplicity of infection (MOI) of 1 in a dose-responsive manner. Infection at a low MOI reduced viral titers by six orders of magnitude and pretreatment of virus was necessary, but not sufficient, for full virus inhibition. Electron microscopy of virions treated with S. nigra extract showed compromised envelopes and the presence of membrane vesicles, which suggested a mechanism of action. Conclusions These results demonstrate that S. nigra extract can inhibit IBV at an early point in infection, probably by rendering the virus non-infectious. They also suggest that future studies using S. nigra extract to treat or prevent IBV or other coronaviruses are warranted. PMID:24433341
SGR 1822-1606: Constant Spin Period
NASA Astrophysics Data System (ADS)
Serim, M.; Baykal, A.; Inam, S. C.
2011-08-01
We have analyzed light curve of the new source SGR 1822-1606 (Cummings et al. GCN 12159) using the real time data of RXTE observations. We have extracted light curve for 11 pointings with a time span of about 20 days and employed pulse timing analysis using the harmonic representation of pulses. Using the cross correlation of harmonic representation of pulses, we have obtained pulse arrival times.
Therapies from Fucoidan: An Update
Fitton, Janet Helen; Stringer, Damien N.; Karpiniec, Samuel S.
2015-01-01
Fucoidans are a class of sulfated fucose-rich polysaccharides found in brown marine algae and echinoderms. Fucoidans have an attractive array of bioactivities and potential applications including immune modulation, cancer inhibition, and pathogen inhibition. Research into fucoidan has continued to gain pace over the last few years and point towards potential therapeutic or adjunct roles. The source, extraction, characterization and detection of fucoidan is discussed. PMID:26389927
Zhang, Mingyuan; Fiol, Guilherme Del; Grout, Randall W.; Jonnalagadda, Siddhartha; Medlin, Richard; Mishra, Rashmi; Weir, Charlene; Liu, Hongfang; Mostafa, Javed; Fiszman, Marcelo
2014-01-01
Online knowledge resources such as Medline can address most clinicians’ patient care information needs. Yet, significant barriers, notably lack of time, limit the use of these sources at the point of care. The most common information needs raised by clinicians are treatment-related. Comparative effectiveness studies allow clinicians to consider multiple treatment alternatives for a particular problem. Still, solutions are needed to enable efficient and effective consumption of comparative effectiveness research at the point of care. Objective Design and assess an algorithm for automatically identifying comparative effectiveness studies and extracting the interventions investigated in these studies. Methods The algorithm combines semantic natural language processing, Medline citation metadata, and machine learning techniques. We assessed the algorithm in a case study of treatment alternatives for depression. Results Both precision and recall for identifying comparative studies was 0.83. A total of 86% of the interventions extracted perfectly or partially matched the gold standard. Conclusion Overall, the algorithm achieved reasonable performance. The method provides building blocks for the automatic summarization of comparative effectiveness research to inform point of care decision-making. PMID:23920677
Single Crystal Diamond Needle as Point Electron Source.
Kleshch, Victor I; Purcell, Stephen T; Obraztsov, Alexander N
2016-10-12
Diamond has been considered to be one of the most attractive materials for cold-cathode applications during past two decades. However, its real application is hampered by the necessity to provide appropriate amount and transport of electrons to emitter surface which is usually achieved by using nanometer size or highly defective crystallites having much lower physical characteristics than the ideal diamond. Here, for the first time the use of single crystal diamond emitter with high aspect ratio as a point electron source is reported. Single crystal diamond needles were obtained by selective oxidation of polycrystalline diamond films produced by plasma enhanced chemical vapor deposition. Field emission currents and total electron energy distributions were measured for individual diamond needles as functions of extraction voltage and temperature. The needles demonstrate current saturation phenomenon and sensitivity of emission to temperature. The analysis of the voltage drops measured via electron energy analyzer shows that the conduction is provided by the surface of the diamond needles and is governed by Poole-Frenkel transport mechanism with characteristic trap energy of 0.2-0.3 eV. The temperature-sensitive FE characteristics of the diamond needles are of great interest for production of the point electron beam sources and sensors for vacuum electronics.
Single Crystal Diamond Needle as Point Electron Source
NASA Astrophysics Data System (ADS)
Kleshch, Victor I.; Purcell, Stephen T.; Obraztsov, Alexander N.
2016-10-01
Diamond has been considered to be one of the most attractive materials for cold-cathode applications during past two decades. However, its real application is hampered by the necessity to provide appropriate amount and transport of electrons to emitter surface which is usually achieved by using nanometer size or highly defective crystallites having much lower physical characteristics than the ideal diamond. Here, for the first time the use of single crystal diamond emitter with high aspect ratio as a point electron source is reported. Single crystal diamond needles were obtained by selective oxidation of polycrystalline diamond films produced by plasma enhanced chemical vapor deposition. Field emission currents and total electron energy distributions were measured for individual diamond needles as functions of extraction voltage and temperature. The needles demonstrate current saturation phenomenon and sensitivity of emission to temperature. The analysis of the voltage drops measured via electron energy analyzer shows that the conduction is provided by the surface of the diamond needles and is governed by Poole-Frenkel transport mechanism with characteristic trap energy of 0.2-0.3 eV. The temperature-sensitive FE characteristics of the diamond needles are of great interest for production of the point electron beam sources and sensors for vacuum electronics.
NASA Technical Reports Server (NTRS)
Heyer, Mark H.; Graham, J. A.
1990-01-01
Imaging and spectroscopic observations of HH55 in the Lupus molecular cloud are presented. Cohen and Schwartz (1987) have shown that HH55 is apparently not excited by the nearby T Tau star RU Lup as once thought but rather by the coincident FIR point source 15533 - 3742 extracted from IRAS coadded images. The optical counterpart of this IR source is identified as an active, relatively unobscured M-dwarf star. The forbidden emission lines observed in the stellar spectrum exhibit slight asymmetries to blueshifted velocities. Deconvolution of the emission lines reveals a weak moderate-velocity (-100 km/sec) wind component and a stronger emission component whose velocity is very close to that of the star.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pierret, C.; Maunoury, L.; Biri, S.
The goal of this article is to present simulations on the extraction from an electron cyclotron resonance ion source (ECRIS). The aim of this work is to find out an extraction system, which allows one to reduce the emittances and to increase the current of the extracted ion beam at the focal point of the analyzing dipole. But first, we should locate the correct software which is able to reproduce the specific physics of an ion beam. To perform the simulations, the following softwares have been tested: SIMION 3D, AXCEL, CPO 3D, and especially, for the magnetic field calculation, MATHEMATICAmore » coupled with the RADIA module. Emittance calculations have been done with two types of ECRIS: one with a hexapole and one without a hexapole, and the difference will be discussed.« less
Modular 3D-Printed Soil Gas Probes
NASA Astrophysics Data System (ADS)
Good, S. P.; Selker, J. S.; Al-Qqaili, F.; Lopez, M.; Kahel, L.
2016-12-01
ABSTRACT: Extraction of soil gas is required for a variety of applications in earth sciences and environmental engineering. However, commercially available probes can be costly and are typically limited to a single depth. Here, we present the open-source design and lab testing of a soil gas probe with modular capabilities that allow for the vertical stacking of gas extraction points at different depths in the soil column. The probe modules consist of a 3D printed spacer unit and hydrophobic gas permeable membrane made of high density Polyethylene with pore sizes 20-40 microns. Each of the modular spacer units contain both a gas extraction line and gas input line for the dilution of soil gases if needed. These 2-inch diameter probes can be installed in the field quickly with a hand auger and returned to at any frequency to extract soil gas from desired soil depths. The probes are tested through extraction of soil pore water vapors with distinct stable isotope ratios.
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds.
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-06-17
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data.
Scan Line Based Road Marking Extraction from Mobile LiDAR Point Clouds†
Yan, Li; Liu, Hua; Tan, Junxiang; Li, Zan; Xie, Hong; Chen, Changjun
2016-01-01
Mobile Mapping Technology (MMT) is one of the most important 3D spatial data acquisition technologies. The state-of-the-art mobile mapping systems, equipped with laser scanners and named Mobile LiDAR Scanning (MLS) systems, have been widely used in a variety of areas, especially in road mapping and road inventory. With the commercialization of Advanced Driving Assistance Systems (ADASs) and self-driving technology, there will be a great demand for lane-level detailed 3D maps, and MLS is the most promising technology to generate such lane-level detailed 3D maps. Road markings and road edges are necessary information in creating such lane-level detailed 3D maps. This paper proposes a scan line based method to extract road markings from mobile LiDAR point clouds in three steps: (1) preprocessing; (2) road points extraction; (3) road markings extraction and refinement. In preprocessing step, the isolated LiDAR points in the air are removed from the LiDAR point clouds and the point clouds are organized into scan lines. In the road points extraction step, seed road points are first extracted by Height Difference (HD) between trajectory data and road surface, then full road points are extracted from the point clouds by moving least squares line fitting. In the road markings extraction and refinement step, the intensity values of road points in a scan line are first smoothed by a dynamic window median filter to suppress intensity noises, then road markings are extracted by Edge Detection and Edge Constraint (EDEC) method, and the Fake Road Marking Points (FRMPs) are eliminated from the detected road markings by segment and dimensionality feature-based refinement. The performance of the proposed method is evaluated by three data samples and the experiment results indicate that road points are well extracted from MLS data and road markings are well extracted from road points by the applied method. A quantitative study shows that the proposed method achieves an average completeness, correctness, and F-measure of 0.96, 0.93, and 0.94, respectively. The time complexity analysis shows that the scan line based road markings extraction method proposed in this paper provides a promising alternative for offline road markings extraction from MLS data. PMID:27322279
THE SPITZER-IRAC POINT-SOURCE CATALOG OF THE VELA-D CLOUD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strafella, F.; Elia, D.; Campeggio, L., E-mail: francesco.strafella@le.infn.i, E-mail: loretta.campeggio@le.infn.i, E-mail: eliad@oal.ul.p
2010-08-10
This paper presents the observations of Cloud D in the Vela Molecular Ridge, obtained with the Infrared Array Camera (IRAC) camera on board the Spitzer Space Telescope at the wavelengths {lambda} = 3.6, 4.5, 5.8, and 8.0 {mu}m. A photometric catalog of point sources, covering a field of approximately 1.2 deg{sup 2}, has been extracted and complemented with additional available observational data in the millimeter region. Previous observations of the same region, obtained with the Spitzer MIPS camera in the photometric bands at 24 {mu}m and 70 {mu}m, have also been reconsidered to allow an estimate of the spectral slopemore » of the sources in a wider spectral range. A total of 170,299 point sources, detected at the 5{sigma} sensitivity level in at least one of the IRAC bands, have been reported in the catalog. There were 8796 sources for which good quality photometry was obtained in all four IRAC bands. For this sample, a preliminary characterization of the young stellar population based on the determination of spectral slope is discussed; combining this with diagnostics in the color-magnitude and color-color diagrams, the relative population of young stellar objects (YSOs) in different evolutionary classes has been estimated and a total of 637 candidate YSOs have been selected. The main differences in their relative abundances have been highlighted and a brief account for their spatial distribution is given. The star formation rate has also been estimated and compared with the values derived for other star-forming regions. Finally, an analysis of the spatial distribution of the sources by means of the two-point correlation function shows that the younger population, constituted by the Class I and flat-spectrum sources, is significantly more clustered than the Class II and III sources.« less
VizieR Online Data Catalog: ROSAT HRI Pointed Observations (1RXH) (ROSAT Team, 2000)
NASA Astrophysics Data System (ADS)
ROSAT Scientific Team
2000-05-01
The hricat.dat table contains a list of sources detected by the Standard Analysis Software System (SASS) in reprocessed, public High Resolution Imager (HRI) datasets. In addition to the parameters returned by SASS (like position, count rate, signal-to-noise, etc.) each source in the table has associated with it a set of source and sequence "flags". These flags are provided by the ROSAT data centers in the US, Germany and the UK to help the user of the ROSHRI database judge the reliability of a given source. These data have been screened by ROSAT data centers in the US, Germany, and the UK as a step in the production of the Rosat Results Archive (RRA). The RRA contains extracted source and associated products with an indication of reliability for the primary parameters. (3 data files).
NASA Astrophysics Data System (ADS)
Gong, Y.; Yang, Y.; Yang, X.
2018-04-01
For the purpose of extracting productions of some specific branching plants effectively and realizing its 3D reconstruction, Terrestrial LiDAR data was used as extraction source of production, and a 3D reconstruction method based on Terrestrial LiDAR technologies combined with the L-system was proposed in this article. The topology structure of the plant architectures was extracted using the point cloud data of the target plant with space level segmentation mechanism. Subsequently, L-system productions were obtained and the structural parameters and production rules of branches, which fit the given plant, was generated. A three-dimensional simulation model of target plant was established combined with computer visualization algorithm finally. The results suggest that the method can effectively extract a given branching plant topology and describes its production, realizing the extraction of topology structure by the computer algorithm for given branching plant and also simplifying the extraction of branching plant productions which would be complex and time-consuming by L-system. It improves the degree of automation in the L-system extraction of productions of specific branching plants, providing a new way for the extraction of branching plant production rules.
Preview of the BATSE Earth Occultation Catalog of Low Energy Gamma Ray Sources
NASA Technical Reports Server (NTRS)
Harmon, B. A.; Wilson, C. A.; Fishman, G. J.; McCollough, M. L.; Robinson, C. R.; Sahi, M.; Paciesas, W. S.; Zhang, S. N.
1999-01-01
The Burst and Transient Source Experiment (BATSE) aboard the Compton Gamma Ray Observatory (CGRO) has been detecting and monitoring point sources in the high energy sky since 1991. Although BATSE is best known for gamma ray bursts, it also monitors the sky for longer-lived sources of radiation. Using the Earth occultation technique to extract flux information, a catalog is being prepared of about 150 sources potential emission in the large area detectors (20-1000 keV). The catalog will contain light curves, representative spectra, and parametric data for black hole and neutron star binaries, active galaxies, and super-nova remnants. In this preview, we present light curves for persistent and transient sources, and also show examples of what type of information can be obtained from the BATSE Earth occultation database. Options for making the data easily accessible as an "on line" WWW document are being explored.
New low-resolution spectrometer spectra for IRAS sources
NASA Astrophysics Data System (ADS)
Volk, Kevin; Kwok, Sun; Stencel, R. E.; Brugel, E.
1991-12-01
Low-resolution spectra of 486 IRAS point sources with Fnu(12 microns) in the range 20-40 Jy are presented. This is part of an effort to extract and classify spectra that were not included in the Atlas of Low-Resolution Spectra and represents an extension of the earlier work by Volk and Cohen which covers sources with Fnu(12 microns) greater than 40 Jy. The spectra have been examined by eye and classified into nine groups based on the spectral morphology. This new classification scheme is compared with the mechanical classification of the Atlas, and the differences are noted. Oxygen-rich stars of the asymptotic giant branch make up 33 percent of the sample. Solid state features dominate the spectra of most sources. It is found that the nature of the sources as implied by the present spectral classification is consistent with the classifications based on broad-band colors of the sources.
The Red Seaweed Gracilaria gracilis as a Multi Products Source
Francavilla, Matteo; Franchi, Massimo; Monteleone, Massimo; Caroppo, Carmela
2013-01-01
In recent years seaweeds have increasingly attracted interest in the search for new drugs and have been shown to be a primary source of bioactive natural compounds and biomaterials. In the present investigation, the biochemical composition of the red seaweed Gracilaria gracilis, collected seasonally in the Lesina Lagoon (Southern Adriatic Sea, Lesina, Italy), was assayed by means of advanced analytical techniques, such as gas-chromatography coupled with mass spectrometry and spectrophotometric tests. In particular, analysis of lipids, fatty acids, sterols, proteins, phycobiliproteins and carbohydrates as well as phenolic content, antioxidant and radical scavenging activity were performed. In winter extracts of G. gracilis, a high content of R-phycoerythrin together with other valuable products such as arachidonic acid (PUFA ω-6), proteins and carbohydrates was observed. High antioxidant and radical scavenging activities were also detected in summer extracts of the seaweed together with a high content of total phenols. In conclusion, this study points out the possibility of using Gracilaria gracilis as a multi products source for biotechnological, nutraceutical and pharmaceutical applications even although more investigations are required for separating, purifying and characterizing these bioactive compounds. PMID:24084791
NASA Astrophysics Data System (ADS)
Li, Jun; Qin, Qiming; Xie, Chao; Zhao, Yue
2012-10-01
The update frequency of digital road maps influences the quality of road-dependent services. However, digital road maps surveyed by probe vehicles or extracted from remotely sensed images still have a long updating circle and their cost remain high. With GPS technology and wireless communication technology maturing and their cost decreasing, floating car technology has been used in traffic monitoring and management, and the dynamic positioning data from floating cars become a new data source for updating road maps. In this paper, we aim to update digital road maps using the floating car data from China's National Commercial Vehicle Monitoring Platform, and present an incremental road network extraction method suitable for the platform's GPS data whose sampling frequency is low and which cover a large area. Based on both spatial and semantic relationships between a trajectory point and its associated road segment, the method classifies each trajectory point, and then merges every trajectory point into the candidate road network through the adding or modifying process according to its type. The road network is gradually updated until all trajectories have been processed. Finally, this method is applied in the updating process of major roads in North China and the experimental results reveal that it can accurately derive geometric information of roads under various scenes. This paper provides a highly-efficient, low-cost approach to update digital road maps.
A new network of faint calibration stars from the near infrared spectrometer (NIRS) on the IRTS
NASA Technical Reports Server (NTRS)
Freund, Minoru M.; Matsuura, Mikako; Murakami, Hiroshi; Cohen, Martin; Noda, Manabu; Matsuura, Shuji; Matsumoto, Toshio
1997-01-01
The point source extraction and calibration of the near infrared spectrometer (NIRS) onboard the Infrared Telescope in Space (IRTS) is described. About 7 percent of the sky was observed during a one month mission in the range of 1.4 micrometers to 4 micrometers. The accuracy of the spectral shape and absolute values of calibration stars provided by the NIRS/IRTS were validated.
A Survey of Insider Attack Detection Research
2008-08-25
modeling of statistical features , such as the frequency of events, the duration of events, the co-occurrence of multiple events combined through...forms of attack that have been reported [Error! Reference source not found.]. For example: • Unauthorized extraction , duplication, or exfiltration...network level. Schultz pointed out that not one approach will work but solutions need to be based on multiple sensors to be able to find any combination
Martins, Ana; Barros, Lillian; Carvalho, Ana Maria; Santos-Buelga, Celestino; Fernandes, Isabel P; Barreiro, Filomena; Ferreira, Isabel C F R
2014-06-01
Rubus ulmifolius Schott (Rosaceae), known as wild blackberry, is a perennial shrub found in wild and cultivated habitats in Europe, Asia and North Africa. Traditionally, it is used for homemade remedies because of its medicinal properties, including antioxidant activity. In the present work, phenolic extracts of R. ulmifolius flower buds obtained by decoction and hydroalcoholic extraction were chemically and biologically characterized. Several phenolic compounds were identified in both decoction and hydroalcoholic extracts of flowers, ellagitannin derivatives being the most abundant ones, namely the sanguiin H-10 isomer and lambertianin. Additionally, comparing with the decoction form, the hydroalcoholic extract presented both higher phenolic content and antioxidant activity. The hydroalcoholic extract was thereafter microencapsulated in an alginate-based matrix and incorporated into a yogurt to achieve antioxidant benefits. In what concerns the performed incorporation tests, the obtained results pointed out that, among the tested samples, the yoghurt containing the microencapsulated extract presented a slightly higher antioxidant activity, and that both forms (free and microencapsulated extracts) gave rise to products with higher activity than the control. In conclusion, this study demonstrated the antioxidant potential of the R. ulmifolius hydroalcoholic extract and the effectiveness of the microencapsulation technique used for its preservation, thus opening new prospects for the exploitation of these natural phenolic extracts in food applications.
Dependence of the source performance on plasma parameters at the BATMAN test facility
NASA Astrophysics Data System (ADS)
Wimmer, C.; Fantz, U.
2015-04-01
The investigation of the dependence of the source performance (high jH-, low je) for optimum Cs conditions on the plasma parameters at the BATMAN (Bavarian Test MAchine for Negative hydrogen ions) test facility is desirable in order to find key parameters for the operation of the source as well as to deepen the physical understanding. The most relevant source physics takes place in the extended boundary layer, which is the plasma layer with a thickness of several cm in front of the plasma grid: the production of H-, its transport through the plasma and its extraction, inevitably accompanied by the co-extraction of electrons. Hence, a link of the source performance with the plasma parameters in the extended boundary layer is expected. In order to characterize electron and negative hydrogen ion fluxes in the extended boundary layer, Cavity Ring-Down Spectroscopy and Langmuir probes have been applied for the measurement of the H- density and the determination of the plasma density, the plasma potential and the electron temperature, respectively. The plasma potential is of particular importance as it determines the sheath potential profile at the plasma grid: depending on the plasma grid bias relative to the plasma potential, a transition in the plasma sheath from an electron repelling to an electron attracting sheath takes place, influencing strongly the electron fraction of the bias current and thus the amount of co-extracted electrons. Dependencies of the source performance on the determined plasma parameters are presented for the comparison of two source pressures (0.6 Pa, 0.45 Pa) in hydrogen operation. The higher source pressure of 0.6 Pa is a standard point of operation at BATMAN with external magnets, whereas the lower pressure of 0.45 Pa is closer to the ITER requirements (p ≤ 0.3 Pa).
Computer-assisted 3D kinematic analysis of all leg joints in walking insects.
Bender, John A; Simpson, Elaine M; Ritzmann, Roy E
2010-10-26
High-speed video can provide fine-scaled analysis of animal behavior. However, extracting behavioral data from video sequences is a time-consuming, tedious, subjective task. These issues are exacerbated where accurate behavioral descriptions require analysis of multiple points in three dimensions. We describe a new computer program written to assist a user in simultaneously extracting three-dimensional kinematics of multiple points on each of an insect's six legs. Digital video of a walking cockroach was collected in grayscale at 500 fps from two synchronized, calibrated cameras. We improved the legs' visibility by painting white dots on the joints, similar to techniques used for digitizing human motion. Compared to manual digitization of 26 points on the legs over a single, 8-second bout of walking (or 106,496 individual 3D points), our software achieved approximately 90% of the accuracy with 10% of the labor. Our experimental design reduced the complexity of the tracking problem by tethering the insect and allowing it to walk in place on a lightly oiled glass surface, but in principle, the algorithms implemented are extensible to free walking. Our software is free and open-source, written in the free language Python and including a graphical user interface for configuration and control. We encourage collaborative enhancements to make this tool both better and widely utilized.
NASA Astrophysics Data System (ADS)
Alaraj, Muhannad; Radenkovic, Miloje; Park, Jae-Do
2017-02-01
Microbial fuel cells (MFCs) are renewable and sustainable energy sources that can be used for various applications. The MFC output power depends on its biochemical conditions as well as the terminal operating points in terms of output voltage and current. There exists one operating point that gives the maximum possible power from the MFC, maximum power point (MPP), for a given operating condition. However, this MPP may vary and needs to be tracked in order to maintain the maximum power extraction from the MFC. Furthermore, MFC reactors often develop voltage overshoots that cause drastic drops in the terminal voltage, current, and the output power. When the voltage overshoot happens, an additional control measure is necessary as conventional MPPT algorithms will fail because of the change in the voltage-current relationship. In this paper, the extremum seeking (ES) algorithm was used to track the varying MPP and a voltage overshoot avoidance (VOA) algorithm is developed to manage the voltage overshoot conditions. The proposed ES-MPPT with VOA algorithm was able to extract 197.2 mJ during 10-min operation avoiding voltage overshoot, while the ES MPPT-only scheme stopped harvesting after only 18.75 mJ because of the voltage overshoot happened at 0.4 min.
NASA Astrophysics Data System (ADS)
Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.
2016-10-01
Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.
Jochmann, A; Irman, A; Bussmann, M; Couperus, J P; Cowan, T E; Debus, A D; Kuntzsch, M; Ledingham, K W D; Lehnert, U; Sauerbrey, R; Schlenvoigt, H P; Seipt, D; Stöhlker, Th; Thorn, D B; Trotsenko, S; Wagner, A; Schramm, U
2013-09-13
Thomson backscattering of intense laser pulses from relativistic electrons not only allows for the generation of bright x-ray pulses but also for the investigation of the complex particle dynamics at the interaction point. For this purpose a complete spectral characterization of a Thomson source powered by a compact linear electron accelerator is performed with unprecedented angular and energy resolution. A rigorous statistical analysis comparing experimental data to 3D simulations enables, e.g., the extraction of the angular distribution of electrons with 1.5% accuracy and, in total, provides predictive capability for the future high brightness hard x-ray source PHOENIX (photon electron collider for narrow bandwidth intense x rays) and potential gamma-ray sources.
Joint classification and contour extraction of large 3D point clouds
NASA Astrophysics Data System (ADS)
Hackel, Timo; Wegner, Jan D.; Schindler, Konrad
2017-08-01
We present an effective and efficient method for point-wise semantic classification and extraction of object contours of large-scale 3D point clouds. What makes point cloud interpretation challenging is the sheer size of several millions of points per scan and the non-grid, sparse, and uneven distribution of points. Standard image processing tools like texture filters, for example, cannot handle such data efficiently, which calls for dedicated point cloud labeling methods. It turns out that one of the major drivers for efficient computation and handling of strong variations in point density, is a careful formulation of per-point neighborhoods at multiple scales. This allows, both, to define an expressive feature set and to extract topologically meaningful object contours. Semantic classification and contour extraction are interlaced problems. Point-wise semantic classification enables extracting a meaningful candidate set of contour points while contours help generating a rich feature representation that benefits point-wise classification. These methods are tailored to have fast run time and small memory footprint for processing large-scale, unstructured, and inhomogeneous point clouds, while still achieving high classification accuracy. We evaluate our methods on the semantic3d.net benchmark for terrestrial laser scans with >109 points.
IN VIVO STUDIES AND STABILITY STUDY OF CLADOPHORA GLOMERATA EXTRACT AS A COSMETIC ACTIVE INGREDIENT.
Fabrowska, Joanna; Kapuscinska, Alicja; Leska, Boguslawa; Feliksik-Skrobich, Katarzyna; Nowak, Izabela
2017-03-01
Marine algae are widely used as cosmetics raw materials. Likewise, freshwater alga Cladophora glomerata may be a good source of fatty acids and others bioactive agents. The aims of this study was to find out if the addition of the extract from the freshwater C. glonerata affects the stability of prepared cosmetic emulsions and to investigate in vivo effects of the extract in cosmetic formulations on hydration and elasticity of human skin. Extract from the freshwater C. glonierata was obtained using supercritical fluid extraction (SFE). Two forms of O/W emulsions were prepared: placebo and emulsion containing 0.5% of Cladophora SFE extract. The stability of obtained emulsions was investigated by using Turbiscan Lab Expert. Emulsions were applied by .volunteers daily. Corneometer was used to evaluate skin hydration and cutometer to examine skin elasticity. Measurements were conducted at reference point (week 0) and after 1st, 2nd, 3rd and 4th week of application. The addition of Cladophora extract insignificantly affected stability of the emulsion. The extract from C. glomerata in the emulsion influenced the improvement of both skin hydration and its elasticity. Thus, freshwater C. glonierata extract prepared via SFE method may be considered as an effective cosmetic raw material used as a moisturizing and firming agent.
Klassen, Tara L.; von Rüden, Eva-Lotta; Drabek, Janice; Noebels, Jeffrey L.; Goldman, Alica M.
2013-01-01
Genetic testing and research have increased the demand for high-quality DNA that has traditionally been obtained by venipuncture. However, venous blood collection may prove difficult in special populations and when large-scale specimen collection or exchange is prerequisite for international collaborative investigations. Guthrie/FTA card–based blood spots, buccal scrapes, and finger nail clippings are DNA-containing specimens that are uniquely accessible and thus attractive as alternative tissue sources (ATS). The literature details a variety of protocols for extraction of nucleic acids from a singular ATS type, but their utility has not been systematically analyzed in comparison with conventional sources such as venous blood. Additionally, the efficacy of each protocol is often equated with the overall nucleic acid yield but not with the analytical performance of the DNA during mutation detection. Together with a critical in-depth literature review of published extraction methods, we developed and evaluated an all-inclusive approach for serial, systematic, and direct comparison of DNA utility from multiple biological samples. Our results point to the often underappreciated value of these alternative tissue sources and highlight ways to maximize the ATS-derived DNA for optimal quantity, quality, and utility as a function of extraction method. Our comparative analysis clarifies the value of ATS in genomic analysis projects for population-based screening, diagnostics, molecular autopsy, medico-legal investigations, or multi-organ surveys of suspected mosaicisms. PMID:22796560
NASA Astrophysics Data System (ADS)
Capuano, P.; De Lauro, E.; De Martino, S.; Falanga, M.
2016-04-01
This work is devoted to the analysis of seismic signals continuously recorded at Campi Flegrei Caldera (Italy) during the entire year 2006. The radiation pattern associated with the Long-Period energy release is investigated. We adopt an innovative Independent Component Analysis algorithm for convolutive seismic series adapted and improved to give automatic procedures for detecting seismic events often buried in the high-level ambient noise. The extracted waveforms characterized by an improved signal-to-noise ratio allows the recognition of Long-Period precursors, evidencing that the seismic activity accompanying the mini-uplift crisis (in 2006), which climaxed in the three days from 26-28 October, had already started at the beginning of the month of October and lasted until mid of November. Hence, a more complete seismic catalog is then provided which can be used to properly quantify the seismic energy release. To better ground our results, we first check the robustness of the method by comparing it with other blind source separation methods based on higher order statistics; secondly, we reconstruct the radiation patterns of the extracted Long-Period events in order to link the individuated signals directly to the sources. We take advantage from Convolutive Independent Component Analysis that provides basic signals along the three directions of motion so that a direct polarization analysis can be performed with no other filtering procedures. We show that the extracted signals are mainly composed of P waves with radial polarization pointing to the seismic source of the main LP swarm, i.e. a small area in the Solfatara, also in the case of the small-events, that both precede and follow the main activity. From a dynamical point of view, they can be described by two degrees of freedom, indicating a low-level of complexity associated with the vibrations from a superficial hydrothermal system. Our results allow us to move towards a full description of the complexity of the source, which can be used, by means of the small-intensity precursors, for hazard-model development and forecast-model testing, showing an illustrative example of the applicability of the CICA method to regions with low seismicity in high ambient noise.
Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction
NASA Astrophysics Data System (ADS)
Zang, Y.; Yang, B.
2018-04-01
3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.
NASA Astrophysics Data System (ADS)
Nettke, Will; Scott, Douglas; Gibb, Andy G.; Thompson, Mark; Chrysostomou, Antonio; Evans, A.; Hill, Tracey; Jenness, Tim; Joncas, Gilles; Moore, Toby; Serjeant, Stephen; Urquhart, James; Vaccari, Mattia; Weferling, Bernd; White, Glenn; Zhu, Ming
2017-06-01
The SCUBA-2 Ambitious Sky Survey (SASSy) is composed of shallow 850-μm imaging using the Submillimetre Common-User Bolometer Array 2 (SCUBA-2) on the James Clerk Maxwell Telescope. Here we describe the extraction of a catalogue of beam-sized sources from a roughly 120 deg2 region of the Galactic plane mapped uniformly (to an rms level of about 40 mJy), covering longitude 120° < l < 140° and latitude |b| < 2.9°. We used a matched-filtering approach to increase the signal-to-noise ratio (S/N) in these noisy maps and tested the efficiency of our extraction procedure through estimates of the false discovery rate, as well as by adding artificial sources to the real images. The primary catalogue contains a total of 189 sources at 850 μm, down to an S/N threshold of approximately 4.6. Additionally, we list 136 sources detected down to S/N = 4.3, but recognize that as we go lower in S/N, the reliability of the catalogue rapidly diminishes. We perform follow-up observations of some of our lower significance sources through small targeted SCUBA-2 images and list 265 sources detected in these maps down to S/N = 5. This illustrates the real power of SASSy: inspecting the shallow maps for regions of 850-μm emission and then using deeper targeted images to efficiently find fainter sources. We also perform a comparison of the SASSy sources with the Planck Catalogue of Compact Sources and the IRAS Point Source Catalogue, to determine which sources discovered in this field might be new, and hence potentially cold regions at an early stage of star formation.
Electrostatic ion thruster optics calculations
NASA Technical Reports Server (NTRS)
Whealton, John H.; Kirkman, David A.; Raridon, R. J.
1992-01-01
Calculations have been performed which encompass both a self-consistent ion source extraction plasma sheath and the primary ion optics including sheath and electrode-induced aberrations. Particular attention is given to the effects of beam space charge, accelerator geometry, and properties of the downstream plasma sheath on the position of the electrostatic potential saddle point near the extractor electrode. The electron blocking potential blocking is described as a function of electrode thickness and secondary plasma processes.
Frequency Domain Fluorimetry Using a Mercury Vapor Lamp
2009-04-07
independence from light scatter and excitation/emission intensity variations in order to extract the sample’s fluorescent lifetime. Mercury vapor lamps ...the modulation amplitude of the lamp , An, via: max 0 1 ( ) sin(2 ) n fluorescence n n n I t B nf tπ θ = ∝ +∑ (8... lamp is estimated by assuming the lamp is emitting as a point source of uniform intensity into the lower hemisphere and has a reflector collecting
Improvement of submerged culture conditions to produce colorants by Penicillium purpurogenum
Santos-Ebinuma, Valéria Carvalho; Roberto, Inês Conceição; Teixeira, Maria Francisca Simas; Pessoa, Adalberto
2014-01-01
Safety issues related to the employment of synthetic colorants in different industrial segments have increased the interest in the production of colorants from natural sources, such as microorganisms. Improved cultivation technologies have allowed the use of microorganisms as an alternative source of natural colorants. The objective of this work was to evaluate the influence of some factors on natural colorants production by a recently isolated from Amazon Forest, Penicillium purpurogenum DPUA 1275 employing statistical tools. To this purpose the following variables: orbital stirring speed, pH, temperature, sucrose and yeast extract concentrations and incubation time were studied through two fractional factorial, one full factorial and a central composite factorial designs. The regression analysis pointed out that sucrose and yeast extract concentrations were the variables that influenced more in colorants production. Under the best conditions (yeast extract concentration around 10 g/L and sucrose concentration of 50 g/L) an increase of 10, 33 and 23% respectively to yellow, orange and red colorants absorbance was achieved. These results show that P. purpurogenum is an alternative colorants producer and the production of these biocompounds can be improved employing statistical tool. PMID:25242965
D'Avolio, Leonard W; Nguyen, Thien M; Goryachev, Sergey; Fiore, Louis D
2011-01-01
Despite at least 40 years of promising empirical performance, very few clinical natural language processing (NLP) or information extraction systems currently contribute to medical science or care. The authors address this gap by reducing the need for custom software and rules development with a graphical user interface-driven, highly generalizable approach to concept-level retrieval. A 'learn by example' approach combines features derived from open-source NLP pipelines with open-source machine learning classifiers to automatically and iteratively evaluate top-performing configurations. The Fourth i2b2/VA Shared Task Challenge's concept extraction task provided the data sets and metrics used to evaluate performance. Top F-measure scores for each of the tasks were medical problems (0.83), treatments (0.82), and tests (0.83). Recall lagged precision in all experiments. Precision was near or above 0.90 in all tasks. Discussion With no customization for the tasks and less than 5 min of end-user time to configure and launch each experiment, the average F-measure was 0.83, one point behind the mean F-measure of the 22 entrants in the competition. Strong precision scores indicate the potential of applying the approach for more specific clinical information extraction tasks. There was not one best configuration, supporting an iterative approach to model creation. Acceptable levels of performance can be achieved using fully automated and generalizable approaches to concept-level information extraction. The described implementation and related documentation is available for download.
Single Crystal Diamond Needle as Point Electron Source
Kleshch, Victor I.; Purcell, Stephen T.; Obraztsov, Alexander N.
2016-01-01
Diamond has been considered to be one of the most attractive materials for cold-cathode applications during past two decades. However, its real application is hampered by the necessity to provide appropriate amount and transport of electrons to emitter surface which is usually achieved by using nanometer size or highly defective crystallites having much lower physical characteristics than the ideal diamond. Here, for the first time the use of single crystal diamond emitter with high aspect ratio as a point electron source is reported. Single crystal diamond needles were obtained by selective oxidation of polycrystalline diamond films produced by plasma enhanced chemical vapor deposition. Field emission currents and total electron energy distributions were measured for individual diamond needles as functions of extraction voltage and temperature. The needles demonstrate current saturation phenomenon and sensitivity of emission to temperature. The analysis of the voltage drops measured via electron energy analyzer shows that the conduction is provided by the surface of the diamond needles and is governed by Poole-Frenkel transport mechanism with characteristic trap energy of 0.2–0.3 eV. The temperature-sensitive FE characteristics of the diamond needles are of great interest for production of the point electron beam sources and sensors for vacuum electronics. PMID:27731379
Second ROSAT all-sky survey (2RXS) source catalogue
NASA Astrophysics Data System (ADS)
Boller, Th.; Freyberg, M. J.; Trümper, J.; Haberl, F.; Voges, W.; Nandra, K.
2016-04-01
Aims: We present the second ROSAT all-sky survey source catalogue, hereafter referred to as the 2RXS catalogue. This is the second publicly released ROSAT catalogue of point-like sources obtained from the ROSAT all-sky survey (RASS) observations performed with the position-sensitive proportional counter (PSPC) between June 1990 and August 1991, and is an extended and revised version of the bright and faint source catalogues. Methods: We used the latest version of the RASS processing to produce overlapping X-ray images of 6.4° × 6.4° sky regions. To create a source catalogue, a likelihood-based detection algorithm was applied to these, which accounts for the variable point-spread function (PSF) across the PSPC field of view. Improvements in the background determination compared to 1RXS were also implemented. X-ray control images showing the source and background extraction regions were generated, which were visually inspected. Simulations were performed to assess the spurious source content of the 2RXS catalogue. X-ray spectra and light curves were extracted for the 2RXS sources, with spectral and variability parameters derived from these products. Results: We obtained about 135 000 X-ray detections in the 0.1-2.4 keV energy band down to a likelihood threshold of 6.5, as adopted in the 1RXS faint source catalogue. Our simulations show that the expected spurious content of the catalogue is a strong function of detection likelihood, and the full catalogue is expected to contain about 30% spurious detections. A more conservative likelihood threshold of 9, on the other hand, yields about 71 000 detections with a 5% spurious fraction. We recommend thresholds appropriate to the scientific application. X-ray images and overlaid X-ray contour lines provide an additional user product to evaluate the detections visually, and we performed our own visual inspections to flag uncertain detections. Intra-day variability in the X-ray light curves was quantified based on the normalised excess variance and a maximum amplitude variability analysis. X-ray spectral fits were performed using three basic models, a power law, a thermal plasma emission model, and black-body emission. Thirty-two large extended regions with diffuse emission and embedded point sources were identified and excluded from the present analysis. Conclusions: The 2RXS catalogue provides the deepest and cleanest X-ray all-sky survey catalogue in advance of eROSITA. The catalogue is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/588/A103
The Herschel Virgo Cluster Survey. XVII. SPIRE point-source catalogs and number counts
NASA Astrophysics Data System (ADS)
Pappalardo, Ciro; Bendo, George J.; Bianchi, Simone; Hunt, Leslie; Zibetti, Stefano; Corbelli, Edvige; di Serego Alighieri, Sperello; Grossi, Marco; Davies, Jonathan; Baes, Maarten; De Looze, Ilse; Fritz, Jacopo; Pohlen, Michael; Smith, Matthew W. L.; Verstappen, Joris; Boquien, Médéric; Boselli, Alessandro; Cortese, Luca; Hughes, Thomas; Viaene, Sebastien; Bizzocchi, Luca; Clemens, Marcel
2015-01-01
Aims: We present three independent catalogs of point-sources extracted from SPIRE images at 250, 350, and 500 μm, acquired with the Herschel Space Observatory as a part of the Herschel Virgo Cluster Survey (HeViCS). The catalogs have been cross-correlated to consistently extract the photometry at SPIRE wavelengths for each object. Methods: Sources have been detected using an iterative loop. The source positions are determined by estimating the likelihood to be a real source for each peak on the maps, according to the criterion defined in the sourceExtractorSussextractor task. The flux densities are estimated using the sourceExtractorTimeline, a timeline-based point source fitter that also determines the fitting procedure with the width of the Gaussian that best reproduces the source considered. Afterwards, each source is subtracted from the maps, removing a Gaussian function in every position with the full width half maximum equal to that estimated in sourceExtractorTimeline. This procedure improves the robustness of our algorithm in terms of source identification. We calculate the completeness and the flux accuracy by injecting artificial sources in the timeline and estimate the reliability of the catalog using a permutation method. Results: The HeViCS catalogs contain about 52 000, 42 200, and 18 700 sources selected at 250, 350, and 500 μm above 3σ and are ~75%, 62%, and 50% complete at flux densities of 20 mJy at 250, 350, 500 μm, respectively. We then measured source number counts at 250, 350, and 500 μm and compare them with previous data and semi-analytical models. We also cross-correlated the catalogs with the Sloan Digital Sky Survey to investigate the redshift distribution of the nearby sources. From this cross-correlation, we select ~2000 sources with reliable fluxes and a high signal-to-noise ratio, finding an average redshift z ~ 0.3 ± 0.22 and 0.25 (16-84 percentile). Conclusions: The number counts at 250, 350, and 500 μm show an increase in the slope below 200 mJy, indicating a strong evolution in number of density for galaxies at these fluxes. In general, models tend to overpredict the counts at brighter flux densities, underlying the importance of studying the Rayleigh-Jeans part of the spectral energy distribution to refine the theoretical recipes of the models. Our iterative method for source identification allowed the detection of a family of 500 μm sources that are not foreground objects belonging to Virgo and not found in other catalogs. Herschel is an ESA space observatory with science instruments provided by a European-led principal investigator consortia and with an important participation from NASA.The 250, 350, 500 μm, and the total catalogs are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/573/A129
Reaction Wheel Disturbance Model Extraction Software - RWDMES
NASA Technical Reports Server (NTRS)
Blaurock, Carl
2009-01-01
The RWDMES is a tool for modeling the disturbances imparted on spacecraft by spinning reaction wheels. Reaction wheels are usually the largest disturbance source on a precision pointing spacecraft, and can be the dominating source of pointing error. Accurate knowledge of the disturbance environment is critical to accurate prediction of the pointing performance. In the past, it has been difficult to extract an accurate wheel disturbance model since the forcing mechanisms are difficult to model physically, and the forcing amplitudes are filtered by the dynamics of the reaction wheel. RWDMES captures the wheel-induced disturbances using a hybrid physical/empirical model that is extracted directly from measured forcing data. The empirical models capture the tonal forces that occur at harmonics of the spin rate, and the broadband forces that arise from random effects. The empirical forcing functions are filtered by a physical model of the wheel structure that includes spin-rate-dependent moments (gyroscopic terms). The resulting hybrid model creates a highly accurate prediction of wheel-induced forces. It accounts for variation in disturbance frequency, as well as the shifts in structural amplification by the whirl modes, as the spin rate changes. This software provides a point-and-click environment for producing accurate models with minimal user effort. Where conventional approaches may take weeks to produce a model of variable quality, RWDMES can create a demonstrably high accuracy model in two hours. The software consists of a graphical user interface (GUI) that enables the user to specify all analysis parameters, to evaluate analysis results and to iteratively refine the model. Underlying algorithms automatically extract disturbance harmonics, initialize and tune harmonic models, and initialize and tune broadband noise models. The component steps are described in the RWDMES user s guide and include: converting time domain data to waterfall PSDs (power spectral densities); converting PSDs to order analysis data; extracting harmonics; initializing and simultaneously tuning a harmonic model and a wheel structural model; initializing and tuning a broadband model; and verifying the harmonic/broadband/structural model against the measurement data. Functional operation is through a MATLAB GUI that loads test data, performs the various analyses, plots evaluation data for assessment and refinement of analysis parameters, and exports the data to documentation or downstream analysis code. The harmonic models are defined as specified functions of frequency, typically speed-squared. The reaction wheel structural model is realized as mass, damping, and stiffness matrices (typically from a finite element analysis package) with the addition of a gyroscopic forcing matrix. The broadband noise model is realized as a set of speed-dependent filters. The tuning of the combined model is performed using nonlinear least squares techniques. RWDMES is implemented as a MATLAB toolbox comprising the Fit Manager for performing the model extraction, Data Manager for managing input data and output models, the Gyro Manager for modifying wheel structural models, and the Harmonic Editor for evaluating and tuning harmonic models. This software was validated using data from Goodrich E wheels, and from GSFC Lunar Reconnaissance Orbiter (LRO) wheels. The validation testing proved that RWDMES has the capability to extract accurate disturbance models from flight reaction wheels with minimal user effort.
The Third Fermi LAT Catalog of High-Energy Gamma-ray Sources
NASA Astrophysics Data System (ADS)
Thompson, David J.; Ballet, J.; Burnett, T.; Fermi Large Area Telescope Collaboration
2014-01-01
The Fermi Gamma-ray Space Telescope Large Area Telescope (LAT) has been gathering science data since August 2008, surveying the full sky every three hours. The second source catalog (2FGL, Nolan et al 2012, ApJS 199, 31) was based on 2 years of data. We are preparing a third source catalog (3FGL) based on 4 years of reprocessed data. The reprocessing introduced a more accurate description of the instrument, which resulted in a narrower point spread function. Both the localization and the detection threshold for hard-spectrum sources have been improved. The new catalog also relies on a refined model of Galactic diffuse emission, particularly important for low-latitude soft-spectrum sources. The process for associating LAT sources with those at other wavelengths has also improved, thanks to dedicated multiwavelength follow-up, new surveys and better ways to extract sources likely to be gamma-ray counterparts. We describe the construction of this new catalog, its characteristics, and its remaining limitations.
The Third Fermi-LAT Catalog of High-Energy Gamma-ray Sources
NASA Astrophysics Data System (ADS)
Burnett, Toby
2014-03-01
The Fermi Gamma-ray Space Telescope Large Area Telescope (LAT) has been gathering science data since August 2008, surveying the full sky every three hours. The second source catalog (2FGL, Nolan et al. 2012, ApJS 199, 31) was based on 2 years of data. We are preparing a third source catalog (3FGL) based on 4 years of reprocessed data. The reprocessing introduced a more accurate description of the instrument, which resulted in a narrower point spread function. Both the localization and the detection threshold for hard-spectrum sources have been improved. The new catalog also relies on a refined model of Galactic diffuse emission, particularly important for low-latitude soft-spectrum sources. The process for associating LAT sources with those at other wavelengths has also improved, thanks to dedicated multiwavelength follow-up, new surveys and better ways to extract sources likely to be gamma-ray counterparts. We describe the construction of this new catalog, its characteristics, and its remaining limitations.
Versatile plasma ion source with an internal evaporator
NASA Astrophysics Data System (ADS)
Turek, M.; Prucnal, S.; Drozdziel, A.; Pyszniak, K.
2011-04-01
A novel construction of an ion source with an evaporator placed inside a plasma chamber is presented. The crucible is heated to high temperatures directly by arc discharge, which makes the ion source suitable for substances with high melting points. The compact ion source enables production of intense ion beams for wide spectrum of solid elements with typical separated beam currents of ˜100-150 μA for Al +, Mn +, As + (which corresponds to emission current densities of 15-25 mA/cm 2) for the extraction voltage of 25 kV. The ion source works for approximately 50-70 h at 100% duty cycle, which enables high ion dose implantation. The typical power consumption of the ion source is 350-400 W. The paper presents detailed experimental data (e.g. dependences of ion currents and anode voltages on discharge and filament currents and magnetic flux densities) for Cr, Fe, Al, As, Mn and In. The discussion is supported by results of Monte Carlo method based numerical simulation of ionisation in the ion source.
Tinkelman, Igor; Melamed, Timor
2005-06-01
In Part I of this two-part investigation [J. Opt. Soc. Am. A 22, 1200 (2005)], we presented a theory for phase-space propagation of time-harmonic electromagnetic fields in an anisotropic medium characterized by a generic wave-number profile. In this Part II, these investigations are extended to transient fields, setting a general analytical framework for local analysis and modeling of radiation from time-dependent extended-source distributions. In this formulation the field is expressed as a superposition of pulsed-beam propagators that emanate from all space-time points in the source domain and in all directions. Using time-dependent quadratic-Lorentzian windows, we represent the field by a phase-space spectral distribution in which the propagating elements are pulsed beams, which are formulated by a transient plane-wave spectrum over the extended-source plane. By applying saddle-point asymptotics, we extract the beam phenomenology in the anisotropic environment resulting from short-pulsed processing. Finally, the general results are applied to the special case of uniaxial crystal and compared with a reference solution.
Searching for minimum in dependence of squared speed-of-sound on collision energy
Liu, Fu -Hu; Gao, Li -Na; Lacey, Roy A.
2016-01-01
Experimore » mental results of the rapidity distributions of negatively charged pions produced in proton-proton ( p - p ) and beryllium-beryllium (Be-Be) collisions at different beam momentums, measured by the NA61/SHINE Collaboration at the super proton synchrotron (SPS), are described by a revised (three-source) Landau hydrodynamic model. The squared speed-of-sound parameter c s 2 is then extracted from the width of rapidity distribution. There is a local minimum (knee point) which indicates a softest point in the equation of state (EoS) appearing at about 40 A GeV/ c (or 8.8 GeV) in c s 2 excitation function (the dependence of c s 2 on incident beam momentum (or center-of-mass energy)). This knee point should be related to the searching for the onset of quark deconfinement and the critical point of quark-gluon plasma (QGP) phase transition.« less
Wilkinson, J R; Yu, J; Abbas, H K; Scheffler, B E; Kim, H S; Nierman, W C; Bhatnagar, D; Cleveland, T E
2007-10-01
Aflatoxins are toxic and carcinogenic polyketide metabolites produced by fungal species, including Aspergillus flavus and A. parasiticus. The biosynthesis of aflatoxins is modulated by many environmental factors, including the availability of a carbon source. The gene expression profile of A. parasiticus was evaluated during a shift from a medium with low concentration of simple sugars, yeast extract (YE), to a similar medium with sucrose, yeast extract sucrose (YES). Gene expression and aflatoxins (B1, B2, G1, and G2) were quantified from fungal mycelia harvested pre- and post-shifting. When compared with YE media, YES caused temporary reduction of the aflatoxin levels detected at 3-h post-shifting and they remained low well past 12 h post-shift. Aflatoxin levels did not exceed the levels in YE until 24 h post-shift, at which time point a tenfold increase was observed over YE. Microarray analysis comparing the RNA samples from the 48-h YE culture to the YES samples identified a total of 2120 genes that were expressed across all experiments, including most of the aflatoxin biosynthesis genes. One-way analysis of variance (ANOVA) identified 56 genes that were expressed with significant variation across all time points. Three genes responsible for converting norsolorinic acid to averantin were identified among these significantly expressed genes. The potential involvement of these genes in the regulation of aflatoxin biosynthesis is discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharp, G.J.; Samant, H.S.; Vaidya, O.C.
1988-06-01
The harvesting of marine plants on a commercial scale was a significant industry in the Maritime Provinces of Canada by the end of World War II. These seaweeds have been traditionally utilized as foodstuffs either as a processed extract or a semi-processed plant. The Maritime coastline is becoming industrialized; there is also potential for expansion of the marine plant industry beyond traditional harvest areas. Therefore, the quality of material from new areas must be examined prior to exploitation as well as monitoring of traditional areas. The bioaccumulated of metals by marine plants was recognized in early measurements of trace elementmore » concentrations which were above ambient water values. Before growth and reproductive inhibition are caused by severe effects of heavy metal pollution, food quality changes may occur. The Food Chemical Code (U.S.A.) limits heavy metals in the extracts of seaweeds. Sediment and water samples taken in connection with the Ocean Dumping Control Act of Canada have identified several sites with elevated heavy metal content in the Maritimes. The purpose of this study was to examine heavy metal levels in commercially important seaweeds from traditional harvest areas and areas near point sources of pollution. The authors wished to provide a baseline for the future and identify existing problem areas.« less
Identifying equivalent sound sources from aeroacoustic simulations using a numerical phased array
NASA Astrophysics Data System (ADS)
Pignier, Nicolas J.; O'Reilly, Ciarán J.; Boij, Susann
2017-04-01
An application of phased array methods to numerical data is presented, aimed at identifying equivalent flow sound sources from aeroacoustic simulations. Based on phased array data extracted from compressible flow simulations, sound source strengths are computed on a set of points in the source region using phased array techniques assuming monopole propagation. Two phased array techniques are used to compute the source strengths: an approach using a Moore-Penrose pseudo-inverse and a beamforming approach using dual linear programming (dual-LP) deconvolution. The first approach gives a model of correlated sources for the acoustic field generated from the flow expressed in a matrix of cross- and auto-power spectral values, whereas the second approach results in a model of uncorrelated sources expressed in a vector of auto-power spectral values. The accuracy of the equivalent source model is estimated by computing the acoustic spectrum at a far-field observer. The approach is tested first on an analytical case with known point sources. It is then applied to the example of the flow around a submerged air inlet. The far-field spectra obtained from the source models for two different flow conditions are in good agreement with the spectra obtained with a Ffowcs Williams-Hawkings integral, showing the accuracy of the source model from the observer's standpoint. Various configurations for the phased array and for the sources are used. The dual-LP beamforming approach shows better robustness to changes in the number of probes and sources than the pseudo-inverse approach. The good results obtained with this simulation case demonstrate the potential of the phased array approach as a modelling tool for aeroacoustic simulations.
NASA Astrophysics Data System (ADS)
Henri, C. V.; Harter, T.
2017-12-01
Agricultural activities are recognized as the preeminent origin of non-point source (NPS) contamination of water bodies through the leakage of nitrate, salt and agrochemicals. A large fraction of world agricultural activities and therefore NPS contamination occurs over unconsolidated alluvial deposit basins offering soil composition and topography favorable to productive farming. These basins represent also important groundwater reservoirs. The over-exploitation of aquifers coupled with groundwater pollution by agriculture-related NPS contaminant has led to a rapid deterioration of the quality of these groundwater basins. The management of groundwater contamination from NPS is challenged by the inherent complexity of aquifers systems. Contaminant transport dynamics are highly uncertain due to the heterogeneity of hydraulic parameters controlling groundwater flow. Well characteristics are also key uncertain elements affecting pollutant transport and NPS management but quantifying uncertainty in NPS management under these conditions is not well documented. Our work focuses on better understanding the joint impact of aquifer heterogeneity and pumping well characteristics (extraction rate and depth) on (1) the transport of contaminants from NPS and (2) the spatio-temporal extension of the capture zone. To do so, we generate a series of geostatistically equivalent 3D heterogeneous aquifers and simulate the flow and non-reactive solute transport from NPS to extraction wells within a stochastic framework. The propagation of the uncertainty on the hydraulic conductivity field is systematically analyzed. A sensitivity analysis of the impact of extraction well characteristics (pumping rate and screen depth) is also conducted. Results highlight the significant role that heterogeneity and well characteristics plays on management metrics. We finally show that, in case of NPS contamination, the joint impact of regional longitudinal and transverse vertical hydraulic gradients and well depth strongly constrain the average travel times and extension of the contributing area.
Draganic, I N
2016-02-01
Basic vacuum calculations were performed for various operating conditions of the Los Alamos National Neutron Science H(-) Cockcroft-Walton (CW) injector and the Ion Source Test Stand (ISTS). The vacuum pressure was estimated for both the CW and ISTS at five different points: (1) inside the H(-) ion source, (2) in front of the Pierce electrode, (3) at the extraction electrode, (4) at the column electrode, and (5) at the ground electrode. A static vacuum analysis of residual gases and the working hydrogen gas was completed for the normal ion source working regime. Gas density and partial pressure were estimated for the injected hydrogen gas. The attenuation of H(-) beam current and generation of electron current in the high voltage acceleration columns and low energy beam transport lines were calculated. The interaction of H(-) ions on molecular hydrogen (H2) is discussed as a dominant collision process in describing electron stripping rates. These results are used to estimate the observed increase in the ratio of electrons to H(-) ion beam in the ISTS beam transport line.
OLED lighting devices having multi element light extraction and luminescence conversion layer
Krummacher, Benjamin Claus; Antoniadis, Homer
2010-11-16
An apparatus such as a light source has a multi element light extraction and luminescence conversion layer disposed over a transparent layer of the light source and on the exterior of said light source. The multi-element light extraction and luminescence conversion layer includes a plurality of light extraction elements and a plurality of luminescence conversion elements. The light extraction elements diffuses the light from the light source while luminescence conversion elements absorbs a first spectrum of light from said light source and emits a second spectrum of light.
Martinez-Avila, G C G; Aguilera, A F; Saucedo, S; Rojas, R; Rodriguez, R; Aguilar, C N
2014-01-01
Agro-industrial by-products are important sources of potent bioactive phenolic compounds. These compounds are of extreme relevance for food and pharmacological industries due to their great variety of biological activities. Fermentation represents an environmentally clean technology for production and extraction of these bioactive compounds, providing high quality and high activity extracts, which can be incorporated in foods using coatings/films wax-based in order to avoid alterations in their quality. In this document is presented an overview about importance and benefits of solid-state fermentation, pointing out this bioprocess as an alternative technology for use agro-industrial by-products as substrates to produce valuable secondary metabolites and their applications as food quality conservatives.
The cardioprotective power of leaves
Boncler, Magdalena; Watala, Cezary
2015-01-01
Lack of physical activity, smoking and/or inappropriate diet can contribute to the increase of oxidative stress, in turn affecting the pathophysiology of cardiovascular diseases. Strong anti-oxidant properties of plant polyphenolic compounds might underlie their cardioprotective activity. This paper reviews recent findings on the anti-oxidant activity of plant leaf extracts and emphasizes their effects on blood platelets, leukocytes and endothelial cells – the targets orchestrating the development and progression of cardiovascular diseases. We also review the evidence linking supplementation with plant leaf extracts and the risk factors defining the metabolic syndrome. The data point to the importance of leaves as an alternative source of polyphenolic compounds in the human diet and their role in the prevention of cardiovascular diseases. PMID:26322095
Producing data-based sensitivity kernels from convolution and correlation in exploration geophysics.
NASA Astrophysics Data System (ADS)
Chmiel, M. J.; Roux, P.; Herrmann, P.; Rondeleux, B.
2016-12-01
Many studies have shown that seismic interferometry can be used to estimate surface wave arrivals by correlation of seismic signals recorded at a pair of locations. In the case of ambient noise sources, the convergence towards the surface wave Green's functions is obtained with the criterion of equipartitioned energy. However, seismic acquisition with active, controlled sources gives more possibilities when it comes to interferometry. The use of controlled sources makes it possible to recover the surface wave Green's function between two points using either correlation or convolution. We investigate the convolutional and correlational approaches using land active-seismic data from exploration geophysics. The data were recorded on 10,710 vertical receivers using 51,808 sources (seismic vibrator trucks). The sources spacing is the same in both X and Y directions (30 m) which is known as a "carpet shooting". The receivers are placed in parallel lines with a spacing 150 m in the X direction and 30 m in the Y direction. Invoking spatial reciprocity between sources and receivers, correlation and convolution functions can thus be constructed between either pairs of receivers or pairs of sources. Benefiting from the dense acquisition, we extract sensitivity kernels from correlation and convolution measurements of the seismic data. These sensitivity kernels are subsequently used to produce phase-velocity dispersion curves between two points and to separate the higher mode from the fundamental mode for surface waves. Potential application to surface wave cancellation is also envisaged.
Vaccinium meridionale Swartz extracts and their addition in beef burgers as antioxidant ingredient.
López-Padilla, Alexis; Martín, Diana; Villanueva Bermejo, David; Jaime, Laura; Ruiz-Rodriguez, Alejandro; Restrepo Flórez, Claudia Estela; Rivero Barrios, Diana Marsela; Fornari, Tiziana
2018-01-01
Vaccinium meridionale Swartz (mortiño) constitutes a source of bioactive phytochemicals, but reports related to its efficient and green production are scarce. In this study, pressurized liquid extraction (PLE) and ultrasound-assisted extraction of mortiño were compared. Total phenolic content (TPC) and antioxidant capacity (ABTS •+ ) were determined. Beef burgers with 20 g kg -1 of mortiño (MM) or its PLE extract (ME) were manufactured. Lipid oxidation (TBARS) and instrumental color changes were measured after refrigerated storage. High TPC (up to 72 g gallic acid equivalent kg -1 extract) was determined in mortiño extracts, which was positively correlated with antioxidant activity. TBARS values of beef burgers containing either MM or ME did not change after refrigerated storage, whereas lipid oxidation of control burgers increased significantly. The color of burgers with added MM or ME was different (lower b* and a* values) from that of control burgers. However, the evolution of color after storage was similar between control and ME samples. Mortiño extracts with high TPC can be obtained by PLE. Both mortiño and its PLE extract are able to control lipid oxidation of beef burgers, but the extract is preferred from the color quality point of view. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Microlensing as a possible probe of event-horizon structure in quasars
NASA Astrophysics Data System (ADS)
Tomozeiu, Mihai; Mohammed, Irshad; Rabold, Manuel; Saha, Prasenjit; Wambsganss, Joachim
2018-04-01
In quasars which are lensed by galaxies, the point-like images sometimes show sharp and uncorrelated brightness variations (microlensing). These brightness changes are associated with the innermost region of the quasar passing through a complicated pattern of caustics produced by the stars in the lensing galaxy. In this paper, we study whether the universal properties of optical caustics could enable extraction of shape information about the central engine of quasars. We present a toy model with a crescent-shaped source crossing a fold caustic. The silhouette of a black hole over an accretion disc tends to produce roughly crescent sources. When a crescent-shaped source crosses a fold caustic, the resulting light curve is noticeably different from the case of a circular luminosity profile or Gaussian source. With good enough monitoring data, the crescent parameters, apart from one degeneracy, can be recovered.
Microlensing as a Possible Probe of Event-Horizon Structure in Quasars
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tomozeiu, Mihai; Mohammed, Irshad; Rabold, Manuel
In quasars which are lensed by galaxies, the point-like images sometimes show sharp and uncorrelated brightness variations (microlensing). These brightness changes are associated with the innermost region of the quasar passing through a complicated pattern of caustics produced by the stars in the lensing galaxy. In this paper, we study whether the universal properties of optical caustics could enable extraction of shape information about the central engine of quasars. We present a toy model with a crescent-shaped source crossing a fold caustic. The silhouette of a black hole over an accretion disk tends to produce roughly crescent sources. When amore » crescent-shaped source crosses a fold caustic, the resulting light curve is noticeably different from the case of a circular luminosity profile or Gaussian source. With good enough monitoring data, the crescent parameters, apart from one degeneracy, can be recovered.« less
ProFound: Source Extraction and Application to Modern Survey Data
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.
2018-04-01
ProFound detects sources in noisy images, generates segmentation maps identifying the pixels belonging to each source, and measures statistics like flux, size, and ellipticity. These inputs are key requirements of ProFit (ascl:1612.004), our galaxy profiling package; these two packages used in unison semi-automatically profile large samples of galaxies. The key novel feature introduced in ProFound is that all photometry is executed on dilated segmentation maps that fully contain the identifiable flux, rather than using more traditional circular or ellipse-based photometry. Also, to be less sensitive to pathological segmentation issues, the de-blending is made across saddle points in flux. ProFound offers good initial parameter estimation for ProFit, and also segmentation maps that follow the sometimes complex geometry of resolved sources, whilst capturing nearly all of the flux. A number of bulge-disc decomposition projects are already making use of the ProFound and ProFit pipeline.
Microlensing as a Possible Probe of Event-Horizon Structure in Quasars
Tomozeiu, Mihai; Mohammed, Irshad; Rabold, Manuel; ...
2017-12-08
In quasars which are lensed by galaxies, the point-like images sometimes show sharp and uncorrelated brightness variations (microlensing). These brightness changes are associated with the innermost region of the quasar passing through a complicated pattern of caustics produced by the stars in the lensing galaxy. In this paper, we study whether the universal properties of optical caustics could enable extraction of shape information about the central engine of quasars. We present a toy model with a crescent-shaped source crossing a fold caustic. The silhouette of a black hole over an accretion disk tends to produce roughly crescent sources. When amore » crescent-shaped source crosses a fold caustic, the resulting light curve is noticeably different from the case of a circular luminosity profile or Gaussian source. With good enough monitoring data, the crescent parameters, apart from one degeneracy, can be recovered.« less
NASA Astrophysics Data System (ADS)
Mehta, Neha; Lasagna, Manuela; Antonella Dino, Giovanna; De Luca, Domenico Antonio
2017-04-01
Extractive activities present threat to natural water systems and their effects are observed even after the cessation of activities. The harmful effects of extractive activities such as deterioration of water sources by low quality waters or by allowing leaching of metals into groundwater makes it necessary to carry out careful, scientific and comprehensive studies on this subject. Consequently, the same problem statement was chosen as part of a PhD research Project. The PhD research is part of REMEDIATE project (A Marie Sklodowska-Curie Action Initial Training Network for Improved decision making in contaminated land site investigation and risk assessment, Grant Agreement No. 643087). The current work thus points out on the contamination of groundwater sources due to past mining activities in the area. Contaminated groundwater may act as possible contamination source to surface water also. The impacts on water systems connected to mining activities depend on the ore type, metal being extracted, exploitation method, ore processing, pollution control efforts, geochemical and hydrogeochemical conditions of water and surroundings. To evaluate the effects posed by past metal extracting activities the study was carried out at an abandoned site used for extracting nickel in Campello Monti (Valstrona municipality, Piedmont region, Italy). Campello Monti is located in basement of Southern Italian Alps in the Ivrea Verbano Zone. The area is composed of mafic rocks intruded by mantle periodite. The mafic formation consists of peridotites, pyroxenites, gabbros, anorthosites, gabbro-norite, gabbro-diorite and diorite. Mines were used for nickel exploitation from 9th Century and continued until 1940s. The long history of nickel extraction has left the waste contaminated with Ni and Co in the mountains alongwith tunnels used for carrying out metal extracting activities. The area around the site is used for housing, shows the presence of domestic animals and has Strona creek passing through it. The groundwater circulation takes place in fractured rocks, in waste dumps and tunnels used for extracting metal. Thus the abandoned site may contaminate local water sources. To study the impacts on local water sources, water sampling and analysis were performed. Three sampling campaigns in June, July and October 2016 resulted in 16 groundwater samples (4 tap water samples, 3 samples from tunnels and 9 from springs) and 6 surface water samples. The samples were analyzed to measure alkalinity, electrolytic conductivity, pH , temperature, metals such as- Hg, Tl, Cd, Cr (total), Cr (VI), Ag, As ,Pb , Se, Ni, Co, Mn, Al, Fe, Cu, Zn, B and metal ions -CN-, Fl-, Mg2+, Na+, SO42-, NO3- ,Cl-. The water samples collected from tunnels showed nickel concentration ranging from 31.9 µg/ l to as high as 304 µg/ l (permissible limit for Ni in Italy according to DLgs. 152/06 is 20 µg/l ). These groundwaters, being in close association with minerals containing heavy metals tend to dissolve such elements. The springs in mountains also contained Ni higher than 20 µg/l. These all groundwater systems act as source to Strona creek which showed Ni concentration of 512 µg/l.
Ultra-short ion and neutron pulse production
Leung, Ka-Ngo; Barletta, William A.; Kwan, Joe W.
2006-01-10
An ion source has an extraction system configured to produce ultra-short ion pulses, i.e. pulses with pulse width of about 1 .mu.s or less, and a neutron source based on the ion source produces correspondingly ultra-short neutron pulses. To form a neutron source, a neutron generating target is positioned to receive an accelerated extracted ion beam from the ion source. To produce the ultra-short ion or neutron pulses, the apertures in the extraction system of the ion source are suitably sized to prevent ion leakage, the electrodes are suitably spaced, and the extraction voltage is controlled. The ion beam current leaving the source is regulated by applying ultra-short voltage pulses of a suitable voltage on the extraction electrode.
Antoniadis,; Homer, Krummacher [Mountain View, CA; Claus, Benjamin [Regensburg, DE
2008-01-22
An apparatus such as a light source has a multi-element light extraction and luminescence conversion layer disposed over a transparent layer of the light source and on the exterior of said light source. The multi-element light extraction and luminescence conversion layer includes a plurality of light extraction elements and a plurality of luminescence conversion elements. The light extraction elements diffuses the light from the light source while luminescence conversion elements absorbs a first spectrum of light from said light source and emits a second spectrum of light.
The modification at CSNS ion source
NASA Astrophysics Data System (ADS)
Liu, S.; Ouyang, H.; Huang, T.; Xiao, Y.; Cao, X.; Lv, Y.; Xue, K.; Chen, W.
2017-08-01
The commissioning of CSNS front end has been finished. Above 15 mA beam intensity is obtained at the end of RFQ. For CSNS ion source, it is a type of penning surface plasma ion source, similar to ISIS ion source. To improve the operation stability and reduce spark rate, some modifications have been performed, including Penning field, extraction optics and post acceleration. PBGUNS is applied to optimize beam extraction. The co-extraction electrons are considered at PBGUNS simulation and various extracted structure are simulated aiming to make the beam through the extracted electrode without loss. The stability of ion source is improved further.
Text-in-Context: A Method for Extracting Findings in Mixed-Methods Mixed Research Synthesis Studies
Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L.
2012-01-01
Aim Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. Background International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. Data source The data extraction challenges described here were encountered and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011–2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. Discussion To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. Implications for nursing The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. Conclusion This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. PMID:22924808
Automated extraction and semantic analysis of mutation impacts from the biomedical literature
2012-01-01
Background Mutations as sources of evolution have long been the focus of attention in the biomedical literature. Accessing the mutational information and their impacts on protein properties facilitates research in various domains, such as enzymology and pharmacology. However, manually curating the rich and fast growing repository of biomedical literature is expensive and time-consuming. As a solution, text mining approaches have increasingly been deployed in the biomedical domain. While the detection of single-point mutations is well covered by existing systems, challenges still exist in grounding impacts to their respective mutations and recognizing the affected protein properties, in particular kinetic and stability properties together with physical quantities. Results We present an ontology model for mutation impacts, together with a comprehensive text mining system for extracting and analysing mutation impact information from full-text articles. Organisms, as sources of proteins, are extracted to help disambiguation of genes and proteins. Our system then detects mutation series to correctly ground detected impacts using novel heuristics. It also extracts the affected protein properties, in particular kinetic and stability properties, as well as the magnitude of the effects and validates these relations against the domain ontology. The output of our system can be provided in various formats, in particular by populating an OWL-DL ontology, which can then be queried to provide structured information. The performance of the system is evaluated on our manually annotated corpora. In the impact detection task, our system achieves a precision of 70.4%-71.1%, a recall of 71.3%-71.5%, and grounds the detected impacts with an accuracy of 76.5%-77%. The developed system, including resources, evaluation data and end-user and developer documentation is freely available under an open source license at http://www.semanticsoftware.info/open-mutation-miner. Conclusion We present Open Mutation Miner (OMM), the first comprehensive, fully open-source approach to automatically extract impacts and related relevant information from the biomedical literature. We assessed the performance of our work on manually annotated corpora and the results show the reliability of our approach. The representation of the extracted information into a structured format facilitates knowledge management and aids in database curation and correction. Furthermore, access to the analysis results is provided through multiple interfaces, including web services for automated data integration and desktop-based solutions for end user interactions. PMID:22759648
Using Deep Space Climate Observatory Measurements to Study the Earth as an Exoplanet
NASA Astrophysics Data System (ADS)
Jiang, Jonathan H.; Zhai, Albert J.; Herman, Jay; Zhai, Chengxing; Hu, Renyu; Su, Hui; Natraj, Vijay; Li, Jiazheng; Xu, Feng; Yung, Yuk L.
2018-07-01
Even though it was not designed as an exoplanetary research mission, the Deep Space Climate Observatory ( DSCOVR ) has been opportunistically used for a novel experiment in which Earth serves as a proxy exoplanet. More than 2 yr of DSCOVR Earth images were employed to produce time series of multiwavelength, single-point light sources in order to extract information on planetary rotation, cloud patterns, surface type, and orbit around the Sun. In what follows, we assume that these properties of the Earth are unknown and instead attempt to derive them from first principles. These conclusions are then compared with known data about our planet. We also used the DSCOVR data to simulate phase-angle changes, as well as the minimum data collection rate needed to determine the rotation period of an exoplanet. This innovative method of using the time evolution of a multiwavelength, reflected single-point light source can be deployed for retrieving a range of intrinsic properties of an exoplanet around a distant star.
An atlas of H-alpha-emitting regions in M33: A systematic search for SS433 star candidates
NASA Technical Reports Server (NTRS)
Calzetti, Daniela; Kinney, Anne L.; Ford, Holland; Doggett, Jesse; Long, Knox S.
1995-01-01
We report finding charts and accurate positions for 432 compact H-alpha emitting regions in the Local Group galaxy M 33 (NGC 598), in an effort to isolate candidates for an SS433-like stellar system. The objects were extracted from narrow band images, centered in the rest-frame H-alpha (lambda 6563 A) and in the red continuum at 6100 A. The atlas is complete down to V approximately equal to 20 and includes 279 compact HII regions and 153 line emitting point-like sources. The point-like sources undoubtedly include a variety of objects: very small HII regions, early type stars with intense stellar winds, and Wolf-Rayet stars, but should also contain objects with the characteristics of SS433. This extensive survey of compact H-alpha regions in M 33 is a first step towards the identification of peculiar stellar systems like SS433 in external galaxies.
Advanced control of neutral beam injected power in DIII-D
Pawley, Carl J.; Crowley, Brendan J.; Pace, David C.; ...
2017-03-23
In the DIII-D tokamak, one of the most powerful techniques to control the density, temperature and plasma rotation is by eight independently modulated neutral beam sources with a total power of 20 MW. The rapid modulation requires a high degree of reproducibility and precise control of the ion source plasma and beam acceleration voltage. Recent changes have been made to the controls to provide a new capability to smoothly vary the beam current and beam voltage during a discharge, while maintaining the modulation capability. The ion source plasma inside the arc chamber is controlled through feedback from the Langmuir probesmore » measuring plasma density near the extraction end. To provide the new capability, the plasma control system (PCS) has been enabled to change the Langmuir probe set point and the beam voltage set point in real time. When the PCS varies the Langmuir set point, the plasma density is directly controlled in the arc chamber, thus changing the beam current (perveance) and power going into the tokamak. Alternately, the PCS can sweep the beam voltage set point by 20 kV or more and adjust the Langmuir probe setting to match, keeping the perveance constant and beam divergence at a minimum. This changes the beam power and average neutral particle energy, which changes deposition in the tokamak plasma. The ion separating magnetic field must accurately match the beam voltage to protect the beam line. To do this, the magnet current control accurately tracks the beam voltage set point. In conclusion, these new capabilities allow continuous in-shot variation of neutral beam ion energy to complement« less
Hierarchical extraction of urban objects from mobile laser scanning data
NASA Astrophysics Data System (ADS)
Yang, Bisheng; Dong, Zhen; Zhao, Gang; Dai, Wenxia
2015-01-01
Point clouds collected in urban scenes contain a huge number of points (e.g., billions), numerous objects with significant size variability, complex and incomplete structures, and variable point densities, raising great challenges for the automated extraction of urban objects in the field of photogrammetry, computer vision, and robotics. This paper addresses these challenges by proposing an automated method to extract urban objects robustly and efficiently. The proposed method generates multi-scale supervoxels from 3D point clouds using the point attributes (e.g., colors, intensities) and spatial distances between points, and then segments the supervoxels rather than individual points by combining graph based segmentation with multiple cues (e.g., principal direction, colors) of the supervoxels. The proposed method defines a set of rules for merging segments into meaningful units according to types of urban objects and forms the semantic knowledge of urban objects for the classification of objects. Finally, the proposed method extracts and classifies urban objects in a hierarchical order ranked by the saliency of the segments. Experiments show that the proposed method is efficient and robust for extracting buildings, streetlamps, trees, telegraph poles, traffic signs, cars, and enclosures from mobile laser scanning (MLS) point clouds, with an overall accuracy of 92.3%.
Two frameworks for integrating knowledge in induction
NASA Technical Reports Server (NTRS)
Rosenbloom, Paul S.; Hirsh, Haym; Cohen, William W.; Smith, Benjamin D.
1994-01-01
The use of knowledge in inductive learning is critical for improving the quality of the concept definitions generated, reducing the number of examples required in order to learn effective concept definitions, and reducing the computation needed to find good concept definitions. Relevant knowledge may come in many forms (such as examples, descriptions, advice, and constraints) and from many sources (such as books, teachers, databases, and scientific instruments). How to extract the relevant knowledge from this plethora of possibilities, and then to integrate it together so as to appropriately affect the induction process is perhaps the key issue at this point in inductive learning. Here the focus is on the integration part of this problem; that is, how induction algorithms can, and do, utilize a range of extracted knowledge. Preliminary work on a transformational framework for defining knowledge-intensive inductive algorithms out of relatively knowledge-free algorithms is described, as is a more tentative problems-space framework that attempts to cover all induction algorithms within a single general approach. These frameworks help to organize what is known about current knowledge-intensive induction algorithms, and to point towards new algorithms.
A scrutiny of heterogeneity at the TCE Source Area BioREmediation (SABRE) test site
NASA Astrophysics Data System (ADS)
Rivett, M.; Wealthall, G. P.; Mcmillan, L. A.; Zeeb, P.
2015-12-01
A scrutiny of heterogeneity at the UK's Source Area BioREmediation (SABRE) test site is presented to better understand how spatial heterogeneity in subsurface properties and process occurrence may constrain performance of enhanced in-situ bioremediation (EISB). The industrial site contained a 25 to 45 year old trichloroethene (TCE) dense non-aqueous phase liquid (DNAPL) that was exceptionally well monitored via a network of multilevel samplers and high resolution core sampling. Moreover, monitoring was conducted within a 3-sided sheet-pile cell that allowed a controlled streamtube of flow to be drawn through the source zone by an extraction well. We primarily focus on the longitudinal transect of monitoring along the length of the cell that provides a 200 groundwater point sample slice along the streamtube of flow through the DNAPL source zone. TCE dechlorination is shown to be significant throughout the cell domain, but spatially heterogeneous in occurrence and progress of dechlorination to lesser chlorinated ethenes - it is this heterogeneity in dechlorination that we primarily scrutinise. We illustrate the diagnostic use of the relative occurrence of TCE parent and daughter compounds to confirm: dechlorination in close proximity to DNAPL and enhanced during the bioremediation; persistent layers of DNAPL into which gradients of dechlorination products are evident; fast flowpaths through the source zone where dechlorination is less evident; and, the importance of underpinning flow regime understanding on EISB performance. Still, even with such spatial detail, there remains uncertainty over the dataset interpretation. These includes poor closure of mass balance along the cell length for the multilevel sampler based monitoring and points to needs to still understand lateral flows (even in the constrained cell), even greater spatial resolution of point monitoring and potentially, not easily proven, ethene degradation loss.
NASA Astrophysics Data System (ADS)
Ajitanand, N. N.; Phenix Collaboration
2014-11-01
Two-pion interferometry measurements in d +Au and Au + Au collisions at √{sNN} = 200 GeV are used to extract and compare the Gaussian source radii Rout, Rside and Rlong, which characterize the space-time extent of the emission sources. The comparisons, which are performed as a function of collision centrality and the mean transverse momentum for pion pairs, indicate strikingly similar patterns for the d +Au and Au + Au systems. They also indicate a linear dependence of Rside on the initial transverse geometric size R bar , as well as a smaller freeze-out size for the d +Au system. These patterns point to the important role of final-state re-scattering effects in the reaction dynamics of d +Au collisions.
NASA Technical Reports Server (NTRS)
Rao, D. B.; Choudary, U. V.; Erstfeld, T. E.; Williams, R. J.; Chang, Y. A.
1979-01-01
The suitability of existing terrestrial extractive metallurgical processes for the production of Al, Ti, Fe, Mg, and O2 from nonterrestrial resources is examined from both thermodynamic and kinetic points of view. Carbochlorination of lunar anorthite concentrate in conjunction with Alcoa electrolysis process for Al; carbochlorination of lunar ilmenite concentrate followed by Ca reduction of TiO2; and subsequent reduction of Fe2O3 by H2 for Ti and Fe, respectively, are suggested. Silicothermic reduction of olivine concentrate was found to be attractive for the extraction of Mg becaue of the technological knowhow of the process. Aluminothermic reduction of olivine is the other possible alternative for the production of magnesium. The large quantities of carbon monoxide generated in the metal extraction processes can be used to recover carbon and oxygen by a combination of the following methods: (1) simple disproportionation of CO,(2) methanation of CO and electrolysis of H2O, and (3) solid-state electrolysis of gas mixtures containing CO, CO2, and H2O. The research needed for the adoption of earth-based extraction processes for lunar and asteroidal minerals is outlined.
Method for contour extraction for object representation
Skourikhine, Alexei N.; Prasad, Lakshman
2005-08-30
Contours are extracted for representing a pixelated object in a background pixel field. An object pixel is located that is the start of a new contour for the object and identifying that pixel as the first pixel of the new contour. A first contour point is then located on the mid-point of a transition edge of the first pixel. A tracing direction from the first contour point is determined for tracing the new contour. Contour points on mid-points of pixel transition edges are sequentially located along the tracing direction until the first contour point is again encountered to complete tracing the new contour. The new contour is then added to a list of extracted contours that represent the object. The contour extraction process associates regions and contours by labeling all the contours belonging to the same object with the same label.
The critical distance in laser-induced plasmas: an operative definition
NASA Astrophysics Data System (ADS)
Delle Side, D.; Giuffreda, E.; Nassisi, V.
2016-05-01
We propose a method to estimate a precise value for the critical distance Lcr after which three-body recombination stops to produce charge losses in an expanding laser-induced plasma. We show in particular that the total charge collected has a ``reversed sigmoid'' shape as a function of the target-to-detector distance. Fitting the total charge data with a logistic related function, we could consider as Lcr the intercept of the tangent to this curve in its inflection point. Furthermore, this value scales well with theoretical predictions. From the application point of view, this could be of great practical interest, since it provide a reliable way to precisely determine the geometry of the extraction system in Laser Ion Sources.
Fromm, Matthias; Bayha, Sandra; Carle, Reinhold; Kammerer, Dietmar R
2012-02-08
The phenolic constituents of seeds of 12 different apple cultivars were fractionated by sequential extraction with aqueous acetone (30:70, v/v) and ethyl acetate after hexane extraction of the lipids. Low molecular weight phenolic compounds were individually quantitated by RP-HPLC-DAD. The contents of extractable and nonextractable procyanidins were determined by applying RP-HPLC following thiolysis and n-butanol/HCl hydrolysis, respectively. As expected, the results revealed marked differences of the ethyl acetate extracts, aqueous acetone extracts, and insoluble residues with regard to contents and mean degrees of polymerization of procyanidins. Total phenolic contents in the defatted apple seed residues ranged between 18.4 and 99.8 mg/g. Phloridzin was the most abundant phenolic compound, representing 79-92% of monomeric polyphenols. Yields of phenolic compounds significantly differed among the cultivars under study, with seeds of cider apples generally being richer in phloridzin and catechins than seeds of dessert apple cultivars. This is the first study presenting comprehensive data on the contents of phenolic compounds in apple seeds comprising extractable and nonextractable procyanidins. Furthermore, the present work points out a strategy for the sustainable and complete exploitation of apple seeds as valuable agro-industrial byproducts, in particular as a rich source of phloridzin and antioxidant flavanols.
Extracting cross sections and water levels of vegetated ditches from LiDAR point clouds
NASA Astrophysics Data System (ADS)
Roelens, Jennifer; Dondeyne, Stefaan; Van Orshoven, Jos; Diels, Jan
2016-12-01
The hydrologic response of a catchment is sensitive to the morphology of the drainage network. Dimensions of bigger channels are usually well known, however, geometrical data for man-made ditches is often missing as there are many and small. Aerial LiDAR data offers the possibility to extract these small geometrical features. Analysing the three-dimensional point clouds directly will maintain the highest degree of information. A longitudinal and cross-sectional buffer were used to extract the cross-sectional profile points from the LiDAR point cloud. The profile was represented by spline functions fitted through the minimum envelop of the extracted points. The cross-sectional ditch profiles were classified for the presence of water and vegetation based on the normalized difference water index and the spatial characteristics of the points along the profile. The normalized difference water index was created using the RGB and intensity data coupled to the LiDAR points. The mean vertical deviation of 0.14 m found between the extracted and reference cross sections could mainly be attributed to the occurrence of water and partly to vegetation on the banks. In contrast to the cross-sectional area, the extracted width was not influenced by the environment (coefficient of determination R2 = 0.87). Water and vegetation influenced the extracted ditch characteristics, but the proposed method is still robust and therefore facilitates input data acquisition and improves accuracy of spatially explicit hydrological models.
Tiwari, Mayank; Gupta, Bhupendra
2018-04-01
For source camera identification (SCI), photo response non-uniformity (PRNU) has been widely used as the fingerprint of the camera. The PRNU is extracted from the image by applying a de-noising filter then taking the difference between the original image and the de-noised image. However, it is observed that intensity-based features and high-frequency details (edges and texture) of the image, effect quality of the extracted PRNU. This effects correlation calculation and creates problems in SCI. For solving this problem, we propose a weighting function based on image features. We have experimentally identified image features (intensity and high-frequency contents) effect on the estimated PRNU, and then develop a weighting function which gives higher weights to image regions which give reliable PRNU and at the same point it gives comparatively less weights to the image regions which do not give reliable PRNU. Experimental results show that the proposed weighting function is able to improve the accuracy of SCI up to a great extent. Copyright © 2018 Elsevier B.V. All rights reserved.
When Dijkstra Meets Vanishing Point: A Stereo Vision Approach for Road Detection.
Zhang, Yigong; Su, Yingna; Yang, Jian; Ponce, Jean; Kong, Hui
2018-05-01
In this paper, we propose a vanishing-point constrained Dijkstra road model for road detection in a stereo-vision paradigm. First, the stereo-camera is used to generate the u- and v-disparity maps of road image, from which the horizon can be extracted. With the horizon and ground region constraints, we can robustly locate the vanishing point of road region. Second, a weighted graph is constructed using all pixels of the image, and the detected vanishing point is treated as the source node of the graph. By computing a vanishing-point constrained Dijkstra minimum-cost map, where both disparity and gradient of gray image are used to calculate cost between two neighbor pixels, the problem of detecting road borders in image is transformed into that of finding two shortest paths that originate from the vanishing point to two pixels in the last row of image. The proposed approach has been implemented and tested over 2600 grayscale images of different road scenes in the KITTI data set. The experimental results demonstrate that this training-free approach can detect horizon, vanishing point, and road regions very accurately and robustly. It can achieve promising performance.
Developing a system for blind acoustic source localization and separation
NASA Astrophysics Data System (ADS)
Kulkarni, Raghavendra
This dissertation presents innovate methodologies for locating, extracting, and separating multiple incoherent sound sources in three-dimensional (3D) space; and applications of the time reversal (TR) algorithm to pinpoint the hyper active neural activities inside the brain auditory structure that are correlated to the tinnitus pathology. Specifically, an acoustic modeling based method is developed for locating arbitrary and incoherent sound sources in 3D space in real time by using a minimal number of microphones, and the Point Source Separation (PSS) method is developed for extracting target signals from directly measured mixed signals. Combining these two approaches leads to a novel technology known as Blind Sources Localization and Separation (BSLS) that enables one to locate multiple incoherent sound signals in 3D space and separate original individual sources simultaneously, based on the directly measured mixed signals. These technologies have been validated through numerical simulations and experiments conducted in various non-ideal environments where there are non-negligible, unspecified sound reflections and reverberation as well as interferences from random background noise. Another innovation presented in this dissertation is concerned with applications of the TR algorithm to pinpoint the exact locations of hyper-active neurons in the brain auditory structure that are directly correlated to the tinnitus perception. Benchmark tests conducted on normal rats have confirmed the localization results provided by the TR algorithm. Results demonstrate that the spatial resolution of this source localization can be as high as the micrometer level. This high precision localization may lead to a paradigm shift in tinnitus diagnosis, which may in turn produce a more cost-effective treatment for tinnitus than any of the existing ones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Draganic, I. N., E-mail: draganic@lanl.gov
Basic vacuum calculations were performed for various operating conditions of the Los Alamos National Neutron Science H{sup −} Cockcroft-Walton (CW) injector and the Ion Source Test Stand (ISTS). The vacuum pressure was estimated for both the CW and ISTS at five different points: (1) inside the H{sup −} ion source, (2) in front of the Pierce electrode, (3) at the extraction electrode, (4) at the column electrode, and (5) at the ground electrode. A static vacuum analysis of residual gases and the working hydrogen gas was completed for the normal ion source working regime. Gas density and partial pressure weremore » estimated for the injected hydrogen gas. The attenuation of H{sup −} beam current and generation of electron current in the high voltage acceleration columns and low energy beam transport lines were calculated. The interaction of H{sup −} ions on molecular hydrogen (H{sub 2}) is discussed as a dominant collision process in describing electron stripping rates. These results are used to estimate the observed increase in the ratio of electrons to H{sup −} ion beam in the ISTS beam transport line.« less
Terrain Extraction by Integrating Terrestrial Laser Scanner Data and Spectral Information
NASA Astrophysics Data System (ADS)
Lau, C. L.; Halim, S.; Zulkepli, M.; Azwan, A. M.; Tang, W. L.; Chong, A. K.
2015-10-01
The extraction of true terrain points from unstructured laser point cloud data is an important process in order to produce an accurate digital terrain model (DTM). However, most of these spatial filtering methods just utilizing the geometrical data to discriminate the terrain points from nonterrain points. The point cloud filtering method also can be improved by using the spectral information available with some scanners. Therefore, the objective of this study is to investigate the effectiveness of using the three-channel (red, green and blue) of the colour image captured from built-in digital camera which is available in some Terrestrial Laser Scanner (TLS) for terrain extraction. In this study, the data acquisition was conducted at a mini replica landscape in Universiti Teknologi Malaysia (UTM), Skudai campus using Leica ScanStation C10. The spectral information of the coloured point clouds from selected sample classes are extracted for spectral analysis. The coloured point clouds which within the corresponding preset spectral threshold are identified as that specific feature point from the dataset. This process of terrain extraction is done through using developed Matlab coding. Result demonstrates that a higher spectral resolution passive image is required in order to improve the output. This is because low quality of the colour images captured by the sensor contributes to the low separability in spectral reflectance. In conclusion, this study shows that, spectral information is capable to be used as a parameter for terrain extraction.
Efficient Open Source Lidar for Desktop Users
NASA Astrophysics Data System (ADS)
Flanagan, Jacob P.
Lidar --- Light Detection and Ranging --- is a remote sensing technology that utilizes a device similar to a rangefinder to determine a distance to a target. A laser pulse is shot at an object and the time it takes for the pulse to return in measured. The distance to the object is easily calculated using the speed property of light. For lidar, this laser is moved (primarily in a rotational movement usually accompanied by a translational movement) and records the distances to objects several thousands of times per second. From this, a 3 dimensional structure can be procured in the form of a point cloud. A point cloud is a collection of 3 dimensional points with at least an x, a y and a z attribute. These 3 attributes represent the position of a single point in 3 dimensional space. Other attributes can be associated with the points that include properties such as the intensity of the return pulse, the color of the target or even the time the point was recorded. Another very useful, post processed attribute is point classification where a point is associated with the type of object the point represents (i.e. ground.). Lidar has gained popularity and advancements in the technology has made its collection easier and cheaper creating larger and denser datasets. The need to handle this data in a more efficiently manner has become a necessity; The processing, visualizing or even simply loading lidar can be computationally intensive due to its very large size. Standard remote sensing and geographical information systems (GIS) software (ENVI, ArcGIS, etc.) was not originally built for optimized point cloud processing and its implementation is an afterthought and therefore inefficient. Newer, more optimized software for point cloud processing (QTModeler, TopoDOT, etc.) usually lack more advanced processing tools, requires higher end computers and are very costly. Existing open source lidar approaches the loading and processing of lidar in an iterative fashion that requires implementing batch coding and processing time that could take months for a standard lidar dataset. This project attempts to build a software with the best approach for creating, importing and exporting, manipulating and processing lidar, especially in the environmental field. Development of this software is described in 3 sections - (1) explanation of the search methods for efficiently extracting the "area of interest" (AOI) data from disk (file space), (2) using file space (for storage), budgeting memory space (for efficient processing) and moving between the two, and (3) method development for creating lidar products (usually raster based) used in environmental modeling and analysis (i.e.: hydrology feature extraction, geomorphological studies, ecology modeling, etc.).
Montoro, Paola; Maldini, Mariateresa; Luciani, Leonilda; Tuberoso, Carlo I G; Congiu, Francesca; Pizza, Cosimo
2012-08-01
Radical scavenging activities of Crocus sativus petals, stamens and entire flowers, which are waste products in the production of the spice saffron, by employing ABTS radical scavenging method, were determined. At the same time, the metabolic profiles of different extract (obtained by petals, stamens and flowers) were obtained by LC-ESI-IT MS (liquid chromatography coupled with electrospray mass spectrometry equipped with Ion Trap analyser). LC-ESI-MS is a techniques largely used nowadays for qualitative fingerprint of herbal extracts and particularly for phenolic compounds. To compare the different extracts under an analytical point of view a specific method for qualitative LC-MS analysis was developed. The high variety of glycosylated flavonoids found in the metabolic profiles could give value to C. sativus petals, stamens and entire flowers. Waste products obtained during saffron production, could represent an interesting source of phenolic compounds, with respect to the high variety of compounds and their free radical scavenging activity. © 2012 Institute of Food Technologists®
NMR analysis of seven selections of vermentino grape berry: metabolites composition and development.
Mulas, Gilberto; Galaffu, Maria Grazia; Pretti, Luca; Nieddu, Gianni; Mercenaro, Luca; Tonelli, Roberto; Anedda, Roberto
2011-02-09
The goal of this work was to study via NMR the unaltered metabolic profile of Sardinian Vermentino grape berry. Seven selections of Vermentino were harvested from the same vineyard. Berries were stored and extracted following an unbiased extraction protocol. Extracts were analyzed to investigate variability in metabolites concentration as a function of the clone, the position of berries in the bunch or growing area within the vineyard. Quantitative NMR and statistical analysis (PCA, correlation analysis, Anova) of the experimental data point out that, among the investigated sources of variation, the position of the berries within the bunch mainly influences the metabolic profile of berries, while the metabolic profile does not seem to be significantly influenced by growing area and clone. Significant variability of the amino acids such as arginine, proline, and organic acids (malic and citric) characterizes the rapid rearrangements of the metabolic profile in response to environmental stimuli. Finally, an application is described on the analysis of metabolite variation throughout the physiological development of berries.
NASA Astrophysics Data System (ADS)
Lv, Zheng; Sui, Haigang; Zhang, Xilin; Huang, Xianfeng
2007-11-01
As one of the most important geo-spatial objects and military establishment, airport is always a key target in fields of transportation and military affairs. Therefore, automatic recognition and extraction of airport from remote sensing images is very important and urgent for updating of civil aviation and military application. In this paper, a new multi-source data fusion approach on automatic airport information extraction, updating and 3D modeling is addressed. Corresponding key technologies including feature extraction of airport information based on a modified Ostu algorithm, automatic change detection based on new parallel lines-based buffer detection algorithm, 3D modeling based on gradual elimination of non-building points algorithm, 3D change detecting between old airport model and LIDAR data, typical CAD models imported and so on are discussed in detail. At last, based on these technologies, we develop a prototype system and the results show our method can achieve good effects.
Rodsamran, Pattrathip; Sothornvit, Rungsinee
2018-02-15
Coconut cake, a by-product from milk and oil extractions, contains a high amount of protein. Protein extraction from coconut milk cake and coconut oil cake was investigated. The supernatant and precipitate protein powders from both coconut milk and oil cakes were compared based on their physicochemical and functional properties. Glutelin was the predominant protein fraction in both coconut cakes. Protein powders from milk cake presented higher water and oil absorption capacities than those from oil cake. Both protein powders from oil cake exhibited better foaming capacity and a better emulsifying activity index than those from milk cake. Coconut proteins were mostly solubilized in strong acidic and alkaline solutions. Minimum solubility was observed at pH 4, confirming the isoelectric point of coconut protein. Therefore, the coconut residues after extractions might be a potential alternative renewable plant protein source to use asa food ingredient to enhance food nutrition and quality. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mottier, Pascal; Frank, Nancy; Dubois, Mathieu; Tarres, Adrienne; Bessaire, Thomas; Romero, Roman; Delatour, Thierry
2014-01-01
Tris(nonylphenyl)phosphite, an antioxidant used in polyethylene resins for food applications, is problematic since it is a source of the endocrine-disrupting chemicals 4-nonylphenols (4NP) upon migration into packaged foods. As a response to concerns surrounding the presence of 4NP-based compounds in packaging materials, some resin producers and additive suppliers have decided to eliminate TNPP from formulations. This paper describes an analytical procedure to verify the "TNPP-free" statement in multilayer laminates used for bag-in-box packaging. The method involves extraction of TNPP from laminates with organic solvents followed by detection/quantification by LC-MS/MS using the atmospheric pressure chemical ionisation (APCI) mode. A further acidic treatment of the latter extract allows the release of 4NP from potentially extracted TNPP. 4NP is then analysed by LC-MS/MS using electrospray ionisation (ESI) mode. This two-step analytical procedure ensures not only TNPP quantification in laminates, but also allows the flagging of other possible sources of 4NP in such packaging materials, typically as non-intentionally added substances (NIAS). The limits of quantification were 0.50 and 0.48 µg dm⁻² for TNPP and 4NP in laminates, respectively, with recoveries ranging between 87% and 114%. Usage of such analytical methodologies in quality control operations has pointed to a lack of traceability at the packaging supplier level and cross-contamination of extrusion equipment at the converter level, when TNPP-containing laminates are processed on the same machine beforehand.
Favre-Réguillon, Alain; Draye, Micheline; Lebuzit, Gérard; Thomas, Sylvie; Foos, Jacques; Cote, Gérard; Guy, Alain
2004-06-17
Cloud point extraction (CPE) was used to extract and separate lanthanum(III) and gadolinium(III) nitrate from an aqueous solution. The methodology used is based on the formation of lanthanide(III)-8-hydroxyquinoline (8-HQ) complexes soluble in a micellar phase of non-ionic surfactant. The lanthanide(III) complexes are then extracted into the surfactant-rich phase at a temperature above the cloud point temperature (CPT). The structure of the non-ionic surfactant, and the chelating agent-metal molar ratio are identified as factors determining the extraction efficiency and selectivity. In an aqueous solution containing equimolar concentrations of La(III) and Gd(III), extraction efficiency for Gd(III) can reach 96% with a Gd(III)/La(III) selectivity higher than 30 using Triton X-114. Under those conditions, a Gd(III) decontamination factor of 50 is obtained.
A method for automatic feature points extraction of human vertebrae three-dimensional model
NASA Astrophysics Data System (ADS)
Wu, Zhen; Wu, Junsheng
2017-05-01
A method for automatic extraction of the feature points of the human vertebrae three-dimensional model is presented. Firstly, the statistical model of vertebrae feature points is established based on the results of manual vertebrae feature points extraction. Then anatomical axial analysis of the vertebrae model is performed according to the physiological and morphological characteristics of the vertebrae. Using the axial information obtained from the analysis, a projection relationship between the statistical model and the vertebrae model to be extracted is established. According to the projection relationship, the statistical model is matched with the vertebrae model to get the estimated position of the feature point. Finally, by analyzing the curvature in the spherical neighborhood with the estimated position of feature points, the final position of the feature points is obtained. According to the benchmark result on multiple test models, the mean relative errors of feature point positions are less than 5.98%. At more than half of the positions, the error rate is less than 3% and the minimum mean relative error is 0.19%, which verifies the effectiveness of the method.
Comparison of Moringa Oleifera seeds oil characterization produced chemically and mechanically
NASA Astrophysics Data System (ADS)
Eman, N. A.; Muhamad, K. N. S.
2016-06-01
It is established that virtually every part of the Moringa oleifera tree (leaves, stem, bark, root, flowers, seeds, and seeds oil) are beneficial in some way with great benefits to human being. The tree is rich in proteins, vitamins, minerals. All Moringa oleifera food products have a very high nutritional value. They are eaten directly as food, as supplements, and as seasonings as well as fodder for animals. The purpose of this research is to investigate the effect of seeds particle size on oil extraction using chemical method (solvent extraction). Also, to compare Moringa oleifera seeds oil properties which are produced chemically (solvent extraction) and mechanically (mechanical press). The Moringa oleifera seeds were grinded, sieved, and the oil was extracted using soxhlet extraction technique with n-Hexane using three different size of sample (2mm, 1mm, and 500μm). The average oil yield was 36.1%, 40.80%, and 41.5% for 2mm, 1mm, and 500μm particle size, respectively. The properties of Moringa oleifera seeds oil were: density of 873 kg/m3, and 880 kg/m3, kinematic viscosity of 42.2mm2/s and 9.12mm2/s for the mechanical and chemical method, respectively. pH, cloud point and pour point were same for oil produced with both methods which is 6, 18°C and 12°C, respectively. For the fatty acids, the oleic acid is present with high percentage of 75.39%, and 73.60% from chemical and mechanical method, respectively. Other fatty acids are present as well in both samples which are (Gadoleic acid, Behenic acid, Palmitic acid) which are with lower percentage of 2.54%, 5.83%, and 5.73%, respectively in chemical method oil, while they present as 2.40%, 6.73%, and 6.04%, respectively in mechanical method oil. In conclusion, the results showed that both methods can produce oil with high quality. Moringa oleifera seeds oil appear to be an acceptable good source for oil rich in oleic acid which is equal to olive oil quality, that can be consumed in Malaysia where the olive oil is imported with high prices. In the same time cultivation of Moringa oleifera tree is considered to be a new source of income for the country and give more job opportunities.
Obtention and characterization of phenolic extracts from different cocoa sources.
Ortega, Nàdia; Romero, Maria-Paz; Macià, Alba; Reguant, Jordi; Anglès, Neus; Morelló, José-Ramón; Motilva, Maria-Jose
2008-10-22
The aim of this study was to evaluate several cocoa sources to obtain a rich phenol extract for use as an ingredient in the food industry. Two types of phenolic extracts, complete and purified, from different cocoa sources (beans, nibs, liquor, and cocoa powder) were investigated. UPLC-MS/MS was used to identify and quantify the phenolic composition of the extracts, and the Folin-Ciocalteu and vanillin assays were used to determine the total phenolic and flavan-3-ol contents, respectively. The DPPH and ORAC assays were used to measure their antioxidant activity. The results of the analysis of the composition of the extracts revealed that the major fraction was procyanidins, followed by flavones and phenolic acids. From the obtained results, the nib could be considered the most interesting source for obtaining a rich phenolic cocoa extract because of its rich phenolic profile content and high antioxidant activity in comparison with the other cocoa sources.
NASA Astrophysics Data System (ADS)
Mangeney, A.; Kuehnert, J.; Capdeville, Y.; Durand, V.; Stutzmann, E.; Kone, E. H.; Sethi, S.
2017-12-01
During their flow along the topography, landslides generate seismic waves in a wide frequency range. These so called landquakes can be recorded at very large distances (a few hundreds of km for large landslides). The recorded signals depend on the landslide seismic source and the seismic wave propagation. If the wave propagation is well understood, the seismic signals can be inverted for the seismic source and thus can be used to get information on the landslide properties and dynamics. Analysis and modeling of long period seismic signals (10-150s) have helped in this way to discriminate between different landslide scenarios and to constrain rheological parameters (e.g. Favreau et al., 2010). This was possible as topography poorly affects wave propagation at these long periods and the landslide seismic source can be approximated as a point source. In the near-field and at higher frequencies (> 1 Hz) the spatial extent of the source has to be taken into account and the influence of the topography on the recorded seismic signal should be quantified in order to extract information on the landslide properties and dynamics. The characteristic signature of distributed sources and varying topographies is studied as a function of frequency and recording distance.The time dependent spatial distribution of the forces applied to the ground by the landslide are obtained using granular flow numerical modeling on 3D topography. The generated seismic waves are simulated using the spectral element method. The simulated seismic signal is compared to observed seismic data from rockfalls at the Dolomieu Crater of Piton de la Fournaise (La Réunion).Favreau, P., Mangeney, A., Lucas, A., Crosta, G., and Bouchut, F. (2010). Numerical modeling of landquakes. Geophysical Research Letters, 37(15):1-5.
Development open source microcontroller based temperature data logger
NASA Astrophysics Data System (ADS)
Abdullah, M. H.; Che Ghani, S. A.; Zaulkafilai, Z.; Tajuddin, S. N.
2017-10-01
This article discusses the development stages in designing, prototyping, testing and deploying a portable open source microcontroller based temperature data logger for use in rough industrial environment. The 5V powered prototype of data logger is equipped with open source Arduino microcontroller for integrating multiple thermocouple sensors with their module, secure digital (SD) card storage, liquid crystal display (LCD), real time clock and electronic enclosure made of acrylic. The program for the function of the datalogger is programmed so that 8 readings from the thermocouples can be acquired within 3 s interval and displayed on the LCD simultaneously. The recorded temperature readings at four different points on both hydrodistillation show similar profile pattern and highest yield of extracted oil was achieved on hydrodistillation 2 at 0.004%. From the obtained results, this study achieved the objective of developing an inexpensive, portable and robust eight channels temperature measuring module with capabilities to monitor and store real time data.
Creation and Delivery of New Superpixelized DIRBE Map Products
NASA Technical Reports Server (NTRS)
Weiland, J.
1998-01-01
Phase 1 called for the following tasks: (1) completion of code to generate intermediate files containing the individual DIRBE observations which would be used to make the superpixelized maps; (2) completion of code necessary to generate the maps themselves; and (3) quality control on test-case maps in the form of point-source extraction and photometry. Items 1 and 2 are well in hand and the tested code is nearly complete. A few test maps have been generated for the tests mentioned in item 3. Map generation is not in production mode yet.
NASA Astrophysics Data System (ADS)
Franzen, P.; Gutser, R.; Fantz, U.; Kraus, W.; Falter, H.; Fröschle, M.; Heinemann, B.; McNeely, P.; Nocentini, R.; Riedl, R.; Stäbler, A.; Wünderlich, D.
2011-07-01
The ITER neutral beam system requires a negative hydrogen ion beam of 48 A with an energy of 0.87 MeV, and a negative deuterium beam of 40 A with an energy of 1 MeV. The beam is extracted from a large ion source of dimension 1.9 × 0.9 m2 by an acceleration system consisting of seven grids with 1280 apertures each. Currently, apertures with a diameter of 14 mm in the first grid are foreseen. In 2007, the IPP RF source was chosen as the ITER reference source due to its reduced maintenance compared with arc-driven sources and the successful development at the BATMAN test facility of being equipped with the small IPP prototype RF source ( {\\sim}\\frac{1}{8} of the area of the ITER NBI source). These results, however, were obtained with an extraction system with 8 mm diameter apertures. This paper reports on the comparison of the source performance at BATMAN of an ITER-relevant extraction system equipped with chamfered apertures with a 14 mm diameter and 8 mm diameter aperture extraction system. The most important result is that there is almost no difference in the achieved current density—being consistent with ion trajectory calculations—and the amount of co-extracted electrons. Furthermore, some aspects of the beam optics of both extraction systems are discussed.
Csf Based Non-Ground Points Extraction from LIDAR Data
NASA Astrophysics Data System (ADS)
Shen, A.; Zhang, W.; Shi, H.
2017-09-01
Region growing is a classical method of point cloud segmentation. Based on the idea of collecting the pixels with similar properties to form regions, region growing is widely used in many fields such as medicine, forestry and remote sensing. In this algorithm, there are two core problems. One is the selection of seed points, the other is the setting of the growth constraints, in which the selection of the seed points is the foundation. In this paper, we propose a CSF (Cloth Simulation Filtering) based method to extract the non-ground seed points effectively. The experiments have shown that this method can obtain a group of seed spots compared with the traditional methods. It is a new attempt to extract seed points
LIDAR Point Cloud Data Extraction and Establishment of 3D Modeling of Buildings
NASA Astrophysics Data System (ADS)
Zhang, Yujuan; Li, Xiuhai; Wang, Qiang; Liu, Jiang; Liang, Xin; Li, Dan; Ni, Chundi; Liu, Yan
2018-01-01
This paper takes the method of Shepard’s to deal with the original LIDAR point clouds data, and generate regular grid data DSM, filters the ground point cloud and non ground point cloud through double least square method, and obtains the rules of DSM. By using region growing method for the segmentation of DSM rules, the removal of non building point cloud, obtaining the building point cloud information. Uses the Canny operator to extract the image segmentation is needed after the edges of the building, uses Hough transform line detection to extract the edges of buildings rules of operation based on the smooth and uniform. At last, uses E3De3 software to establish the 3D model of buildings.
Ghelichi, Sakhi; Shabanpour, Bahareh; Pourashouri, Parastoo; Hajfathalian, Mona; Jacobsen, Charlotte
2018-03-01
Common carp roe is a rich protein and oil source, which is usually discarded with no specific use. The aims of this study were to extract oil from the discarded roe and examine functional, antioxidant, and antibacterial properties of defatted roe hydrolysates (CDRHs) at various degrees of hydrolysis (DH). Gas chromatography of fatty acid methyl esters revealed that common carp roe oil contained high levels of unsaturated fatty acids. The results of high-performance liquid chromatography-mass spectrometry indicated that enzymatic hydrolysis of defatted roe yielded higher content of essential amino acids. CDRHs displayed higher solubility than untreated defatted roe, which increased with DH. Better emulsifying and foaming properties were observed at lower DH and non-isoelectric points. Furthermore, water and oil binding capacity decreased with DH. CDRHs exhibited antioxidant activity both in vitro and in 5% roe oil-in-water emulsions and inhibited the growth of certain bacterial strains. Common carp roe could be a promising source of unsaturated fatty acids and functional bioactive agents. Unsaturated fatty acid-rich oil extracted from common carp roe can be delivered into food systems by roe oil-in-water emulsions fortified by functional, antioxidant, and antibacterial hydrolysates from the defatted roe. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Roads Data Conflation Using Update High Resolution Satellite Images
NASA Astrophysics Data System (ADS)
Abdollahi, A.; Riyahi Bakhtiari, H. R.
2017-11-01
Urbanization, industrialization and modernization are rapidly growing in developing countries. New industrial cities, with all the problems brought on by rapid population growth, need infrastructure to support the growth. This has led to the expansion and development of the road network. A great deal of road network data has made by using traditional methods in the past years. Over time, a large amount of descriptive information has assigned to these map data, but their geometric accuracy and precision is not appropriate to today's need. In this regard, the improvement of the geometric accuracy of road network data by preserving the descriptive data attributed to them and updating of the existing geo databases is necessary. Due to the size and extent of the country, updating the road network maps using traditional methods is time consuming and costly. Conversely, using remote sensing technology and geographic information systems can reduce costs, save time and increase accuracy and speed. With increasing the availability of high resolution satellite imagery and geospatial datasets there is an urgent need to combine geographic information from overlapping sources to retain accurate data, minimize redundancy, and reconcile data conflicts. In this research, an innovative method for a vector-to-imagery conflation by integrating several image-based and vector-based algorithms presented. The SVM method for image classification and Level Set method used to extract the road the different types of road intersections extracted from imagery using morphological operators. For matching the extracted points and to find the corresponding points, matching function which uses the nearest neighborhood method was applied. Finally, after identifying the matching points rubber-sheeting method used to align two datasets. Two residual and RMSE criteria used to evaluate accuracy. The results demonstrated excellent performance. The average root-mean-square error decreased from 11.8 to 4.1 m.
Chemicals identified in human biological media: a data base. Third annual report, October 1981
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cone, M.V.; Baldauf, M.F.; Martin, F.M.
1981-12-01
Part 2 contains the data base in tabular format. There are two sections, the first with records on nondrug substances, and the second with records on drugs. Chemicals in each section are arranged alphabetically by CAS preferred name, CAS registry number, formula, atomic weight, melting point, boiling point, and vapor pressure. Tissues are listed alphabetically with exposure route, analytical method, number of cases, range, and mean - when available in the source document. A variety of information may also be included that is pertinent to the range and mean as well as experimental design, demography, health effects, pathology, morphology, andmore » toxicity. Review articles are included in the data base; however, no data have been extracted from such documents because the original research articles are included.« less
Obtaining lutein-rich extract from microalgal biomass at preparative scale.
Fernández-Sevilla, José M; Fernández, F Gabriel Acién; Grima, Emilio Molina
2012-01-01
Lutein extracts are in increasing demand due to their alleged role in the prevention of degenerative disorders such as age-related macular degeneration (AMD). Lutein extracts are currently obtained from plant sources, but microalgae have been demonstrated to be a competitive source likely to become an alternative. The extraction of lutein from microalgae posesses specific problems that arise from the different structure and composition of the source biomass. Here is presented a method for the recovery of lutein-rich carotenoid extracts from microalgal biomass in the kilogram scale.
A CMB foreground study in WMAP data: Extragalactic point sources and zodiacal light emission
NASA Astrophysics Data System (ADS)
Chen, Xi
The Cosmic Microwave Background (CMB) radiation is the remnant heat from the Big Bang. It serves as a primary tool to understand the global properties, content and evolution of the universe. Since 2001, NASA's Wilkinson Microwave Anisotropy Probe (WMAP) satellite has been napping the full sky anisotropy with unprecedented accuracy, precision and reliability. The CMB angular power spectrum calculated from the WMAP full sky maps not only enables accurate testing of cosmological models, but also places significant constraints on model parameters. The CMB signal in the WMAP sky maps is contaminated by microwave emission from the Milky Way and from extragalactic sources. Therefore, in order to use the maps reliably for cosmological studies, the foreground signals must be well understood and removed from the maps. This thesis focuses on the separation of two foreground contaminants from the WMAP maps: extragalactic point sources and zodiacal light emission. Extragalactic point sources constitute the most important foreground on small angular scales. Various methods have been applied to the WMAP single frequency maps to extract sources. However, due to the limited angular resolution of WMAP, it is possible to confuse positive CMB excursions with point sources or miss sources that are embedded in negative CMB fluctuations. We present a novel CMB-free source finding technique that utilizes the spectrum difference of point sources and CMB to form internal linear combinations of multifrequency maps to suppress the CMB and better reveal sources. When applied to the WMAP 41, 64 and 94 GHz maps, this technique has not only enabled detection of sources that are previously cataloged by independent methods, but also allowed disclosure of new sources. Without the noise contribution from the CMB, this method responds rapidly with the integration time. The number of detections varies as 0( t 0.72 in the two-band search and 0( t 0.70 in the three-band search from one year to five years, separately, in comparison to t 0.40 from the WMAP catalogs. Our source catalogs are a good supplement to the existing WMAP source catalogs, and the method itself is proven to be both complementary to and competitive with all the current source finding techniques in WMAP maps. Scattered light and thermal emission from the interplanetary dust (IPD) within our Solar System are major contributors to the diffuse sky brightness at most infrared wavelengths. For wavelengths longer than 3.5 mm, the thermal emission of the IPD dominates over scattering, and the emission is often referred to as the Zodiacal Light Emission (ZLE). To set a limit of ZLE contribution to the WMAP data, we have performed a simultaneous fit of the yearly WMAP time-ordered data to the time variation of ZLE predicted by the DIRBE IPD model (Kelsallet al. 1998) evaluated at 240 mm, plus [cursive l] = 1 - 4 CMB components. It is found that although this fitting procedure can successfully recover the CMB dipole to a 0.5% accuracy, it is not sensitive enough to determine the ZLE signal nor the other multipole moments very accurately.
Hybrid Automatic Building Interpretation System
NASA Astrophysics Data System (ADS)
Pakzad, K.; Klink, A.; Müterthies, A.; Gröger, G.; Stroh, V.; Plümer, L.
2011-09-01
HABIS (Hybrid Automatic Building Interpretation System) is a system for an automatic reconstruction of building roofs used in virtual 3D building models. Unlike most of the commercially available systems, HABIS is able to work to a high degree automatically. The hybrid method uses different sources intending to exploit the advantages of the particular sources. 3D point clouds usually provide good height and surface data, whereas spatial high resolution aerial images provide important information for edges and detail information for roof objects like dormers or chimneys. The cadastral data provide important basis information about the building ground plans. The approach used in HABIS works with a multi-stage-process, which starts with a coarse roof classification based on 3D point clouds. After that it continues with an image based verification of these predicted roofs. In a further step a final classification and adjustment of the roofs is done. In addition some roof objects like dormers and chimneys are also extracted based on aerial images and added to the models. In this paper the used methods are described and some results are presented.
The Chandra Source Catalog: Source Properties and Data Products
NASA Astrophysics Data System (ADS)
Rots, Arnold; Evans, Ian N.; Glotfelty, Kenny J.; Primini, Francis A.; Zografou, Panagoula; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.
2009-09-01
The Chandra Source Catalog (CSC) is breaking new ground in several areas. There are two aspects that are of particular interest to the users: its evolution and its contents. The CSC will be a living catalog that becomes richer, bigger, and better in time while still remembering its state at each point in time. This means that users will be able to take full advantage of new additions to the catalog, while retaining the ability to back-track and return to what was extracted in the past. The CSC sheds the limitations of flat-table catalogs. Its sources will be characterized by a large number of properties, as usual, but each source will also be associated with its own specific data products, allowing users to perform mini custom analysis on the sources. Source properties fall in the spatial (position, extent), photometric (fluxes, count rates), spectral (hardness ratios, standard spectral fits), and temporal (variability probabilities) domains, and are all accompanied by error estimates. Data products cover the same coordinate space and include event lists, images, spectra, and light curves. In addition, the catalog contains data products covering complete observations: event lists, background images, exposure maps, etc. This work is supported by NASA contract NAS8-03060 (CXC).
GPU surface extraction using the closest point embedding
NASA Astrophysics Data System (ADS)
Kim, Mark; Hansen, Charles
2015-01-01
Isosurface extraction is a fundamental technique used for both surface reconstruction and mesh generation. One method to extract well-formed isosurfaces is a particle system; unfortunately, particle systems can be slow. In this paper, we introduce an enhanced parallel particle system that uses the closest point embedding as the surface representation to speedup the particle system for isosurface extraction. The closest point embedding is used in the Closest Point Method (CPM), a technique that uses a standard three dimensional numerical PDE solver on two dimensional embedded surfaces. To fully take advantage of the closest point embedding, it is coupled with a Barnes-Hut tree code on the GPU. This new technique produces well-formed, conformal unstructured triangular and tetrahedral meshes from labeled multi-material volume datasets. Further, this new parallel implementation of the particle system is faster than any known methods for conformal multi-material mesh extraction. The resulting speed-ups gained in this implementation can reduce the time from labeled data to mesh from hours to minutes and benefits users, such as bioengineers, who employ triangular and tetrahedral meshes
Diagnosing malignant melanoma in ambulatory care: a systematic review of clinical prediction rules
Harrington, Emma; Clyne, Barbara; Wesseling, Nieneke; Sandhu, Harkiran; Armstrong, Laura; Bennett, Holly; Fahey, Tom
2017-01-01
Objectives Malignant melanoma has high morbidity and mortality rates. Early diagnosis improves prognosis. Clinical prediction rules (CPRs) can be used to stratify patients with symptoms of suspected malignant melanoma to improve early diagnosis. We conducted a systematic review of CPRs for melanoma diagnosis in ambulatory care. Design Systematic review. Data sources A comprehensive search of PubMed, EMBASE, PROSPERO, CINAHL, the Cochrane Library and SCOPUS was conducted in May 2015, using combinations of keywords and medical subject headings (MeSH) terms. Study selection and data extraction Studies deriving and validating, validating or assessing the impact of a CPR for predicting melanoma diagnosis in ambulatory care were included. Data extraction and methodological quality assessment were guided by the CHARMS checklist. Results From 16 334 studies reviewed, 51 were included, validating the performance of 24 unique CPRs. Three impact analysis studies were identified. Five studies were set in primary care. The most commonly evaluated CPRs were the ABCD, more than one or uneven distribution of Colour, or a large (greater than 6 mm) Diameter (ABCD) dermoscopy rule (at a cut-point of >4.75; 8 studies; pooled sensitivity 0.85, 95% CI 0.73 to 0.93, specificity 0.72, 95% CI 0.65 to 0.78) and the 7-point dermoscopy checklist (at a cut-point of ≥1 recommending ruling in melanoma; 11 studies; pooled sensitivity 0.77, 95% CI 0.61 to 0.88, specificity 0.80, 95% CI 0.59 to 0.92). The methodological quality of studies varied. Conclusions At their recommended cut-points, the ABCD dermoscopy rule is more useful for ruling out melanoma than the 7-point dermoscopy checklist. A focus on impact analysis will help translate melanoma risk prediction rules into useful tools for clinical practice. PMID:28264830
NASA Astrophysics Data System (ADS)
Wünderlich, D.; Mochalskyy, S.; Montellano, I. M.; Revel, A.
2018-05-01
Particle-in-cell (PIC) codes are used since the early 1960s for calculating self-consistently the motion of charged particles in plasmas, taking into account external electric and magnetic fields as well as the fields created by the particles itself. Due to the used very small time steps (in the order of the inverse plasma frequency) and mesh size, the computational requirements can be very high and they drastically increase with increasing plasma density and size of the calculation domain. Thus, usually small computational domains and/or reduced dimensionality are used. In the last years, the available central processing unit (CPU) power strongly increased. Together with a massive parallelization of the codes, it is now possible to describe in 3D the extraction of charged particles from a plasma, using calculation domains with an edge length of several centimeters, consisting of one extraction aperture, the plasma in direct vicinity of the aperture, and a part of the extraction system. Large negative hydrogen or deuterium ion sources are essential parts of the neutral beam injection (NBI) system in future fusion devices like the international fusion experiment ITER and the demonstration reactor (DEMO). For ITER NBI RF driven sources with a source area of 0.9 × 1.9 m2 and 1280 extraction apertures will be used. The extraction of negative ions is accompanied by the co-extraction of electrons which are deflected onto an electron dump. Typically, the maximum negative extracted ion current is limited by the amount and the temporal instability of the co-extracted electrons, especially for operation in deuterium. Different PIC codes are available for the extraction region of large driven negative ion sources for fusion. Additionally, some effort is ongoing in developing codes that describe in a simplified manner (coarser mesh or reduced dimensionality) the plasma of the whole ion source. The presentation first gives a brief overview of the current status of the ion source development for ITER NBI and of the PIC method. Different PIC codes for the extraction region are introduced as well as the coupling to codes describing the whole source (PIC codes or fluid codes). Presented and discussed are different physical and numerical aspects of applying PIC codes to negative hydrogen ion sources for fusion as well as selected code results. The main focus of future calculations will be the meniscus formation and identifying measures for reducing the co-extracted electrons, in particular for deuterium operation. The recent results of the 3D PIC code ONIX (calculation domain: one extraction aperture and its vicinity) for the ITER prototype source (1/8 size of the ITER NBI source) are presented.
Emissions Estimation from Satellite Retrievals: a Review of Current Capability
NASA Technical Reports Server (NTRS)
Streets, David; Canty, Timothy; Carmichael, Gregory R.; deFoy, Benjamin; Dickerson, Russell R.; Duncan, Bryan N.; Edwards, David P.; Haynes, John A.; Henze, Daven K.; Houyoux, Marc R.;
2013-01-01
Since the mid-1990s a new generation of Earth-observing satellites has been able to detect tropospheric air pollution at increasingly high spatial and temporal resolution. Most primary emitted species can be measured by one or more of the instruments. This review article addresses the question of how well we can relate the satellite measurements to quantification of primary emissions and what advances are needed to improve the usability of the measurements by U.S. air quality managers. Built on a comprehensive literature review and comprising input by both satellite experts and emission inventory specialists, the review identifies several targets that seem promising: large point sources of NOx and SO2, species that are difficult to measure by other means (NH3 and CH4, for example), area sources that cannot easily be quantified by traditional bottom-up methods (such as unconventional oil and gas extraction, shipping, biomass burning, and biogenic sources), and the temporal variation of emissions (seasonal, diurnal, episodic). Techniques that enhance the usefulness of current retrievals (data assimilation, oversampling, multi-species retrievals, improved vertical profiles, etc.) are discussed. Finally, we point out the value of having new geostationary satellites like GEO-CAPE and TEMPO over North America that could provide measurements at high spatial (few km) and temporal (hourly) resolution.
Emissions estimation from satellite retrievals: A review of current capability
NASA Astrophysics Data System (ADS)
Streets, David G.; Canty, Timothy; Carmichael, Gregory R.; de Foy, Benjamin; Dickerson, Russell R.; Duncan, Bryan N.; Edwards, David P.; Haynes, John A.; Henze, Daven K.; Houyoux, Marc R.; Jacob, Daniel J.; Krotkov, Nickolay A.; Lamsal, Lok N.; Liu, Yang; Lu, Zifeng; Martin, Randall V.; Pfister, Gabriele G.; Pinder, Robert W.; Salawitch, Ross J.; Wecht, Kevin J.
2013-10-01
Since the mid-1990s a new generation of Earth-observing satellites has been able to detect tropospheric air pollution at increasingly high spatial and temporal resolution. Most primary emitted species can be measured by one or more of the instruments. This review article addresses the question of how well we can relate the satellite measurements to quantification of primary emissions and what advances are needed to improve the usability of the measurements by U.S. air quality managers. Built on a comprehensive literature review and comprising input by both satellite experts and emission inventory specialists, the review identifies several targets that seem promising: large point sources of NOx and SO2, species that are difficult to measure by other means (NH3 and CH4, for example), area sources that cannot easily be quantified by traditional bottom-up methods (such as unconventional oil and gas extraction, shipping, biomass burning, and biogenic sources), and the temporal variation of emissions (seasonal, diurnal, episodic). Techniques that enhance the usefulness of current retrievals (data assimilation, oversampling, multi-species retrievals, improved vertical profiles, etc.) are discussed. Finally, we point out the value of having new geostationary satellites like GEO-CAPE and TEMPO over North America that could provide measurements at high spatial (few km) and temporal (hourly) resolution.
A TV Camera System Which Extracts Feature Points For Non-Contact Eye Movement Detection
NASA Astrophysics Data System (ADS)
Tomono, Akira; Iida, Muneo; Kobayashi, Yukio
1990-04-01
This paper proposes a highly efficient camera system which extracts, irrespective of background, feature points such as the pupil, corneal reflection image and dot-marks pasted on a human face in order to detect human eye movement by image processing. Two eye movement detection methods are sugested: One utilizing face orientation as well as pupil position, The other utilizing pupil and corneal reflection images. A method of extracting these feature points using LEDs as illumination devices and a new TV camera system designed to record eye movement are proposed. Two kinds of infra-red LEDs are used. These LEDs are set up a short distance apart and emit polarized light of different wavelengths. One light source beams from near the optical axis of the lens and the other is some distance from the optical axis. The LEDs are operated in synchronization with the camera. The camera includes 3 CCD image pick-up sensors and a prism system with 2 boundary layers. Incident rays are separated into 2 wavelengths by the first boundary layer of the prism. One set of rays forms an image on CCD-3. The other set is split by the half-mirror layer of the prism and forms an image including the regularly reflected component by placing a polarizing filter in front of CCD-1 or another image not including the component by not placing a polarizing filter in front of CCD-2. Thus, three images with different reflection characteristics are obtained by three CCDs. Through the experiment, it is shown that two kinds of subtraction operations between the three images output from CCDs accentuate three kinds of feature points: the pupil and corneal reflection images and the dot-marks. Since the S/N ratio of the subtracted image is extremely high, the thresholding process is simple and allows reducting the intensity of the infra-red illumination. A high speed image processing apparatus using this camera system is decribed. Realtime processing of the subtraction, thresholding and gravity position calculation of the feature points is possible.
Shi, Zhihong; Zhu, Xiaomin; Zhang, Hongyi
2007-08-15
In this paper, a micelle-mediated extraction and cloud point preconcentration method was developed for the determination of less hydrophobic compounds aesculin and aesculetin in Cortex fraxini by HPLC. Non-ionic surfactant oligoethylene glycol monoalkyl ether (Genapol X-080) was employed as the extraction solvent. Various experimental conditions were investigated to optimize the extraction process. Under optimum conditions, i.e. 5% Genapol X-080 (w/v), pH 1.0, liquid/solid ratio of 400:1 (ml/g), ultrasonic-assisted extraction for 30 min, the extraction yield reached the highest value. For the preconcentration of aesculin and aesculetin by cloud point extraction (CPE), the solution was incubated in a thermostatic water bath at 55 degrees C for 30 min, and 20% NaCl (w/v) was added to the solution to facilitate the phase separation and increase the preconcentration factor during the CPE process. Compared with methanol, which was used in Chinese Pharmacopoeia (2005 edition) for the extraction of C. fraxini, the extraction efficiency of 5% Genapol X-080 reached higher value.
NASA Astrophysics Data System (ADS)
Hild, Kenneth E.; Alleva, Giovanna; Nagarajan, Srikantan; Comani, Silvia
2007-01-01
In this study we compare the performance of six independent components analysis (ICA) algorithms on 16 real fetal magnetocardiographic (fMCG) datasets for the application of extracting the fetal cardiac signal. We also compare the extraction results for real data with the results previously obtained for synthetic data. The six ICA algorithms are FastICA, CubICA, JADE, Infomax, MRMI-SIG and TDSEP. The results obtained using real fMCG data indicate that the FastICA method consistently outperforms the others in regard to separation quality and that the performance of an ICA method that uses temporal information suffers in the presence of noise. These two results confirm the previous results obtained using synthetic fMCG data. There were also two notable differences between the studies based on real and synthetic data. The differences are that all six ICA algorithms are independent of gestational age and sensor dimensionality for synthetic data, but depend on gestational age and sensor dimensionality for real data. It is possible to explain these differences by assuming that the number of point sources needed to completely explain the data is larger than the dimensionality used in the ICA extraction.
Automatic drawing for traffic marking with MMS LIDAR intensity
NASA Astrophysics Data System (ADS)
Takahashi, G.; Takeda, H.; Shimano, Y.
2014-05-01
Upgrading the database of CYBER JAPAN has been strategically promoted because the "Basic Act on Promotion of Utilization of Geographical Information", was enacted in May 2007. In particular, there is a high demand for road information that comprises a framework in this database. Therefore, road inventory mapping work has to be accurate and eliminate variation caused by individual human operators. Further, the large number of traffic markings that are periodically maintained and possibly changed require an efficient method for updating spatial data. Currently, we apply manual photogrammetry drawing for mapping traffic markings. However, this method is not sufficiently efficient in terms of the required productivity, and data variation can arise from individual operators. In contrast, Mobile Mapping Systems (MMS) and high-density Laser Imaging Detection and Ranging (LIDAR) scanners are rapidly gaining popularity. The aim in this study is to build an efficient method for automatically drawing traffic markings using MMS LIDAR data. The key idea in this method is extracting lines using a Hough transform strategically focused on changes in local reflection intensity along scan lines. However, also note that this method processes every traffic marking. In this paper, we discuss a highly accurate and non-human-operator-dependent method that applies the following steps: (1) Binarizing LIDAR points by intensity and extracting higher intensity points; (2) Generating a Triangulated Irregular Network (TIN) from higher intensity points; (3) Deleting arcs by length and generating outline polygons on the TIN; (4) Generating buffers from the outline polygons; (5) Extracting points from the buffers using the original LIDAR points; (6) Extracting local-intensity-changing points along scan lines using the extracted points; (7) Extracting lines from intensity-changing points through a Hough transform; and (8) Connecting lines to generate automated traffic marking mapping data.
Modulation of antimicrobial metabolites production by the fungus Aspergillus parasiticus
Bracarense, Adriana A.P.; Takahashi, Jacqueline A.
2014-01-01
Biosynthesis of active secondary metabolites by fungi occurs as a specific response to the different growing environments. Changes in this environment alter the chemical and biological profiles leading to metabolites diversification and consequently to novel pharmacological applications. In this work, it was studied the influence of three parameters (fermentation length, medium composition and aeration) in the biosyntheses of antimicrobial metabolites by the fungus Aspergillus parasiticus in 10 distinct fermentation periods. Metabolism modulation in two culturing media, CYA and YES was evaluated by a 22 full factorial planning (ANOVA) and on a 23 factorial planning, role of aeration, medium composition and carbohydrate concentration were also evaluated. In overall, 120 different extracts were prepared, their HPLC profiles were obtained and the antimicrobial activity against A. flavus, C. albicans, E. coli and S. aureus of all extracts was evaluated by microdilution bioassay. Yield of kojic acid, a fine chemical produced by the fungus A. parasiticus was determined in all extracts. Statistical analyses pointed thirteen conditions able to modulate the production of bioactive metabolites by A. parasiticus. Effect of carbon source in metabolites diversification was significant as shown by the changes in the HPLC profiles of the extracts. Most of the extracts presented inhibition rates higher than that of kojic acid as for the extract obtained after 6 days of fermentation in YES medium under stirring. Kojic acid was not the only metabolite responsible for the activity since some highly active extracts showed to possess low amounts of this compound, as determined by HPLC. PMID:24948950
Temperature effect of natural organic extraction upon light absorbance in dye-sensitized solar cells
NASA Astrophysics Data System (ADS)
Suhaimi, Suriati; Mohamed Siddick, Siti Zubaidah; Retnasamy, Vithyacharan; Abdul Wahid, Mohamad Halim; Ahmad Hambali, Nor Azura Malini; Mohamad Shahimin, Mukhzeer
2017-02-01
Natural organic dyes contain pigments which when safely extracted from plants have the potential to be used as a sensitizer while promising a low-cost fabrication, environmental friendly dye-sensitized solar cells (DSSCs). Ardisia, Bawang Sabrang, Harum Manis mango, Oxalis Triangularis and Rosella showed different absorption peaks when the extraction process were carried out at different temperatures. Hence, these were used as the basis to determine the conversion efficiency against the dyes extracting temperature. In this study, all dyes extracted in water have shown the best performance at a temperature of 100°C except for Harum Manis mango, while in ethanol, the optimum temperature was obtained between the room temperature, 25°C and 50°C. The absorption spectrum in water showed a broader absorption wavelength vis-à-vis ethanol solvent that resulted in the absorption peak for Ardisia, Harum Manis mango and Rosella between 450 nm and 550 nm. The highest conversion efficiency is observed to be achieved by Oxalis Triangularis extracted in water solution at 100°C, which was approximately 0.96% which corresponds to the broader absorbance trends in the literature. Thus, the optimum condition for extracting temperature for dyes in water and ethanol is room temperature and boiling points of water. Hence, Ardisia, Bawang Sabrang, Harum Manis mango, Oxalis Triangularis and Rosella can be an as alternative source for photosensitizer, and the impacts of temperature upon the light absorbance can be further investigated to produce the ultimate natural dye based solar cells.
Mechanism of trail following by the arboreal termite Nasutitermes corniger (Isoptera: Termitidae).
Gazai, Vinícius; Bailez, Omar; Viana-Bailez, Ana Maria
2014-01-01
In this study, we investigated the mechanisms used by the arboreal termite Nasutitermes corniger (Motschulsky, 1855) to follow trails from the nest to sources of food. A plate containing one of seven trail types was used to connect an artificial nest of N. corniger with an artificial foraging arena. The trail types were: termite trail; paraffined termite trail; trail made of paraffin; rectal fluid extract trail; sternal gland extract trail; feces extract trail; and solvent trail (control). In each test, the time was recorded from the start of the test until the occurrence of trail following, at which point the number of termites that followed the trail for least 5 cm in the first 3 min of observation was recorded. The delay for termites initiating trail following along the termite trail was lower (0.55 ± 0.16 min) than in the trails of sternal gland extract (1.05 ± 0.08 min) and trails of termite feces extract (1.57 ± 0.21 min) (F(2), (48) = 22.59, P < 0.001). The number of termites that followed the termite trail was greater (207.3 ± 17.3) than the number that followed the trail of termite feces extract (102.5 ± 9.4) or sternal gland extract (36, 9 ± 1.6) (F(2), (48) = 174.34, P < 0.001). Therefore, feces on the trail may play an important role alongside sternal gland pheromones in increasing the persistence of the trail.
An Efficient Method for Automatic Road Extraction Based on Multiple Features from LiDAR Data
NASA Astrophysics Data System (ADS)
Li, Y.; Hu, X.; Guan, H.; Liu, P.
2016-06-01
The road extraction in urban areas is difficult task due to the complicated patterns and many contextual objects. LiDAR data directly provides three dimensional (3D) points with less occlusions and smaller shadows. The elevation information and surface roughness are distinguishing features to separate roads. However, LiDAR data has some disadvantages are not beneficial to object extraction, such as the irregular distribution of point clouds and lack of clear edges of roads. For these problems, this paper proposes an automatic road centerlines extraction method which has three major steps: (1) road center point detection based on multiple feature spatial clustering for separating road points from ground points, (2) local principal component analysis with least squares fitting for extracting the primitives of road centerlines, and (3) hierarchical grouping for connecting primitives into complete roads network. Compared with MTH (consist of Mean shift algorithm, Tensor voting, and Hough transform) proposed in our previous article, this method greatly reduced the computational cost. To evaluate the proposed method, the Vaihingen data set, a benchmark testing data provided by ISPRS for "Urban Classification and 3D Building Reconstruction" project, was selected. The experimental results show that our method achieve the same performance by less time in road extraction using LiDAR data.
Zhu, Hai-Zhen; Liu, Wei; Mao, Jian-Wei; Yang, Ming-Min
2008-04-28
4-Amino-4'-nitrobiphenyl, which is formed by catalytic effect of trichlorfon on sodium perborate oxidizing benzidine, is extracted with a cloud point extraction method and then detected using a high performance liquid chromatography with ultraviolet detection (HPLC-UV). Under the optimum experimental conditions, there was a linear relationship between trichlorfon in the concentration range of 0.01-0.2 mgL(-1) and the peak areas of 4-amino-4'-nitrobiphenyl (r=0.996). Limit of detection was 2.0 microgL(-1), recoveries of spiked water and cabbage samples ranged between 95.4-103 and 85.2-91.2%, respectively. It was proved that the cloud point extraction (CPE) method was simple, cheap, and environment friendly than extraction with organic solvents and had more effective extraction yield.
Comparison of Protein Extracts from Various Unicellular Green Sources.
Teuling, Emma; Wierenga, Peter A; Schrama, Johan W; Gruppen, Harry
2017-09-13
Photosynthetic unicellular organisms are considered as promising alternative protein sources. The aim of this study is to understand the extent to which these green sources differ with respect to their gross composition and how these differences affect the final protein isolate. Using mild isolation techniques, proteins were extracted and isolated from four different unicellular sources (Arthrospira (spirulina) maxima, Nannochloropsis gaditana, Tetraselmis impellucida, and Scenedesmus dimorphus). Despite differences in protein contents of the sources (27-62% w/w) and in protein extractability (17-74% w/w), final protein isolates were obtained that had similar protein contents (62-77% w/w) and protein yields (3-9% w/w). Protein solubility as a function of pH was different between the sources and in ionic strength dependency, especially at pH < 4.0. Overall, the characterization and extraction protocol used allows a relatively fast and well-described isolation of purified proteins from novel protein sources.
Comparison of Protein Extracts from Various Unicellular Green Sources
2017-01-01
Photosynthetic unicellular organisms are considered as promising alternative protein sources. The aim of this study is to understand the extent to which these green sources differ with respect to their gross composition and how these differences affect the final protein isolate. Using mild isolation techniques, proteins were extracted and isolated from four different unicellular sources (Arthrospira (spirulina) maxima, Nannochloropsis gaditana, Tetraselmis impellucida, and Scenedesmus dimorphus). Despite differences in protein contents of the sources (27–62% w/w) and in protein extractability (17–74% w/w), final protein isolates were obtained that had similar protein contents (62–77% w/w) and protein yields (3–9% w/w). Protein solubility as a function of pH was different between the sources and in ionic strength dependency, especially at pH < 4.0. Overall, the characterization and extraction protocol used allows a relatively fast and well-described isolation of purified proteins from novel protein sources. PMID:28701042
Riley, Sean P; Covington, Kyle; Landry, Michel D; McCallum, Christine; Engelhard, Chalee; Cook, Chad E
2016-01-01
This study aimed to compare selectivity characteristics among institution characteristics to determine differences by institutional funding source (public vs. private) or research activity level (research vs. non-research). This study included information provided by the Commission on Accreditation in Physical Therapy Education (CAPTE) and the Federation of State Boards of Physical Therapy. Data were extracted from all students who graduated in 2011 from accredited physical therapy programs in the United States. The public and private designations of the institutions were extracted directly from the classifications from the 'CAPTE annual accreditation report,' and high and low research activity was determined based on Carnegie classifications. The institutions were classified into four groups: public/research intensive, public/non-research intensive, private/research intensive, and private/non-research intensive. Descriptive and comparison analyses with post hoc testing were performed to determine whether there were statistically significant differences among the four groups. Although there were statistically significant baseline grade point average differences among the four categorized groups, there were no significant differences in licensure pass rates or for any of the selectivity variables of interest. Selectivity characteristics did not differ by institutional funding source (public vs. private) or research activity level (research vs. non-research). This suggests that the concerns about reduced selectivity among physiotherapy programs, specifically the types that are experiencing the largest proliferation, appear less warranted.
The algorithm of fast image stitching based on multi-feature extraction
NASA Astrophysics Data System (ADS)
Yang, Chunde; Wu, Ge; Shi, Jing
2018-05-01
This paper proposed an improved image registration method combining Hu-based invariant moment contour information and feature points detection, aiming to solve the problems in traditional image stitching algorithm, such as time-consuming feature points extraction process, redundant invalid information overload and inefficiency. First, use the neighborhood of pixels to extract the contour information, employing the Hu invariant moment as similarity measure to extract SIFT feature points in those similar regions. Then replace the Euclidean distance with Hellinger kernel function to improve the initial matching efficiency and get less mismatching points, further, estimate affine transformation matrix between the images. Finally, local color mapping method is adopted to solve uneven exposure, using the improved multiresolution fusion algorithm to fuse the mosaic images and realize seamless stitching. Experimental results confirm high accuracy and efficiency of method proposed in this paper.
EarthServer2 : The Marine Data Service - Web based and Programmatic Access to Ocean Colour Open Data
NASA Astrophysics Data System (ADS)
Clements, Oliver; Walker, Peter
2017-04-01
The ESA Ocean Colour - Climate Change Initiative (ESA OC-CCI) has produced a long-term high quality global dataset with associated per-pixel uncertainty data. This dataset has now grown to several hundred terabytes (uncompressed) and is freely available to download. However, the sheer size of the dataset can act as a barrier to many users; large network bandwidth, local storage and processing requirements can prevent researchers without the backing of a large organisation from taking advantage of this raw data. The EC H2020 project, EarthServer2, aims to create a federated data service providing access to more than 1 petabyte of earth science data. Within this federation the Marine Data Service already provides an innovative on-line tool-kit for filtering, analysing and visualising OC-CCI data. Data are made available, filtered and processed at source through a standards-based interface, the Open Geospatial Consortium Web Coverage Service and Web Coverage Processing Service. This work was initiated in the EC FP7 EarthServer project where it was found that the unfamiliarity and complexity of these interfaces itself created a barrier to wider uptake. The continuation project, EarthServer2, addresses these issues by providing higher level tools for working with these data. We will present some examples of these tools. Many researchers wish to extract time series data from discrete points of interest. We will present a web based interface, based on NASA/ESA WebWorldWind, for selecting points of interest and plotting time series from a chosen dataset. In addition, a CSV file of locations and times, such as a ship's track, can be uploaded and these points extracted and returned in a CSV file allowing researchers to work with the extract locally, such as a spreadsheet. We will also present a set of Python and JavaScript APIs that have been created to complement and extend the web based GUI. These APIs allow the selection of single points and areas for extraction. The extracted data is returned as structured data (for instance a Python array) which can then be passed directly to local processing code. We will highlight how the libraries can be used by the community and integrated into existing systems, for instance by the use of Jupyter notebooks to share Python code examples which can then be used by other researchers as a basis for their own work.
3D local feature BKD to extract road information from mobile laser scanning point clouds
NASA Astrophysics Data System (ADS)
Yang, Bisheng; Liu, Yuan; Dong, Zhen; Liang, Fuxun; Li, Bijun; Peng, Xiangyang
2017-08-01
Extracting road information from point clouds obtained through mobile laser scanning (MLS) is essential for autonomous vehicle navigation, and has hence garnered a growing amount of research interest in recent years. However, the performance of such systems is seriously affected due to varying point density and noise. This paper proposes a novel three-dimensional (3D) local feature called the binary kernel descriptor (BKD) to extract road information from MLS point clouds. The BKD consists of Gaussian kernel density estimation and binarization components to encode the shape and intensity information of the 3D point clouds that are fed to a random forest classifier to extract curbs and markings on the road. These are then used to derive road information, such as the number of lanes, the lane width, and intersections. In experiments, the precision and recall of the proposed feature for the detection of curbs and road markings on an urban dataset and a highway dataset were as high as 90%, thus showing that the BKD is accurate and robust against varying point density and noise.
NASA Astrophysics Data System (ADS)
Poux, F.; Neuville, R.; Billen, R.
2017-08-01
Reasoning from information extraction given by point cloud data mining allows contextual adaptation and fast decision making. However, to achieve this perceptive level, a point cloud must be semantically rich, retaining relevant information for the end user. This paper presents an automatic knowledge-based method for pre-processing multi-sensory data and classifying a hybrid point cloud from both terrestrial laser scanning and dense image matching. Using 18 features including sensor's biased data, each tessera in the high-density point cloud from the 3D captured complex mosaics of Germigny-des-prés (France) is segmented via a colour multi-scale abstraction-based featuring extracting connectivity. A 2D surface and outline polygon of each tessera is generated by a RANSAC plane extraction and convex hull fitting. Knowledge is then used to classify every tesserae based on their size, surface, shape, material properties and their neighbour's class. The detection and semantic enrichment method shows promising results of 94% correct semantization, a first step toward the creation of an archaeological smart point cloud.
Fiandaca, Massimo S.; Kapogiannis, Dimitrios; Mapstone, Mark; Boxer, Adam; Eitan, Erez; Schwartz, Janice B.; Abner, Erin L.; Petersen, Ronald C.; Federoff, Howard J.; Miller, Bruce L.; Goetzl, Edward J.
2014-01-01
Background Proteins pathogenic in Alzheimer’s disease (AD) were extracted from neurally-derived blood exosomes and quantified to develop biomarkers for staging of sporadic AD. Methods Blood exosomes obtained at one time-point from patients with AD (n=57) or frontotemporal dementia (FTD) (n=16), and at two time-points from others (n=24) when cognitively normal and one-ten years later when diagnosed with AD were enriched for neural sources by immunoabsorption. AD-pathogenic exosomal proteins were extracted and quantified by ELISAs. Results Mean exosomal levels of total Tau, P-T181-tau, P-S396-tau and Aβ1-42 for AD and levels of P-T181-tau and Aβ1-42 for FTD were significantly higher than for case-controls. Stepwise discriminant modeling incorporated P-T181-tau, P-S396-tau and Aβ1-42 in AD, but only P-T181-tau in FTD. Classification of 96.4% of AD patients and 87.5% of FTD patients was correct. In 24 AD patients, exosomal levels of P-S396-tau, P-T181-tau and Aβ1-42 were significantly higher than for controls both one to ten years before and when diagnosed with AD. Conclusions Levels of P-S396-tau, P-T181-tau and Aβ1-42 in extracts of neurally-derived blood exosomes predict development of AD up to 10 years prior to clinical onset. PMID:25130657
NASA Astrophysics Data System (ADS)
Caceres, Jhon
Three-dimensional (3D) models of urban infrastructure comprise critical data for planners working on problems in wireless communications, environmental monitoring, civil engineering, and urban planning, among other tasks. Photogrammetric methods have been the most common approach to date to extract building models. However, Airborne Laser Swath Mapping (ALSM) observations offer a competitive alternative because they overcome some of the ambiguities that arise when trying to extract 3D information from 2D images. Regardless of the source data, the building extraction process requires segmentation and classification of the data and building identification. In this work, approaches for classifying ALSM data, separating building and tree points, and delineating ALSM footprints from the classified data are described. Digital aerial photographs are used in some cases to verify results, but the objective of this work is to develop methods that can work on ALSM data alone. A robust approach for separating tree and building points in ALSM data is presented. The method is based on supervised learning of the classes (tree vs. building) in a high dimensional feature space that yields good class separability. Features used for classification are based on the generation of local mappings, from three-dimensional space to two-dimensional space, known as "spin images" for each ALSM point to be classified. The method discriminates ALSM returns in compact spaces and even where the classes are very close together or overlapping spatially. A modified algorithm of the Hough Transform is used to orient the spin images, and the spin image parameters are specified such that the mutual information between the spin image pixel values and class labels is maximized. This new approach to ALSM classification allows us to fully exploit the 3D point information in the ALSM data while still achieving good class separability, which has been a difficult trade-off in the past. Supported by the spin image analysis for obtaining an initial classification, an automatic approach for delineating accurate building footprints is presented. The physical fact that laser pulses that happen to strike building edges can produce very different 1st and last return elevations has been long recognized. However, in older generation ALSM systems (<50 kHz pulse rates) such points were too few and far between to delineate building footprints precisely. Furthermore, without the robust separation of nearby trees and vegetation from the buildings, simply extracting ALSM shots where the elevation of the first return was much higher than the elevation of the last return, was not a reliable means of identifying building footprints. However, with the advent of ALSM systems with pulse rates in excess of 100 kHz, and by using spin-imaged based segmentation, it is now possible to extract building edges from the point cloud. A refined classification resulting from incorporating "on-edge" information is developed for obtaining quadrangular footprints. The footprint fitting process involves line generalization, least squares-based clustering and dominant points finding for segmenting individual building edges. In addition, an algorithm for fitting complex footprints using the segmented edges and data inside footprints is also proposed.
Point Cloud Oriented Shoulder Line Extraction in Loess Hilly Area
NASA Astrophysics Data System (ADS)
Min, Li; Xin, Yang; Liyang, Xiong
2016-06-01
Shoulder line is the significant line in hilly area of Loess Plateau in China, dividing the surface into positive and negative terrain (P-N terrains). Due to the point cloud vegetation removal methods of P-N terrains are different, there is an imperative need for shoulder line extraction. In this paper, we proposed an automatic shoulder line extraction method based on point cloud. The workflow is as below: (i) ground points were selected by using a grid filter in order to remove most of noisy points. (ii) Based on DEM interpolated by those ground points, slope was mapped and classified into two classes (P-N terrains), using Natural Break Classified method. (iii) The common boundary between two slopes is extracted as shoulder line candidate. (iv) Adjust the filter gird size and repeat step i-iii until the shoulder line candidate matches its real location. (v) Generate shoulder line of the whole area. Test area locates in Madigou, Jingbian County of Shaanxi Province, China. A total of 600 million points are acquired in the test area of 0.23km2, using Riegl VZ400 3D Laser Scanner in August 2014. Due to the limit Granted computing performance, the test area is divided into 60 blocks and 13 of them around the shoulder line were selected for filter grid size optimizing. The experiment result shows that the optimal filter grid size varies in diverse sample area, and a power function relation exists between filter grid size and point density. The optimal grid size was determined by above relation and shoulder lines of 60 blocks were then extracted. Comparing with the manual interpretation results, the accuracy of the whole result reaches 85%. This method can be applied to shoulder line extraction in hilly area, which is crucial for point cloud denoising and high accuracy DEM generation.
In-flight calibration of the Hitomi Soft X-ray Spectrometer. (2) Point spread function
NASA Astrophysics Data System (ADS)
Maeda, Yoshitomo; Sato, Toshiki; Hayashi, Takayuki; Iizuka, Ryo; Angelini, Lorella; Asai, Ryota; Furuzawa, Akihiro; Kelley, Richard; Koyama, Shu; Kurashima, Sho; Ishida, Manabu; Mori, Hideyuki; Nakaniwa, Nozomi; Okajima, Takashi; Serlemitsos, Peter J.; Tsujimoto, Masahiro; Yaqoob, Tahir
2018-03-01
We present results of inflight calibration of the point spread function of the Soft X-ray Telescope that focuses X-rays onto the pixel array of the Soft X-ray Spectrometer system. We make a full array image of a point-like source by extracting a pulsed component of the Crab nebula emission. Within the limited statistics afforded by an exposure time of only 6.9 ks and limited knowledge of the systematic uncertainties, we find that the raytracing model of 1 {^'.} 2 half-power-diameter is consistent with an image of the observed event distributions across pixels. The ratio between the Crab pulsar image and the raytracing shows scatter from pixel to pixel that is 40% or less in all except one pixel. The pixel-to-pixel ratio has a spread of 20%, on average, for the 15 edge pixels, with an averaged statistical error of 17% (1 σ). In the central 16 pixels, the corresponding ratio is 15% with an error of 6%.
Preliminary GAOFEN-3 Insar dem Accuracy Analysis
NASA Astrophysics Data System (ADS)
Chen, Q.; Li, T.; Tang, X.; Gao, X.; Zhang, X.
2018-04-01
GF-3 satellite, the first C band and full-polarization SAR satellite of China with spatial resolution of 1 m, was successfully launched in August 2016. We analyze the error sources of GF-3 satellite in this paper, and provide the interferometric calibration model based on range function, Doppler shift equation and interferometric phase function, and interferometric parameters calibrated using the three-dimensional coordinates of ground control points. Then, we conduct the experimental two pairs of images in fine stripmap I mode covering Songshan of Henan Province and Tangshan of Hebei Province, respectively. The DEM data are assessed using SRTM DEM, ICESat-GLAS points, and ground control points database obtained using ZY-3 satellite to validate the accuracy of DEM elevation. The experimental results show that the accuracy of DEM extracted from GF-3 satellite SAR data can meet the requirements of topographic mapping in mountain and alpine regions at the scale of 1 : 50000 in China. Besides, it proves that GF-3 satellite has the potential of interferometry.
A method of PSF generation for 3D brightfield deconvolution.
Tadrous, P J
2010-02-01
This paper addresses the problem of 3D deconvolution of through focus widefield microscope datasets (Z-stacks). One of the most difficult stages in brightfield deconvolution is finding the point spread function. A theoretically calculated point spread function (called a 'synthetic PSF' in this paper) requires foreknowledge of many system parameters and still gives only approximate results. A point spread function measured from a sub-resolution bead suffers from low signal-to-noise ratio, compounded in the brightfield setting (by contrast to fluorescence) by absorptive, refractive and dispersal effects. This paper describes a method of point spread function estimation based on measurements of a Z-stack through a thin sample. This Z-stack is deconvolved by an idealized point spread function derived from the same Z-stack to yield a point spread function of high signal-to-noise ratio that is also inherently tailored to the imaging system. The theory is validated by a practical experiment comparing the non-blind 3D deconvolution of the yeast Saccharomyces cerevisiae with the point spread function generated using the method presented in this paper (called the 'extracted PSF') to a synthetic point spread function. Restoration of both high- and low-contrast brightfield structures is achieved with fewer artefacts using the extracted point spread function obtained with this method. Furthermore the deconvolution progresses further (more iterations are allowed before the error function reaches its nadir) with the extracted point spread function compared to the synthetic point spread function indicating that the extracted point spread function is a better fit to the brightfield deconvolution model than the synthetic point spread function.
Automatic extraction of protein point mutations using a graph bigram association.
Lee, Lawrence C; Horn, Florence; Cohen, Fred E
2007-02-02
Protein point mutations are an essential component of the evolutionary and experimental analysis of protein structure and function. While many manually curated databases attempt to index point mutations, most experimentally generated point mutations and the biological impacts of the changes are described in the peer-reviewed published literature. We describe an application, Mutation GraB (Graph Bigram), that identifies, extracts, and verifies point mutations from biomedical literature. The principal problem of point mutation extraction is to link the point mutation with its associated protein and organism of origin. Our algorithm uses a graph-based bigram traversal to identify these relevant associations and exploits the Swiss-Prot protein database to verify this information. The graph bigram method is different from other models for point mutation extraction in that it incorporates frequency and positional data of all terms in an article to drive the point mutation-protein association. Our method was tested on 589 articles describing point mutations from the G protein-coupled receptor (GPCR), tyrosine kinase, and ion channel protein families. We evaluated our graph bigram metric against a word-proximity metric for term association on datasets of full-text literature in these three different protein families. Our testing shows that the graph bigram metric achieves a higher F-measure for the GPCRs (0.79 versus 0.76), protein tyrosine kinases (0.72 versus 0.69), and ion channel transporters (0.76 versus 0.74). Importantly, in situations where more than one protein can be assigned to a point mutation and disambiguation is required, the graph bigram metric achieves a precision of 0.84 compared with the word distance metric precision of 0.73. We believe the graph bigram search metric to be a significant improvement over previous search metrics for point mutation extraction and to be applicable to text-mining application requiring the association of words.
A Taxonomy of 3D Occluded Objects Recognition Techniques
NASA Astrophysics Data System (ADS)
Soleimanizadeh, Shiva; Mohamad, Dzulkifli; Saba, Tanzila; Al-ghamdi, Jarallah Saleh
2016-03-01
The overall performances of object recognition techniques under different condition (e.g., occlusion, viewpoint, and illumination) have been improved significantly in recent years. New applications and hardware are shifted towards digital photography, and digital media. This faces an increase in Internet usage requiring object recognition for certain applications; particularly occulded objects. However occlusion is still an issue unhandled, interlacing the relations between extracted feature points through image, research is going on to develop efficient techniques and easy to use algorithms that would help users to source images; this need to overcome problems and issues regarding occlusion. The aim of this research is to review recognition occluded objects algorithms and figure out their pros and cons to solve the occlusion problem features, which are extracted from occluded object to distinguish objects from other co-existing objects by determining the new techniques, which could differentiate the occluded fragment and sections inside an image.
Khmyrova, Irina; Watanabe, Norikazu; Kholopova, Julia; Kovalchuk, Anatoly; Shapoval, Sergei
2014-07-20
We develop an analytical and numerical model for performing simulation of light extraction through the planar output interface of the light-emitting diodes (LEDs) with nonuniform current injection. Spatial nonuniformity of injected current is a peculiar feature of the LEDs in which top metal electrode is patterned as a mesh in order to enhance the output power of light extracted through the top surface. Basic features of the model are the bi-plane computation domain, related to other areas of numerical grid (NG) cells in these two planes, representation of light-generating layer by an ensemble of point light sources, numerical "collection" of light photons from the area limited by acceptance circle and adjustment of NG-cell areas in the computation procedure by the angle-tuned aperture function. The developed model and procedure are used to simulate spatial distributions of the output optical power as well as the total output power at different mesh pitches. The proposed model and simulation strategy can be very efficient in evaluation of the output optical performance of LEDs with periodical or symmetrical configuration of the electrodes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Basset, J.; Stockklauser, A.; Jarausch, D.-D.
2014-08-11
We evaluate the charge noise acting on a GaAs/GaAlAs based semiconductor double quantum dot dipole-coupled to the voltage oscillations of a superconducting transmission line resonator. The in-phase (I) and the quadrature (Q) components of the microwave tone transmitted through the resonator are sensitive to charging events in the surrounding environment of the double dot with an optimum sensitivity of 8.5×10{sup −5} e/√(Hz). A low frequency 1/f type noise spectrum combined with a white noise level of 6.6×10{sup −6} e{sup 2}/Hz above 1 Hz is extracted, consistent with previous results obtained with quantum point contact charge detectors on similar heterostructures. The slope ofmore » the 1/f noise allows to extract a lower bound for the double-dot charge qubit dephasing rate which we compare to the one extracted from a Jaynes-Cummings Hamiltonian approach. The two rates are found to be similar emphasizing that charge noise is the main source of dephasing in our system.« less
Grid point extraction and coding for structured light system
NASA Astrophysics Data System (ADS)
Song, Zhan; Chung, Ronald
2011-09-01
A structured light system simplifies three-dimensional reconstruction by illuminating a specially designed pattern to the target object, thereby generating a distinct texture on it for imaging and further processing. Success of the system hinges upon what features are to be coded in the projected pattern, extracted in the captured image, and matched between the projector's display panel and the camera's image plane. The codes have to be such that they are largely preserved in the image data upon illumination from the projector, reflection from the target object, and projective distortion in the imaging process. The features also need to be reliably extracted in the image domain. In this article, a two-dimensional pseudorandom pattern consisting of rhombic color elements is proposed, and the grid points between the pattern elements are chosen as the feature points. We describe how a type classification of the grid points plus the pseudorandomness of the projected pattern can equip each grid point with a unique label that is preserved in the captured image. We also present a grid point detector that extracts the grid points without the need of segmenting the pattern elements, and that localizes the grid points in subpixel accuracy. Extensive experiments are presented to illustrate that, with the proposed pattern feature definition and feature detector, more features points in higher accuracy can be reconstructed in comparison with the existing pseudorandomly encoded structured light systems.
Application of lasers in endodontics
NASA Astrophysics Data System (ADS)
Ertl, Thomas P.; Benthin, Hartmut; Majaron, Boris; Mueller, Gerhard J.
1997-12-01
Root canal treatment is still a problem in dentistry. Very often the conventional treatment fails and several treatment sessions are necessary to save the tooth from root resection or extraction. Application of lasers may help in this situation. Bacteria reduction has been demonstrated both in vitro and clinically and is either based on laser induced thermal effects or by using an ultraviolet light source. Root canal cleansing is possible by Er:YAG/YSGG-Lasers, using the hydrodynamic motion of a fluid filled in the canals. However root canal shaping using lasers is still a problem. Via falsas and fiber breakage are points of research.
The LBM program at the EPFL/LOTUS Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
File, J.; Jassby, D.L.; Tsang, F.Y.
1986-11-01
An experimental program of neutron transport studies of the Lithium Blanket Module (LBM) is being carried out with the LOTUS point-neutron source facility at Ecole Polytechnique Federale de Lausanne (EPFL), Switzerland. Preliminary experiments use passive neutron dosimetry within the fuel rods in the LBM central zone, as well as, both thermal extraction and dissolution methods to assay tritium bred in Li/sub 2/O diagnostic wafers and LBM pellets. These measurements are being compared and reconciled with each other and with the predictions of two-dimensional discrete-ordinates and continuous-energy Monte-Carlo analyses of the Lotus/LBM system.
Studies on the Extraction Region of the Type VI RF Driven H- Ion Source
NASA Astrophysics Data System (ADS)
McNeely, P.; Bandyopadhyay, M.; Franzen, P.; Heinemann, B.; Hu, C.; Kraus, W.; Riedl, R.; Speth, E.; Wilhelm, R.
2002-11-01
IPP Garching has spent several years developing a RF driven H- ion source intended to be an alternative to the current ITER (International Thermonuclear Experimental Reactor) reference design ion source. A RF driven source offers a number of advantages to ITER in terms of reduced costs and maintenance requirements. Although the RF driven ion source has shown itself to be competitive with a standard arc filament ion source for positive ions many questions still remain on the physics behind the production of the H- ion beam extracted from the source. With the improvements that have been implemented to the BATMAN (Bavarian Test Machine for Negative Ions) facility over the last two years it is now possible to study both the extracted ion beam and the plasma in the vicinity of the extraction grid in greater detail. This paper will show the effect of changing the extraction and acceleration voltage on both the current and shape of the beam as measured on the calorimeter some 1.5 m downstream from the source. The extraction voltage required to operate in the plasma limit is 3 kV. The perveance optimum for the extraction system was determined to be 2.2 x 10-6 A/V3/2 and occurs at 2.7 kV extraction voltage. The horizontal and vertical beam half widths vary as a function of the extracted ion current and the horizontal half width is generally smaller than the vertical. The effect of reducing the co-extracted electron current via plasma grid biasing on the H- current extractable and the beam profile from the source is shown. It is possible in the case of a silver contaminated plasma to reduce the co-extracted electron current to 20% of the initial value by applying a bias of 12 V. In the case where argon is present in the plasma, biasing is observed to have minimal effect on the beam half width but in a pure hydrogen plasma the beam half width increases as the bias voltage increases. New Langmuir probe studies that have been carried out parallel to the plasma grid (in the vicinity of the peak of the external magnetic filter field) and changes to source parameters as a function of power, and argon addition are reported. The behaviour of the electron density is different when the plasma is argon seeded showing a strong increase with RF power. The plasma potential is decreased by 2 V when argon is added to the plasma. The effect of the presence of unwanted silver sputtered from the Faraday screen by Ar+ ions on both the source performance and the plasma parameters is also presented. The silver dramatically downgraded source performance in terms of current density and produced an early saturation of current with applied RF power. Recently, collaboration was begun with the Technical University of Augsburg to perform spectroscopic measurements on the Type VI ion source. The final results of this analysis are not yet ready but some interesting initial observations on the gas temperature, disassociation degree and impurity ions will be presented.
ChemEngine: harvesting 3D chemical structures of supplementary data from PDF files.
Karthikeyan, Muthukumarasamy; Vyas, Renu
2016-01-01
Digital access to chemical journals resulted in a vast array of molecular information that is now available in the supplementary material files in PDF format. However, extracting this molecular information, generally from a PDF document format is a daunting task. Here we present an approach to harvest 3D molecular data from the supporting information of scientific research articles that are normally available from publisher's resources. In order to demonstrate the feasibility of extracting truly computable molecules from PDF file formats in a fast and efficient manner, we have developed a Java based application, namely ChemEngine. This program recognizes textual patterns from the supplementary data and generates standard molecular structure data (bond matrix, atomic coordinates) that can be subjected to a multitude of computational processes automatically. The methodology has been demonstrated via several case studies on different formats of coordinates data stored in supplementary information files, wherein ChemEngine selectively harvested the atomic coordinates and interpreted them as molecules with high accuracy. The reusability of extracted molecular coordinate data was demonstrated by computing Single Point Energies that were in close agreement with the original computed data provided with the articles. It is envisaged that the methodology will enable large scale conversion of molecular information from supplementary files available in the PDF format into a collection of ready- to- compute molecular data to create an automated workflow for advanced computational processes. Software along with source codes and instructions available at https://sourceforge.net/projects/chemengine/files/?source=navbar.Graphical abstract.
Contaminants in ventilated filling boxes
NASA Astrophysics Data System (ADS)
Bolster, D. T.; Linden, P. F.
While energy efficiency is important, the adoption of energy-efficient ventilation systems still requires the provision of acceptable indoor air quality. Many low-energy systems, such as displacement or natural ventilation, rely on temperature stratification within the interior environment, always extracting the warmest air from the top of the room. Understanding buoyancy-driven convection in a confined ventilated space is key to understanding the flow that develops with many of these modern low-energy ventilation schemes. In this work we study the transport of an initially uniformly distributed passive contaminant in a displacement-ventilated space. Representing a heat source as an ideal sourced of buoyancy, analytical and numerical models are developed that allow us to compare the average efficiency of contaminant removal between traditional mixing and modern low-energy systems. A set of small-scale analogue laboratory experiments was also conducted to further validate our analytical and numerical solutions.We find that on average traditional and low-energy ventilation methods are similar with regard to pollutant flushing efficiency. This is because the concentration being extracted from the system at any given time is approximately the same for both systems. However, very different vertical concentration gradients exist. For the low-energy system, a peak in contaminant concentration occurs at the temperature interface that is established within the space. This interface is typically designed to sit at some intermediate height in the space. Since this peak does not coincide with the extraction point, displacement ventilation does not offer the same benefits for pollutant flushing as it does for buoyancy removal.
Breil, Cassandra; Abert Vian, Maryline; Zemb, Thomas; Kunz, Werner; Chemat, Farid
2017-03-27
Bligh and Dyer (B & D) or Folch procedures for the extraction and separation of lipids from microorganisms and biological tissues using chloroform/methanol/water have been used tens of thousands of times and are "gold standards" for the analysis of extracted lipids. Based on the Conductor-like Screening MOdel for realistic Solvatation (COSMO-RS), we select ethanol and ethyl acetate as being potentially suitable for the substitution of methanol and chloroform. We confirm this by performing solid-liquid extraction of yeast ( Yarrowia lipolytica IFP29 ) and subsequent liquid-liquid partition-the two steps of routine extraction. For this purpose, we consider similar points in the ternary phase diagrams of water/methanol/chloroform and water/ethanol/ethyl acetate, both in the monophasic mixtures and in the liquid-liquid miscibility gap. Based on high performance thin-layer chromatography (HPTLC) to obtain the distribution of lipids classes, and gas chromatography coupled with a flame ionisation detector (GC/FID) to obtain fatty acid profiles, this greener solvents pair is found to be almost as effective as the classic methanol-chloroform couple in terms of efficiency and selectivity of lipids and non-lipid material. Moreover, using these bio-sourced solvents as an alternative system is shown to be as effective as the classical system in terms of the yield of lipids extracted from microorganism tissues, independently of their apparent hydrophilicity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schroeder, Jenna N.; Harto, Christopher B.; Clark, Corrie E.
Extracted water—water brought to the surface of the ground during carbon capture and sequestration (CCS) projects to create additional room for carbon dioxide injection—exists in a murky legal environment. As part of a broader attempt to identify the complex interactions between water resource policies and CCS, an analysis was undertaken at both the state and the federal level to scope the policy environments surrounding extracted water policies and laws. Six states (California, Illinois, Mississippi, Montana, North Dakota, and Texas) were chosen for this analysis because either active CCS work is currently underway, or the potential exists for future work. Although regulationmore » of extracted waters could potentially occur at many points along the CCS life cycle, this paper focuses on regulation that may apply when the water is withdrawn—that is, accessed and removed from the saline aquifer—and when it is re-injected in a close but unconnected aquifer. It was found that no regulations exist for this source specifically. In addition, greater input is needed from regulators and policy makers in terms of defining this resource. In particular, regulation of extracted waters (and CCS activities broadly) often overlaps with the management of fluids produced during oil and gas development. Many regulations would apply to extracted waters if they were classified as such. Therefore, correct categorization is key as the industry in this space continues to grow.« less
Breil, Cassandra; Abert Vian, Maryline; Zemb, Thomas; Kunz, Werner; Chemat, Farid
2017-01-01
Bligh and Dyer (B & D) or Folch procedures for the extraction and separation of lipids from microorganisms and biological tissues using chloroform/methanol/water have been used tens of thousands of times and are “gold standards” for the analysis of extracted lipids. Based on the Conductor-like Screening MOdel for realistic Solvatation (COSMO-RS), we select ethanol and ethyl acetate as being potentially suitable for the substitution of methanol and chloroform. We confirm this by performing solid–liquid extraction of yeast (Yarrowia lipolytica IFP29) and subsequent liquid–liquid partition—the two steps of routine extraction. For this purpose, we consider similar points in the ternary phase diagrams of water/methanol/chloroform and water/ethanol/ethyl acetate, both in the monophasic mixtures and in the liquid–liquid miscibility gap. Based on high performance thin-layer chromatography (HPTLC) to obtain the distribution of lipids classes, and gas chromatography coupled with a flame ionisation detector (GC/FID) to obtain fatty acid profiles, this greener solvents pair is found to be almost as effective as the classic methanol–chloroform couple in terms of efficiency and selectivity of lipids and non-lipid material. Moreover, using these bio-sourced solvents as an alternative system is shown to be as effective as the classical system in terms of the yield of lipids extracted from microorganism tissues, independently of their apparent hydrophilicity. PMID:28346372
Fourier plane modeling of the jet in the galaxy M81
NASA Astrophysics Data System (ADS)
Ramessur, Arvind; Bietenholz, Michael F.; Leeuw, Lerothodi L.; Bartel, Norbert
2015-03-01
The nearby spiral galaxy M81 has a low-luminosity Active Galactic Nucleus in its center with a core and a one-sided curved jet, dubbed M81*, that is barely resolved with VLBI. To derive basic parameters such as the length of the jet, its orientation and curvature, the usual method of model-fitting with point sources and elliptical Gaussians may not always be the most appropriate one. We are developing Fourier-plane models for such sources, in particular an asymmetric triangle model to fit the extensive set of VLBI data of M81* in the u-v plane. This method may have an advantage over conventional ones in extracting information close to the resolution limit to provide us with a more comprehensive picture of the structure and evolution of the jet. We report on preliminary results.
Silva, Tânia Costa; de Andrade, Paula Branquinho; Paiva-Martins, Fátima; Valentão, Patrícia; Pereira, David Micael
2017-03-17
Marine invertebrates have been attracting the attention of researchers for their application in nutrition, agriculture, and the pharmaceutical industry, among others. Concerning sea anemones (Cnidaria), little is known regarding their metabolic profiles and potential value as a source of pharmacologically-active agents. In this work, the chemical profiles of two species of sea anemones Actinia equina and Anemonia sulcata , were studied by high-performance liquid chromatography with diode-array detection (HPLC-DAD) and its impact upon immune and gastric cells was evaluated. In both species, the methylpyridinium alkaloid homarine was the major compound in aqueous extracts. The extracts were effective in reducing lipopolysaccharide (LPS)-induced levels of nitric oxide (NO) and intracellular reactive oxygen species (ROS) in a macrophage model of inflammation. Both the extracts and the alkaloid homarine were effective in inhibiting phospholipase A₂ (PLA₂), a pivotal enzyme in the initial steps of the inflammatory cascade. In order to mimic the oral consumption of these extracts; their effect upon human gastric cells was evaluated. While no caspase-9 activation was detected, the fact that the endoplasmic reticulum-resident caspase-4, and also caspase-3, were activated points to a non-classical mechanism of apoptosis in human gastric cells. This work provides new insights on the toxicity and biological potential of sea anemones increasingly present in human nutrition.
Grzybowski, Adelia; Tiboni, Marcela; Silva, Mário A N; Chitolina, Rodrigo F; Passos, Maurício; Fontana, José D
2013-05-01
Phytopesticide combinations of different botanical sources are seldom reported. Annona muricata seed and Piper nigrum fruit ethanolic extracts enriched in acetogenins and piperamides, respectively, were synergistically used as larvicides against the dengue fever vector Aedes aegypti. Individual bioassays of A. muricata and P. nigrum indicated respective LC50 values of 93.48 and 1.84 µg mL(-1) against third-instar larvae. Five combinations of different proportions of plant extracts pointed to synergism between the extracts. The best A. muricata:P. nigrum extract combination was 90:10, which showed 5.12 times the amount of synergism, as confirmed by statistical equations and total concentration log versus combination proportions. Concerning the morphology, A. muricata caused larvae body elongation, mainly in the abdomen, along with the appearance of a cervix. Conversely, P. nigrum induced abdomen and whole body shortening. The morphological effects of A. muricata were prevalent in all of the combinations tested, irrespective of its proportion in the combination. It is suggested that the different mechanisms of action of the larvicidal actives A. muricata acetogenins and P. nigrum piperamides explain the observed synergism. The combination of inexpensive botanicals and a low-cost organosolvent such as ethanol leads to a simple and efficient phytolarvicidal formulation. © 2012 Society of Chemical Industry.
Population Estimation in Singapore Based on Remote Sensing and Open Data
NASA Astrophysics Data System (ADS)
Guo, H.; Cao, K.; Wang, P.
2017-09-01
Population estimation statistics are widely used in government, commercial and educational sectors for a variety of purposes. With growing emphases on real-time and detailed population information, data users nowadays have switched from traditional census data to more technology-based data source such as LiDAR point cloud and High-Resolution Satellite Imagery. Nevertheless, such data are costly and periodically unavailable. In this paper, the authors use West Coast District, Singapore as a case study to investigate the applicability and effectiveness of using satellite image from Google Earth for extraction of building footprint and population estimation. At the same time, volunteered geographic information (VGI) is also utilized as ancillary data for building footprint extraction. Open data such as Open Street Map OSM could be employed to enhance the extraction process. In view of challenges in building shadow extraction, this paper discusses several methods including buffer, mask and shape index to improve accuracy. It also illustrates population estimation methods based on building height and number of floor estimates. The results show that the accuracy level of housing unit method on population estimation can reach 92.5 %, which is remarkably accurate. This paper thus provides insights into techniques for building extraction and fine-scale population estimation, which will benefit users such as urban planners in terms of policymaking and urban planning of Singapore.
Extracting microRNA-gene relations from biomedical literature using distant supervision
Clarke, Luka A.; Couto, Francisco M.
2017-01-01
Many biomedical relation extraction approaches are based on supervised machine learning, requiring an annotated corpus. Distant supervision aims at training a classifier by combining a knowledge base with a corpus, reducing the amount of manual effort necessary. This is particularly useful for biomedicine because many databases and ontologies have been made available for many biological processes, while the availability of annotated corpora is still limited. We studied the extraction of microRNA-gene relations from text. MicroRNA regulation is an important biological process due to its close association with human diseases. The proposed method, IBRel, is based on distantly supervised multi-instance learning. We evaluated IBRel on three datasets, and the results were compared with a co-occurrence approach as well as a supervised machine learning algorithm. While supervised learning outperformed on two of those datasets, IBRel obtained an F-score 28.3 percentage points higher on the dataset for which there was no training set developed specifically. To demonstrate the applicability of IBRel, we used it to extract 27 miRNA-gene relations from recently published papers about cystic fibrosis. Our results demonstrate that our method can be successfully used to extract relations from literature about a biological process without an annotated corpus. The source code and data used in this study are available at https://github.com/AndreLamurias/IBRel. PMID:28263989
Extracting microRNA-gene relations from biomedical literature using distant supervision.
Lamurias, Andre; Clarke, Luka A; Couto, Francisco M
2017-01-01
Many biomedical relation extraction approaches are based on supervised machine learning, requiring an annotated corpus. Distant supervision aims at training a classifier by combining a knowledge base with a corpus, reducing the amount of manual effort necessary. This is particularly useful for biomedicine because many databases and ontologies have been made available for many biological processes, while the availability of annotated corpora is still limited. We studied the extraction of microRNA-gene relations from text. MicroRNA regulation is an important biological process due to its close association with human diseases. The proposed method, IBRel, is based on distantly supervised multi-instance learning. We evaluated IBRel on three datasets, and the results were compared with a co-occurrence approach as well as a supervised machine learning algorithm. While supervised learning outperformed on two of those datasets, IBRel obtained an F-score 28.3 percentage points higher on the dataset for which there was no training set developed specifically. To demonstrate the applicability of IBRel, we used it to extract 27 miRNA-gene relations from recently published papers about cystic fibrosis. Our results demonstrate that our method can be successfully used to extract relations from literature about a biological process without an annotated corpus. The source code and data used in this study are available at https://github.com/AndreLamurias/IBRel.
Barry, Michael J; Meleth, Sreelatha; Lee, Jeannette Y; Kreder, Karl J; Avins, Andrew L; Nickel, J Curtis; Roehrborn, Claus G; Crawford, E David; Foster, Harris E; Kaplan, Steven A; McCullough, Andrew; Andriole, Gerald L; Naslund, Michael J; Williams, O Dale; Kusek, John W; Meyers, Catherine M; Betz, Joseph M; Cantor, Alan; McVary, Kevin T
2011-09-28
Saw palmetto fruit extracts are widely used for treating lower urinary tract symptoms attributed to benign prostatic hyperplasia (BPH); however, recent clinical trials have questioned their efficacy, at least at standard doses (320 mg/d). To determine the effect of saw palmetto extract (Serenoa repens, from saw palmetto berries) at up to 3 times the standard dose on lower urinary tract symptoms attributed to BPH. A double-blind, multicenter, placebo-controlled randomized trial at 11 North American clinical sites conducted between June 5, 2008, and October 10, 2010, of 369 men aged 45 years or older, with a peak urinary flow rate of at least 4 mL/s, an American Urological Association Symptom Index (AUASI) score of between 8 and 24 at 2 screening visits, and no exclusions. One, 2, and then 3 doses (320 mg/d) of saw palmetto extract or placebo, with dose increases at 24 and 48 weeks. Difference in AUASI score between baseline and 72 weeks. Secondary outcomes included measures of urinary bother, nocturia, peak uroflow, postvoid residual volume, prostate-specific antigen level, participants' global assessments, and indices of sexual function, continence, sleep quality, and prostatitis symptoms. Between baseline and 72 weeks, mean AUASI scores decreased from 14.42 to 12.22 points (-2.20 points; 95% CI, -3.04 to -1.36) [corrected]with saw palmetto extract and from 14.69 to 11.70 points (-2.99 points; 95% CI, -3.81 to -2.17) with placebo. The group mean difference in AUASI score change from baseline to 72 weeks between the saw palmetto extract and placebo groups was 0.79 points favoring placebo (upper bound of the 1-sided 95% CI most favorable to saw palmetto extract was 1.77 points, 1-sided P = .91). Saw palmetto extract was no more effective than placebo for any secondary outcome. No clearly attributable adverse effects were identified. Increasing doses of a saw palmetto fruit extract did not reduce lower urinary tract symptoms more than placebo. clinicaltrials.gov Identifier: NCT00603304.
Consistent criticality and radiation studies of Swiss spent nuclear fuel: The CS2M approach.
Rochman, D; Vasiliev, A; Ferroukhi, H; Pecchia, M
2018-06-15
In this paper, a new method is proposed to systematically calculate at the same time canister loading curves and radiation sources, based on the inventory information from an in-core fuel management system. As a demonstration, the isotopic contents of the assemblies come from a Swiss PWR, considering more than 6000 cases from 34 reactor cycles. The CS 2 M approach consists in combining four codes: CASMO and SIMULATE to extract the assembly characteristics (based on validated models), the SNF code for source emission and MCNP for criticality calculations for specific canister loadings. The considered cases cover enrichments from 1.9 to 5.0% for the UO 2 assemblies and 4.8% for the MOX, with assembly burnup values from 7 to 74 MWd/kgU. Because such a study is based on the individual fuel assembly history, it opens the possibility to optimize canister loadings from the point-of-view of criticality, decay heat and emission sources. Copyright © 2018 Elsevier B.V. All rights reserved.
PyEEG: an open source Python module for EEG/MEG feature extraction.
Bao, Forrest Sheng; Liu, Xin; Zhang, Christina
2011-01-01
Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction.
PyEEG: An Open Source Python Module for EEG/MEG Feature Extraction
Bao, Forrest Sheng; Liu, Xin; Zhang, Christina
2011-01-01
Computer-aided diagnosis of neural diseases from EEG signals (or other physiological signals that can be treated as time series, e.g., MEG) is an emerging field that has gained much attention in past years. Extracting features is a key component in the analysis of EEG signals. In our previous works, we have implemented many EEG feature extraction functions in the Python programming language. As Python is gaining more ground in scientific computing, an open source Python module for extracting EEG features has the potential to save much time for computational neuroscientists. In this paper, we introduce PyEEG, an open source Python module for EEG feature extraction. PMID:21512582
Ohashi, Akira; Tsuguchi, Akira; Imura, Hisanori; Ohashi, Kousaburo
2004-07-01
The cloud point extraction behavior of aluminum(III) with 8-quinolinol (HQ) or 2-methyl-8-quinolinol (HMQ) and Triton X-100 was investigated in the absence and presence of 3,5-dichlorophenol (Hdcp). Aluminum(III) was almost extracted with HQ and 4(v/v)% Triton X-100 above pH 5.0, but was not extracted with HMQ-Triton X-100. However, in the presence of Hdcp, it was almost quantitatively extracted with HMQ-Triton X-100. The synergistic effect of Hdcp on the extraction of aluminum(III) with HMQ and Triton X-100 may be caused by the formation of a mixed-ligand complex, Al(dcp)(MQ)2.
Comparison of results from simple expressions for MOSFET parameter extraction
NASA Technical Reports Server (NTRS)
Buehler, M. G.; Lin, Y.-S.
1988-01-01
In this paper results are compared from a parameter extraction procedure applied to the linear, saturation, and subthreshold regions for enhancement-mode MOSFETs fabricated in a 3-micron CMOS process. The results indicate that the extracted parameters differ significantly depending on the extraction algorithm and the distribution of I-V data points. It was observed that KP values vary by 30 percent, VT values differ by 50 mV, and Delta L values differ by 1 micron. Thus for acceptance of wafers from foundries and for modeling purposes, the extraction method and data point distribution must be specified. In this paper measurement and extraction procedures that will allow a consistent evaluation of measured parameters are discussed.
NASA Astrophysics Data System (ADS)
Mochalskyy, Serhiy; Fantz, Ursel; Wünderlich, Dirk; Minea, Tiberiu
2016-10-01
The development of negative ion (NI) sources for the ITER neutral beam injector is strongly accompanied by modelling activities. The ONIX (Orsay Negative Ion eXtraction) code simulates the formation and extraction of negative hydrogen ions and co-extracted electrons produced in caesiated sources. In this paper the 3D geometry of the BATMAN extraction system, and the source characteristics such as the extraction and bias potential, and the 3D magnetic field were integrated in the model. Calculations were performed using plasma parameters experimentally obtained on BATMAN. The comparison of the ONIX calculated extracted NI density with the experimental results suggests that predictive calculations of the extraction of NIs are possible. The results show that for an ideal status of the Cs conditioning the extracted hydrogen NI current density could reach ~30 mA cm-2 at 10 kV and ~20 mA cm-2 at 5 kV extraction potential, with an electron/NI current density ratio of about 1, as measured in the experiments under the same plasma and source conditions. The dependency of the extracted NI current on the NI density in the bulk plasma region from both the modeling and the experiment was investigated. The separate distributions composing the NI beam originating from the plasma bulk region and the PG surface are presented for different NI plasma volume densities and NI emission rates from the plasma grid (PG) wall, respectively. The extracted current from the NIs produced at the Cs covered PG surface, initially moving towards the bulk plasma and then being bent towards the extraction surfaces, is lower compared to the extracted NI current from directly extracted surface produced ions.
Matawle, Jeevan Lal; Pervez, Shamsh; Deb, Manas Kanti; Shrivastava, Anjali; Tiwari, Suresh
2018-02-01
USEPA's UNMIX, positive matrix factorization (PMF) and effective variance-chemical mass balance (EV-CMB) receptor models were applied to chemically speciated profiles of 125 indoor PM 2.5 measurements, sampled longitudinally during 2012-2013 in low-income group households of Central India which uses solid fuels for cooking practices. Three step source apportionment studies were carried out to generate more confident source characterization. Firstly, UNMIX6.0 extracted initial number of source factors, which were used to execute PMF5.0 to extract source-factor profiles in second step. Finally, factor analog locally derived source profiles were supplemented to EV-CMB8.2 with indoor receptor PM 2.5 chemical profile to evaluate source contribution estimates (SCEs). The results of combined use of three receptor models clearly describe that UNMIX and PMF are useful tool to extract types of source categories within small receptor dataset and EV-CMB can pick those locally derived source profiles for source apportionment which are analog to PMF-extracted source categories. The source apportionment results have also shown three fold higher relative contribution of solid fuel burning emissions to indoor PM 2.5 compared to those measurements reported for normal households with LPG stoves. The previously reported influential source marker species were found to be comparatively similar to those extracted from PMF fingerprint plots. The comparison between PMF and CMB SCEs results were also found to be qualitatively similar. The performance fit measures of all three receptor models were cross-verified and validated and support each other to gain confidence in source apportionment results.
Hassan, Afifa Afifi
1982-01-01
The gas evolution and the strontium carbonate precipitation techniques to extract dissolved inorganic carbon (DIC) for stable carbon isotope analysis were investigated. Theoretical considerations, involving thermodynamic calculations and computer simulation pointed out several possible sources of error in delta carbon-13 measurements of the DIC and demonstrated the need for experimental evaluation of the magnitude of the error. An alternative analytical technique, equilibration with out-gassed vapor phase, is proposed. The experimental studies revealed that delta carbon-13 of the DIC extracted from a 0.01 molar NaHC03 solution by both techniques agreed within 0.1 per mil with the delta carbon-13 of the DIC extracted by the precipitation technique, and an increase of only 0.27 per mil in that extracted by the gas evolution technique. The efficiency of extraction of DIC decreased with sulfate concentration in the precipitation technique but was independent of sulfate concentration in the gas evolution technique. Both the precipitation and gas evolution technique were found to be satisfactory for extraction of DIC from different kinds of natural water for stable carbon isotope analysis, provided appropriate precautions are observed in handling the samples. For example, it was found that diffusion of atmospheric carbon dioxide does alter the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the samples contained in polyethylene bottles; filtration and drying in the air change the delta carbon-13 of the precipitation technique; hot manganese dioxide purification changes the delta carbon-13 of carbon dioxide. (USGS)
NASA Astrophysics Data System (ADS)
Bai, Yang; Wu, Lixin; Zhou, Yuan; Li, Ding
2017-04-01
Nitrogen oxides (NOX) and sulfur dioxide (SO2) emissions from coal combustion, which is oxidized quickly in the atmosphere resulting in secondary aerosol formation and acid deposition, are the main resource causing China's regional fog-haze pollution. Extensive literature has estimated quantitatively the lifetimes and emissions of NO2 and SO2 for large point sources such as coal-fired power plants and cities using satellite measurements. However, rare of these methods is suitable for sources located in a heterogeneously polluted background. In this work, we present a simplified emission effective radius extraction model for point source to study the NO2 and SO2 reduction trend in China with complex polluted sources. First, to find out the time range during which actual emissions could be derived from satellite observations, the spatial distribution characteristics of mean daily, monthly, seasonal and annual concentration of OMI NO2 and SO2 around a single power plant were analyzed and compared. Then, a 100 km × 100 km geographical grid with a 1 km step was established around the source and the mean concentration of all satellite pixels covered in each grid point is calculated by the area weight pixel-averaging approach. The emission effective radius is defined by the concentration gradient values near the power plant. Finally, the developed model is employed to investigate the characteristic and evolution of NO2 and SO2 emissions and verify the effectiveness of flue gas desulfurization (FGD) and selective catalytic reduction (SCR) devices applied in coal-fired power plants during the period of 10 years from 2006 to 2015. It can be observed that the the spatial distribution pattern of NO2 and SO2 concentration in the vicinity of large coal-burning source was not only affected by the emission of coal-burning itself, but also closely related to the process of pollutant transmission and diffusion caused by meteorological factors in different seasons. Our proposed model can be used to identify the effective operation time of FGD and SCR equipped in coal-fired power plant.
NASA Astrophysics Data System (ADS)
Warneke, C.; Geiger, F.; Zahn, A.; Graus, M.; De Gouw, J. A.; Gilman, J. B.; Lerner, B. M.; Roberts, J. M.; Edwards, P. M.; Dube, W. P.; Brown, S. S.; Peischl, J.; Ryerson, T. B.; Williams, E. J.; Petron, G.; Kofler, J.; Sweeney, C.; Karion, A.; Dlugokencky, E. J.
2012-12-01
Technological advances such as hydraulic fracturing have led to a rapid increase in the production of natural gas from several basins in the Rocky Mountain West, including the Denver-Julesburg basin in Colorado, the Uintah basin in Utah and the Upper Green River basin in Wyoming. There are significant concerns about the impact of natural gas production on the atmosphere, including (1) emissions of methane, which determine the net climate impact of this energy source, (2) emissions of reactive hydrocarbons and nitrogen oxides, and their contribution to photochemical ozone formation, and (3) emissions of air toxics with direct health effects. The Energy & Environment - Uintah Basin Wintertime Ozone Study (UBWOS) in 2012 was focused on addressing these issues. During UBWOS, measurements of volatile organic compounds (VOCs) were made using proton-transfer-reaction mass spectrometry (PTR-MS) instruments from a ground site and a mobile laboratory. Measurements at the ground site showed mixing ratios of VOCs related to oil and gas extraction were greatly enhanced in the Uintah basin, including several days long periods of elevated mixing ratios and concentrated short term plumes. Diurnal variations were observed with large mixing ratios during the night caused by low nighttime mixing heights and a shift in wind direction during the day. The mobile laboratory sampled a wide variety of individual parts of the gas production infrastructure including active gas wells and various processing plants. Included in those point sources was a new well that was sampled by the mobile laboratory 11 times within two weeks. This new well was previously hydraulically fractured and had an active flow-back pond. Very high mixing ratios of aromatics were observed close to the flow-back pond. The measurements of the mobile laboratory are used to determine the source composition of the individual point sources and those are compared to the VOC enhancement ratios observed at the ground site. The source composition of most point sources was similar to the typical enhancement ratios observed at the ground site, whereas the new well with the flow-back pond showed a somewhat different composition.
de Lima, Júlio C.; de Costa, Fernanda; Füller, Thanise N.; Rodrigues-Corrêa, Kelly C. da Silva; Kerber, Magnus R.; Lima, Mariano S.; Fett, Janette P.; Fett-Neto, Arthur G.
2016-01-01
Pine oleoresin is a major source of terpenes, consisting of turpentine (mono- and sesquiterpenes) and rosin (diterpenes) fractions. Higher oleoresin yields are of economic interest, since oleoresin derivatives make up a valuable source of materials for chemical industries. Oleoresin can be extracted from living trees, often by the bark streak method, in which bark removal is done periodically, followed by application of stimulant paste containing sulfuric acid and other chemicals on the freshly wounded exposed surface. To better understand the molecular basis of chemically-stimulated and wound induced oleoresin production, we evaluated the stability of 11 putative reference genes for the purpose of normalization in studying Pinus elliottii gene expression during oleoresinosis. Samples for RNA extraction were collected from field-grown adult trees under tapping operations using stimulant pastes with different compositions and at various time points after paste application. Statistical methods established by geNorm, NormFinder, and BestKeeper softwares were consistent in pointing as adequate reference genes HISTO3 and UBI. To confirm expression stability of the candidate reference genes, expression profiles of putative P. elliottii orthologs of resin biosynthesis-related genes encoding Pinus contorta β-pinene synthase [PcTPS-(−)β-pin1], P. contorta levopimaradiene/abietadiene synthase (PcLAS1), Pinus taeda α-pinene synthase [PtTPS-(+)αpin], and P. taeda α-farnesene synthase (PtαFS) were examined following stimulant paste application. Increased oleoresin yields observed in stimulated treatments using phytohormone-based pastes were consistent with higher expression of pinene synthases. Overall, the expression of all genes examined matched the expected profiles of oleoresin-related transcript changes reported for previously examined conifers. PMID:27379135
de Lima, Júlio C; de Costa, Fernanda; Füller, Thanise N; Rodrigues-Corrêa, Kelly C da Silva; Kerber, Magnus R; Lima, Mariano S; Fett, Janette P; Fett-Neto, Arthur G
2016-01-01
Pine oleoresin is a major source of terpenes, consisting of turpentine (mono- and sesquiterpenes) and rosin (diterpenes) fractions. Higher oleoresin yields are of economic interest, since oleoresin derivatives make up a valuable source of materials for chemical industries. Oleoresin can be extracted from living trees, often by the bark streak method, in which bark removal is done periodically, followed by application of stimulant paste containing sulfuric acid and other chemicals on the freshly wounded exposed surface. To better understand the molecular basis of chemically-stimulated and wound induced oleoresin production, we evaluated the stability of 11 putative reference genes for the purpose of normalization in studying Pinus elliottii gene expression during oleoresinosis. Samples for RNA extraction were collected from field-grown adult trees under tapping operations using stimulant pastes with different compositions and at various time points after paste application. Statistical methods established by geNorm, NormFinder, and BestKeeper softwares were consistent in pointing as adequate reference genes HISTO3 and UBI. To confirm expression stability of the candidate reference genes, expression profiles of putative P. elliottii orthologs of resin biosynthesis-related genes encoding Pinus contorta β-pinene synthase [PcTPS-(-)β-pin1], P. contorta levopimaradiene/abietadiene synthase (PcLAS1), Pinus taeda α-pinene synthase [PtTPS-(+)αpin], and P. taeda α-farnesene synthase (PtαFS) were examined following stimulant paste application. Increased oleoresin yields observed in stimulated treatments using phytohormone-based pastes were consistent with higher expression of pinene synthases. Overall, the expression of all genes examined matched the expected profiles of oleoresin-related transcript changes reported for previously examined conifers.
Exact extraction method for road rutting laser lines
NASA Astrophysics Data System (ADS)
Hong, Zhiming
2018-02-01
This paper analyzes the importance of asphalt pavement rutting detection in pavement maintenance and pavement administration in today's society, the shortcomings of the existing rutting detection methods are presented and a new rutting line-laser extraction method based on peak intensity characteristic and peak continuity is proposed. The intensity of peak characteristic is enhanced by a designed transverse mean filter, and an intensity map of peak characteristic based on peak intensity calculation for the whole road image is obtained to determine the seed point of the rutting laser line. Regarding the seed point as the starting point, the light-points of a rutting line-laser are extracted based on the features of peak continuity, which providing exact basic data for subsequent calculation of pavement rutting depths.
2007-12-14
contained dried residues from a collection of terrestrial plants , marine inver- tebrates, and various fungi. NCI plate numbers, sources of extracts, and... plants ), while Fig. 3B displays results from row G of the same plate. In these examples, wells B3, B5, B9, G9, and G12 were selected for further...sources of extracts Plate no. Source Extraction solvent 96110120 Terrestrial plants Water 96110125 Terrestrial plants CH3OH-CH2Cl2 12000707 Marine
NASA Astrophysics Data System (ADS)
Dong, Chun; He, Gen; Mai, Kangsen; Zhou, Huihui; Xu, Wei
2016-06-01
Poor palatability is a limiting factor for replacing fishmeal with other protein sources in aquaculture. The water-soluble molecules with low molecular weights are the major determinants of the palatability of diets. The present study was conducted to investigate the palatability of water-soluble extracts from single protein source (single extract pellets) and the mixture of these extracts with different proportions (blended extract pellets) in juvenile turbot ( Scophthalmus maximus). Then according to the palatability of blended extract pellets, an optimal mixture proportion was selected, and a new protein source made from raw protein materials with the selected proportion was formulated to replace fishmeal. Summarily, the palatability of single extract pellets for turbot was descendent from fishmeal to pet-food grade poultry by-product meal, wheat gluten meal, soybean meal, peanut meal, meat and bone meal, and corn gluten meal. Subsequently, according to the palatability of single extract pellets, 52 kinds of blended extract pellets were designed to test their palatability. The results showed that the pellets presented remarkably different palatability, and the optimal one was diet 52 (wheat gluten meal: pet-food grade poultry by-product meal: meat and bone meal: corn gluten meal = 1:6:1:2). The highest ingestion ratio (the number of pellets ingested/the number of pellets fed) was 0.73 ± 0.03, which was observed in Diet 52. Then five isonitrogenous (52% crude protein) and isocaloric (20 kJ g-1 gross energy) diets were formulated by replacing 0 (control), 35%, 50%, 65% and 80% of fishmeal with No.52 blending proportion. After a 10-weeks feeding trial, a consistent feed intake was found among all replacement treatments. Replacement level of fishmeal up to 35% did not significantly influence final body weight, specific growth rate, feed efficiency ratio, and protein efficiency ratio of turbot. Therefore, the water-soluble extracts of protein sources play an important role in improving the palatability of non-fishmeal protein sources in aquafeed.
[Cocoa (Theobroma cacao L.) hulls: a posible commercial source of pectins].
Barazarte, Humberto; Sangronis, Elba; Unai, Emaldi
2008-03-01
Commercial exploitation of cocoa (Theobroma cacao L.) generates a volume of hulls that could be used in the production of pectins on an industrial scale. Therefore, pectins from cocoa hulls were extracted at different pH and temperature conditions, and their main chemical characteristics were evaluated. EDTA at 0.5% was used for the extraction at pHs 3, 4 and 5 and temperatures of 60, 75 and 90 degrees C, under a 3 2 factorial design. The response variables were yield, content of anhydrous galacturonic acid (AGA), content of metoxil, degree of esterification and equivalent weight of the pectins extracted. The strength of the pectic gel was determined with a TA-XT2 texturometer. Strawberry jam was made with the pectin extracted, and its acceptability was determined using a 7-point hedonic scale. The results obtained were as follows: an extraction yield from 2.64 to 4.69 g/100 g; an AGA content between 49.8 and 64.06 g/100 g; a content of metoxil between 4.72 and 7.18 g/100 g; a degree of esterification between 37.94 and 52.20%; an equivalent weight from 385.47 to 464.61 g/equivalent of H+, and a degree of gelation between 28.64 and 806.03 g force. The pectin extracted at pH 4 and 90 degrees C showed a gelation power of 422.16 g force, purity 62.26 g/100 g of AGA, and a yield of extraction of 3.89 g/100 g and allowed to prepare ajam with an average level of liking of "like moderately". Pectins from cocoa hulls show potential application in the food industry, but it is necessary to optimize the extraction parameters to increase its yield.
Investigation of automated feature extraction using multiple data sources
NASA Astrophysics Data System (ADS)
Harvey, Neal R.; Perkins, Simon J.; Pope, Paul A.; Theiler, James P.; David, Nancy A.; Porter, Reid B.
2003-04-01
An increasing number and variety of platforms are now capable of collecting remote sensing data over a particular scene. For many applications, the information available from any individual sensor may be incomplete, inconsistent or imprecise. However, other sources may provide complementary and/or additional data. Thus, for an application such as image feature extraction or classification, it may be that fusing the mulitple data sources can lead to more consistent and reliable results. Unfortunately, with the increased complexity of the fused data, the search space of feature-extraction or classification algorithms also greatly increases. With a single data source, the determination of a suitable algorithm may be a significant challenge for an image analyst. With the fused data, the search for suitable algorithms can go far beyond the capabilities of a human in a realistic time frame, and becomes the realm of machine learning, where the computational power of modern computers can be harnessed to the task at hand. We describe experiments in which we investigate the ability of a suite of automated feature extraction tools developed at Los Alamos National Laboratory to make use of multiple data sources for various feature extraction tasks. We compare and contrast this software's capabilities on 1) individual data sets from different data sources 2) fused data sets from multiple data sources and 3) fusion of results from multiple individual data sources.
Developing the RAL front end test stand source to deliver a 60 mA, 50 Hz, 2 ms H- beam
NASA Astrophysics Data System (ADS)
Faircloth, Dan; Lawrie, Scott; Letchford, Alan; Gabor, Christoph; Perkins, Mike; Whitehead, Mark; Wood, Trevor; Tarvainen, Olli; Komppula, Jani; Kalvas, Taneli; Dudnikov, Vadim; Pereira, Hugo; Izaola, Zunbeltz; Simkin, John
2013-02-01
All the Front End Test Stand (FETS) beam requirements have been achieved, but not simultaneously [1]. At 50 Hz repetition rates beam current droop becomes unacceptable for pulse lengths longer than 1 ms. This is fundamental limitation of the present source design. Previous researchers [2] have demonstrated that using a physically larger Penning surface plasma source should overcome these limitations. The scaled source development strategy is outlined in this paper. A study of time-varying plasma behavior has been performed using a V-UV spectrometer. Initial experiments to test scaled plasma volumes are outlined. A dedicated plasma and extraction test stand (VESPA-Vessel for Extraction and Source Plasma Analysis) is being developed to allow new source and extraction designs to be appraised. The experimental work is backed up by modeling and simulations. A detailed ANSYS thermal model has been developed. IBSimu is being used to design extraction and beam transport. A novel 3D plasma modeling code using beamlets is being developed by Cobham Vector Fields using SCALA OPERA, early source modeling results are very promising. Hardware on FETS is also being developed in preparation to run the scaled source. A new 2 ms, 50 Hz, 25 kV pulsed extraction voltage power supply has been constructed and a new discharge power supply is being designed. The design of the post acceleration electrode assembly has been improved.
Lawrie, S R; Faircloth, D C; Letchford, A P; Perkins, M; Whitehead, M O; Wood, T; Gabor, C; Back, J
2014-02-01
The ISIS pulsed spallation neutron and muon facility at the Rutherford Appleton Laboratory (RAL) in the UK uses a Penning surface plasma negative hydrogen ion source. Upgrade options for the ISIS accelerator system demand a higher current, lower emittance beam with longer pulse lengths from the injector. The Front End Test Stand is being constructed at RAL to meet the upgrade requirements using a modified ISIS ion source. A new 10% duty cycle 25 kV pulsed extraction power supply has been commissioned and the first meter of 3 MeV radio frequency quadrupole has been delivered. Simultaneously, a Vessel for Extraction and Source Plasma Analyses is under construction in a new laboratory at RAL. The detailed measurements of the plasma and extracted beam characteristics will allow a radical overhaul of the transport optics, potentially yielding a simpler source configuration with greater output and lifetime.
[A landscape ecological approach for urban non-point source pollution control].
Guo, Qinghai; Ma, Keming; Zhao, Jingzhu; Yang, Liu; Yin, Chengqing
2005-05-01
Urban non-point source pollution is a new problem appeared with the speeding development of urbanization. The particularity of urban land use and the increase of impervious surface area make urban non-point source pollution differ from agricultural non-point source pollution, and more difficult to control. Best Management Practices (BMPs) are the effective practices commonly applied in controlling urban non-point source pollution, mainly adopting local repairing practices to control the pollutants in surface runoff. Because of the close relationship between urban land use patterns and non-point source pollution, it would be rational to combine the landscape ecological planning with local BMPs to control the urban non-point source pollution, which needs, firstly, analyzing and evaluating the influence of landscape structure on water-bodies, pollution sources and pollutant removal processes to define the relationships between landscape spatial pattern and non-point source pollution and to decide the key polluted fields, and secondly, adjusting inherent landscape structures or/and joining new landscape factors to form new landscape pattern, and combining landscape planning and management through applying BMPs into planning to improve urban landscape heterogeneity and to control urban non-point source pollution.
1988-10-26
concentrated into this off- axis peak is then considered. Estimates of the source brightness ( extraction ion diode source current density divided by the square...radioactive contamination of the accelerator. One possible scheme for avoiding this problem is to use extraction geometry ion diodes to focus the ion beams...annular region. These results will be coupled to two simple models of extraction ion diodes to determihe the ion source brightness requirements. These
NASA Technical Reports Server (NTRS)
Mclennan, G. A.
1986-01-01
This report describes, and is a User's Manual for, a computer code (ANL/RBC) which calculates cycle performance for Rankine bottoming cycles extracting heat from a specified source gas stream. The code calculates cycle power and efficiency and the sizes for the heat exchangers, using tabular input of the properties of the cycle working fluid. An option is provided to calculate the costs of system components from user defined input cost functions. These cost functions may be defined in equation form or by numerical tabular data. A variety of functional forms have been included for these functions and they may be combined to create very general cost functions. An optional calculation mode can be used to determine the off-design performance of a system when operated away from the design-point, using the heat exchanger areas calculated for the design-point.
NASA Astrophysics Data System (ADS)
Ingacheva, Anastasia; Chukalina, Marina; Khanipov, Timur; Nikolaev, Dmitry
2018-04-01
Motion blur caused by camera vibration is a common source of degradation in photographs. In this paper we study the problem of finding the point spread function (PSF) of a blurred image using the tomography technique. The PSF reconstruction result strongly depends on the particular tomography technique used. We present a tomography algorithm with regularization adapted specifically for this task. We use the algebraic reconstruction technique (ART algorithm) as the starting algorithm and introduce regularization. We use the conjugate gradient method for numerical implementation of the proposed approach. The algorithm is tested using a dataset which contains 9 kernels extracted from real photographs by the Adobe corporation where the point spread function is known. We also investigate influence of noise on the quality of image reconstruction and investigate how the number of projections influence the magnitude change of the reconstruction error.
Dos Santos Nascimento, Luana Beatriz; de Aguiar, Paula Fernandes; Leal-Costa, Marcos Vinicius; Coutinho, Marcela Araújo Soares; Borsodi, Maria Paula Gonçalves; Rossi-Bergmann, Bartira; Tavares, Eliana Schwartz; Costa, Sônia Soares
2018-05-01
The medicinal plant Kalanchoe pinnata is a phenolic-rich species used worldwide. The reports on its pharmacological uses have increased by 70% in the last 10 years. The leaves of this plant are the main source of an unusual quercetin-diglycosyl flavonoid (QAR, quercetin arabinopyranosyl rhamnopyranoside), which can be easily extracted using water. QAR possess a strong in vivo anti-inflammatory activity. To optimize the aqueous extraction of QAR from K. pinnata leaves using a three-level full factorial design. After a previous screening design, time (x 1 ) and temperature (x 2 ) were chosen as the two independent variables for optimization. Freeze-dried leaves were extracted with water (20% w/v), at 30°C, 40°C or 50°C for 5, 18 or 30 min. QAR content (determined by HPLC-DAD) and yield of extracts were analyzed. The optimized extracts were also evaluated for cytotoxicity. The optimal heating times for extract yield and QAR content were similar in two-dimensional (2D) surface responses (between 12.8 and 30 min), but their optimal extraction temperatures were ranged between 40°C and 50°C for QAR content and 30°C and 38°C for extract yield. A compromise region for both parameters was at the mean points that were 40°C for the extraction temperature and 18 min for the total time. The optimized process is faster and spends less energy than the previous one (water; 30 min at 55°C); therefore is greener and more attractive for industrial purposes. This is the first report of extraction optimization of this bioactive flavonoid. Copyright © 2018 John Wiley & Sons, Ltd. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Altmann, Jens; Jansen, Boris; Kalbitz, Karsten; Filley, Timothy
2013-04-01
Dissolved organic matter (DOM) is one of the most dynamic carbon pools linking the terrestrial with the aquatic carbon cycle. Besides the insecure contribution of terrestrial DOM to the greenhouse effect, DOM also plays an important role for the mobility and availability of heavy metals and organic pollutants in soils. These processes depend very much on the molecular characteristics of the DOM. Surprisingly the processes that determine the molecular composition of DOM are only poorly understood. DOM can originate from various sources, which influence its molecular composition. It has been recognized that DOM formation is not a static process and DOM characteristics vary not only between different carbon sources. However, molecular characteristics of DOM extracts have scarcely been studied continuously over a longer period of time. Due to constant molecular changes of the parent litter material or soil organic matter during microbial degradation, we assumed that also the molecular characteristics of litter derived DOM varies at different stages during root and needle decomposition. For this study we analyzed the chemical composition of root and leaf samples of 6 temperate tree species during one year of litter decomposition in a laboratory incubation. During this long-term experiment we measured continuously carbon and nitrogen contents of the water extracts and the remaining residues, C mineralization rates, and the chemical composition of water extracts and residues by Curie-point pyrolysis mass spectrometry with TMAH We focused on the following questions: (I) How mobile are molecules derived from plant polymers like tannin, lignin, suberin and cutin? (II) How does the composition of root and leaf derived DOM change over time in dependence on the stage of decomposition and species? Litter derived DOM was generally dominated by aromatic compounds. Substituded fatty acids as typically cutin or suberin derived were not detected in the water extracts. Fresh leaf and needle samples released a much higher amount of tannins than fresh root samples. At later litter decomposition stages the influence of tannins decreased and lignin derived phenols dominated the extracts. With ongoing litter degradation the degree of oxidation for the litter material increased, which was also reflected by the water extracted molecules.
Applications of 3D-EDGE Detection for ALS Point Cloud
NASA Astrophysics Data System (ADS)
Ni, H.; Lin, X. G.; Zhang, J. X.
2017-09-01
Edge detection has been one of the major issues in the field of remote sensing and photogrammetry. With the fast development of sensor technology of laser scanning system, dense point clouds have become increasingly common. Precious 3D-edges are able to be detected from these point clouds and a great deal of edge or feature line extraction methods have been proposed. Among these methods, an easy-to-use 3D-edge detection method, AGPN (Analyzing Geometric Properties of Neighborhoods), has been proposed. The AGPN method detects edges based on the analysis of geometric properties of a query point's neighbourhood. The AGPN method detects two kinds of 3D-edges, including boundary elements and fold edges, and it has many applications. This paper presents three applications of AGPN, i.e., 3D line segment extraction, ground points filtering, and ground breakline extraction. Experiments show that the utilization of AGPN method gives a straightforward solution to these applications.
Spatiotemporal attention operator using isotropic contrast and regional homogeneity
NASA Astrophysics Data System (ADS)
Palenichka, Roman; Lakhssassi, Ahmed; Zaremba, Marek
2011-04-01
A multiscale operator for spatiotemporal isotropic attention is proposed to reliably extract attention points during image sequence analysis. Its consecutive local maxima indicate attention points as the centers of image fragments of variable size with high intensity contrast, region homogeneity, regional shape saliency, and temporal change presence. The scale-adaptive estimation of temporal change (motion) and its aggregation with the regional shape saliency contribute to the accurate determination of attention points in image sequences. Multilocation descriptors of an image sequence are extracted at the attention points in the form of a set of multidimensional descriptor vectors. A fast recursive implementation is also proposed to make the operator's computational complexity independent from the spatial scale size, which is the window size in the spatial averaging filter. Experiments on the accuracy of attention-point detection have proved the operator consistency and its high potential for multiscale feature extraction from image sequences.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warner-Schmid, D.; Hoshi, Suwaru; Armstrong, D.W.
Aqueous solutions of nonionic surfactants are known to undergo phase separations at elevated temperatures. This phenomenon is known as clouding,' and the temperature at which it occurs is refereed to as the cloud point. Permethylhydroxypropyl-[beta]-cyclodextrin (PMHP-[beta]-CD) was synthesized and aqueous solutions containing it were found to undergo similar cloud-point behavior. Factors that affect the phase separation of PMHP-[beta]-CD were investigated. Subsequently, the cloud-point extractions of several aromatic compounds (i.e., acetanilide, aniline, 2,2[prime]-dihydroxybiphenyl, N-methylaniline, 2-naphthol, o-nitroaniline, m-nitroaniline, p-nitroaniline, nitrobenzene, o-nitrophenol, m-nitrophenol, p-nitrophenol, 4-phenazophenol, 3-phenylphenol, and 2-phenylbenzimidazole) from dilute aqueous solution were evaluated. Although the extraction efficiency of the compounds varied, mostmore » can be quantitatively extracted if sufficient PMHP-[beta]-CD is used. For those few compounds that are not extracted (e.g., o-nitroacetanilide), the cloud-point procedure may be an effective one-step isolation or purification method. 18 refs., 2 figs., 3 tabs.« less
Effects of nitrogen and carbon sources on the production of inulinase from strain Bacillus sp. SG113
NASA Astrophysics Data System (ADS)
Gavrailov, Simeon; Ivanova, Viara
2016-03-01
The effects of the carbon and nitrogen substrates on the growth of Bacillus sp. SG113 strain were studied. The use of organic nitrogen sources (peptone, beef extract, yeast extract, casein) leads to rapid cellular growth and the best results for the Bacillus strain were obtained with casein hydrolysate. From the inorganic nitrogen sources studied, the (NH4) 2SO4 proved to be the best nitrogen source. Casein hydrolysate and (NH4) 2SO4 stimulated the invertase synthesis. In the presence of Jerusalem artichoke, onion and garlic extracts as carbon sources the strain synthesized from 6 to 10 times more inulinase.
Self-Similar Spin Images for Point Cloud Matching
NASA Astrophysics Data System (ADS)
Pulido, Daniel
The rapid growth of Light Detection And Ranging (Lidar) technologies that collect, process, and disseminate 3D point clouds have allowed for increasingly accurate spatial modeling and analysis of the real world. Lidar sensors can generate massive 3D point clouds of a collection area that provide highly detailed spatial and radiometric information. However, a Lidar collection can be expensive and time consuming. Simultaneously, the growth of crowdsourced Web 2.0 data (e.g., Flickr, OpenStreetMap) have provided researchers with a wealth of freely available data sources that cover a variety of geographic areas. Crowdsourced data can be of varying quality and density. In addition, since it is typically not collected as part of a dedicated experiment but rather volunteered, when and where the data is collected is arbitrary. The integration of these two sources of geoinformation can provide researchers the ability to generate products and derive intelligence that mitigate their respective disadvantages and combine their advantages. Therefore, this research will address the problem of fusing two point clouds from potentially different sources. Specifically, we will consider two problems: scale matching and feature matching. Scale matching consists of computing feature metrics of each point cloud and analyzing their distributions to determine scale differences. Feature matching consists of defining local descriptors that are invariant to common dataset distortions (e.g., rotation and translation). Additionally, after matching the point clouds they can be registered and processed further (e.g., change detection). The objective of this research is to develop novel methods to fuse and enhance two point clouds from potentially disparate sources (e.g., Lidar and crowdsourced Web 2.0 datasets). The scope of this research is to investigate both scale and feature matching between two point clouds. The specific focus of this research will be in developing a novel local descriptor based on the concept of self-similarity to aid in the scale and feature matching steps. An open problem in fusion is how best to extract features from two point clouds and then perform feature-based matching. The proposed approach for this matching step is the use of local self-similarity as an invariant measure to match features. In particular, the proposed approach is to combine the concept of local self-similarity with a well-known feature descriptor, Spin Images, and thereby define "Self-Similar Spin Images". This approach is then extended to the case of matching two points clouds in very different coordinate systems (e.g., a geo-referenced Lidar point cloud and stereo-image derived point cloud without geo-referencing). The use of Self-Similar Spin Images is again applied to address this problem by introducing a "Self-Similar Keyscale" that matches the spatial scales of two point clouds. Another open problem is how best to detect changes in content between two point clouds. A method is proposed to find changes between two point clouds by analyzing the order statistics of the nearest neighbors between the two clouds, and thereby define the "Nearest Neighbor Order Statistic" method. Note that the well-known Hausdorff distance is a special case as being just the maximum order statistic. Therefore, by studying the entire histogram of these nearest neighbors it is expected to yield a more robust method to detect points that are present in one cloud but not the other. This approach is applied at multiple resolutions. Therefore, changes detected at the coarsest level will yield large missing targets and at finer levels will yield smaller targets.
Poerschmann, Juergen; Koschorreck, Matthias; Górecki, Tadeusz
2017-02-01
Natural neutralization of acidic mining lakes is often limited by organic matter. The knowledge of the sources and degradability of organic matter is crucial for understanding alkalinity generation in these lakes. Sediments collected at different depths (surface sediment layer from 0 to 1 cm and deep sediment layer from 4 to 5cm) from an acidic mining lake were studied in order to characterize sedimentary organic matter based on neutral signature markers. Samples were exhaustively extracted, subjected to pre-chromatographic derivatizations and analyzed by GC/MS. Herein, molecular distributions of diagnostic alkanes/alkenes, terpenes/terpenoids, polycyclic aromatic hydrocarbons, aliphatic alcohols and ketones, sterols, and hopanes/hopanoids were addressed. Characterization of the contribution of natural vs. anthropogenic sources to the sedimentary organic matter in these extreme environments was then possible based on these distributions. With the exception of polycyclic aromatic hydrocarbons, combined concentrations across all marker classes proved higher in the surface sediment layer as compared to those in the deep sediment layer. Alkane and aliphatic alcohol distributions pointed to predominantly allochthonous over autochthonous contribution to sedimentary organic matter. Sterol patterns were dominated by phytosterols of terrestrial plants including stigmasterol and β-sitosterol. Hopanoid markers with the ββ-biohopanoid "biological" configuration were more abundant in the surface sediment layer, which pointed to higher bacterial activity. The pattern of polycyclic aromatic hydrocarbons pointed to prevailing anthropogenic input. Pyrolytic makers were likely to due to atmospheric deposition from a nearby former coal combustion facility. The combined analysis of the array of biomarkers provided new insights into the sources and transformations of organic matter in lake sediments. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sun, Z.; Xu, Y.; Hoegner, L.; Stilla, U.
2018-05-01
In this work, we propose a classification method designed for the labeling of MLS point clouds, with detrended geometric features extracted from the points of the supervoxel-based local context. To achieve the analysis of complex 3D urban scenes, acquired points of the scene should be tagged with individual labels of different classes. Thus, assigning a unique label to the points of an object that belong to the same category plays an essential role in the entire 3D scene analysis workflow. Although plenty of studies in this field have been reported, this work is still a challenging task. Specifically, in this work: 1) A novel geometric feature extraction method, detrending the redundant and in-salient information in the local context, is proposed, which is proved to be effective for extracting local geometric features from the 3D scene. 2) Instead of using individual point as basic element, the supervoxel-based local context is designed to encapsulate geometric characteristics of points, providing a flexible and robust solution for feature extraction. 3) Experiments using complex urban scene with manually labeled ground truth are conducted, and the performance of proposed method with respect to different methods is analyzed. With the testing dataset, we have obtained a result of 0.92 for overall accuracy for assigning eight semantic classes.
Development of a Multi-Point Microwave Interferometry (MPMI) Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specht, Paul Elliott; Cooper, Marcia A.; Jilek, Brook Anton
2015-09-01
A multi-point microwave interferometer (MPMI) concept was developed for non-invasively tracking a shock, reaction, or detonation front in energetic media. Initially, a single-point, heterodyne microwave interferometry capability was established. The design, construction, and verification of the single-point interferometer provided a knowledge base for the creation of the MPMI concept. The MPMI concept uses an electro-optic (EO) crystal to impart a time-varying phase lag onto a laser at the microwave frequency. Polarization optics converts this phase lag into an amplitude modulation, which is analyzed in a heterodyne interfer- ometer to detect Doppler shifts in the microwave frequency. A version of themore » MPMI was constructed to experimentally measure the frequency of a microwave source through the EO modulation of a laser. The successful extraction of the microwave frequency proved the underlying physical concept of the MPMI design, and highlighted the challenges associated with the longer microwave wavelength. The frequency measurements made with the current equipment contained too much uncertainty for an accurate velocity measurement. Potential alterations to the current construction are presented to improve the quality of the measured signal and enable multiple accurate velocity measurements.« less
Preliminaries toward studying resonant extraction from the Debuncher
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michelotti, Leo; Johnstone, John; /Fermilab
2009-06-01
A recent proposal to detect {mu} {yields} e direct conversion at Fermilab asks for slow extraction of protons from the antiproton source, specifically from the Debuncher. [1] A third-integer resonance originally was considered for this, partly because of the Debuncher's three-fold symmetry and partly because its operational horizontal tune, {nu}{sub x} {approx} 9.765, is already within 0.1 of {nu}{sub x} = 29/3. Using a half integer resonance, {nu}{sub x} = 19/2, though not part of the original proposal, has been suggested more recently because (a) Fermilab has had a good deal of experience with half-integer extraction from the Tevatron, themore » Main Injector and the erstwhile Main Ring, and (b) for reasons we shall examine later, it depopulates the entire bunch without an abort at the end. This memo presents considerations preliminary to studying both possibilities. It is meant only as a starting point for investigations to be carried out in the future. The working constraints and assumptions have oscillated between two extremes: (1) making minimal changes in the antiproton source to minimize cost and (2) building another machine in the same tunnel. In this memo we adopt an attitude aligned more toward the first. The assumed parameters are listed in Table 1. A few are not (easily) subject to change, such as those related to the beam's momentum and revolution frequency and the acceptance of the debuncher. Two resonance exemplars are presented in the next section, with an explanation of the analytic and semi-analytic calculations that can be done for each. Section 3 contains preliminary numerical work that was done to validate the exemplars within the context of extraction from the Debuncher. A final section contains a summary. Following the bibliography, appendices contain (a) a qualitative, conceptual discussion of extraction for the novice, (b) a telegraphic review of the perturbative incantations used to filter the exemplars as principal resonances of quadrupole, sextupole and octupole distributions, (c) a brief discussion of linearly independent control circuits, and (d) two files describing the antiproton source's rings in MAD v.8 format, not readily available elsewhere. All figures are located at the end. We emphasize again, the work reported here barely begins the effort that will be required to design, validate and perform resonant extraction from the Debuncher. Our goal was to compile these preliminary notes in one place for easy future reference, preferably by a young, intelligent, motivated and energetic graduate student.« less
Analysis of separation test for automatic brake adjuster based on linear radon transformation
NASA Astrophysics Data System (ADS)
Luo, Zai; Jiang, Wensong; Guo, Bin; Fan, Weijun; Lu, Yi
2015-01-01
The linear Radon transformation is applied to extract inflection points for online test system under the noise conditions. The linear Radon transformation has a strong ability of anti-noise and anti-interference by fitting the online test curve in several parts, which makes it easy to handle consecutive inflection points. We applied the linear Radon transformation to the separation test system to solve the separating clearance of automatic brake adjuster. The experimental results show that the feature point extraction error of the gradient maximum optimal method is approximately equal to ±0.100, while the feature point extraction error of linear Radon transformation method can reach to ±0.010, which has a lower error than the former one. In addition, the linear Radon transformation is robust.
Piedade, Tales Campos; Melo, Vander Freitas; Souza, Luiz Cláudio Paula; Dieckow, Jeferson
2014-09-01
Monitoring of heavy metal contamination plume in soils can be helpful in establishing strategies to minimize its hazardous impacts to the environment. The objective of this study was to apply a new approach of visualization, based on tridimensional (3D) images, of pseudo-total (extracted with concentrated acids) and exchangeable (extracted with 0.5 mol L(-1) Ca(NO3)2) lead (Pb) concentrations in soils of a mining and metallurgy area to determine the spatial distribution of this pollutant and to estimate the most contaminated soil volumes. Tridimensional images were obtained after interpolation of Pb concentrations of 171 soil samples (57 points × 3 depths) with regularized spline with tension in a 3D function version. The tridimensional visualization showed great potential of use in environmental studies and allowed to determine the spatial 3D distribution of Pb contamination plume in the area and to establish relationships with soil characteristics, landscape, and pollution sources. The most contaminated soil volumes (10,001 to 52,000 mg Pb kg(-1)) occurred near the metallurgy factory. The main contamination sources were attributed to atmospheric emissions of particulate Pb through chimneys. The large soil volume estimated to be removed to industrial landfills or co-processing evidenced the difficulties related to this practice as a remediation strategy.
Chemical and biological extraction of metals present in E waste: A hybrid technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pant, Deepak, E-mail: deepakpant1@rediffmail.com; Joshi, Deepika; Upreti, Manoj K.
2012-05-15
Highlights: Black-Right-Pointing-Pointer Hybrid methodology for E waste management. Black-Right-Pointing-Pointer Efficient extraction of metals. Black-Right-Pointing-Pointer Trace metal extraction is possible. - Abstract: Management of metal pollution associated with E-waste is widespread across the globe. Currently used techniques for the extraction of metals from E-waste by using either chemical or biological leaching have their own limitations. Chemical leaching is much rapid and efficient but has its own environmental consequences, even the future prospects of associated nanoremediation are also uncertain. Biological leaching on the other hand is comparatively a cost effective technique but at the same moment it is time consuming and themore » complete recovery of the metal, alone by biological leaching is not possible in most of the cases. The current review addresses the individual issues related to chemical and biological extraction techniques and proposes a hybrid-methodology which incorporates both, along with safer chemicals and compatible microbes for better and efficient extraction of metals from the E-waste.« less
Extracting valley-ridge lines from point-cloud-based 3D fingerprint models.
Pang, Xufang; Song, Zhan; Xie, Wuyuan
2013-01-01
3D fingerprinting is an emerging technology with the distinct advantage of touchless operation. More important, 3D fingerprint models contain more biometric information than traditional 2D fingerprint images. However, current approaches to fingerprint feature detection usually must transform the 3D models to a 2D space through unwrapping or other methods, which might introduce distortions. A new approach directly extracts valley-ridge features from point-cloud-based 3D fingerprint models. It first applies the moving least-squares method to fit a local paraboloid surface and represent the local point cloud area. It then computes the local surface's curvatures and curvature tensors to facilitate detection of the potential valley and ridge points. The approach projects those points to the most likely valley-ridge lines, using statistical means such as covariance analysis and cross correlation. To finally extract the valley-ridge lines, it grows the polylines that approximate the projected feature points and removes the perturbations between the sampled points. Experiments with different 3D fingerprint models demonstrate this approach's feasibility and performance.
Line segment extraction for large scale unorganized point clouds
NASA Astrophysics Data System (ADS)
Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan
2015-04-01
Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.
Classification of spatially unresolved objects
NASA Technical Reports Server (NTRS)
Nalepka, R. F.; Horwitz, H. M.; Hyde, P. D.; Morgenstern, J. P.
1972-01-01
A proportion estimation technique for classification of multispectral scanner images is reported that uses data point averaging to extract and compute estimated proportions for a single average data point to classify spatial unresolved areas. Example extraction calculations of spectral signatures for bare soil, weeds, alfalfa, and barley prove quite accurate.
Mikkonen, Hannah G; Clarke, Bradley O; Dasika, Raghava; Wallis, Christian J; Reichman, Suzie M
2017-02-15
Understanding ambient background concentrations in soil, at a local scale, is an essential part of environmental risk assessment. Where high resolution geochemical soil surveys have not been undertaken, soil data from alternative sources, such as environmental site assessment reports, can be used to support an understanding of ambient background conditions. Concentrations of metals/metalloids (As, Mn, Ni, Pb and Zn) were extracted from open-source environmental site assessment reports, for soils derived from the Newer Volcanics basalt, of Melbourne, Victoria, Australia. A manual screening method was applied to remove samples that were indicated to be contaminated by point sources and hence not representative of ambient background conditions. The manual screening approach was validated by comparison to data from a targeted background soil survey. Statistical methods for exclusion of contaminated samples from background soil datasets were compared to the manual screening method. The statistical methods tested included the Median plus Two Median Absolute Deviations, the upper whisker of a normal and log transformed Tukey boxplot, the point of inflection on a cumulative frequency plot and the 95th percentile. We have demonstrated that where anomalous sample results cannot be screened using site information, the Median plus Two Median Absolute Deviations is a conservative method for derivation of ambient background upper concentration limits (i.e. expected maximums). The upper whisker of a boxplot and the point of inflection on a cumulative frequency plot, were also considered adequate methods for deriving ambient background upper concentration limits, where the percentage of contaminated samples is <25%. Median ambient background concentrations of metals/metalloids in the Newer Volcanic soils of Melbourne were comparable to ambient background concentrations in Europe and the United States, except for Ni, which was naturally enriched in the basalt-derived soils of Melbourne. Copyright © 2016 Elsevier B.V. All rights reserved.
Galhiane, Mário S; Rissato, Sandra R; Chierice, Gilberto O; Almeida, Marcos V; Silva, Letícia C
2006-09-15
This work has been developed using a sylvestral fruit tree, native to the Brazilian forest, the Eugenia uniflora L., one of the Mirtaceae family. The main goal of the analytical study was focused on extraction methods themselves. The method development pointed to the Clevenger extraction as the best yield in relation to SFE and Soxhlet. The SFE method presented a good yield but showed a big amount of components in the final extract, demonstrating low selectivity. The essential oil extracted was analyzed by GC/FID showing a large range of polarity and boiling point compounds, where linalool, a widely used compound, was identified. Furthermore, an analytical solid phase extraction method was used to clean it up and obtain separated classes of compounds that were fractionated and studied by GC/FID and GC/MS.
An inverter/controller subsystem optimized for photovoltaic applications
NASA Technical Reports Server (NTRS)
Pickrell, R. L.; Osullivan, G.; Merrill, W. C.
1978-01-01
Conversion of solar array dc power to ac power stimulated the specification, design, and simulation testing of an inverter/controller subsystem tailored to the photovoltaic power source characteristics. Optimization of the inverter/controller design is discussed as part of an overall photovoltaic power system designed for maximum energy extraction from the solar array. The special design requirements for the inverter/ controller include: a power system controller (PSC) to control continuously the solar array operating point at the maximum power level based on variable solar insolation and cell temperatures; and an inverter designed for high efficiency at rated load and low losses at light loadings to conserve energy.
Model-based Bayesian signal extraction algorithm for peripheral nerves
NASA Astrophysics Data System (ADS)
Eggers, Thomas E.; Dweiri, Yazan M.; McCallum, Grant A.; Durand, Dominique M.
2017-10-01
Objective. Multi-channel cuff electrodes have recently been investigated for extracting fascicular-level motor commands from mixed neural recordings. Such signals could provide volitional, intuitive control over a robotic prosthesis for amputee patients. Recent work has demonstrated success in extracting these signals in acute and chronic preparations using spatial filtering techniques. These extracted signals, however, had low signal-to-noise ratios and thus limited their utility to binary classification. In this work a new algorithm is proposed which combines previous source localization approaches to create a model based method which operates in real time. Approach. To validate this algorithm, a saline benchtop setup was created to allow the precise placement of artificial sources within a cuff and interference sources outside the cuff. The artificial source was taken from five seconds of chronic neural activity to replicate realistic recordings. The proposed algorithm, hybrid Bayesian signal extraction (HBSE), is then compared to previous algorithms, beamforming and a Bayesian spatial filtering method, on this test data. An example chronic neural recording is also analyzed with all three algorithms. Main results. The proposed algorithm improved the signal to noise and signal to interference ratio of extracted test signals two to three fold, as well as increased the correlation coefficient between the original and recovered signals by 10-20%. These improvements translated to the chronic recording example and increased the calculated bit rate between the recovered signals and the recorded motor activity. Significance. HBSE significantly outperforms previous algorithms in extracting realistic neural signals, even in the presence of external noise sources. These results demonstrate the feasibility of extracting dynamic motor signals from a multi-fascicled intact nerve trunk, which in turn could extract motor command signals from an amputee for the end goal of controlling a prosthetic limb.
Apu, Apurba Sarker; Liza, Mahmuda Sultana; Jamaluddin, A T M; Howlader, Md Amran; Saha, Repon Kumer; Rizwan, Farhana; Nasrin, Nishat
2012-09-01
To investigate the bioactivities of crude n-hexane, ethyl acetate and methanol extracts of aerial part of Boerhavia diffusa Linn. (B. diffusa) and its phytochemical analysis. The identification of phytoconstituents and assay of antioxidant, thrombolytic, cytotoxic, antimicrobial activities were conducted using specific standard in vitro procedures. The results showed that the plant extracts were a rich source of phytoconstituents. Methanol extract showed higher antioxidant, thrombolytic activity and less cytotoxic activity than those of n-hexane and ethyl acetate extracts of B. diffusa. Among the bioactivities, antioxidant activity was the most notable compared to the positive control and thus could be a potential rich source of natural antioxidant. In case of antimicrobial screening, crude extracts of the plant showed remarkable antibacterial activity against tested microorganisms. All the extracts showed significant inhibitory activity against Candida albicuns, at a concentration of 1000 µg/disc. The present findings suggest that, the plant widely available in Bangladesh, could be a prominent source of medicinally important natural compounds.
Inferring Models of Bacterial Dynamics toward Point Sources
Jashnsaz, Hossein; Nguyen, Tyler; Petrache, Horia I.; Pressé, Steve
2015-01-01
Experiments have shown that bacteria can be sensitive to small variations in chemoattractant (CA) concentrations. Motivated by these findings, our focus here is on a regime rarely studied in experiments: bacteria tracking point CA sources (such as food patches or even prey). In tracking point sources, the CA detected by bacteria may show very large spatiotemporal fluctuations which vary with distance from the source. We present a general statistical model to describe how bacteria locate point sources of food on the basis of stochastic event detection, rather than CA gradient information. We show how all model parameters can be directly inferred from single cell tracking data even in the limit of high detection noise. Once parameterized, our model recapitulates bacterial behavior around point sources such as the “volcano effect”. In addition, while the search by bacteria for point sources such as prey may appear random, our model identifies key statistical signatures of a targeted search for a point source given any arbitrary source configuration. PMID:26466373
An FBG acoustic emission source locating system based on PHAT and GA
NASA Astrophysics Data System (ADS)
Shen, Jing-shi; Zeng, Xiao-dong; Li, Wei; Jiang, Ming-shun
2017-09-01
Using the acoustic emission locating technology to monitor the health of the structure is important for ensuring the continuous and healthy operation of the complex engineering structures and large mechanical equipment. In this paper, four fiber Bragg grating (FBG) sensors are used to establish the sensor array to locate the acoustic emission source. Firstly, the nonlinear locating equations are established based on the principle of acoustic emission, and the solution of these equations is transformed into an optimization problem. Secondly, time difference extraction algorithm based on the phase transform (PHAT) weighted generalized cross correlation provides the necessary conditions for the accurate localization. Finally, the genetic algorithm (GA) is used to solve the optimization model. In this paper, twenty points are tested in the marble plate surface, and the results show that the absolute locating error is within the range of 10 mm, which proves the accuracy of this locating method.
Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A
2004-11-01
The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.
Performance of Four-Leg VSC based DSTATCOM using Single Phase P-Q Theory
NASA Astrophysics Data System (ADS)
Jampana, Bangarraju; Veramalla, Rajagopal; Askani, Jayalaxmi
2017-02-01
This paper presents single-phase P-Q theory for four-leg VSC based distributed static compensator (DSTATCOM) in the distribution system. The proposed DSTATCOM maintains unity power factor at source, zero voltage regulation, eliminates current harmonics, load balancing and neutral current compensation. The advantage of using four-leg VSC based DSTATCOM is to eliminate isolated/non-isolated transformer connection at point of common coupling (PCC) for neutral current compensation. The elimination of transformer connection at PCC with proposed topology will reduce cost of DSTATCOM. The single-phase P-Q theory control algorithm is used to extract fundamental component of active and reactive currents for generation of reference source currents which is based on indirect current control method. The proposed DSTATCOM is modelled and the results are validated with various consumer loads under unity power factor and zero voltage regulation modes in the MATLAB R2013a environment using simpower system toolbox.
Light yield of Kuraray SCSF-78MJ scintillating fibers for the Gluex barrel calorimeter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beattie, T D; Fischer, A P; Krueger, S T
Over three quarters of a million 1-mm-diameter 4-m-long Kuraray double-clad SCSF-78MJ (blue-green) scintillating fibers have been used in the construction of the GlueX electromagnetic barrel calorimeter for the Hall D experimental program at Jefferson Lab. The quality of a random sample of 4,750 of these fibers was evaluated by exciting the fibers at their mid point using a 90Sr source in order to determine the light yield using a calibrated vacuum photomultiplier as the photosensor. A novel methodology was developed to extract the number of photoelectrons detected for measurements where individual photoelectron peaks are not discernible. The average number ofmore » photoelectrons from this sample of fibers was 9.17±0.6 at a source distance of 200 cm from the PMT.« less
Moranda, Arianna
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities. PMID:29270328
Paladino, Ombretta; Moranda, Arianna; Seyedsalehi, Mahdi
2017-01-01
A procedure for assessing harbour pollution by heavy metals and PAH and the possible sources of contamination is proposed. The procedure is based on a ratio-matching method applied to the results of principal component analysis (PCA), and it allows discrimination between point and nonpoint sources. The approach can be adopted when many sources of pollution can contribute in a very narrow coastal ecosystem, both internal and outside but close to the harbour, and was used to identify the possible point sources of contamination in a Mediterranean Harbour (Port of Vado, Savona, Italy). 235 sediment samples were collected in 81 sampling points during four monitoring campaigns and 28 chemicals were searched for within the collected samples. PCA of total samples allowed the assessment of 8 main possible point sources, while the refining ratio-matching identified 1 sampling point as a possible PAH source, 2 sampling points as Cd point sources, and 3 sampling points as C > 12 point sources. By a map analysis it was possible to assess two internal sources of pollution directly related to terminals activity. The study is the prosecution of a previous work aimed at assessing Savona-Vado Harbour pollution levels and suggested strategies to regulate the harbour activities.
Javidi, Soroush; Mandic, Danilo P.; Took, Clive Cheong; Cichocki, Andrzej
2011-01-01
A new class of complex domain blind source extraction algorithms suitable for the extraction of both circular and non-circular complex signals is proposed. This is achieved through sequential extraction based on the degree of kurtosis and in the presence of non-circular measurement noise. The existence and uniqueness analysis of the solution is followed by a study of fast converging variants of the algorithm. The performance is first assessed through simulations on well understood benchmark signals, followed by a case study on real-time artifact removal from EEG signals, verified using both qualitative and quantitative metrics. The results illustrate the power of the proposed approach in real-time blind extraction of general complex-valued sources. PMID:22319461
Kishii, Y; Kawasaki, S; Kitagawa, A; Muramatsu, M; Uchida, T
2014-02-01
A compact ECR ion source has utilized for carbon radiotherapy. In order to increase beam intensity with higher electric field at the extraction electrode and be better ion supply stability for long periods, electric geometry and surface conditions of an extraction electrode have been studied. Focusing attention on black deposited substances on the extraction electrode, which were observed around the extraction electrode after long-term use, the relation between black deposited substances and the electrical insulation property is investigated. The black deposited substances were inspected for the thickness of deposit, surface roughness, structural arrangement examined using Raman spectroscopy, and characteristics of electric discharge in a test bench, which was set up to simulate the ECR ion source.
Non-point source pollution is a diffuse source that is difficult to measure and is highly variable due to different rain patterns and other climatic conditions. In many areas, however, non-point source pollution is the greatest source of water quality degradation. Presently, stat...
Limonoids from Melia azedarach Fruits as Inhibitors of Flaviviruses and Mycobacterium tubercolosis.
Sanna, Giuseppina; Madeddu, Silvia; Giliberti, Gabriele; Ntalli, Nikoletta G; Cottiglia, Filippo; De Logu, Alessandro; Agus, Emanuela; Caboni, Pierluigi
2015-01-01
The biological diversity of nature is the source of a wide range of bioactive molecules. The natural products, either as pure compounds or as standardized plant extracts, have been a successful source of inspiration for the development of new drugs. The present work was carried out to investigate the cytotoxicity, antiviral and antimycobacterial activity of the methanol extract and of four identified limonoids from the fruits of Melia azedarach (Meliaceae). The extract and purified limonoids were tested in cell-based assays for antiviral activity against representatives of ssRNA, dsRNA and dsDNA viruses and against Mycobacterium tuberculosis. Very interestingly, 3-α-tigloyl-melianol and melianone showed a potent antiviral activity (EC50 in the range of 3-11μM) against three important human pathogens, belonging to Flaviviridae family, West Nile virus, Dengue virus and Yellow Fever virus. Mode of action studies demonstrated that title compounds were inhibitors of West Nile virus only when added during the infection, acting as inhibitors of the entry or of a very early event of life cycle. Furthermore, 3-α-tigloyl-melianol and methyl kulonate showed interesting antimycobacterial activity (with MIC values of 29 and 70 μM respectively). The limonoids are typically lipophilic compounds present in the fruits of Melia azeradach. They are known as cytotoxic compounds against different cancer cell lines, while their potential as antiviral and antibacterial was poorly investigated. Our studies show that they may serve as a good starting point for the development of novel drugs for the treatment of infections by Flaviviruses and Mycobacterium tuberculosis, for which there is a continued need.
Phytomedical investigation of Najas minor All. in the view of the chemical constituents
Topuzovic, Marina D.; Radojevic, Ivana D.; Dekic, Milan S.; Radulovic, Niko S.; Vasic, Sava M.; Comic, Ljiljana R.; Licina, Braho Z.
2015-01-01
Plants are an abundant natural source of effective antibiotic compounds. Phytomedical investigations of certain plants haven't still been conducted. One of them is Najas minor (N. minor), an aquatic plant with confirmed allelopathy. Research conducted in this study showed the influence of water and ethyl acetate extracts of N. minor on microorganisms, in the view of chemical profiling of volatile constituents and the concentrations of total phenols, flavonoids and tannins. Antimicrobial activity was defined by determining minimum inhibitory and minimum microbicidal concentrations using microdilution method. Influence on bacterial biofilm formation was performed by tissue culture plate method. The total phenolics, flavonoids and condensed tannins were determined by Folin-Ciocalteu, aluminum chloride and butanol-HCl colorimetric methods. Chemical profiling of volatile constituents was investigated by GC and GC-MS. Water extract didn't have antimicrobial activity below 5000 µg/mL. Ethyl acetate extract has shown strong antimicrobial activity on G+ bacteria - Staphylococcus aureus PMFKGB12 and Bacillus subtilis (MIC < 78.13 µg/mL). The best antibiofilm activity was obtained on Escherichia coli ATCC25922 (BIC50 at 719 µg/mL). Water extract had higher yield. Ethyl acetate extract had a significantly greater amount of total phenolics, flavonoids and tannins. As major constituent hexahydrofarnesyl acetone was identified. The ethyl acetate extract effected only G+ bacteria, but the biofilm formation of G-bacteria was suppressed. There was a connection between those in vivo and in vitro effects against pathogenic bacterial biofilm formation. All of this points to a so far unexplored potential of N. minor. PMID:26535038
Gangopadhyay, Nirupama; Hossain, Mohammad B; Rai, Dilip K; Brunton, Nigel P
2015-06-12
Oat and barely are cereal crops mainly used as animal feed and for the purposes of malting and brewing, respectively. Some studies have indicated that consumption of oat and barley rich foods may reduce the risk of some chronic diseases such as coronary heart disease, type II diabetes and cancer. Whilst there is no absolute consensus, some of these benefits may be linked to presence of compounds such as phenolics, vitamin E and β-glucan in these cereals. A number of benefits have also been linked to the lipid component (sterols, fatty acids) and the proteins and bioactive peptides in oats and barley. Since the available evidence is pointing toward the possible health benefits of oat and barley components, a number of authors have examined techniques for recovering them from their native sources. In the present review, we summarise and examine the range of conventional techniques that have been used for the purpose of extraction and detection of these bioactives. In addition, the recent advances in use of novel food processing technologies as a substitute to conventional processes for extraction of bioactives from oats and barley, has been discussed.
Wang, T; Yang, Z; Dong, P; long, J D; He, X Z; Wang, X; Zhang, K Z; Zhang, L W
2012-06-01
The cold-cathode Penning ion gauge (PIG) type ion source has been used for generation of negative hydrogen (H(-)) ions as the internal ion source of a compact cyclotron. A novel method called electrical shielding box dc beam measurement is described in this paper, and the beam intensity was measured under dc extraction inside an electrical shielding box. The results of the trajectory simulation and dc H(-) beam extraction measurement were presented. The effect of gas flow rate, magnetic field strength, arc current, and extraction voltage were also discussed. In conclusion, the dc H(-) beam current of about 4 mA from the PIG ion source with the puller voltage of 40 kV and arc current of 1.31 A was extrapolated from the measurement at low extraction dc voltages.
Development of a helicon ion source: Simulations and preliminary experiments.
Afsharmanesh, M; Habibi, M
2018-03-01
In the present context, the extraction system of a helicon ion source has been simulated and constructed. Results of the ion source commissioning at up to 20 kV are presented as well as simulations of an ion beam extraction system. Argon current of more than 200 μA at up to 20 kV is extracted and is characterized with a Faraday cup and beam profile monitoring grid. By changing different ion source parameters such as RF power, extraction voltage, and working pressure, an ion beam with current distribution exhibiting a central core has been detected. Jump transition of ion beam current emerges at the RF power near to 700 W, which reveals that the helicon mode excitation has reached this power. Furthermore, measuring the emission line intensity of Ar ii at 434.8 nm is the other way we have used for demonstrating the mode transition from inductively coupled plasma to helicon. Due to asymmetrical longitudinal power absorption of a half-helix helicon antenna, it is used for the ion source development. The modeling of the plasma part of the ion source has been carried out using a code, HELIC. Simulations are carried out by taking into account a Gaussian radial plasma density profile and for plasma densities in range of 10 18 -10 19 m -3 . Power absorption spectrum and the excited helicon mode number are obtained. Longitudinal RF power absorption for two different antenna positions is compared. Our results indicate that positioning the antenna near to the plasma electrode is desirable for the ion beam extraction. The simulation of the extraction system was performed with the ion optical code IBSimu, making it the first helicon ion source extraction designed with the code. Ion beam emittance and Twiss parameters of the ellipse emittance are calculated at different iterations and mesh sizes, and the best values of the mesh size and iteration number have been obtained for the calculations. The simulated ion beam extraction system has been evaluated using optimized parameters such as the gap distance between electrodes, electrodes aperture, and extraction voltage. The gap distance, ground electrode aperture, and extraction voltage have been changed between 3 and 9 mm, 2-6.5 mm, and 10-35 kV in the simulations, respectively.
Development of a helicon ion source: Simulations and preliminary experiments
NASA Astrophysics Data System (ADS)
Afsharmanesh, M.; Habibi, M.
2018-03-01
In the present context, the extraction system of a helicon ion source has been simulated and constructed. Results of the ion source commissioning at up to 20 kV are presented as well as simulations of an ion beam extraction system. Argon current of more than 200 μA at up to 20 kV is extracted and is characterized with a Faraday cup and beam profile monitoring grid. By changing different ion source parameters such as RF power, extraction voltage, and working pressure, an ion beam with current distribution exhibiting a central core has been detected. Jump transition of ion beam current emerges at the RF power near to 700 W, which reveals that the helicon mode excitation has reached this power. Furthermore, measuring the emission line intensity of Ar ii at 434.8 nm is the other way we have used for demonstrating the mode transition from inductively coupled plasma to helicon. Due to asymmetrical longitudinal power absorption of a half-helix helicon antenna, it is used for the ion source development. The modeling of the plasma part of the ion source has been carried out using a code, HELIC. Simulations are carried out by taking into account a Gaussian radial plasma density profile and for plasma densities in range of 1018-1019 m-3. Power absorption spectrum and the excited helicon mode number are obtained. Longitudinal RF power absorption for two different antenna positions is compared. Our results indicate that positioning the antenna near to the plasma electrode is desirable for the ion beam extraction. The simulation of the extraction system was performed with the ion optical code IBSimu, making it the first helicon ion source extraction designed with the code. Ion beam emittance and Twiss parameters of the ellipse emittance are calculated at different iterations and mesh sizes, and the best values of the mesh size and iteration number have been obtained for the calculations. The simulated ion beam extraction system has been evaluated using optimized parameters such as the gap distance between electrodes, electrodes aperture, and extraction voltage. The gap distance, ground electrode aperture, and extraction voltage have been changed between 3 and 9 mm, 2-6.5 mm, and 10-35 kV in the simulations, respectively.
Binary Code Extraction and Interface Identification for Security Applications
2009-10-02
the functions extracted during the end-to-end applications and at the bottom some additional functions extracted from the OpenSSL library. fact that as...mentioned in Section 5.1 through Section 5.3 and some additional functions that we extract from the OpenSSL library for evaluation purposes. The... OpenSSL functions, the false positives and negatives are measured by comparison with the original C source code. For the malware samples, no source is
Text-in-context: a method for extracting findings in mixed-methods mixed research synthesis studies.
Sandelowski, Margarete; Leeman, Jennifer; Knafl, Kathleen; Crandell, Jamie L
2013-06-01
Our purpose in this paper is to propose a new method for extracting findings from research reports included in mixed-methods mixed research synthesis studies. International initiatives in the domains of systematic review and evidence synthesis have been focused on broadening the conceptualization of evidence, increased methodological inclusiveness and the production of evidence syntheses that will be accessible to and usable by a wider range of consumers. Initiatives in the general mixed-methods research field have been focused on developing truly integrative approaches to data analysis and interpretation. The data extraction challenges described here were encountered, and the method proposed for addressing these challenges was developed, in the first year of the ongoing (2011-2016) study: Mixed-Methods Synthesis of Research on Childhood Chronic Conditions and Family. To preserve the text-in-context of findings in research reports, we describe a method whereby findings are transformed into portable statements that anchor results to relevant information about sample, source of information, time, comparative reference point, magnitude and significance and study-specific conceptions of phenomena. The data extraction method featured here was developed specifically to accommodate mixed-methods mixed research synthesis studies conducted in nursing and other health sciences, but reviewers might find it useful in other kinds of research synthesis studies. This data extraction method itself constitutes a type of integration to preserve the methodological context of findings when statements are read individually and in comparison to each other. © 2012 Blackwell Publishing Ltd.
Concepts, laboratory, and telescope test results of the plenoptic camera as a wavefront sensor
NASA Astrophysics Data System (ADS)
Rodríguez-Ramos, L. F.; Montilla, I.; Fernández-Valdivia, J. J.; Trujillo-Sevilla, J. L.; Rodríguez-Ramos, J. M.
2012-07-01
The plenoptic camera has been proposed as an alternative wavefront sensor adequate for extended objects within the context of the design of the European Solar Telescope (EST), but it can also be used with point sources. Originated in the field of the Electronic Photography, the plenoptic camera directly samples the Light Field function, which is the four - dimensional representation of all the light entering a camera. Image formation can then be seen as the result of the photography operator applied to this function, and many other features of the light field can be exploited to extract information of the scene, like depths computation to extract 3D imaging or, as it will be specifically addressed in this paper, wavefront sensing. The underlying concept of the plenoptic camera can be adapted to the case of a telescope by using a lenslet array of the same f-number placed at the focal plane, thus obtaining at the detector a set of pupil images corresponding to every sampled point of view. This approach will generate a generalization of Shack-Hartmann, Curvature and Pyramid wavefront sensors in the sense that all those could be considered particular cases of the plenoptic wavefront sensor, because the information needed as the starting point for those sensors can be derived from the plenoptic image. Laboratory results obtained with extended objects, phase plates and commercial interferometers, and even telescope observations using stars and the Moon as an extended object are presented in the paper, clearly showing the capability of the plenoptic camera to behave as a wavefront sensor.
In-situ continuous water analyzing module
Thompson, Cyril V.; Wise, Marcus B.
1998-01-01
An in-situ continuous liquid analyzing system for continuously analyzing volatile components contained in a water source comprises: a carrier gas supply, an extraction container and a mass spectrometer. The carrier gas supply continuously supplies the carrier gas to the extraction container and is mixed with a water sample that is continuously drawn into the extraction container. The carrier gas continuously extracts the volatile components out of the water sample. The water sample is returned to the water source after the volatile components are extracted from it. The extracted volatile components and the carrier gas are delivered continuously to the mass spectometer and the volatile components are continuously analyzed by the mass spectrometer.
Automatic Extraction of Road Markings from Mobile Laser Scanning Data
NASA Astrophysics Data System (ADS)
Ma, H.; Pei, Z.; Wei, Z.; Zhong, R.
2017-09-01
Road markings as critical feature in high-defination maps, which are Advanced Driver Assistance System (ADAS) and self-driving technology required, have important functions in providing guidance and information to moving cars. Mobile laser scanning (MLS) system is an effective way to obtain the 3D information of the road surface, including road markings, at highway speeds and at less than traditional survey costs. This paper presents a novel method to automatically extract road markings from MLS point clouds. Ground points are first filtered from raw input point clouds using neighborhood elevation consistency method. The basic assumption of the method is that the road surface is smooth. Points with small elevation-difference between neighborhood are considered to be ground points. Then ground points are partitioned into a set of profiles according to trajectory data. The intensity histogram of points in each profile is generated to find intensity jumps in certain threshold which inversely to laser distance. The separated points are used as seed points to region grow based on intensity so as to obtain road mark of integrity. We use the point cloud template-matching method to refine the road marking candidates via removing the noise clusters with low correlation coefficient. During experiment with a MLS point set of about 2 kilometres in a city center, our method provides a promising solution to the road markings extraction from MLS data.
Extraction of space-charge-dominated ion beams from an ECR ion source: Theory and simulation
NASA Astrophysics Data System (ADS)
Alton, G. D.; Bilheux, H.
2004-05-01
Extraction of high quality space-charge-dominated ion beams from plasma ion sources constitutes an optimization problem centered about finding an optimal concave plasma emission boundary that minimizes half-angular divergence for a given charge state, independent of the presence or lack thereof of a magnetic field in the extraction region. The curvature of the emission boundary acts to converge/diverge the low velocity beam during extraction. Beams of highest quality are extracted whenever the half-angular divergence, ω, is minimized. Under minimum half-angular divergence conditions, the plasma emission boundary has an optimum curvature and the perveance, P, current density, j+ext, and extraction gap, d, have optimum values for a given charge state, q. Optimum values for each of the independent variables (P, j+ext and d) are found to be in close agreement with those derived from elementary analytical theory for extraction with a simple two-electrode extraction system, independent of the presence of a magnetic field. The magnetic field only increases the emittances of beams through additional aberrational effects caused by increased angular divergences through coupling of the longitudinal to the transverse velocity components of particles as they pass though the mirror region of the electron cyclotron resonance (ECR) ion source. This article reviews the underlying theory of elementary extraction optics and presents results derived from simulation studies of extraction of space-charge dominated heavy-ion beams of varying mass, charge state, and intensity from an ECR ion source with emphasis on magnetic field induced effects.
Single-trial event-related potential extraction through one-unit ICA-with-reference
NASA Astrophysics Data System (ADS)
Lih Lee, Wee; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong
2016-12-01
Objective. In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. Approach. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Main results. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. Significance. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.
Single-trial event-related potential extraction through one-unit ICA-with-reference.
Lee, Wee Lih; Tan, Tele; Falkmer, Torbjörn; Leung, Yee Hong
2016-12-01
In recent years, ICA has been one of the more popular methods for extracting event-related potential (ERP) at the single-trial level. It is a blind source separation technique that allows the extraction of an ERP without making strong assumptions on the temporal and spatial characteristics of an ERP. However, the problem with traditional ICA is that the extraction is not direct and is time-consuming due to the need for source selection processing. In this paper, the application of an one-unit ICA-with-Reference (ICA-R), a constrained ICA method, is proposed. In cases where the time-region of the desired ERP is known a priori, this time information is utilized to generate a reference signal, which is then used for guiding the one-unit ICA-R to extract the source signal of the desired ERP directly. Our results showed that, as compared to traditional ICA, ICA-R is a more effective method for analysing ERP because it avoids manual source selection and it requires less computation thus resulting in faster ERP extraction. In addition to that, since the method is automated, it reduces the risks of any subjective bias in the ERP analysis. It is also a potential tool for extracting the ERP in online application.
Bendif, Hamdi; Adouni, Khaoula; Miara, Mohamed Djamel; Baranauskienė, Renata; Kraujalis, Paulius; Venskutonis, Petras Rimantas; Nabavi, Seyed Mohammad; Maggi, Filippo
2018-09-15
The aim of this study was to demonstrate the potential of extracts from Algerian Thymus munbyanus as a valuable source of antioxidants for use on an industrial level. To this end, a study was conducted on the composition and antioxidant activities of essential oils (EOs), pressurized liquid extracts (PLE) and supercritical fluid extracts (SFE-CO 2 ) obtained from Thymus munbyanus subsp. coloratus (TMC) and subsp. munbyanus (TMM). EOs and SFE-CO 2 extracts were analysed by GC-FID and GC×GC-TOFMS revealing significant differences. A successive extraction of the solid SFE-CO 2 residue by PLE extraction with solvents of increasing polarity such as acetone, ethanol and water, was carried out. The extracts were evaluated for total phenolic content by Folin-Ciocalteu assay, while the antioxidant power was assessed by DPPH, FRAP, and ORAC assays. SFE-CO 2 extracts were also analysed for their tocopherol content. The antioxidant activity of PLE extracts was found to be higher than that of SFE-CO 2 extracts, and this increased with solvent polarity (water > ethanol > acetone). Overall, these results support the use of T. munbyanus as a valuable source of substances to be used on an industrial level as preservative agents. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Tan, J. P.; Jahim, J. M.; Wu, T. Y.; Harun, S.; Mumtaz, T.
2016-06-01
Expensive raw materials are the driving force that leads to the shifting of the petroleum-based succinic acid production into bio-based succinic acid production by microorganisms. Cost of fermentation medium is among the main factors contributing to the total production cost of bio-succinic acid. After carbon source, nitrogen source is the second largest component of the fermentation medium, the cost of which has been overlooked for the past years. The current study aimed at replacing yeast extract- a costly nitrogen source with corn steep liquor for economical production of bio-succinic acid by Actinobacillus succinogenes 130Z. In this study, a final succinic acid concentration of 20.6 g/L was obtained from the use of corn steep liquor as the nitrogen source, which was comparable with the use of yeast extract as the nitrogen source that had a final succinate concentration of 21.4 g/l. In terms of economical wise, corn steep liquor was priced at 200 /ton, which was one fifth of the cost of yeast extract at 1000 /ton. Therefore, corn steep liquor can be considered as a potential nitrogen source in biochemical industries instead of the costly yeast extract.
Noradilah, Samseh Abdullah; Lee, Ii Li; Anuar, Tengku Shahrul; Salleh, Fatmah Md; Abdul Manap, Siti Nor Azreen; Mohd Mohtar, Noor Shazleen Husnie; Azrul, Syed Muhamad; Abdullah, Wan Omar
2016-01-01
In the tropics, there are too few studies on isolation of Blastocystis sp. subtypes from water sources; in addition, there is also an absence of reported studies on the occurrence of Blastocystis sp. subtypes in water during different seasons. Therefore, this study was aimed to determine the occurrence of Blastocystis sp. subtypes in river water and other water sources that drained aboriginal vicinity of highly endemic intestinal parasitic infections during wet and dry seasons. Water samples were collected from six sampling points of Sungai Krau (K1–K6) and a point at Sungai Lompat (K7) and other water sources around the aboriginal villages. The water samples were collected during both seasons, wet and dry seasons. Filtration of the water samples were carried out using a flatbed membrane filtration system. The extracted DNA from concentrated water sediment was subjected to single round polymerase chain reaction and positive PCR products were subjected to sequencing. All samples were also subjected to filtration and cultured on membrane lactose glucuronide agar for the detection of faecal coliforms. During wet season, Blastocystis sp. ST1, ST2 and ST3 were detected in river water samples. Blastocystis sp. ST3 occurrence was sustained in the river water samples during dry season. However Blastocystis sp. ST1 and ST2 were absent during dry season. Water samples collected from various water sources showed contaminations of Blastocystis sp. ST1, ST2, ST3 and ST4, during wet season and Blastocystis sp. ST1, ST3, ST8 and ST10 during dry season. Water collected from all river sampling points during both seasons showed growth of Escherichia coli and Enterobacter aerogenes, indicating faecal contamination. In this study, Blastocystis sp. ST3 is suggested as the most robust and resistant subtype able to survive in any adverse environmental condition. Restriction and control of human and animal faecal contaminations to the river and other water sources shall prevent the transmission of Blastocystis sp. to humans and animals in this aboriginal community. PMID:27761331
Deuterium results at the negative ion source test facility ELISE
NASA Astrophysics Data System (ADS)
Kraus, W.; Wünderlich, D.; Fantz, U.; Heinemann, B.; Bonomo, F.; Riedl, R.
2018-05-01
The ITER neutral beam system will be equipped with large radio frequency (RF) driven negative ion sources, with a cross section of 0.9 m × 1.9 m, which have to deliver extracted D- ion beams of 57 A at 1 MeV for 1 h. On the extraction from a large ion source experiment test facility, a source of half of this size is being operational since 2013. The goal of this experiment is to demonstrate a high operational reliability and to achieve the extracted current densities and beam properties required for ITER. Technical improvements of the source design and the RF system were necessary to provide reliable operation in steady state with an RF power of up to 300 kW. While in short pulses the required D- current density has almost been reached, the performance in long pulses is determined in particular in Deuterium by inhomogeneous and unstable currents of co-extracted electrons. By application of refined caesium evaporation and distribution procedures, and reduction and symmetrization of the electron currents, considerable progress has been made and up to 190 A/m2 D-, corresponding to 66% of the value required for ITER, have been extracted for 45 min.
An improved DPSM technique for modelling ultrasonic fields in cracked solids
NASA Astrophysics Data System (ADS)
Banerjee, Sourav; Kundu, Tribikram; Placko, Dominique
2007-04-01
In recent years Distributed Point Source Method (DPSM) is being used for modelling various ultrasonic, electrostatic and electromagnetic field modelling problems. In conventional DPSM several point sources are placed near the transducer face, interface and anomaly boundaries. The ultrasonic or the electromagnetic field at any point is computed by superimposing the contributions of different layers of point sources strategically placed. The conventional DPSM modelling technique is modified in this paper so that the contributions of the point sources in the shadow region can be removed from the calculations. For this purpose the conventional point sources that radiate in all directions are replaced by Controlled Space Radiation (CSR) sources. CSR sources can take care of the shadow region problem to some extent. Complete removal of the shadow region problem can be achieved by introducing artificial interfaces. Numerically synthesized fields obtained by the conventional DPSM technique that does not give any special consideration to the point sources in the shadow region and the proposed modified technique that nullifies the contributions of the point sources in the shadow region are compared. One application of this research can be found in the improved modelling of the real time ultrasonic non-destructive evaluation experiments.
On the assessment of spatial resolution of PET systems with iterative image reconstruction
NASA Astrophysics Data System (ADS)
Gong, Kuang; Cherry, Simon R.; Qi, Jinyi
2016-03-01
Spatial resolution is an important metric for performance characterization in PET systems. Measuring spatial resolution is straightforward with a linear reconstruction algorithm, such as filtered backprojection, and can be performed by reconstructing a point source scan and calculating the full-width-at-half-maximum (FWHM) along the principal directions. With the widespread adoption of iterative reconstruction methods, it is desirable to quantify the spatial resolution using an iterative reconstruction algorithm. However, the task can be difficult because the reconstruction algorithms are nonlinear and the non-negativity constraint can artificially enhance the apparent spatial resolution if a point source image is reconstructed without any background. Thus, it was recommended that a background should be added to the point source data before reconstruction for resolution measurement. However, there has been no detailed study on the effect of the point source contrast on the measured spatial resolution. Here we use point source scans from a preclinical PET scanner to investigate the relationship between measured spatial resolution and the point source contrast. We also evaluate whether the reconstruction of an isolated point source is predictive of the ability of the system to resolve two adjacent point sources. Our results indicate that when the point source contrast is below a certain threshold, the measured FWHM remains stable. Once the contrast is above the threshold, the measured FWHM monotonically decreases with increasing point source contrast. In addition, the measured FWHM also monotonically decreases with iteration number for maximum likelihood estimate. Therefore, when measuring system resolution with an iterative reconstruction algorithm, we recommend using a low-contrast point source and a fixed number of iterations.
Neutron sources for investigations on extracted beams in Russia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aksenov, V. L.
An overview is presented of the current status and prospects for the development of neutron sources intended for investigations on extracted beams in Russia. The participation of Russia in international scientific organizations is demonstrated.
NASA Astrophysics Data System (ADS)
Tripathi, Nishant; Pavelyev, Vladimir; Islam, S. S.
2017-11-01
Green catalyst derived from plants, a cheap and abundant natural source, is used for the synthesis of multi-walled carbon nanotubes (MWNTs). The concept is unconventional and practically realized into existence by simple CVD growth while keeping away the potential hazards caused by metal catalyst on environment and living organisms. The notable points to mention of such growth are: (1) grown CNTs are free from toxic metal catalyst, (2) low growth temperature (575 °C) required and produced high yield vis-à-vis any other catalyst used so far reported, and (3) no need of expensive and complex systems for its synthesis. Besides, growth of SWNT as well as carbon nano-belts with hollow rectangular cross-section is observed when growth temperature increased to 800 °C, specifically, for the wall-nut extract. The samples were characterized by microscopic and spectroscopic analysis and the results verified our study. The present work provides innovative technique and may open up new avenues for CNTs synthesis and its applications.
Benelli, Giovanni; Chandramohan, Balamurugan; Murugan, Kadarkarai; Madhiyazhagan, Pari; Kovendan, Kalimuthu; Panneerselvam, Chellasamy; Dinesh, Devakumar; Govindarajan, Marimuthu; Higuchi, Akon; Toniolo, Chiara; Canale, Angelo; Nicoletti, Marcello
2017-05-01
Mosquitoes are insects of huge public health importance, since they act as vectors for important pathogens and parasites. Here, we focused on the possibility of using the neem cake in the fight against mosquito vectors. The neem cake chemical composition significantly changes among producers, as evidenced by our HPTLC (High performance thin layer chromatography) analyses of different marketed products. Neem cake extracts were tested to evaluate the ovicidal, larvicidal and adulticidal activity against the rural malaria vector Anopheles culicifacies. Ovicidal activity of both types of extracts was statistically significant, and 150 ppm completely inhibited egg hatching. LC 50 values were extremely low against fourth instar larvae, ranging from 1.321 (NM1) to 1.818 ppm (NA2). Adulticidal activity was also high, with LC 50 ranging from 3.015 (NM1) to 3.637 ppm (NM2). This study pointed out the utility of neem cake as a source of eco-friendly mosquitocides in Anopheline vector control programmes.
Schultz, M.M.; Furlong, E.T.
2008-01-01
Treated wastewater effluent is a potential environmental point source for antidepressant pharmaceuticals. A quantitative method was developed for the determination of trace levels of antidepressants in environmental aquatic matrixes using solid-phase extraction coupled with liquid chromatography- electrospray ionization tandem mass spectrometry. Recoveries of parent antidepressants from matrix spiking experiments for the individual antidepressants ranged from 72 to 118% at low concentrations (0.5 ng/L) and 70 to 118% at high concentrations (100 ng/L) for the solid-phase extraction method. Method detection limits for the individual antidepressant compounds ranged from 0.19 to 0.45 ng/L. The method was applied to wastewater effluent and samples collected from a wastewater-dominated stream. Venlafaxine was the predominant antidepressant observed in wastewater and river water samples. Individual antidepressant concentrations found in the wastewater effluent ranged from 3 (duloxetine) to 2190 ng/L (venlafaxine), whereas individual concentrations in the waste-dominated stream ranged from 0.72 (norfluoxetine) to 1310 ng/L (venlafaxine). ?? 2008 American Chemical Society.
Aguirre, Ana-Maria; Bassi, Amarjeet
2014-07-01
Biofuels from algae are considered a technically viable energy source that overcomes several of the problems present in previous generations of biofuels. In this research high pressure steaming (HPS) was studied as a hydrothermal pre-treatment for extraction of lipids from Chlorella vulgaris, and analysis by response surface methodology allowed finding operational points in terms of target temperature and algae concentration for high lipid and glucose yields. Within the range covered by these experiments the best conditions for high bio-crude yield are temperatures higher than 174°C and low biomass concentrations (<5 g/L). For high glucose yield there are two suitable operational ranges, either low temperatures (<105°C) and low biomass concentrations (<4 g/L); or low temperatures (<105°C) and high biomass concentrations (<110 g/L). High pressure steaming is a good hydrothermal treatment for lipid recovery and does not significantly change the fatty acids profile for the range of temperatures studied. Copyright © 2014 Elsevier Ltd. All rights reserved.
Automated control of robotic camera tacheometers for measurements of industrial large scale objects
NASA Astrophysics Data System (ADS)
Heimonen, Teuvo; Leinonen, Jukka; Sipola, Jani
2013-04-01
The modern robotic tacheometers equipped with digital cameras (called also imaging total stations) and capable to measure reflectorless offer new possibilities to gather 3d data. In this paper an automated approach for the tacheometer measurements needed in the dimensional control of industrial large scale objects is proposed. There are two new contributions in the approach: the automated extraction of the vital points (i.e. the points to be measured) and the automated fine aiming of the tacheometer. The proposed approach proceeds through the following steps: First the coordinates of the vital points are automatically extracted from the computer aided design (CAD) data. The extracted design coordinates are then used to aim the tacheometer to point out to the designed location of the points, one after another. However, due to the deviations between the designed and the actual location of the points, the aiming need to be adjusted. An automated dynamic image-based look-and-move type servoing architecture is proposed to be used for this task. After a successful fine aiming, the actual coordinates of the point in question can be automatically measured by using the measuring functionalities of the tacheometer. The approach was validated experimentally and noted to be feasible. On average 97 % of the points actually measured in four different shipbuilding measurement cases were indeed proposed to be vital points by the automated extraction algorithm. The accuracy of the results obtained with the automatic control method of the tachoemeter were comparable to the results obtained with the manual control, and also the reliability of the image processing step of the method was found to be high in the laboratory experiments.
Liu, Jing-fu; Liu, Rui; Yin, Yong-guang; Jiang, Gui-bin
2009-03-28
Capable of preserving the sizes and shapes of nanomaterials during the phase transferring, Triton X-114 based cloud point extraction provides a general, simple, and cost-effective route for reversible concentration/separation or dispersion of various nanomaterials in the aqueous phase.
Automatic extraction of discontinuity orientation from rock mass surface 3D point cloud
NASA Astrophysics Data System (ADS)
Chen, Jianqin; Zhu, Hehua; Li, Xiaojun
2016-10-01
This paper presents a new method for extracting discontinuity orientation automatically from rock mass surface 3D point cloud. The proposed method consists of four steps: (1) automatic grouping of discontinuity sets using an improved K-means clustering method, (2) discontinuity segmentation and optimization, (3) discontinuity plane fitting using Random Sample Consensus (RANSAC) method, and (4) coordinate transformation of discontinuity plane. The method is first validated by the point cloud of a small piece of a rock slope acquired by photogrammetry. The extracted discontinuity orientations are compared with measured ones in the field. Then it is applied to a publicly available LiDAR data of a road cut rock slope at Rockbench repository. The extracted discontinuity orientations are compared with the method proposed by Riquelme et al. (2014). The results show that the presented method is reliable and of high accuracy, and can meet the engineering needs.
Automatic Extraction of Road Markings from Mobile Laser-Point Cloud Using Intensity Data
NASA Astrophysics Data System (ADS)
Yao, L.; Chen, Q.; Qin, C.; Wu, H.; Zhang, S.
2018-04-01
With the development of intelligent transportation, road's high precision information data has been widely applied in many fields. This paper proposes a concise and practical way to extract road marking information from point cloud data collected by mobile mapping system (MMS). The method contains three steps. Firstly, road surface is segmented through edge detection from scan lines. Then the intensity image is generated by inverse distance weighted (IDW) interpolation and the road marking is extracted by using adaptive threshold segmentation based on integral image without intensity calibration. Moreover, the noise is reduced by removing a small number of plaque pixels from binary image. Finally, point cloud mapped from binary image is clustered into marking objects according to Euclidean distance, and using a series of algorithms including template matching and feature attribute filtering for the classification of linear markings, arrow markings and guidelines. Through processing the point cloud data collected by RIEGL VUX-1 in case area, the results show that the F-score of marking extraction is 0.83, and the average classification rate is 0.9.
NASA Astrophysics Data System (ADS)
Kelkboom, Emile J. C.; Breebaart, Jeroen; Buhan, Ileana; Veldhuis, Raymond N. J.
2010-04-01
Template protection techniques are used within biometric systems in order to protect the stored biometric template against privacy and security threats. A great portion of template protection techniques are based on extracting a key from or binding a key to a biometric sample. The achieved protection depends on the size of the key and its closeness to being random. In the literature it can be observed that there is a large variation on the reported key lengths at similar classification performance of the same template protection system, even when based on the same biometric modality and database. In this work we determine the analytical relationship between the system performance and the theoretical maximum key size given a biometric source modeled by parallel Gaussian channels. We consider the case where the source capacity is evenly distributed across all channels and the channels are independent. We also determine the effect of the parameters such as the source capacity, the number of enrolment and verification samples, and the operating point selection on the maximum key size. We show that a trade-off exists between the privacy protection of the biometric system and its convenience for its users.
NASA Astrophysics Data System (ADS)
Eilerman, S. J.; Peischl, J.; Neuman, J. A.; Ryerson, T. B.; Wild, R. J.; Perring, A. E.; Brown, S. S.; Aikin, K. C.; Holloway, M.; Roberts, O.
2014-12-01
Atmospheric emissions from agriculture are important to air quality and climate, yet their representation in inventories is incomplete. Increased fertilizer use has lead to increased emissions of nitrogen compounds, which can adversely affect ecosystems and contribute to the formation of fine particulates. Furthermore, extraction and processing of oil and natural gas continues to expand throughout northeastern Colorado; emissions from these operations require ongoing measurement and characterization. This presentation summarizes initial data and analysis from a summer 2014 campaign to study emissions of nitrogen compounds, methane, and other species in northeastern Colorado using a new mobile laboratory. A van was instrumented to measure NH3, N2O, NOx, NOy, CH4, CO, CO2, O3, and bioaerosols with high time resolution. By sampling in close proximity to a variety of emissions sources, the mobile laboratory facilitated accurate source identification and quantification of emissions ratios. Measurements were obtained near agricultural sites, natural gas and oil operations, and other point sources. Additionally, extensive measurements were obtained downwind from urban areas and along roadways. The relationship between ammonia and other trace gases is used to characterize sources and constrain emissions inventories.
Kachangoon, Rawikan; Vichapong, Jitlada; Burakham, Rodjana; Santaladchaiyakit, Yanawath; Srijaranai, Supalax
2018-05-12
An effective pre-concentration method, namely amended-cloud point extraction (CPE), has been developed for the extraction and pre-concentration of neonicotinoid insecticide residues. The studied analytes including clothianidin, imidacloprid, acetamiprid, thiamethoxam and thiacloprid were chosen as a model compound. The amended-CPE procedure included two cloud point processes. Triton™ X-114 was used to extract neonicotinoid residues into the surfactant-rich phase and then the analytes were transferred into an alkaline solution with the help of ultrasound energy. The extracts were then analyzed by high-performance liquid chromatography (HPLC) coupled with a monolithic column. Several factors influencing the extraction efficiency were studied such as kind and concentration of surfactant, type and content of salts, kind and concentration of back extraction agent, and incubation temperature and time. Enrichment factors (EFs) were found in the range of 20⁻333 folds. The limits of detection of the studied neonicotinoids were in the range of 0.0003⁻0.002 µg mL −1 which are below the maximum residue limits (MRLs) established by the European Union (EU). Good repeatability was obtained with relative standard deviations lower than 1.92% and 4.54% for retention time ( t R ) and peak area, respectively. The developed extraction method was successfully applied for the analysis of water samples. No detectable residues of neonicotinoids in the studied samples were found.
Extraction of Extended Small-Scale Objects in Digital Images
NASA Astrophysics Data System (ADS)
Volkov, V. Y.
2015-05-01
Detection and localization problem of extended small-scale objects with different shapes appears in radio observation systems which use SAR, infra-red, lidar and television camera. Intensive non-stationary background is the main difficulty for processing. Other challenge is low quality of images, blobs, blurred boundaries; in addition SAR images suffer from a serious intrinsic speckle noise. Statistics of background is not normal, it has evident skewness and heavy tails in probability density, so it is hard to identify it. The problem of extraction small-scale objects is solved here on the basis of directional filtering, adaptive thresholding and morthological analysis. New kind of masks is used which are open-ended at one side so it is possible to extract ends of line segments with unknown length. An advanced method of dynamical adaptive threshold setting is investigated which is based on isolated fragments extraction after thresholding. Hierarchy of isolated fragments on binary image is proposed for the analysis of segmentation results. It includes small-scale objects with different shape, size and orientation. The method uses extraction of isolated fragments in binary image and counting points in these fragments. Number of points in extracted fragments is normalized to the total number of points for given threshold and is used as effectiveness of extraction for these fragments. New method for adaptive threshold setting and control maximises effectiveness of extraction. It has optimality properties for objects extraction in normal noise field and shows effective results for real SAR images.
Local spectrum analysis of field propagation in an anisotropic medium. Part I. Time-harmonic fields.
Tinkelman, Igor; Melamed, Timor
2005-06-01
The phase-space beam summation is a general analytical framework for local analysis and modeling of radiation from extended source distributions. In this formulation, the field is expressed as a superposition of beam propagators that emanate from all points in the source domain and in all directions. In this Part I of a two-part investigation, the theory is extended to include propagation in anisotropic medium characterized by a generic wave-number profile for time-harmonic fields; in a companion paper [J. Opt. Soc. Am. A 22, 1208 (2005)], the theory is extended to time-dependent fields. The propagation characteristics of the beam propagators in a homogeneous anisotropic medium are considered. With use of Gaussian windows for the local processing of either ordinary or extraordinary electromagnetic field distributions, the field is represented by a phase-space spectral distribution in which the propagating elements are Gaussian beams that are formulated by using Gaussian plane-wave spectral distributions over the extended source plane. By applying saddle-point asymptotics, we extract the Gaussian beam phenomenology in the anisotropic environment. The resulting field is parameterized in terms of the spatial evolution of the beam curvature, beam width, etc., which are mapped to local geometrical properties of the generic wave-number profile. The general results are applied to the special case of uniaxial crystal, and it is found that the asymptotics for the Gaussian beam propagators, as well as the physical phenomenology attached, perform remarkably well.
Palmprint verification using Lagrangian decomposition and invariant interest points
NASA Astrophysics Data System (ADS)
Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.
2011-06-01
This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.
Universal Temporal Profile of Replication Origin Activation in Eukaryotes
NASA Astrophysics Data System (ADS)
Goldar, Arach
2011-03-01
The complete and faithful transmission of eukaryotic genome to daughter cells involves the timely duplication of mother cell's DNA. DNA replication starts at multiple chromosomal positions called replication origin. From each activated replication origin two replication forks progress in opposite direction and duplicate the mother cell's DNA. While it is widely accepted that in eukaryotic organisms replication origins are activated in a stochastic manner, little is known on the sources of the observed stochasticity. It is often associated to the population variability to enter S phase. We extract from a growing Saccharomyces cerevisiae population the average rate of origin activation in a single cell by combining single molecule measurements and a numerical deconvolution technique. We show that the temporal profile of the rate of origin activation in a single cell is similar to the one extracted from a replicating cell population. Taking into account this observation we exclude the population variability as the origin of observed stochasticity in origin activation. We confirm that the rate of origin activation increases in the early stage of S phase and decreases at the latter stage. The population average activation rate extracted from single molecule analysis is in prefect accordance with the activation rate extracted from published micro-array data, confirming therefore the homogeneity and genome scale invariance of dynamic of replication process. All these observations point toward a possible role of replication fork to control the rate of origin activation.
The negative hydrogen Penning ion gauge ion source for KIRAMS-13 cyclotron
DOE Office of Scientific and Technical Information (OSTI.GOV)
An, D. H.; Jung, I. S.; Kang, J.
2008-02-15
The cold-cathode-type Penning ion gauge (PIG) ion source for the internal ion source of KIRAMS-13 cyclotron has been used for generation of negative hydrogen ions. The dc H-beam current of 650 {mu}A from the PIG ion source with the Dee voltage of 40 kV and arc current of 1.0 A is extrapolated from the measured dc extraction beam currents at the low extraction dc voltages. The output optimization of PIG ion source in the cyclotron has been carried out by using various chimneys with different sizes of the expansion gap between the plasma boundary and the chimney wall. This papermore » presents the results of the dc H-extraction measurement and the expansion gap experiment.« less
NASA Astrophysics Data System (ADS)
Neri, L.; Celona, L.; Gammino, S.; Miraglia, A.; Leonardi, O.; Castro, G.; Torrisi, G.; Mascali, D.; Mazzaglia, M.; Allegra, L.; Amato, A.; Calabrese, G.; Caruso, A.; Chines, F.; Gallo, G.; Longhitano, A.; Manno, G.; Marletta, S.; Maugeri, A.; Passarello, S.; Pastore, G.; Seminara, A.; Spartà, A.; Vinciguerra, S.
2017-07-01
At the Istituto Nazionale di Fisica Nucleare - Laboratori Nazionali del Sud (INFN-LNS) the beam commissioning of the high intensity Proton Source for the European Spallation Source (PS-ESS) started in November 2016. Beam stability at high current intensity is one of the most important parameter for the first steps of the ongoing commissioning. Promising results were obtained since the first source start with a 6 mm diameter extraction hole. The increase of the extraction hole to 8 mm allowed improving PS-ESS performances and obtaining the values required by the ESS accelerator. In this work, extracted beam current characteristics together with Doppler shift and emittance measurements are presented, as well as the description of the next phases before the installation at ESS in Lund.
Guo, Xueru; Zuo, Rui; Meng, Li; Wang, Jinsheng; Teng, Yanguo; Liu, Xin; Chen, Minhua
2018-01-01
Globally, groundwater resources are being deteriorated by rapid social development. Thus, there is an urgent need to assess the combined impacts of natural and enhanced anthropogenic sources on groundwater chemistry. The aim of this study was to identify seasonal characteristics and spatial variations in anthropogenic and natural effects, to improve the understanding of major hydrogeochemical processes based on source apportionment. 34 groundwater points located in a riverside groundwater resource area in northeast China were sampled during the wet and dry seasons in 2015. Using principal component analysis and factor analysis, 4 principal components (PCs) were extracted from 16 groundwater parameters. Three of the PCs were water-rock interaction (PC1), geogenic Fe and Mn (PC2), and agricultural pollution (PC3). A remarkable difference (PC4) was organic pollution originating from negative anthropogenic effects during the wet season, and geogenic F enrichment during the dry season. Groundwater exploitation resulted in dramatic depression cone with higher hydraulic gradient around the water source area. It not only intensified dissolution of calcite, dolomite, gypsum, Fe, Mn and fluorine minerals, but also induced more surface water recharge for the water source area. The spatial distribution of the PCs also suggested the center of the study area was extremely vulnerable to contamination by Fe, Mn, COD, and F−. PMID:29415516
Dong, Junzi; Colburn, H. Steven
2016-01-01
In multisource, “cocktail party” sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem. PMID:26866056
Dong, Junzi; Colburn, H Steven; Sen, Kamal
2016-01-01
In multisource, "cocktail party" sound environments, human and animal auditory systems can use spatial cues to effectively separate and follow one source of sound over competing sources. While mechanisms to extract spatial cues such as interaural time differences (ITDs) are well understood in precortical areas, how such information is reused and transformed in higher cortical regions to represent segregated sound sources is not clear. We present a computational model describing a hypothesized neural network that spans spatial cue detection areas and the cortex. This network is based on recent physiological findings that cortical neurons selectively encode target stimuli in the presence of competing maskers based on source locations (Maddox et al., 2012). We demonstrate that key features of cortical responses can be generated by the model network, which exploits spatial interactions between inputs via lateral inhibition, enabling the spatial separation of target and interfering sources while allowing monitoring of a broader acoustic space when there is no competition. We present the model network along with testable experimental paradigms as a starting point for understanding the transformation and organization of spatial information from midbrain to cortex. This network is then extended to suggest engineering solutions that may be useful for hearing-assistive devices in solving the cocktail party problem.
Apu, Apurba Sarker; Liza, Mahmuda Sultana; Jamaluddin, A.T.M.; Howlader, Md. Amran; Saha, Repon Kumer; Rizwan, Farhana; Nasrin, Nishat
2012-01-01
Objective To investigate the bioactivities of crude n-hexane, ethyl acetate and methanol extracts of aerial part of Boerhavia diffusa Linn. (B. diffusa) and its phytochemical analysis. Methods The identification of phytoconstituents and assay of antioxidant, thrombolytic, cytotoxic, antimicrobial activities were conducted using specific standard in vitro procedures. Results The results showed that the plant extracts were a rich source of phytoconstituents. Methanol extract showed higher antioxidant, thrombolytic activity and less cytotoxic activity than those of n-hexane and ethyl acetate extracts of B. diffusa. Among the bioactivities, antioxidant activity was the most notable compared to the positive control and thus could be a potential rich source of natural antioxidant. In case of antimicrobial screening, crude extracts of the plant showed remarkable antibacterial activity against tested microorganisms. All the extracts showed significant inhibitory activity against Candida albicuns, at a concentration of 1000 µg/disc. Conclusions The present findings suggest that, the plant widely available in Bangladesh, could be a prominent source of medicinally important natural compounds. PMID:23569993
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A.; Zhang, Wenbo
2016-01-01
Objective Combined source imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a non-invasive fashion. Source imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source imaging algorithms to both find the network nodes (regions of interest) and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Methods Source imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from inter-ictal and ictal signals recorded by EEG and/or MEG. Results Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ~20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Conclusion Our study indicates that combined source imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). Significance The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions. PMID:27740473
Sohrabpour, Abbas; Ye, Shuai; Worrell, Gregory A; Zhang, Wenbo; He, Bin
2016-12-01
Combined source-imaging techniques and directional connectivity analysis can provide useful information about the underlying brain networks in a noninvasive fashion. Source-imaging techniques have been used successfully to either determine the source of activity or to extract source time-courses for Granger causality analysis, previously. In this work, we utilize source-imaging algorithms to both find the network nodes [regions of interest (ROI)] and then extract the activation time series for further Granger causality analysis. The aim of this work is to find network nodes objectively from noninvasive electromagnetic signals, extract activation time-courses, and apply Granger analysis on the extracted series to study brain networks under realistic conditions. Source-imaging methods are used to identify network nodes and extract time-courses and then Granger causality analysis is applied to delineate the directional functional connectivity of underlying brain networks. Computer simulations studies where the underlying network (nodes and connectivity pattern) is known were performed; additionally, this approach has been evaluated in partial epilepsy patients to study epilepsy networks from interictal and ictal signals recorded by EEG and/or Magnetoencephalography (MEG). Localization errors of network nodes are less than 5 mm and normalized connectivity errors of ∼20% in estimating underlying brain networks in simulation studies. Additionally, two focal epilepsy patients were studied and the identified nodes driving the epileptic network were concordant with clinical findings from intracranial recordings or surgical resection. Our study indicates that combined source-imaging algorithms with Granger causality analysis can identify underlying networks precisely (both in terms of network nodes location and internodal connectivity). The combined source imaging and Granger analysis technique is an effective tool for studying normal or pathological brain conditions.
NASA Astrophysics Data System (ADS)
Liu, Kai; Liu, Yuan; Liu, Yu-Rong; En, Yun-Fei; Li, Bin
2017-07-01
Channel mobility in the p-type polycrystalline silicon thin film transistors (poly-Si TFTs) is extracted using Hoffman method, linear region transconductance method and multi-frequency C-V method. Due to the non-negligible errors when neglecting the dependence of gate-source voltage on the effective mobility, the extracted mobility results are overestimated using linear region transconductance method and Hoffman method, especially in the lower gate-source voltage region. By considering of the distribution of localized states in the band-gap, the frequency independent capacitance due to localized charges in the sub-gap states and due to channel free electron charges in the conduction band were extracted using multi-frequency C-V method. Therefore, channel mobility was extracted accurately based on the charge transport theory. In addition, the effect of electrical field dependent mobility degradation was also considered in the higher gate-source voltage region. In the end, the extracted mobility results in the poly-Si TFTs using these three methods are compared and analyzed.
Point source emission reference materials from the Emissions Inventory Improvement Program (EIIP). Provides point source guidance on planning, emissions estimation, data collection, inventory documentation and reporting, and quality assurance/quality contr
Contextual Classification of Point Cloud Data by Exploiting Individual 3d Neigbourhoods
NASA Astrophysics Data System (ADS)
Weinmann, M.; Schmidt, A.; Mallet, C.; Hinz, S.; Rottensteiner, F.; Jutzi, B.
2015-03-01
The fully automated analysis of 3D point clouds is of great importance in photogrammetry, remote sensing and computer vision. For reliably extracting objects such as buildings, road inventory or vegetation, many approaches rely on the results of a point cloud classification, where each 3D point is assigned a respective semantic class label. Such an assignment, in turn, typically involves statistical methods for feature extraction and machine learning. Whereas the different components in the processing workflow have extensively, but separately been investigated in recent years, the respective connection by sharing the results of crucial tasks across all components has not yet been addressed. This connection not only encapsulates the interrelated issues of neighborhood selection and feature extraction, but also the issue of how to involve spatial context in the classification step. In this paper, we present a novel and generic approach for 3D scene analysis which relies on (i) individually optimized 3D neighborhoods for (ii) the extraction of distinctive geometric features and (iii) the contextual classification of point cloud data. For a labeled benchmark dataset, we demonstrate the beneficial impact of involving contextual information in the classification process and that using individual 3D neighborhoods of optimal size significantly increases the quality of the results for both pointwise and contextual classification.
Research on Methods of High Coherent Target Extraction in Urban Area Based on Psinsar Technology
NASA Astrophysics Data System (ADS)
Li, N.; Wu, J.
2018-04-01
PSInSAR technology has been widely applied in ground deformation monitoring. Accurate identification of Persistent Scatterers (PS) is key to the success of PSInSAR data processing. In this paper, the theoretic models and specific algorithms of PS point extraction methods are summarized and the characteristics and applicable conditions of each method, such as Coherence Coefficient Threshold method, Amplitude Threshold method, Dispersion of Amplitude method, Dispersion of Intensity method, are analyzed. Based on the merits and demerits of different methods, an improved method for PS point extraction in urban area is proposed, that uses simultaneously backscattering characteristic, amplitude and phase stability to find PS point in all pixels. Shanghai city is chosen as an example area for checking the improvements of the new method. The results show that the PS points extracted by the new method have high quality, high stability and meet the strong scattering characteristics. Based on these high quality PS points, the deformation rate along the line-of-sight (LOS) in the central urban area of Shanghai is obtained by using 35 COSMO-SkyMed X-band SAR images acquired from 2008 to 2010 and it varies from -14.6 mm/year to 4.9 mm/year. There is a large sedimentation funnel in the cross boundary of Hongkou and Yangpu district with a maximum sedimentation rate of more than 14 mm per year. The obtained ground subsidence rates are also compared with the result of spirit leveling and show good consistent. Our new method for PS point extraction is more reasonable, and can improve the accuracy of the obtained deformation results.
Daniels, Sarah I; Sillé, Fenna C M; Goldbaum, Audrey; Yee, Brenda; Key, Ellen F; Zhang, Luoping; Smith, Martyn T; Thomas, Reuben
2014-12-01
Blood miRNAs are a new promising area of disease research, but variability in miRNA measurements may limit detection of true-positive findings. Here, we measured sources of miRNA variability and determine whether repeated measures can improve power to detect fold-change differences between comparison groups. Blood from healthy volunteers (N = 12) was collected at three time points. The miRNAs were extracted by a method predetermined to give the highest miRNA yield. Nine different miRNAs were quantified using different qPCR assays and analyzed using mixed models to identify sources of variability. A larger number of miRNAs from a publicly available blood miRNA microarray dataset with repeated measures were used for a bootstrapping procedure to investigate effects of repeated measures on power to detect fold changes in miRNA expression for a theoretical case-control study. Technical variability in qPCR replicates was identified as a significant source of variability (P < 0.05) for all nine miRNAs tested. Variability was larger in the TaqMan qPCR assays (SD = 0.15-0.61) versus the qScript qPCR assays (SD = 0.08-0.14). Inter- and intraindividual and extraction variability also contributed significantly for two miRNAs. The bootstrapping procedure demonstrated that repeated measures (20%-50% of N) increased detection of a 2-fold change for approximately 10% to 45% more miRNAs. Statistical power to detect small fold changes in blood miRNAs can be improved by accounting for sources of variability using repeated measures and choosing appropriate methods to minimize variability in miRNA quantification. This study demonstrates the importance of including repeated measures in experimental designs for blood miRNA research. See all the articles in this CEBP Focus section, "Biomarkers, Biospecimens, and New Technologies in Molecular Epidemiology." ©2014 American Association for Cancer Research.
NASA Astrophysics Data System (ADS)
Zhu, Lei; Song, JinXi; Liu, WanQing
2017-12-01
Huaxian Section is the last hydrological and water quality monitoring section of Weihe River Watershed. Weihe River Watershed above Huaxian Section is taken as the research objective in this paper and COD is chosen as the water quality parameter. According to the discharge characteristics of point source pollutions and non-point source pollutions, a new method to estimate pollution loads—characteristic section load(CSLD) method is suggested and point source pollution and non-point source pollution loads of Weihe River Watershed above Huaxian Section are calculated in the rainy, normal and dry season in the year 2007. The results show that the monthly point source pollution loads of Weihe River Watershed above Huaxian Section discharge stably and the monthly non-point source pollution loads of Weihe River Watershed above Huaxian Section change greatly and the non-point source pollution load proportions of total pollution load of COD decrease in the normal, rainy and wet period in turn.
Calculating NH3-N pollution load of wei river watershed above Huaxian section using CSLD method
NASA Astrophysics Data System (ADS)
Zhu, Lei; Song, JinXi; Liu, WanQing
2018-02-01
Huaxian Section is the last hydrological and water quality monitoring section of Weihe River Watershed. So it is taken as the research objective in this paper and NH3-N is chosen as the water quality parameter. According to the discharge characteristics of point source pollutions and non-point source pollutions, a new method to estimate pollution loads—characteristic section load (CSLD)method is suggested and point source pollution and non-point source pollution loads of Weihe River Watershed above Huaxian Section are calculated in the rainy, normal and dry season in the year 2007. The results show that the monthly point source pollution loads of Weihe River Watershed above Huaxian Section discharge stably and the monthly non-point source pollution loads of Weihe River Watershed above Huaxian Section change greatly. The non-point source pollution load proportions of total pollution load of NH3-N decrease in the normal, rainy and wet period in turn.
NASA Astrophysics Data System (ADS)
Huang, Ching-Sheng; Yeh, Hund-Der
2016-11-01
This study introduces an analytical approach to estimate drawdown induced by well extraction in a heterogeneous confined aquifer with an irregular outer boundary. The aquifer domain is divided into a number of zones according to the zonation method for representing the spatial distribution of a hydraulic parameter field. The lateral boundary of the aquifer can be considered under the Dirichlet, Neumann or Robin condition at different parts of the boundary. Flow across the interface between two zones satisfies the continuities of drawdown and flux. Source points, each of which has an unknown volumetric rate representing the boundary effect on the drawdown, are allocated around the boundary of each zone. The solution of drawdown in each zone is expressed as a series in terms of the Theis equation with unknown volumetric rates from the source points. The rates are then determined based on the aquifer boundary conditions and the continuity requirements. The estimated aquifer drawdown by the present approach agrees well with a finite element solution developed based on the Mathematica function NDSolve. As compared with the existing numerical approaches, the present approach has a merit of directly computing the drawdown at any given location and time and therefore takes much less computing time to obtain the required results in engineering applications.
The virtual library: Coming of age
NASA Technical Reports Server (NTRS)
Hunter, Judy F.; Cotter, Gladys A.
1994-01-01
With the high speed networking capabilities, multiple media options, and massive amounts of information that exist in electronic format today, the concept of a 'virtual' library or 'library without walls' is becoming viable. In virtual library environment, the information processed goes beyond the traditional definition of documents to include the results of scientific and technical research and development (reports, software, data) recorded in any format or media: electronic, audio, video, or scanned images. Network access to information must include tools to help locate information sources and navigate the networks to connect to the sources, as well as methods to extract the relevant information. Graphical User Interfaces (GUI's) that are intuitive and navigational tools such as Intelligent Gateway Processors (IGP) will provide users with seamless and transparent use of high speed networks to access, organize, and manage information. Traditional libraries will become points of electronic access to information on multiple medias. The emphasis will be towards unique collections of information at each library rather than entire collections at every library. It is no longer a question of whether there is enough information available; it is more a question of how to manage the vast volumes of information. The future equation will involve being able to organize knowledge, manage information, and provide access at the point of origin.
NASA Astrophysics Data System (ADS)
Saeedimoghaddam, M.; Kim, C.
2017-10-01
Understanding individual travel behavior is vital in travel demand management as well as in urban and transportation planning. New data sources including mobile phone data and location-based social media (LBSM) data allow us to understand mobility behavior on an unprecedented level of details. Recent studies of trip purpose prediction tend to use machine learning (ML) methods, since they generally produce high levels of predictive accuracy. Few studies used LSBM as a large data source to extend its potential in predicting individual travel destination using ML techniques. In the presented research, we created a spatio-temporal probabilistic model based on an ensemble ML framework named "Random Forests" utilizing the travel extracted from geotagged Tweets in 419 census tracts of Greater Cincinnati area for predicting the tract ID of an individual's travel destination at any time using the information of its origin. We evaluated the model accuracy using the travels extracted from the Tweets themselves as well as the travels from household travel survey. The Tweets and survey based travels that start from same tract in the south western parts of the study area is more likely to select same destination compare to the other parts. Also, both Tweets and survey based travels were affected by the attraction points in the downtown of Cincinnati and the tracts in the north eastern part of the area. Finally, both evaluations show that the model predictions are acceptable, but it cannot predict destination using inputs from other data sources as precise as the Tweets based data.
Imaging a Fault Boundary System Using Controlled-Source Data Recorded on a Large-N Seismic Array
NASA Astrophysics Data System (ADS)
Paschall, O. C.; Chen, T.; Snelson, C. M.; Ralston, M. D.; Rowe, C. A.
2016-12-01
The Source Physics Experiment (SPE) is a series of chemical explosions conducted in southern Nevada with an objective of improving nuclear explosion monitoring. Five chemical explosions have occurred thus far in granite, the most recent being SPE-5 on April 26, 2016. The SPE series will improve our understanding of seismic wave propagation (primarily S-waves) due to explosions, and allow better discrimination of background seismicity such as earthquakes and explosions. The Large-N portion of the project consists of 996 receiver stations. Half of the stations were vertical component and the other half were three-component geophones. All receivers were deployed for 30 days and recorded the SPE-5 shot, earthquakes, noise, and an additional controlled-source: a large weight-drop, which is a 13,000 kg modified industrial pile driver. In this study, we undertake reflection processing of waveforms from the weight-drop, as recorded by a line of sensors extracted from the Large-N array. The profile is 1.2 km in length with 25 m station spacing and 100 m shot point spacing. This profile crosses the Boundary Fault that separates granite body and an alluvium basin, a strong acoustic impedance boundary that scatters seismic energy into S-waves and coda. The data were processed with traditional seismic reflection processing methods that include filtering, deconvolution, and stacking. The stack will be used to extract the location of the splays of the Boundary Fault and provide geologic constraints to the modeling and simulation teams within the SPE project.
Shanks, Orin C.; White, Karen; Kelty, Catherine A.; Hayes, Sam; Sivaganesan, Mano; Jenkins, Michael; Varma, Manju; Haugland, Richard A.
2010-01-01
There are numerous PCR-based assays available to characterize bovine fecal pollution in ambient waters. The determination of which approaches are most suitable for field applications can be difficult because each assay targets a different gene, in many cases from different microorganisms, leading to variation in assay performance. We describe a performance evaluation of seven end-point PCR and real-time quantitative PCR (qPCR) assays reported to be associated with either ruminant or bovine feces. Each assay was tested against a reference collection of DNA extracts from 247 individual bovine fecal samples representing 11 different populations and 175 fecal DNA extracts from 24 different animal species. Bovine-associated genetic markers were broadly distributed among individual bovine samples ranging from 39 to 93%. Specificity levels of the assays spanned 47.4% to 100%. End-point PCR sensitivity also varied between assays and among different bovine populations. For qPCR assays, the abundance of each host-associated genetic marker was measured within each bovine population and compared to results of a qPCR assay targeting 16S rRNA gene sequences from Bacteroidales. Experiments indicate large discrepancies in the performance of bovine-associated assays across different bovine populations. Variability in assay performance between host populations suggests that the use of bovine microbial source-tracking applications will require a priori characterization at each watershed of interest. PMID:20061457
Studies of the beam extraction system of the GTS-LHC electron cyclotron resonance ion source at CERN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Toivanen, V., E-mail: ville.aleksi.toivanen@cern.ch; Küchler, D.
2016-02-15
The 14.5 GHz GTS-LHC Electron Cyclotron Resonance Ion Source (ECRIS) provides multiply charged heavy ion beams for the CERN experimental program. The GTS-LHC beam formation has been studied extensively with lead, argon, and xenon beams with varied beam extraction conditions using the ion optical code IBSimu. The simulation model predicts self-consistently the formation of triangular and hollow beam structures which are often associated with ECRIS ion beams, as well as beam loss patterns which match the observed beam induced markings in the extraction region. These studies provide a better understanding of the properties of the extracted beams and a waymore » to diagnose the extraction system performance and limitations, which is otherwise challenging due to the lack of direct diagnostics in this region and the limited availability of the ion source for development work.« less
Toivanen, V; Küchler, D
2016-02-01
The 14.5 GHz GTS-LHC Electron Cyclotron Resonance Ion Source (ECRIS) provides multiply charged heavy ion beams for the CERN experimental program. The GTS-LHC beam formation has been studied extensively with lead, argon, and xenon beams with varied beam extraction conditions using the ion optical code IBSimu. The simulation model predicts self-consistently the formation of triangular and hollow beam structures which are often associated with ECRIS ion beams, as well as beam loss patterns which match the observed beam induced markings in the extraction region. These studies provide a better understanding of the properties of the extracted beams and a way to diagnose the extraction system performance and limitations, which is otherwise challenging due to the lack of direct diagnostics in this region and the limited availability of the ion source for development work.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sidorov, A.; Dorf, M.; Zorin, V.
2008-02-15
Electron cyclotron resonance ion source with quasi-gas-dynamic regime of plasma confinement (ReGIS), constructed at the Institute of Applied Physics, Russia, provides opportunities for extracting intense and high-brightness multicharged ion beams. Despite the short plasma lifetime in a magnetic trap of a ReGIS, the degree of multiple ionization may be significantly enhanced by the increase in power and frequency of the applied microwave radiation. The present work is focused on studying the intense beam quality of this source by the pepper-pot method. A single beamlet emittance measured by the pepper-pot method was found to be {approx}70 {pi} mm mrad, and themore » total extracted beam current obtained at 14 kV extraction voltage was {approx}25 mA. The results of the numerical simulations of ion beam extraction are found to be in good agreement with experimental data.« less
Moderate pressure plasma source of nonthermal electrons
NASA Astrophysics Data System (ADS)
Gershman, S.; Raitses, Y.
2018-06-01
Plasma sources of electrons offer control of gas and surface chemistry without the need for complex vacuum systems. The plasma electron source presented here is based on a cold cathode glow discharge (GD) operating in a dc steady state mode in a moderate pressure range of 2–10 torr. Ion-induced secondary electron emission is the source of electrons accelerated to high energies in the cathode sheath potential. The source geometry is a key to the availability and the extraction of the nonthermal portion of the electron population. The source consists of a flat and a cylindrical electrode, 1 mm apart. Our estimates show that the length of the cathode sheath in the plasma source is commensurate (~0.5–1 mm) with the inter-electrode distance so the GD operates in an obstructed regime without a positive column. Estimations of the electron energy relaxation confirm the non-local nature of this GD, hence the nonthermal portion of the electron population is available for extraction outside of the source. The use of a cylindrical anode presents a simple and promising method of extracting the high energy portion of the electron population. Langmuir probe measurements and optical emission spectroscopy confirm the presence of electrons with energies ~15 eV outside of the source. These electrons become available for surface modification and radical production outside of the source. The extraction of the electrons of specific energies by varying the anode geometry opens exciting opportunities for future exploration.
Speciation and Determination of Low Concentration of Iron in Beer Samples by Cloud Point Extraction
ERIC Educational Resources Information Center
Khalafi, Lida; Doolittle, Pamela; Wright, John
2018-01-01
A laboratory experiment is described in which students determine the concentration and speciation of iron in beer samples using cloud point extraction and absorbance spectroscopy. The basis of determination is the complexation between iron and 2-(5-bromo-2- pyridylazo)-5-diethylaminophenol (5-Br-PADAP) as a colorimetric reagent in an aqueous…
Varying Conditions for Hexanoic Acid Degradation with BioTiger™
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foreman, Koji; Milliken, Charles; Brigmon, Robin
BioTiger™ (BT) is a consortium of 12 bacteria designed for petroleum waste biodegradation. BT is currently being studied and could be considered for bioremediation of the Athabasca oil sands refineries in Canada and elsewhere. The run-off ponds from the petroleum extraction processes, called tailings ponds, are a mixture of polycyclic aromatic hydrocarbons, naphthenic acids, hydrocarbons, toxic chemicals like heavy metals, water, and sand. Due to environmental regulations the oil industry would like to separate and degrade the hazardous chemical species from the tailings ponds while recycling the water. It has been shown that BT at 30 C° is able tomore » completely degrade 10 mM hexanoic acid (HA) co-metabolically with 0.2% yeast extract (w/v) in 48 hours when starting at 0.4 OD 600nm. After establishing this stable degradation capability, variations were tested to explore the wider parameters of BT activity in temperature, pH, intermediate degradation, co-metabolic dependence, and transfer stability. Due to the vast differences in temperature at various points in the refineries, a wide range of temperatures were assessed. The results indicate that BT retains the ability to degrade HA, a model surrogate for tailings pond contaminants, at temperatures ranging from 15°C to 35°C. Hexanamide (HAM) was shown to be an intermediate generated during the degradation of HA in an earlier work and HAM is completely degraded after 48 hours, indicating that HAM is not the final product of HA degradation. Various replacements for yeast extract were attempted. Glucose, a carbon source; casein amino acids, a protein source; additional ammonia, mimicking known media; and additional phosphate with Wolffe’s vitamins and minerals all showed no significant degradation of HA compared to control. Decreasing the yeast extract concentration (0.05%) demonstrated limited but significant degradation. Finally, serial inoculations of BT were performed to determine the stability of degradation over several generations. Overall, BT has shown to be moderately flexible for HA co-metabolic biodegradation.« less
Sources of fine particle composition in the northeastern US
NASA Astrophysics Data System (ADS)
Song, Xin-Hua; Polissar, Alexandr V.; Hopke, Philip K.
Fine particle composition data obtained at three sampling sites in the northeastern US were studied using a relatively new type of factor analysis, positive matrix factorization (PMF). The three sites are Washington, DC, Brigantine, NJ and Underhill, VT. The PMF method uses the estimates of the error in the data to provide optimal point-by-point weighting and permits efficient treatment of missing and below detection limit values. It also imposes the non-negativity constraint on the factors. Eight, nine and 11 sources were resolved from the Washington, Brigantine and Underhill data, respectively. The factors were normalized by using aerosol fine mass concentration data through multiple linear regression so that the quantitative source contributions for each resolved factor were obtained. Among the sources resolved at the three sites, six are common. These six sources exhibit not only similar chemical compositions, but also similar seasonal variations at all three sites. They are secondary sulfate with a high concentration of S and strong seasonal variation trend peaking in summer time; coal combustion with the presence of S and Se and its seasonal variation peaking in winter time; oil combustion characterized by Ni and V; soil represented by Al, Ca, Fe, K, Si and Ti; incinerator with the presence of Pb and Zn; sea salt with the high concentrations of Na and S. Among the other sources, nitrate (dominated by NO 3-) and motor vehicle (with high concentrations of organic carbon (OC) and elemental carbon (EC), and with the presence of some soil dust components) were obtained for the Washington data, while the three additional sources for the Brigantine data were nitrate, motor vehicle and wood smoke (OC, EC, K). At the Underhill site, five other sources were resolved. They are wood smoke, Canadian Mn, Canadian Cu smelter, Canadian Ni smelter, and another salt source with high concentrations of Cl and Na. A nitrate source similar to that found at the other sites could not be obtained at Underhill since NO 3- was not measured at this site. Generally, most of the sources at the three sites showed similar chemical composition profiles and seasonal variation patterns. The study indicated that PMF was a powerful factor analysis method to extract sources from the ambient aerosol concentration data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mazurek, M.A.; Hildemann, L.M.; Cass, G.R.
1990-04-01
Extractable organic compounds having between 6 to 40 carbon atoms comprise an important mass fraction of the fine particulate matter samples from major urban emission sources. Depending on the emission source type, this solvent-soluble fraction accounts for <20% to 100% of the total organic aerosol mass, as measured by quantitative high-resolution has chromatography (HRGC) with flame ionization detection. In addition to total extract quantitation, HRGC can be applied to further analyses of the mass distributions of elutable organics present in the complex aerosol extract mixtures, thus generating profiles that serve as fingerprints'' for the sources of interest. This HRGC analyticalmore » method is applied to emission source samples that contain between 7 to 12,000 {mu}g/filter organic carbon. It is shown to be a sensitive technique for analysis of carbonaceous aerosol extract mixtures having diverse mass loadings and species distributions. This study describes the analytical chemical methods that have been applied to: the construction of chemical mass balances based on the mass of fine organic aerosol emitted for major urban sources of particulate carbon; and the generation of discrete emission source chemical profiles derived from chromatographic characteristics of the organic aerosol components. 21 refs., 1 fig., 2 tabs.« less
Meijun Li,; Ellis, Geoffrey S.
2015-01-01
Dibenzofuran (DBF), its alkylated homologues, and benzo[b]naphthofurans (BNFs) are common oxygen-heterocyclic aromatic compounds in crude oils and source rock extracts. A series of positional isomers of alkyldibenzofuran and benzo[b]naphthofuran were identified in mass chromatograms by comparison with internal standards and standard retention indices. The response factors of dibenzofuran in relation to internal standards were obtained by gas chromatography-mass spectrometry analyses of a set of mixed solutions with different concentration ratios. Perdeuterated dibenzofuran and dibenzothiophene are optimal internal standards for quantitative analyses of furan compounds in crude oils and source rock extracts. The average concentration of the total DBFs in oils derived from siliciclastic lacustrine rock extracts from the Beibuwan Basin, South China Sea, was 518 μg/g, which is about 5 times that observed in the oils from carbonate source rocks in the Tarim Basin, Northwest China. The BNFs occur ubiquitously in source rock extracts and related oils of various origins. The results of this work suggest that the relative abundance of benzo[b]naphthofuran isomers, that is, the benzo[b]naphtho[2,1-d]furan/{benzo[b]naphtho[2,1-d]furan + benzo[b]naphtho[1,2-d]furan} ratio, may be a potential molecular geochemical parameter to indicate oil migration pathways and distances.
Modeling of negative ion transport in a plasma source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Riz, David; Departement de Recherches sur la Fusion Controelee CE Cadarache, 13108 St Paul lez Durance; Pamela, Jerome
1998-08-20
A code called NIETZSCHE has been developed to simulate the negative ion transport in a plasma source, from their birth place to the extraction holes. The ion trajectory is calculated by numerically solving the 3-D motion equation, while the atomic processes of destruction, of elastic collision H{sup -}/H{sup +} and of charge exchange H{sup -}/H{sup 0} are handled at each time step by a Monte-Carlo procedure. This code can be used to calculate the extraction probability of a negative ion produced at any location inside the source. Calculations performed with NIETZSCHE have allowed to explain, either quantitatively or qualitatively, severalmore » phenomena observed in negative ion sources, such as the isotopic H{sup -}/D{sup -} effect, and the influence of the plasma grid bias or of the magnetic filter on the negative ion extraction. The code has also shown that in the type of sources contemplated for ITER, which operate at large arc power densities (>1 W cm{sup -3}), negative ions can reach the extraction region provided if they are produced at a distance lower than 2 cm from the plasma grid in the case of 'volume production' (dissociative attachment processes), or if they are produced at the plasma grid surface, in the vicinity of the extraction holes.« less
Modeling of negative ion transport in a plasma source (invited)
NASA Astrophysics Data System (ADS)
Riz, David; Paméla, Jérôme
1998-02-01
A code called NIETZSCHE has been developed to simulate the negative ion transport in a plasma source, from their birth place to the extraction holes. The H-/D- trajectory is calculated by numerically solving the 3D motion equation, while the atomic processes of destruction, of elastic collision with H+/D+ and of charge exchange with H0/D0 are handled at each time step by a Monte Carlo procedure. This code can be used to calculate the extraction probability of a negative ion produced at any location inside the source. Calculations performed with NIETZSCHE have been allowed to explain, either quantitatively or qualitatively, several phenomena observed in negative ion sources, such as the isotopic H-/D- effect, and the influence of the plasma grid bias or of the magnetic filter on the negative ion extraction. The code has also shown that, in the type of sources contemplated for ITER, which operate at large arc power densities (>1 W cm-3), negative ions can reach the extraction region provided they are produced at a distance lower than 2 cm from the plasma grid in the case of volume production (dissociative attachment processes), or if they are produced at the plasma grid surface, in the vicinity of the extraction holes.
Modeling of negative ion transport in a plasma source
NASA Astrophysics Data System (ADS)
Riz, David; Paméla, Jérôme
1998-08-01
A code called NIETZSCHE has been developed to simulate the negative ion transport in a plasma source, from their birth place to the extraction holes. The ion trajectory is calculated by numerically solving the 3-D motion equation, while the atomic processes of destruction, of elastic collision H-/H+ and of charge exchange H-/H0 are handled at each time step by a Monte-Carlo procedure. This code can be used to calculate the extraction probability of a negative ion produced at any location inside the source. Calculations performed with NIETZSCHE have allowed to explain, either quantitatively or qualitatively, several phenomena observed in negative ion sources, such as the isotopic H-/D- effect, and the influence of the plasma grid bias or of the magnetic filter on the negative ion extraction. The code has also shown that in the type of sources contemplated for ITER, which operate at large arc power densities (>1 W cm-3), negative ions can reach the extraction region provided if they are produced at a distance lower than 2 cm from the plasma grid in the case of «volume production» (dissociative attachment processes), or if they are produced at the plasma grid surface, in the vicinity of the extraction holes.
Evaluation of Genotoxic and Mutagenic Activity of Organic Extracts from Drinking Water Sources
Guan, Ying; Wang, Xiaodong; Wong, Minghung; Sun, Guoping; An, Taicheng; Guo, Jun
2017-01-01
An increasing number of industrial, agricultural and commercial chemicals in the aquatic environment lead to various deleterious effects on organisms, which is becoming a serious global health concern. In this study, the Ames test and SOS/umu test were conducted to investigate the potential genotoxicity and mutagenicity caused by organic extracts from drinking water sources. Organic content of source water was extracted with XAD-2 resin column and organic solvents. Four doses of the extract equivalent to 0.25, 0.5, 1 and 2L of source water were tested for toxicity. All the water samples were collected from six different locations in Guangdong province. The results of the Ames test and SOS/umu test showed that all the organic extracts from the water samples could induce different levels of DNA damage and mutagenic potentials at the dose of 2 L in the absence of S9 mix, which demonstrated the existence of genotoxicity and mutagenicity. Additionally, we found that Salmonella typhimurium strain TA98 was more sensitive for the mutagen. Correlation analysis between genotoxicity, Organochlorine Pesticides (OCPs) and Polycyclic Aromatic Hydrocarbons (PAHs) showed that most individual OCPs were frame shift toxicants in drinking water sources, and there was no correlation with total OCPs and PAHs. PMID:28125725
ProFound: Source Extraction and Application to Modern Survey Data
NASA Astrophysics Data System (ADS)
Robotham, A. S. G.; Davies, L. J. M.; Driver, S. P.; Koushan, S.; Taranu, D. S.; Casura, S.; Liske, J.
2018-05-01
We introduce PROFOUND, a source finding and image analysis package. PROFOUND provides methods to detect sources in noisy images, generate segmentation maps identifying the pixels belonging to each source, and measure statistics like flux, size, and ellipticity. These inputs are key requirements of PROFIT, our recently released galaxy profiling package, where the design aim is that these two software packages will be used in unison to semi-automatically profile large samples of galaxies. The key novel feature introduced in PROFOUND is that all photometry is executed on dilated segmentation maps that fully contain the identifiable flux, rather than using more traditional circular or ellipse-based photometry. Also, to be less sensitive to pathological segmentation issues, the de-blending is made across saddle points in flux. We apply PROFOUND in a number of simulated and real-world cases, and demonstrate that it behaves reasonably given its stated design goals. In particular, it offers good initial parameter estimation for PROFIT, and also segmentation maps that follow the sometimes complex geometry of resolved sources, whilst capturing nearly all of the flux. A number of bulge-disc decomposition projects are already making use of the PROFOUND and PROFIT pipeline, and adoption is being encouraged by publicly releasing the software for the open source R data analysis platform under an LGPL-3 license on GitHub (github.com/asgr/ProFound).
Ally, Moonis Raza; Munk, Jeffrey D.; Baxter, Van D.; ...
2015-06-26
This twelve-month field study analyzes the performance of a 7.56W (2.16- ton) water-to-air-ground source heat pump (WA-GSHP) to satisfy domestic space conditioning loads in a 253 m 2 house in a mixed-humid climate in the United States. The practical feasibility of using the ground as a source of renewable energy is clearly demonstrated. Better than 75% of the energy needed for space heating was extracted from the ground. The average monthly electricity consumption for space conditioning was only 40 kWh at summer and winter thermostat set points of 24.4°C and 21.7°C, respectively. The WA-GSHP shared the same 94.5 m verticalmore » bore ground loop with a separate water-to-water ground-source heat pump (WW-GSHP) for meeting domestic hot water needs in the same house. Sources of systemic irreversibility, the main cause of lost work are identified using Exergy and energy analysis. Quantifying the sources of Exergy and energy losses is essential for further systemic improvements. The research findings suggest that the WA-GSHPs are a practical and viable technology to reduce primary energy consumption and greenhouse gas emissions under the IECC 2012 Standard, as well as the European Union (EU) 2020 targets of using renewable energy resources.« less
Inferring Small Scale Dynamics from Aircraft Measurements of Tracers
NASA Technical Reports Server (NTRS)
Sparling, L. C.; Einaudi, Franco (Technical Monitor)
2000-01-01
The millions of ER-2 and DC-8 aircraft measurements of long-lived tracers in the Upper Troposphere/Lower Stratosphere (UT/LS) hold enormous potential as a source of statistical information about subgrid scale dynamics. Extracting this information however can be extremely difficult because the measurements are made along a 1-D transect through fields that are highly anisotropic in all three dimensions. Some of the challenges and limitations posed by both the instrumentation and platform are illustrated within the context of the problem of using the data to obtain an estimate of the dissipation scale. This presentation will also include some tutorial remarks about the conditional and two-point statistics used in the analysis.
Lucas, Nicholas; Macaskill, Petra; Irwig, Les; Moran, Robert; Bogduk, Nikolai
2009-01-01
Trigger points are promoted as an important cause of musculoskeletal pain. There is no accepted reference standard for the diagnosis of trigger points, and data on the reliability of physical examination for trigger points are conflicting. To systematically review the literature on the reliability of physical examination for the diagnosis of trigger points. MEDLINE, EMBASE, and other sources were searched for articles reporting the reliability of physical examination for trigger points. Included studies were evaluated for their quality and applicability, and reliability estimates were extracted and reported. Nine studies were eligible for inclusion. None satisfied all quality and applicability criteria. No study specifically reported reliability for the identification of the location of active trigger points in the muscles of symptomatic participants. Reliability estimates varied widely for each diagnostic sign, for each muscle, and across each study. Reliability estimates were generally higher for subjective signs such as tenderness (kappa range, 0.22-1.0) and pain reproduction (kappa range, 0.57-1.00), and lower for objective signs such as the taut band (kappa range, -0.08-0.75) and local twitch response (kappa range, -0.05-0.57). No study to date has reported the reliability of trigger point diagnosis according to the currently proposed criteria. On the basis of the limited number of studies available, and significant problems with their design, reporting, statistical integrity, and clinical applicability, physical examination cannot currently be recommended as a reliable test for the diagnosis of trigger points. The reliability of trigger point diagnosis needs to be further investigated with studies of high quality that use current diagnostic criteria in clinically relevant patients.
Bio-based thermosetting copolymers of eugenol and tung oil
NASA Astrophysics Data System (ADS)
Handoko, Harris
There has been an increasing demand for novel synthetic polymers made of components derived from renewable sources to cope with the depletion of petroleum sources. In fact, monomers derived vegetable oils and plant sources have shown promising results in forming polymers with good properties. The following is a study of two highly viable renewable sources, eugenol and tung oil (TO) to be copolymerized into fully bio-based thermosets. Polymerization of eugenol required initial methacrylate-functionalization through Steglich esterification and the synthesized methacrylated eugenol (ME) was confirmed by 1H-NMR. Rheological studies showed ideal Newtonian behavior in ME and five other blended ME resins containing 10 -- 50 wt% TO. Free-radical copolymerization using 5 mol% of tert-butyl peroxybenzoate (crosslinking catalyst) and curing at elevated temperatures (90 -- 160 °C) formed a series of soft to rigid highly-crosslinked thermosets. Crosslinked material (89 -- 98 %) in the thermosets were determined by Soxhlet extraction to decrease with increase of TO content (0 -- 30%). Thermosets containing 0 -- 30 wt% TO possessed ultimate flexural (3-point bending) strength of 32.2 -- 97.2 MPa and flexural moduli of 0.6 -- 3.5 GPa, with 3.2 -- 8.8 % strain-to-failure ratio. Those containing 10 -- 40 wt% TO exhibited ultimate tensile strength of 3.3 -- 45.0 MPa and tensile moduli of 0.02 GPa to 1.12 GPa, with 8.5 -- 76.7 % strain-to-failure ratio. Glass transition temperatures ranged from 52 -- 152 °C as determined by DMA in 3-point bending. SEM analysis on fractured tensile test specimens detected a small degree of heterogeneity. All the thermosets are thermally stable up to approximately 300 °C based on 5% weight loss.
Torres-Pérez, Mónica I; Jiménez-Velez, Braulio D; Mansilla-Rivera, Imar; Rodríguez-Sierra, Carlos J
2005-03-01
The effect that three extraction techniques (e.g., Soxhlet, ultrasound and microwave-assisted extraction) have on the toxicity, as measured by submitochondrial particle (SMP) and Microtox assays, of organic extracts was compared from three sources of airborne particulate matter (APM). The extraction technique influenced the toxicity response of APM extracts and it was dependent on the bioassay method, and APM sample source. APM extracts from microwave-assisted extraction (MAE) were similar or more toxic than the conventional extraction techniques of Soxhlet and ultrasound, thus, providing an alternate extraction method. The microwave extraction technique has the advantage of using less solvent volume, less extraction time, and the capacity to simultaneously extract twelve samples. The ordering of APM toxicity was generally urban dust > diesel dust > PM10 (particles with diameter < 10 microm), thus, reflecting different chemical composition of the samples. This study is the first to report the suitability of two standard in-vitro bioassays for the future toxicological characterization of APM collected from Puerto Rico, with the SMP generally showing better sensitivity to the well-known Microtox bioassay.
NASA Astrophysics Data System (ADS)
Wapenaar, C. P. A.; Van der Neut, J.; Thorbecke, J.; Broggini, F.; Slob, E. C.; Snieder, R.
2015-12-01
Imagine one could place seismic sources and receivers at any desired position inside the earth. Since the receivers would record the full wave field (direct waves, up- and downward reflections, multiples, etc.), this would give a wealth of information about the local structures, material properties and processes in the earth's interior. Although in reality one cannot place sources and receivers anywhere inside the earth, it appears to be possible to create virtual sources and receivers at any desired position, which accurately mimics the desired situation. The underlying method involves some major steps beyond standard seismic interferometry. With seismic interferometry, virtual sources can be created at the positions of physical receivers, assuming these receivers are illuminated isotropically. Our proposed method does not need physical receivers at the positions of the virtual sources; moreover, it does not require isotropic illumination. To create virtual sources and receivers anywhere inside the earth, it suffices to record the reflection response with physical sources and receivers at the earth's surface. We do not need detailed information about the medium parameters; it suffices to have an estimate of the direct waves between the virtual-source positions and the acquisition surface. With these prerequisites, our method can create virtual sources and receivers, anywhere inside the earth, which record the full wave field. The up- and downward reflections, multiples, etc. in the virtual responses are extracted directly from the reflection response at the surface. The retrieved virtual responses form an ideal starting point for accurate seismic imaging, characterization and monitoring.
NASA Astrophysics Data System (ADS)
Sirmacek, B.; Lindenbergh, R. C.; Menenti, M.
2013-10-01
Fusion of 3D airborne laser (LIDAR) data and terrestrial optical imagery can be applied in 3D urban modeling and model up-dating. The most challenging aspect of the fusion procedure is registering the terrestrial optical images on the LIDAR point clouds. In this article, we propose an approach for registering these two different data from different sensor sources. As we use iPhone camera images which are taken in front of the interested urban structure by the application user and the high resolution LIDAR point clouds of the acquired by an airborne laser sensor. After finding the photo capturing position and orientation from the iPhone photograph metafile, we automatically select the area of interest in the point cloud and transform it into a range image which has only grayscale intensity levels according to the distance from the image acquisition position. We benefit from local features for registering the iPhone image to the generated range image. In this article, we have applied the registration process based on local feature extraction and graph matching. Finally, the registration result is used for facade texture mapping on the 3D building surface mesh which is generated from the LIDAR point cloud. Our experimental results indicate possible usage of the proposed algorithm framework for 3D urban map updating and enhancing purposes.
2011-01-01
Background Acquired immunodeficiency syndrome (AIDS), which is caused by the human immunodeficiency virus (HIV), is an immunosuppressive disease that results in life-threatening opportunistic infections. The general problems in current therapy include the constant emergence of drug-resistant HIV strains, adverse side effects and the unavailability of treatments in developing countries. Natural products from herbs with the abilities to inhibit HIV-1 life cycle at different stages, have served as excellent sources of new anti-HIV-1 drugs. In this study, we aimed to investigate the anti-HIV-1 activity of aqueous dandelion extract. Methods The pseudotyped HIV-1 virus has been utilized to explore the anti-HIV-1 activity of dandelion, the level of HIV-1 replication was assessed by the percentage of GFP-positive cells. The inhibitory effect of the dandelion extract on reverse transcriptase activity was assessed by the reverse transcriptase assay kit. Results Compared to control values obtained from cells infected without treatment, the level of HIV-1 replication and reverse transcriptase activity were decreased in a dose-dependent manner. The data suggest that dandelion extract has a potent inhibitory activity against HIV-1 replication and reverse transcriptase activity. The identification of HIV-1 antiviral compounds from Taraxacum officinale should be pursued. Conclusions The dandelion extract showed strong activity against HIV-1 RT and inhibited both the HIV-1 vector and the hybrid-MoMuLV/MoMuSV retrovirus replication. These findings provide additional support for the potential therapeutic efficacy of Taraxacum officinale. Extracts from this plant may be regarded as another starting point for the development of an antiretroviral therapy with fewer side effects. PMID:22078030
First results of the ITER-relevant negative ion beam test facility ELISE (invited).
Fantz, U; Franzen, P; Heinemann, B; Wünderlich, D
2014-02-01
An important step in the European R&D roadmap towards the neutral beam heating systems of ITER is the new test facility ELISE (Extraction from a Large Ion Source Experiment) for large-scale extraction from a half-size ITER RF source. The test facility was constructed in the last years at Max-Planck-Institut für Plasmaphysik Garching and is now operational. ELISE is gaining early experience of the performance and operation of large RF-driven negative hydrogen ion sources with plasma illumination of a source area of 1 × 0.9 m(2) and an extraction area of 0.1 m(2) using 640 apertures. First results in volume operation, i.e., without caesium seeding, are presented.
A Data Cleaning Method for Big Trace Data Using Movement Consistency
Tang, Luliang; Zhang, Xia; Li, Qingquan
2018-01-01
Given the popularization of GPS technologies, the massive amount of spatiotemporal GPS traces collected by vehicles are becoming a new kind of big data source for urban geographic information extraction. The growing volume of the dataset, however, creates processing and management difficulties, while the low quality generates uncertainties when investigating human activities. Based on the conception of the error distribution law and position accuracy of the GPS data, we propose in this paper a data cleaning method for this kind of spatial big data using movement consistency. First, a trajectory is partitioned into a set of sub-trajectories using the movement characteristic points. In this process, GPS points indicate that the motion status of the vehicle has transformed from one state into another, and are regarded as the movement characteristic points. Then, GPS data are cleaned based on the similarities of GPS points and the movement consistency model of the sub-trajectory. The movement consistency model is built using the random sample consensus algorithm based on the high spatial consistency of high-quality GPS data. The proposed method is evaluated based on extensive experiments, using GPS trajectories generated by a sample of vehicles over a 7-day period in Wuhan city, China. The results show the effectiveness and efficiency of the proposed method. PMID:29522456
In-situ continuous water monitoring system
Thompson, Cyril V.; Wise, Marcus B.
1998-01-01
An in-situ continuous liquid monitoring system for continuously analyzing volatile components contained in a water source comprises: a carrier gas supply, an extraction container and a mass spectrometer. The carrier gas supply continuously supplies the carrier gas to the extraction container and is mixed with a water sample that is continuously drawn into the extraction container by the flow of carrier gas into the liquid directing device. The carrier gas continuously extracts the volatile components out of the water sample. The water sample is returned to the water source after the volatile components are extracted from it. The extracted volatile components and the carrier gas are delivered continuously to the mass spectrometer and the volatile components are continuously analyzed by the mass spectrometer.
In-situ continuous water monitoring system
Thompson, C.V.; Wise, M.B.
1998-03-31
An in-situ continuous liquid monitoring system for continuously analyzing volatile components contained in a water source comprises: a carrier gas supply, an extraction container and a mass spectrometer. The carrier gas supply continuously supplies the carrier gas to the extraction container and is mixed with a water sample that is continuously drawn into the extraction container by the flow of carrier gas into the liquid directing device. The carrier gas continuously extracts the volatile components out of the water sample. The water sample is returned to the water source after the volatile components are extracted from it. The extracted volatile components and the carrier gas are delivered continuously to the mass spectrometer and the volatile components are continuously analyzed by the mass spectrometer. 2 figs.
Liu, Shijie
2010-01-01
The conversion of biomass to chemicals and energy is imperative to sustaining our way of life as known to us today. Fossil chemical and energy sources are traditionally regarded as wastes from a distant past. Petroleum, natural gas, and coal are not being regenerated in a sustainable manner. However, biomass sources such as algae, grasses, bushes and forests are continuously being replenished. Woody biomass represents the most abundant and available biomass source. Woody biomass is a reliably sustainable source of chemicals and energy that could be replenished at a rate consistent with our needs. The biorefinery is a concept describing the collection of processes used to convert biomass to chemicals and energy. Woody biomass presents more challenges than cereal grains for conversion to platform chemicals due to its stereochemical structures. Woody biomass can be thought of as comprised of at least four components: extractives, hemicellulose, lignin and cellulose. Each of these four components has a different degree of resistance to chemical, thermal and biological degradation. The biorefinery concept proposed at ESF (State University of New York - College of Environmental Science and Forestry) aims at incremental sequential deconstruction, fractionation/conversion of woody biomass to achieve efficient separation of major components. The emphasis of this work is on the kinetics of hot-water extraction, filling the gap in the fundamental understanding, linking engineering developments, and completing the first step in the biorefinery processes. This first step removes extractives and hemicellulose fractions from woody biomass. While extractives and hemicellulose are largely removed in the extraction liquor, cellulose and lignin largely remain in the residual woody structure. Xylo-oligomers and acetic acid in the extract are the major components having the greatest potential value for development. Extraction/hydrolysis involves at least 16 general reactions that could be divided into four categories: adsorption of proton onto woody biomass, hydrolysis reactions on the woody biomass surface, dissolution of soluble substances into the extraction liquor, and hydrolysis and dehydration decomposition in the extraction liquor. The extraction/hydrolysis rates are significantly simplified when the reactivity of all the intermonomer bonds are regarded as identical within each macromolecule, and the overall reactivity are identical for all the extractable macromolecules on the surface. A pseudo-first order extraction rate expression has been derived based on concentrations in monomer units. The reaction rate constant is however lower at the beginning of the extraction than that towards the end of the extraction. Furthermore, the H-factor and/or severity factor can be applied to lump the effects of temperature and residence time on the extraction process, at least for short times. This provides a means to control and optimize the performance of the extraction process effectively. Copyright 2010 Elsevier Inc. All rights reserved.
Geometric correction and digital elevation extraction using multiple MTI datasets
Mercier, Jeffrey A.; Schowengerdt, Robert A.; Storey, James C.; Smith, Jody L.
2007-01-01
Digital Elevation Models (DEMs) are traditionally acquired from a stereo pair of aerial photographs sequentially captured by an airborne metric camera. Standard DEM extraction techniques can be naturally extended to satellite imagery, but the particular characteristics of satellite imaging can cause difficulties. The spacecraft ephemeris with respect to the ground site during image collects is the most important factor in the elevation extraction process. When the angle of separation between the stereo images is small, the extraction process typically produces measurements with low accuracy, while a large angle of separation can cause an excessive number of erroneous points in the DEM from occlusion of ground areas. The use of three or more images registered to the same ground area can potentially reduce these problems and improve the accuracy of the extracted DEM. The pointing capability of some sensors, such as the Multispectral Thermal Imager (MTI), allows for multiple collects of the same area from different perspectives. This functionality of MTI makes it a good candidate for the implementation of a DEM extraction algorithm using multiple images for improved accuracy. Evaluation of this capability and development of algorithms to geometrically model the MTI sensor and extract DEMs from multi-look MTI imagery are described in this paper. An RMS elevation error of 6.3-meters is achieved using 11 ground test points, while the MTI band has a 5-meter ground sample distance.
Vanishing Point Extraction and Refinement for Robust Camera Calibration
Tsai, Fuan
2017-01-01
This paper describes a flexible camera calibration method using refined vanishing points without prior information. Vanishing points are estimated from human-made features like parallel lines and repeated patterns. With the vanishing points extracted from the three mutually orthogonal directions, the interior and exterior orientation parameters can be further calculated using collinearity condition equations. A vanishing point refinement process is proposed to reduce the uncertainty caused by vanishing point localization errors. The fine-tuning algorithm is based on the divergence of grouped feature points projected onto the reference plane, minimizing the standard deviation of each of the grouped collinear points with an O(1) computational complexity. This paper also presents an automated vanishing point estimation approach based on the cascade Hough transform. The experiment results indicate that the vanishing point refinement process can significantly improve camera calibration parameters and the root mean square error (RMSE) of the constructed 3D model can be reduced by about 30%. PMID:29280966
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosier, R.
1984-09-01
This volume is a compendium of detailed emission and test data from field tests of a firetube industrial boiler burning a coal/oil/water (COW) mixture. The boiler was tested while burning COW fuel, and COW with soda ash added (COW+SA) to serve as an SO/sub 2/ sorbent. The test data include: preliminary equipment calibration data, boiler operating data for both tests, fuel analysis results, and complete flue gas emission measurement and laboratory analysis results. Flue gas emission measurements included: continuous monitoring for criteria gas pollutants; gas chromatography (GC) of gas grab samples for volatile organics (C1-C6); EPA Method 5 for particulate;more » controlled condensation system for SO2 emissions; and source assessment sampling system (SASS) for total organics in two boiling point ranges (100 to 300 C and > 300 C), organic compound category information using infrared spectrometry (IR) and low resolution mass spectrometry (LRMS), specific quantitation of the semivolatile organic priority pollutants using gas chromatography/mass spectrometry (GC/MS), liquid chromatography (LC) separation of organic extracts into seven polarity fractions with total organic and IR analyses of eluted fractions, flue gas concentrations of trace elements by spark source mass spectrometry (SSMS) and atomic absorption spectroscopy (AAS), and biological assays of organic extracts.« less
Fahradpour, Mohsen; Keov, Peter; Tognola, Carlotta; Perez-Santamarina, Estela; McCormick, Peter J.; Ghassempour, Alireza; Gruber, Christian W.
2017-01-01
Cyclotides are plant derived, cystine-knot stabilized peptides characterized by their natural abundance, sequence variability and structural plasticity. They are abundantly expressed in Rubiaceae, Psychotrieae in particular. Previously the cyclotide kalata B7 was identified to modulate the human oxytocin and vasopressin G protein-coupled receptors (GPCRs), providing molecular validation of the plants’ uterotonic properties and further establishing cyclotides as valuable source for GPCR ligand design. In this study we screened a cyclotide extract derived from the root powder of the South American medicinal plant ipecac (Carapichea ipecacuanha) for its GPCR modulating activity of the corticotropin-releasing factor type 1 receptor (CRF1R). We identified and characterized seven novel cyclotides. One cyclotide, caripe 8, isolated from the most active fraction, was further analyzed and found to antagonize the CRF1R. A nanomolar concentration of this cyclotide (260 nM) reduced CRF potency by ∼4.5-fold. In contrast, caripe 8 did not inhibit forskolin-, or vasopressin-stimulated cAMP responses at the vasopressin V2 receptor, suggesting a CRF1R-specific mode-of-action. These results in conjunction with our previous findings establish cyclotides as modulators of both classes A and B GPCRs. Given the diversity of cyclotides, our data point to other cyclotide-GPCR interactions as potentially important sources of drug-like molecules. PMID:29033832
Sensory qualities of pastry products enriched with dietary fiber and polyphenolic substances.
Komolka, Patrycja; Górecka, Danuta; Szymandera-Buszka, Krystyna; Jędrusek-Golińska, Anna; Dziedzic, Krzysztof; Waszkowiak, Katarzyna
2016-01-01
Growing consumer demand for products with pro-health properties is forcing food manufacturers to introduce new food items onto the market, which will not only possess such health-enhancing properties but will also compete on the grounds of sensory attributes - taste, flavour, texture etc. The aim was to evaluate these sensory attributes of pastry products enhanced with biologically active compounds, such as inulin, buckwheat hull and buckwheat flour. For decreasing the energy value of the products tested (crispy cookies, muesli cookies, waffles and pancakes) some ingredients were replaced: vegetable butter or oil by inulin and wheat flour by roasted buckwheat flour and thermally processed buckwheat hull. The substances mentioned are rich sources of soluble and insoluble buckwheat fiber, and also polyphenolic substances. Dry chokeberry and mulberry leaf extract were added as a rich source of flavonoids and 1-deoxynorijimycin, respectively. These substances are recommended for people with obesity. The processing was carried out at 175°C for 15 minutes using a convection oven (Rational Combi-Steamer CCC). Pastry products with buckwheat flour, buckwheat hulls, mulberry extract, chokeberry and inulin had a lower food energy, a higher dietary fiber content and scored high on customer desirability. Pastry products which contain ingredients carrying biologically active substances are not only attractive from the sensory point of view, but also low in calories, and are thus recommendable for obesity people.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, T.; Yang, Z.; Dong, P.
The cold-cathode Penning ion gauge (PIG) type ion source has been used for generation of negative hydrogen (H{sup -}) ions as the internal ion source of a compact cyclotron. A novel method called electrical shielding box dc beam measurement is described in this paper, and the beam intensity was measured under dc extraction inside an electrical shielding box. The results of the trajectory simulation and dc H{sup -} beam extraction measurement were presented. The effect of gas flow rate, magnetic field strength, arc current, and extraction voltage were also discussed. In conclusion, the dc H{sup -} beam current of aboutmore » 4 mA from the PIG ion source with the puller voltage of 40 kV and arc current of 1.31 A was extrapolated from the measurement at low extraction dc voltages.« less
Second Iteration of Photogrammetric Pipeline to Enhance the Accuracy of Image Pose Estimation
NASA Astrophysics Data System (ADS)
Nguyen, T. G.; Pierrot-Deseilligny, M.; Muller, J.-M.; Thom, C.
2017-05-01
In classical photogrammetric processing pipeline, the automatic tie point extraction plays a key role in the quality of achieved results. The image tie points are crucial to pose estimation and have a significant influence on the precision of calculated orientation parameters. Therefore, both relative and absolute orientations of the 3D model can be affected. By improving the precision of image tie point measurement, one can enhance the quality of image orientation. The quality of image tie points is under the influence of several factors such as the multiplicity, the measurement precision and the distribution in 2D images as well as in 3D scenes. In complex acquisition scenarios such as indoor applications and oblique aerial images, tie point extraction is limited while only image information can be exploited. Hence, we propose here a method which improves the precision of pose estimation in complex scenarios by adding a second iteration to the classical processing pipeline. The result of a first iteration is used as a priori information to guide the extraction of new tie points with better quality. Evaluated with multiple case studies, the proposed method shows its validity and its high potiential for precision improvement.
An open-source framework for stress-testing non-invasive foetal ECG extraction algorithms.
Andreotti, Fernando; Behar, Joachim; Zaunseder, Sebastian; Oster, Julien; Clifford, Gari D
2016-05-01
Over the past decades, many studies have been published on the extraction of non-invasive foetal electrocardiogram (NI-FECG) from abdominal recordings. Most of these contributions claim to obtain excellent results in detecting foetal QRS (FQRS) complexes in terms of location. A small subset of authors have investigated the extraction of morphological features from the NI-FECG. However, due to the shortage of available public databases, the large variety of performance measures employed and the lack of open-source reference algorithms, most contributions cannot be meaningfully assessed. This article attempts to address these issues by presenting a standardised methodology for stress testing NI-FECG algorithms, including absolute data, as well as extraction and evaluation routines. To that end, a large database of realistic artificial signals was created, totaling 145.8 h of multichannel data and over one million FQRS complexes. An important characteristic of this dataset is the inclusion of several non-stationary events (e.g. foetal movements, uterine contractions and heart rate fluctuations) that are critical for evaluating extraction routines. To demonstrate our testing methodology, three classes of NI-FECG extraction algorithms were evaluated: blind source separation (BSS), template subtraction (TS) and adaptive methods (AM). Experiments were conducted to benchmark the performance of eight NI-FECG extraction algorithms on the artificial database focusing on: FQRS detection and morphological analysis (foetal QT and T/QRS ratio). The overall median FQRS detection accuracies (i.e. considering all non-stationary events) for the best performing methods in each group were 99.9% for BSS, 97.9% for AM and 96.0% for TS. Both FQRS detections and morphological parameters were shown to heavily depend on the extraction techniques and signal-to-noise ratio. Particularly, it is shown that their evaluation in the source domain, obtained after using a BSS technique, should be avoided. Data, extraction algorithms and evaluation routines were released as part of the fecgsyn toolbox on Physionet under an GNU GPL open-source license. This contribution provides a standard framework for benchmarking and regulatory testing of NI-FECG extraction algorithms.
NASA Astrophysics Data System (ADS)
Li, Jia; Shen, Hua; Zhu, Rihong; Gao, Jinming; Sun, Yue; Wang, Jinsong; Li, Bo
2018-06-01
The precision of the measurements of aspheric and freeform surfaces remains the primary factor restrict their manufacture and application. One effective means of measuring such surfaces involves using reference or probe beams with angle modulation, such as tilted-wave-interferometer (TWI). It is necessary to improve the measurement efficiency by obtaining the optimum point source array for different pieces before TWI measurements. For purpose of forming a point source array based on the gradients of different surfaces under test, we established a mathematical model describing the relationship between the point source array and the test surface. However, the optimal point sources are irregularly distributed. In order to achieve a flexible point source array according to the gradient of test surface, a novel interference setup using fiber array is proposed in which every point source can be independently controlled on and off. Simulations and the actual measurement examples of two different surfaces are given in this paper to verify the mathematical model. Finally, we performed an experiment of testing an off-axis ellipsoidal surface that proved the validity of the proposed interference system.
Di Ciaccio, L S; Spotorno, V G; Córdoba Estévez, M M; Ríos, D J L; Fortunato, R H; Salvat, A E
2018-03-01
Fungi are cosmopolitan organisms that grow in and adapt to a vast number of substrates and environments, and that can cause diseases in humans and animals, as well as in crops. The vast area and diverse geographical characteristics of Argentina, with the consequent climatic diversity make the country an important source of biological resources suitable for the search of new compounds. The aim of the present study was to describe the antifungal activity of extracts of Parastrephia quadrangularis, a species from northern Argentina, against Fusarium verticillioides M7075. Bio-guided fractionation and MS/MS studies were conducted to elucidate the chemical structure of active compounds. The extracts exhibited a minimum inhibitory concentration among 118·74 and 250 μg ml -1 and the differences between the treatments and the inoculum control was 12·5-16·5 mm, respectively, in colony growth. Moreover, hyphae treated with the extracts stained blue with Evans blue, showed alterations in permeability of plasma membranes. HLPC-MS analysis of active fractions revealed the presence of p-coumaroyloxitremetone, and a derivate structure for another compound is proposed. In Argentina, Fusarium verticillioides causes 'ear rot', a disease that produces important yield and nutritional quality losses in the maize producing region. This study suggests that Parastrephia quadrangularis extracts have potential for the growth inhibition against F. verticillioides M7075, and the bioactivity is reported for the first time. The results obtained will provide a starting point for discover new antimicotic candidate in natural products. © 2018 The Society for Applied Microbiology.
Long-term stability of CMV DNA in human breast milk.
Sam, Soya S; Ingersoll, Jessica; Racsa, Lori D; Caliendo, Angela M; Racsa, Patrick N; Igwe, Doris; Abdul-Ali, Deborah; Josephson, Cassandra; Kraft, Colleen S
2018-05-01
Human cytomegalovirus (CMV) is the leading cause of intrauterine and perinatal viral infection. The most common route of CMV transmission in newborns is through breastmilk and this can lead to infant morbidity and mortality. Breast milk that has been frozen for an extended period may need to be tested for CMV DNA to determine the source of infection. It has been a challenge for clinical laboratories to ensure the stability of CMV DNA in frozen breast milk for accurate viral load measurement. To evaluate the stability of CMV DNA in breast milk by testing quantitative viral loads over a 28-day period for breast milk stored at 4 °C and a 90-day period for breast milk stored at -20 °C. Baseline viral loads were determined on day 0 and the samples stored at 4 °C underwent extraction and amplification at four time points, up to 28 days. The samples stored at -20 °C underwent extraction and amplification at five time points up to 90 days. Log 10 values were calculated and t-test, Pearson's coefficient, and concordance correlation coefficient were calculated. There was no statistically significant difference between the time points by t-test, and correlation coefficients showed greater than 90% concordance for days 0 and 28 as well as days 0 and 90 at both storage temperatures tested. The concentration of CMV DNA in breast milk was stable for 28 days at 4 °C and 90 days at -20 °C as the concentrations did not differ significantly from the baseline viral loads. Copyright © 2018 Elsevier B.V. All rights reserved.
Changing Regulations of COD Pollution Load of Weihe River Watershed above TongGuan Section, China
NASA Astrophysics Data System (ADS)
Zhu, Lei; Liu, WanQing
2018-02-01
TongGuan Section of Weihe River Watershed is a provincial section between Shaanxi Province and Henan Province, China. Weihe River Watershed above TongGuan Section is taken as the research objective in this paper and COD is chosen as the water quality parameter. According to the discharge characteristics of point source pollutions and non-point source pollutions, a method—characteristic section load (CSLD) method is suggested and point and non-point source pollution loads of Weihe River Watershed above TongGuan Section are calculated in the rainy, normal and dry season in 2013. The results show that the monthly point source pollution loads of Weihe River Watershed above TongGuan Section discharge stably and the monthly non-point source pollution loads of Weihe River Watershed above TongGuan Section change greatly and the non-point source pollution load proportions of total pollution load of COD decrease in the rainy, wet and normal period in turn.
Fernández-Ramos, C; Ballesteros, O; Zafra-Gómez, A; Camino-Sánchez, F J; Blanc, R; Navalón, A; Pérez-Trujillo, J P; Vílchez, J L
2014-02-15
Alcohol sulfates (AS) and alcohol ethoxysulfates (AES) are all High Production Volume and 'down-the-drain' chemicals used globally in detergent and personal care products, resulting in low levels ultimately released to the environment via wastewater treatment plant effluents. They have a strong affinity for sorption to sediments. Almost 50% of Tenerife Island surface area is environmentally protected. Therefore, determination of concentration levels of AS/AES in marine sediments near wastewater discharge points along the coast of the Island is of interest. These data were obtained after pressurized liquid extraction and liquid chromatography-tandem mass spectrometry analysis. Short chains of AES and especially of AS dominated the homologue distribution for AES. The Principal Components Analysis was used. The results showed that the sources of AS and AES were the same and that both compounds exhibit similar behavior. Three different patterns in the distribution for homologues and ethoxymers were found. Copyright © 2013 Elsevier Ltd. All rights reserved.
Distribution of trace elements in the coastal sea sediments of Maslinica Bay, Croatia
NASA Astrophysics Data System (ADS)
Mikulic, Nenad; Orescanin, Visnja; Elez, Loris; Pavicic, Ljiljana; Pezelj, Durdica; Lovrencic, Ivanka; Lulic, Stipe
2008-02-01
Spatial distributions of trace elements in the coastal sea sediments and water of Maslinica Bay (Southern Adriatic), Croatia and possible changes in marine flora and foraminifera communities due to pollution were investigated. Macro, micro and trace elements’ distributions in five granulometric fractions were determined for each sediment sample. Bulk sediment samples were also subjected to leaching tests. Elemental concentrations in sediments, sediment extracts and seawater were measured by source excited energy dispersive X-ray fluorescence (EDXRF). Concentrations of the elements Cr, Cu, Zn, and Pb in bulk sediment samples taken in the Maslinica Bay were from 2.1 to over six times enriched when compared with the background level determined for coarse grained carbonate sediments. A low degree of trace elements leaching determined for bulk sediments pointed to strong bonding of trace elements to sediment mineral phases. The analyses of marine flora pointed to higher eutrophication, which disturbs the balance between communities and natural habitats.
Spatial statistical analysis of tree deaths using airborne digital imagery
NASA Astrophysics Data System (ADS)
Chang, Ya-Mei; Baddeley, Adrian; Wallace, Jeremy; Canci, Michael
2013-04-01
High resolution digital airborne imagery offers unprecedented opportunities for observation and monitoring of vegetation, providing the potential to identify, locate and track individual vegetation objects over time. Analytical tools are required to quantify relevant information. In this paper, locations of trees over a large area of native woodland vegetation were identified using morphological image analysis techniques. Methods of spatial point process statistics were then applied to estimate the spatially-varying tree death risk, and to show that it is significantly non-uniform. [Tree deaths over the area were detected in our previous work (Wallace et al., 2008).] The study area is a major source of ground water for the city of Perth, and the work was motivated by the need to understand and quantify vegetation changes in the context of water extraction and drying climate. The influence of hydrological variables on tree death risk was investigated using spatial statistics (graphical exploratory methods, spatial point pattern modelling and diagnostics).
GARLIC, A SHIELDING PROGRAM FOR GAMMA RADIATION FROM LINE- AND CYLINDER- SOURCES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roos, M.
1959-06-01
GARLlC is a program for computing the gamma ray flux or dose rate at a shielded isotropic point detector, due to a line source or the line equivalent of a cylindrical source. The source strength distribution along the line must be either uniform or an arbitrary part of the positive half-cycle of a cosine function The line source can be orierted arbitrarily with respect to the main shield and the detector, except that the detector must not be located on the line source or on its extensionThe main source is a homogeneous plane slab in which scattered radiation is accountedmore » for by multiplying each point element of the line source by a point source buildup factor inside the integral over the point elements. Between the main shield and the line source additional shields can be introduced, which are either plane slabs, parallel to the main shield, or cylindrical rings, coaxial with the line source. Scattered radiation in the additional shields can only be accounted for by constant build-up factors outside the integral. GARLlC-xyz is an extended version particularly suited for the frequently met problem of shielding a room containing a large number of line sources in diHerent positions. The program computes the angles and linear dimensions of a problem for GARLIC when the positions of the detector point and the end points of the line source are given as points in an arbitrary rectangular coordinate system. As an example the isodose curves in water are presented for a monoenergetic cosine-distributed line source at several source energies and for an operating fuel element of the Swedish reactor R3, (auth)« less
NASA Astrophysics Data System (ADS)
Cong, Chao; Liu, Dingsheng; Zhao, Lingjun
2008-12-01
This paper discusses a new method for the automatic matching of ground control points (GCPs) between satellite remote sensing Image and digital raster graphic (DRG) in urban areas. The key of this method is to automatically extract tie point pairs according to geographic characters from such heterogeneous images. Since there are big differences between such heterogeneous images respect to texture and corner features, more detail analyzations are performed to find similarities and differences between high resolution remote sensing Image and (DRG). Furthermore a new algorithms based on the fuzzy-c means (FCM) method is proposed to extract linear feature in remote sensing Image. Based on linear feature, crossings and corners extracted from these features are chosen as GCPs. On the other hand, similar method was used to find same features from DRGs. Finally, Hausdorff Distance was adopted to pick matching GCPs from above two GCP groups. Experiences shown the method can extract GCPs from such images with a reasonable RMS error.
Microbial source tracking and transfer hydrodynamics in rural catchments.
NASA Astrophysics Data System (ADS)
Murphy, Sinead; Bhreathnach, Niamh; O'Flaherty, Vincent; Jordan, Philip; Wuertz, Stefan
2013-04-01
In Ireland, bacterial pathogens from continual point source pollution and intermittent pollution from diffuse sources can impact both drinking water supplies and recreational waters. This poses a serious public health threat. Observing and establishing the source of faecal pollution is imperative for the protection of water quality and human health. Traditional culture methods to detect such pollution via faecal indicator bacteria have been widely utilised but do not decipher the source of pollution. To combat this, microbial source tracking, an important emerging molecular tool, is applied to detect host-specific markers in faecally contaminated waters. The aim of this study is to target ruminant and human-specific faecal Bacteroidales and Bacteroides 16S rRNA genes within rural river catchments in Ireland and investigate hydrological transfer dependencies. During storm events and non-storm periods, 1L untreated water samples, taken every 2 hours over a 48-hour time period at the spring (Cregduff) or outlet (Dunleer), and large (5-20L) untreated water samples were collected from two catchment sites. Cregduff is a spring emergence under a grassland karst landscape in Co. Mayo (west coast of Ireland) and Dunleer is a mixed landuse over till soils in Co. Louth (east coast). From a risk assessment point of view, the catchments are very different. Samples were filtered through 0.2µm nitrocellulose filters to concentrate bacterial cells which then underwent chemical extraction of total nucleic acids. Animal and human stool samples were also collected from the catchments to determine assay sensitivity and specificity following nucleic acid extraction. Aquifer response to seasonal events was assessed by monitoring coliforms and E. coli occurrence using the IDEXX Colisure® Quanti Tray®/2000 system in conjunction with chemical and hydrological parameters. Autoanalysers deployed at each catchment monitor multiple water parameters every 10 min such as phosphorus, nitrogen (nitrate), turbidity, conductivity and flow rate. InStat V 3.06 was used to determine correlations between chemical and microbial parameters (P< 0.05 considered significant).There was a positive correlation between E. coli and phosphorus in Cregduff during rain events (p=0.040) & significant correlation for a non-rain periods (<0.001). There was a positive correlation between E. coli and turbidity in Dunleer during rain events (p=0.0008) and in Cregduff during non-rain periods (p=0.0241). The water samples from Dunleer have a higher concentration of phosphorus than in Cregduff. Host specific primers BacCow-UCD, BacHum-UCD, BacUni-UCD and BoBac were then assayed against both faecal and water extracts and quantified using PCR. BacUni-UCD, BacCow-UCD and BoBac detected faecal contamination in three of the four sample sites in Dunleer and BacHum-UCD detected faecal contamination in one of the sites. The concentrations of the BacUni-UCD qPCR assay were higher in the water samples taken from Dunleer outlet than those taken from Cregduff spring. BacCow-UCD and BacHum-UCD qPCR detected low and very low concentrations, respectively, in water from the Dunleer outlet. The concentrations can be seen changing over the hydrograph event. None of the host-specific assays detected pollution in Cregduff. From the results, it can be seen that Dunleer is more subject to contamination than Cregduff.
Lorenson, T.D.; Hostettler, Frances D.; Rosenbauer, Robert J.; Peters, Kenneth E.; Dougherty, Jennifer A.; Kvenvolden, Keith A.; Gutmacher, Christina E.; Wong, Florence L.; Normark, William R.
2009-01-01
Oil spillage from natural sources is very common in the waters of southern California. Active oil extraction and shipping is occurring concurrently within the region and it is of great interest to resource managers to be able to distinguish between natural seepage and anthropogenic oil spillage. The major goal of this study was to establish the geologic setting, sources, and ultimate dispersal of natural oil seeps in the offshore southern Santa Maria Basin and Santa Barbara Basins. Our surveys focused on likely areas of hydrocarbon seepage that are known to occur between Point Arguello and Ventura, California. Our approach was to 1) document the locations and geochemically fingerprint natural seep oils or tar; 2) geochemically fingerprint coastal tar residues and potential tar sources in this region, both onshore and offshore; 3) establish chemical correlations between offshore active seeps and coastal residues thus linking seep sources to oil residues; 4) measure the rate of natural seepage of individual seeps and attempt to assess regional natural oil and gas seepage rates; and 5) interpret the petroleum system history for the natural seeps. To document the location of sub-sea oil seeps, we first looked into previous studies within and near our survey area. We measured the concentration of methane gas in the water column in areas of reported seepage and found numerous gas plumes and measured high concentrations of methane in the water column. The result of this work showed that the seeps were widely distributed between Point Conception east to the vicinity of Coal Oil Point, and that they by in large occur within the 3-mile limit of California State waters. Subsequent cruises used sidescan and high resolution seismic to map the seafloor, from just south of Point Arguello, east to near Gaviota, California. The results of the methane survey guided the exploration of the area west of Point Conception east to Gaviota using a combination of seismic instruments. The seafloor was mapped by sidescan sonar, and numerous lines of high -resolution seismic surveys were conducted over areas of interest. Biomarker and stable carbon isotope ratios were used to infer the age, lithology, organic matter input, and depositional environment of the source rocks for 388 samples of produced crude oil, seep oil, and tarballs mainly from coastal California. These samples were used to construct a chemometric fingerprint (multivariate statistics) decision tree to classify 288 additional samples, including tarballs of unknown origin collected from Monterey and San Mateo County beaches after a storm in early 2007. A subset of 9 of 23 active offshore platform oils and one inactive platform oil representing a few oil reservoirs from the western Santa Barbara Channel were used in this analysis, and thus this model is not comprehensive and the findings are not conclusive. The platform oils included in this study are from west to east: Irene, Hildago, Harvest, Hermosa, Heritage, Harmony, Hondo, Holly, Platform A, and Hilda (now removed). The results identify three 'tribes' of 13C-rich oil samples inferred to originate from thermally mature equivalents of the clayey-siliceous, carbonaceous marl, and lower calcareous-siliceous members of the Monterey Formation. Tribe 1 contains four oil families having geochemical traits of clay-rich marine shale source rock deposited under suboxic conditions with substantial higher-plant input. Tribe 2 contains four oil families with intermediate traits, except for abundant 28,30-bisnorhopane, indicating suboxic to anoxic marine marl source rock with hemipelagic input. Tribe 3 contains five oil families with traits of distal marine carbonate source rock deposited under anoxic conditions with pelagic but little or no higher-plant input. Tribes 1 and 2 occur mainly south of Point Conception in paleogeographic settings where deep burial of the Monterey Formation source rock favored generation from all thre
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Cloud point extraction of Δ9-tetrahydrocannabinol from cannabis resin.
Ameur, S; Haddou, B; Derriche, Z; Canselier, J P; Gourdon, C
2013-04-01
A cloud point extraction coupled with high performance liquid chromatography (HPLC/UV) method was developed for the determination of Δ(9)-tetrahydrocannabinol (THC) in micellar phase. The nonionic surfactant "Dowfax 20B102" was used to extract and pre-concentrate THC from cannabis resin, prior to its determination with a HPLC-UV system (diode array detector) with isocratic elution. The parameters and variables affecting the extraction were investigated. Under optimum conditions (1 wt.% Dowfax 20B102, 1 wt.% Na2SO4, T = 318 K, t = 30 min), this method yielded a quite satisfactory recovery rate (~81 %). The limit of detection was 0.04 μg mL(-1), and the relative standard deviation was less than 2 %. Compared with conventional solid-liquid extraction, this new method avoids the use of volatile organic solvents, therefore is environmentally safer.
Automatic extraction of pavement markings on streets from point cloud data of mobile LiDAR
NASA Astrophysics Data System (ADS)
Gao, Yang; Zhong, Ruofei; Tang, Tao; Wang, Liuzhao; Liu, Xianlin
2017-08-01
Pavement markings provide an important foundation as they help to keep roads users safe. Accurate and comprehensive information about pavement markings assists the road regulators and is useful in developing driverless technology. Mobile light detection and ranging (LiDAR) systems offer new opportunities to collect and process accurate pavement markings’ information. Mobile LiDAR systems can directly obtain the three-dimensional (3D) coordinates of an object, thus defining spatial data and the intensity of (3D) objects in a fast and efficient way. The RGB attribute information of data points can be obtained based on the panoramic camera in the system. In this paper, we present a novel method process to automatically extract pavement markings using multiple attribute information of the laser scanning point cloud from the mobile LiDAR data. This method process utilizes a differential grayscale of RGB color, laser pulse reflection intensity, and the differential intensity to identify and extract pavement markings. We utilized point cloud density to remove the noise and used morphological operations to eliminate the errors. In the application, we tested our method process on different sections of roads in Beijing, China, and Buffalo, NY, USA. The results indicated that both correctness (p) and completeness (r) were higher than 90%. The method process of this research can be applied to extract pavement markings from huge point cloud data produced by mobile LiDAR.
Sinhal, Tapati Manohar; Shah, Ruchi Rani Purvesh; Jais, Pratik Subhas; Shah, Nimisha Chinmay; Hadwani, Krupali Dhirubhai; Rothe, Tushar; Sinhal, Neha Nilesh
2018-01-01
The aim of this study is to compare and to evaluate sealing ability of newly introduced C-point system, cold lateral condensation, and thermoplasticized gutta-percha obturating technique using a dye extraction method. Sixty extracted maxillary central incisors were decoronated below the cementoenamel junction. Working length was established, and biomechanical preparation was done using K3 rotary files with standard irrigation protocol. Teeth were divided into three groups according to the obturation protocol; Group I-Cold lateral condensation, Group II-Thermoplasticized gutta-percha, and Group III-C-Point obturating system. After obturation all samples were subjected to microleakage assessment using dye extraction method. Obtained scores will be statistical analyzed using ANOVA test and post hoc Tukey's test. One-way analysis of variance revealed that there is significant difference among the three groups with P value (0.000 < 0.05). Tukey's HSD post hoc tests for multiple comparisons test shows that the Group II and III perform significantly better than Group I. Group III performs better than Group II with no significant difference. All the obturating technique showed some degree of microleakage. Root canals filled with C-point system showed least microleakage followed by thermoplasticized obturating technique with no significant difference among them. C-point obturation system could be an alternative to the cold lateral condensation technique.
Karakida, Ryo; Okada, Masato; Amari, Shun-Ichi
2016-07-01
The restricted Boltzmann machine (RBM) is an essential constituent of deep learning, but it is hard to train by using maximum likelihood (ML) learning, which minimizes the Kullback-Leibler (KL) divergence. Instead, contrastive divergence (CD) learning has been developed as an approximation of ML learning and widely used in practice. To clarify the performance of CD learning, in this paper, we analytically derive the fixed points where ML and CDn learning rules converge in two types of RBMs: one with Gaussian visible and Gaussian hidden units and the other with Gaussian visible and Bernoulli hidden units. In addition, we analyze the stability of the fixed points. As a result, we find that the stable points of CDn learning rule coincide with those of ML learning rule in a Gaussian-Gaussian RBM. We also reveal that larger principal components of the input data are extracted at the stable points. Moreover, in a Gaussian-Bernoulli RBM, we find that both ML and CDn learning can extract independent components at one of stable points. Our analysis demonstrates that the same feature components as those extracted by ML learning are extracted simply by performing CD1 learning. Expanding this study should elucidate the specific solutions obtained by CD learning in other types of RBMs or in deep networks. Copyright © 2016 Elsevier Ltd. All rights reserved.
a Voxel-Based Filtering Algorithm for Mobile LIDAR Data
NASA Astrophysics Data System (ADS)
Qin, H.; Guan, G.; Yu, Y.; Zhong, L.
2018-04-01
This paper presents a stepwise voxel-based filtering algorithm for mobile LiDAR data. In the first step, to improve computational efficiency, mobile LiDAR points, in xy-plane, are first partitioned into a set of two-dimensional (2-D) blocks with a given block size, in each of which all laser points are further organized into an octree partition structure with a set of three-dimensional (3-D) voxels. Then, a voxel-based upward growing processing is performed to roughly separate terrain from non-terrain points with global and local terrain thresholds. In the second step, the extracted terrain points are refined by computing voxel curvatures. This voxel-based filtering algorithm is comprehensively discussed in the analyses of parameter sensitivity and overall performance. An experimental study performed on multiple point cloud samples, collected by different commercial mobile LiDAR systems, showed that the proposed algorithm provides a promising solution to terrain point extraction from mobile point clouds.
Development of a compact filament-discharge multi-cusp H- ion source.
Jia, XianLu; Zhang, TianJue; Zheng, Xia; Qin, JiuChang
2012-02-01
A 14 MeV medical cyclotron with the external ion source has been designed and is being constructed at China Institute of Atomic Energy. The H(-) ion will be accelerated by this machine and the proton beam will be extracted by carbon strippers in dual opposite direction. The compact multi-cusp H(-) ion source has been developed for the cyclotron. The 79.5 mm long ion source is 48 mm in diameter, which is consisting of a special shape filament, ten columns of permanent magnets providing a multi-cusp field, and a three-electrode extraction system. So far, the 3 mA∕25 keV H(-) beam with an emittance of 0.3 π mm mrad has been obtained from the ion source. The paper gives the design details and the beam test results. Further experimental study is under way and an extracted beam of 5 mA is expected.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowder, Jeff; Cornish, Neil J.; Reddinger, J. Lucas
This work presents the first application of the method of genetic algorithms (GAs) to data analysis for the Laser Interferometer Space Antenna (LISA). In the low frequency regime of the LISA band there are expected to be tens of thousands of galactic binary systems that will be emitting gravitational waves detectable by LISA. The challenge of parameter extraction of such a large number of sources in the LISA data stream requires a search method that can efficiently explore the large parameter spaces involved. As signals of many of these sources will overlap, a global search method is desired. GAs representmore » such a global search method for parameter extraction of multiple overlapping sources in the LISA data stream. We find that GAs are able to correctly extract source parameters for overlapping sources. Several optimizations of a basic GA are presented with results derived from applications of the GA searches to simulated LISA data.« less
A method to improve data transmission efficiency of non-cabled seismographs
NASA Astrophysics Data System (ADS)
Zheng, F.; Lin, J.; Huaizhu, Z.; Yang, H.
2012-12-01
The non-cable self-locating seismograph developed by College of Instrumentation and Electrical Engineering, Jilin University integrates in-built battery, storage, WIFI, GPS and precision data acquisition. It is suitable for complex terrains which are typically not well addressed by cabled telemetric seismic instruments, such as mountains, swamps, and rivers. Moreover, it provides strong support for core functions such as long-term observation, wired and wireless data transmission, self-positioning and precision clock synchronization. The non-cable seismograph supports time window and continuous data acquisition. When the sampling time is long and sampling rate is high, a huge amount of original seismic data will be stored in the non-cable seismograph. As a result, it usually takes a long time—sometimes too long to be acceptable—to recover data in quasi real-time using wireless technology in resource exploration, especially in complex terrains. Furthermore, a large part of the recovered data is useless noise and only a small percentage is useful. For example, during the exploration experiment of a Chinese mine on July 12 and 14, 2012, we used 20 non-cable seismographs, each of them has 4 tracts. With a total of 80 tracts, 36GB data is collected over two data collecting sessions. 80 shot points were laid, each point lasting 4 seconds. As such the volume of valid data was about 100MB. That means only 0.3% of the total data was valid. At a wired data recovery rate of 200Mbps, 0.4 hours was needed to transmit all data completely. It takes even longer if one wish to review data on the spot by relying on a wireless data transmission rate of 10Mbps.A storage-type non-cable seismograph can store the collected data into several data files, and if one knows the source trigger time and vibration duration, it would be faster to collect data, thus improving data transmission efficiency. To this end, a triggering station is developed. It is one type of non-cable seismograph having the functions of a regular non-cable seismograph such as collecting, storing and transmitting, and on top of that, the abilities to acquire, record and transmit source triggering time. GPS is built into the non-cable seismograph to ensure accurate clock synchronization for all working non-cable seismographs. The source-triggered station can obtain the source trigger time accurately and store it in a file, send it to the server or portable terminal using wireless technology. The management system in the server checks clock synchronization information of each non-cable seismograph against the trigger time, determines the exact sampling location of the trigger time, extracts the corresponding data according to predetermined triggering length. It then sequences data according to the survey line, and integrate it into the seismic data file in appropriate format, thus completing the extraction of single-shot data. For off-site data recovery, one can extract all trigger time from the triggered station and recover data in the above-mentioned method post-experimental. The method can rapidly extract valid data from recovered data. Many field experiments have shown that the method can effectively improve data transmission efficiency of non-cabled seismographs and save data storage spaces in the servers.
Affonso, Regina Celis Lopes; Voytena, Ana Paula Lorenzen; Fanan, Simone; Pitz, Heloísa; Coelho, Daniela Sousa; Horstmann, Ana Luiza; Pereira, Aline; Uarrota, Virgílio Gavicho; Hillmann, Maria Clara; Varela, Lucas Andre Calbusch; Ribeiro-do-Valle, Rosa Maria; Maraschin, Marcelo
2016-01-01
The world coffee consumption has been growing for its appreciated taste and its beneficial effects on health. The residual biomass of coffee, originated in the food industry after oil extraction from coffee beans, called coffee beans residual press cake, has attracted interest as a source of compounds with antioxidant activity. This study investigated the chemical composition of aqueous extracts of coffee beans residual press cake (AE), their antioxidant activity, and the effect of topical application on the skin wound healing, in animal model, of hydrogels containing the AE, chlorogenic acid (CGA), allantoin (positive control), and carbopol (negative control). The treatments' performance was compared by measuring the reduction of the wound area, with superior result ( p < 0.05) for the green coffee AE (78.20%) with respect to roasted coffee AE (53.71%), allantoin (70.83%), and carbopol (23.56%). CGA hydrogels reduced significantly the wound area size on the inflammatory phase, which may be associated with the well known antioxidant and anti-inflammatory actions of that compound. The topic use of the coffee AE studied improved the skin wound healing and points to an interesting biotechnological application of the coffee bean residual press cake.
Wang, Dongjie; Williams, Barbara A; Ferruzzi, Mario G; D'Arcy, Bruce R
2013-01-01
Grape seed extract (GSE) phenolics have potential health-promoting properties, either from compounds present within the extract, or metabolites resulting from gastrointestinal tract (GIT) fermentation of these compounds. This study describes how GSE affected the kinetics and end-products of starch fermentation in vitro using pig intestinal and fecal inocula. Six GSE concentrations (0, 60, 125, 250, 500, and 750 µg ml⁻¹ were fermented in vitro by porcine ileal and fecal microbiota using starch as the energy source. Cumulative gas production, and end-point short chain fatty acids and ammonia were measured. GSE phenolics altered the pattern (gas kinetics, and end-products such as SCFA and NH₄⁺) of starch fermentation by both inocula, at concentrations above 250 µg ml⁻¹ . Below this level, neither inoculum showed any significant (P > 0.05) effect of the GSE. The results show that GSE phenolics at a concentration over 250 µg ml⁻¹ can have measurable effects on microbial activity in an in vitro fermentation system, as evidenced by the changes in kinetics and end-products from starch fermentation. This suggests that fermentation patterns could be conceivably shifted in the actual GIT, though further evidence will be required from in vivo studies. Copyright © 2012 Society of Chemical Industry.
Voytena, Ana Paula Lorenzen; Fanan, Simone; Pitz, Heloísa; Coelho, Daniela Sousa; Horstmann, Ana Luiza; Pereira, Aline; Uarrota, Virgílio Gavicho; Hillmann, Maria Clara; Varela, Lucas Andre Calbusch; Ribeiro-do-Valle, Rosa Maria; Maraschin, Marcelo
2016-01-01
The world coffee consumption has been growing for its appreciated taste and its beneficial effects on health. The residual biomass of coffee, originated in the food industry after oil extraction from coffee beans, called coffee beans residual press cake, has attracted interest as a source of compounds with antioxidant activity. This study investigated the chemical composition of aqueous extracts of coffee beans residual press cake (AE), their antioxidant activity, and the effect of topical application on the skin wound healing, in animal model, of hydrogels containing the AE, chlorogenic acid (CGA), allantoin (positive control), and carbopol (negative control). The treatments' performance was compared by measuring the reduction of the wound area, with superior result (p < 0.05) for the green coffee AE (78.20%) with respect to roasted coffee AE (53.71%), allantoin (70.83%), and carbopol (23.56%). CGA hydrogels reduced significantly the wound area size on the inflammatory phase, which may be associated with the well known antioxidant and anti-inflammatory actions of that compound. The topic use of the coffee AE studied improved the skin wound healing and points to an interesting biotechnological application of the coffee bean residual press cake. PMID:27965732
Vázquez, Cristina; Maier, Marta S; Parera, Sara D; Yacobaccio, Hugo; Solá, Patricia
2008-06-01
Archaeological samples are complex in composition since they generally comprise a mixture of materials submitted to deterioration factors largely dependent on the environmental conditions. Therefore, the integration of analytical tools such as TXRF, FT-IR and GC-MS can maximize the amount of information provided by the sample. Recently, two black rock art samples of camelid figures at Alero Hornillos 2, an archaeological site located near the town of Susques (Jujuy Province, Argentina), were investigated. TXRF, selected for inorganic information, showed the presence of manganese and iron among other elements, consistent with an iron and manganese oxide as the black pigment. Aiming at the detection of any residual organic compounds, the samples were extracted with a chloroform-methanol mixture and the extracts were analyzed by FT-IR, showing the presence of bands attributable to lipids. Analysis by GC-MS of the carboxylic acid methyl esters prepared from the sample extracts, indicated that the main organic constituents were saturated (C(16:0) and C(18:0)) fatty acids in relative abundance characteristic of degraded animal fat. The presence of minor C(15:0) and C(17:0) fatty acids and branched-chain iso-C(16:0) pointed to a ruminant animal source.
Lacroix, Frederic; Guillot, Mathieu; McEwen, Malcolm; Gingras, Luc; Beaulieu, Luc
2011-10-01
This work presents the experimental extraction of the perturbation factor in megavoltage electron beams for three models of silicon diodes (IBA Dosimetry, EFD and SFD, and the PTW 60012 unshielded) using a plastic scintillation detector (PSD). The authors used a single scanning PSD mounted on a high-precision scanning tank to measure depth-dose curves in 6-, 12-, and 18-MeV clinical electron beams. They also measured depth-dose curves using the IBA Dosimetry, EFD and SFD, and the PTW 60012 unshielded diodes. The authors used the depth-dose curves measured with the PSD as a perturbation-free reference to extract the perturbation factors of the diodes. The authors found that the perturbation factors for the diodes increased substantially with depth, especially for low-energy electron beams. The experimental results show the same trend as published Monte Carlo simulation results for the EFD diode; however, the perturbations measured experimentally were greater. They found that using an effective point of measurement (EPOM) placed slightly away from the source reduced the variation of perturbation factors with depth and that the optimal EPOM appears to be energy dependent. The manufacturer recommended EPOM appears to be incorrect at low electron energy (6 MeV). In addition, the perturbation factors for diodes may be greater than predicted by Monte Carlo simulations.
An occupational respiratory allergy caused by Sinapis alba pollen in olive farmers.
Anguita, J L; Palacios, L; Ruiz-Valenzuela, L; Bartolomé, B; López-Urbano, M J; Sáenz de San Pedro, B; Cano, E; Quiralte, J
2007-04-01
Sinapis alba (white mustard) is a entomophilic species included in the Brassicaceae family. To date it has not been related to allergic sensitization or clinical respiratory disease. Twelve olive orchard workers had a history of rhinitis and/or bronchial asthma that occurred during control weed management and/or harvest, from January to March. They underwent skin prick tests (SPT) with S. alba pollen extract and a standard battery of aeroallergens. Sinapis alba pollen extract was prepared for performing quantitative skin tests, enzyme allergosorbent test and nasal challenge test (NCT). A portable monitoring station and an urban volumetric Hirst-type spore trap were used for the aerobiological study. Eleven patients suffered from rhinitis and bronchial asthma and one had only from rhinitis. All patients were sensitized to S. alba pollen extract, and they showed a positive NCT response. In the urban aerobiologic monitoring station the amount of S. alba pollen only exceptionally reached peaks of 21 grains/m(3), whereas in the work environment peaks of 1801 grains/m(3) were detected between 15 February and 7 April. We demonstrate the existence of a new occupational allergen for olive farmers: S. alba pollen. We point out the importance of perform aerobiological sampling within the occupational environment for the detection and quantification of the allergenic source.
Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System
NASA Astrophysics Data System (ADS)
Chan, T. O.; Lichti, D. D.; Belton, D.
2013-10-01
At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78.43% accuracy improvement for the HDL-32E can be achieved using the proposed calibration method.
Ehlers, Kenneth W.; Leung, Ka-Ngo
1988-01-01
A high concentration of positive molecular ions of hydrogen or deuterium gas is extracted from a positive ion source having a short path length of extracted ions, relative to the mean free path of the gas molecules, to minimize the production of other ion species by collision between the positive ions and gas molecules. The ion source has arrays of permanent magnets to produce a multi-cusp magnetic field in regions remote from the plasma grid and the electron emitters, for largely confining the plasma to the space therebetween. The ion source has a chamber which is short in length, relative to its transverse dimensions, and the electron emitters are at an even shorter distance from the plasma grid, which contains one or more extraction apertures.
NASA Astrophysics Data System (ADS)
Duarte, João; Gonçalves, Gil; Duarte, Diogo; Figueiredo, Fernando; Mira, Maria
2015-04-01
Photogrammetric Unmanned Aerial Vehicles (UAVs) and Terrestrial Laser Scanners (TLS) are two emerging technologies that allows the production of dense 3D point clouds of the sensed topographic surfaces. Although image-based stereo-photogrammetric point clouds could not, in general, compete on geometric quality over TLS point clouds, fully automated mapping solutions based on ultra-light UAVs (or drones) have recently become commercially available at very reasonable accuracy and cost for engineering and geological applications. The purpose of this paper is to compare the two point clouds generated by these two technologies, in order to automatize the manual process tasks commonly used to detect and represent the attitude of discontinuities (Stereographic projection: Schmidt net - Equal area). To avoid the difficulties of access and guarantee the data survey security conditions, this fundamental step in all geological/geotechnical studies, applied to the extractive industry and engineering works, has to be replaced by a more expeditious and reliable methodology. This methodology will allow, in a more actuated clear way, give answers to the needs of evaluation of rock masses, by mapping the structures present, which will reduce considerably the associated risks (investment, structures dimensioning, security, etc.). A case study of a dolerite outcrop locate in the center of Portugal (the dolerite outcrop is situated in the volcanic complex of Serra de Todo-o-Mundo, Casais Gaiola, intruded in Jurassic sandstones) will be used to assess this methodology. The results obtained show that the 3D point cloud produced by the Photogrammetric UAV platform has the appropriate geometric quality for extracting the parameters that define the discontinuities of the dolerite outcrops. Although, they are comparable to the manual extracted parameters, their quality is inferior to parameters extracted from the TLS point cloud.
Continuously Deformation Monitoring of Subway Tunnel Based on Terrestrial Point Clouds
NASA Astrophysics Data System (ADS)
Kang, Z.; Tuo, L.; Zlatanova, S.
2012-07-01
The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the common control points can be used by each station and thus the error accumulation avoided within a section. Afterwards, the central axis of the subway tunnel is determined through RANSAC (Random Sample Consensus) algorithm and curve fitting. Although with very high resolution, laser points are still discrete and thus the vertical section is computed via the quadric fitting of the vicinity of interest, instead of the fitting of the whole model of a subway tunnel, which is determined by the intersection line rotated about the central axis of tunnel within a vertical plane. The extraction of the vertical section is then optimized using RANSAC for the purpose of filtering out noises. Based on the extracted vertical sections, the volume of tunnel deformation is estimated by the comparison between vertical sections extracted at the same position from different epochs of point clouds. Furthermore, the continuously extracted vertical sections are deployed to evaluate the convergent tendency of the tunnel. The proposed algorithms are verified using real datasets in terms of accuracy and computation efficiency. The experimental result of fitting accuracy analysis shows the maximum deviation between interpolated point and real point is 1.5 mm, and the minimum one is 0.1 mm; the convergent tendency of the tunnel was detected by the comparison of adjacent fitting radius. The maximum error is 6 mm, while the minimum one is 1 mm. The computation cost of vertical section abstraction is within 3 seconds/section, which proves high efficiency..
NASA Astrophysics Data System (ADS)
Wang, Yongzhi; Ma, Yuqing; Zhu, A.-xing; Zhao, Hui; Liao, Lixia
2018-05-01
Facade features represent segmentations of building surfaces and can serve as a building framework. Extracting facade features from three-dimensional (3D) point cloud data (3D PCD) is an efficient method for 3D building modeling. By combining the advantages of 3D PCD and two-dimensional optical images, this study describes the creation of a highly accurate building facade feature extraction method from 3D PCD with a focus on structural information. The new extraction method involves three major steps: image feature extraction, exploration of the mapping method between the image features and 3D PCD, and optimization of the initial 3D PCD facade features considering structural information. Results show that the new method can extract the 3D PCD facade features of buildings more accurately and continuously. The new method is validated using a case study. In addition, the effectiveness of the new method is demonstrated by comparing it with the range image-extraction method and the optical image-extraction method in the absence of structural information. The 3D PCD facade features extracted by the new method can be applied in many fields, such as 3D building modeling and building information modeling.
Automatic facial animation parameters extraction in MPEG-4 visual communication
NASA Astrophysics Data System (ADS)
Yang, Chenggen; Gong, Wanwei; Yu, Lu
2002-01-01
Facial Animation Parameters (FAPs) are defined in MPEG-4 to animate a facial object. The algorithm proposed in this paper to extract these FAPs is applied to very low bit-rate video communication, in which the scene is composed of a head-and-shoulder object with complex background. This paper addresses the algorithm to automatically extract all FAPs needed to animate a generic facial model, estimate the 3D motion of head by points. The proposed algorithm extracts human facial region by color segmentation and intra-frame and inter-frame edge detection. Facial structure and edge distribution of facial feature such as vertical and horizontal gradient histograms are used to locate the facial feature region. Parabola and circle deformable templates are employed to fit facial feature and extract a part of FAPs. A special data structure is proposed to describe deformable templates to reduce time consumption for computing energy functions. Another part of FAPs, 3D rigid head motion vectors, are estimated by corresponding-points method. A 3D head wire-frame model provides facial semantic information for selection of proper corresponding points, which helps to increase accuracy of 3D rigid object motion estimation.
Chen, Meilian; Kim, Sunghwan; Park, Jae-Eun; Kim, Hyun Sik; Hur, Jin
2016-07-01
Noting the source-dependent properties of dissolved organic matter (DOM), this study explored the recoverable compounds by solid phase extraction (SPE) of two common sorbents (C18 and PPL) eluted with methanol solvent for contrasting DOM sources via fluorescence excitation-emission matrix coupled with parallel factor analysis (EEM-PARAFAC) and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR-MS). Fresh algae and leaf litter extracts DOM, one riverine DOM, and one upstream lacustrine DOM were selected for the comparison. C18 sorbent was generally found to extract more diverse molecular formula, relatively higher molecular weight, and more heteroatomic DOM compounds within the studied mass range than PPL sorbent except for the leaf litter extract. Even with the same sorbent, the main molecular features of the two end member DOM were distributed on different sides of the axes of a multivariate ordination, indicating the source-dependent characteristics of the recoverable compounds by the sorbents. In addition, further examination of the molecular formula uniquely present in the two end members and the upstream lake DOM suggested that proteinaceous, tannin-like, and heteroatomic DOM constituents might be potential compound groups which are labile and easily degraded during their mobilization into downstream watershed. This study provides new insights into the sorbent selectivity of DOM from diverse sources and potential lability of various compound groups.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Stanislawski, Larry V.; Survila, Kornelijus; Wendel, Jeffrey; Liu, Yan; Buttenfield, Barbara P.
2018-01-01
This paper describes a workflow for automating the extraction of elevation-derived stream lines using open source tools with parallel computing support and testing the effectiveness of procedures in various terrain conditions within the conterminous United States. Drainage networks are extracted from the US Geological Survey 1/3 arc-second 3D Elevation Program elevation data having a nominal cell size of 10 m. This research demonstrates the utility of open source tools with parallel computing support for extracting connected drainage network patterns and handling depressions in 30 subbasins distributed across humid, dry, and transitional climate regions and in terrain conditions exhibiting a range of slopes. Special attention is given to low-slope terrain, where network connectivity is preserved by generating synthetic stream channels through lake and waterbody polygons. Conflation analysis compares the extracted streams with a 1:24,000-scale National Hydrography Dataset flowline network and shows that similarities are greatest for second- and higher-order tributaries.
Instantaneous Coastline Extraction from LIDAR Point Cloud and High Resolution Remote Sensing Imagery
NASA Astrophysics Data System (ADS)
Li, Y.; Zhoing, L.; Lai, Z.; Gan, Z.
2018-04-01
A new method was proposed for instantaneous waterline extraction in this paper, which combines point cloud geometry features and image spectral characteristics of the coastal zone. The proposed method consists of follow steps: Mean Shift algorithm is used to segment the coastal zone of high resolution remote sensing images into small regions containing semantic information;Region features are extracted by integrating LiDAR data and the surface area of the image; initial waterlines are extracted by α-shape algorithm; a region growing algorithm with is taking into coastline refinement, with a growth rule integrating the intensity and topography of LiDAR data; moothing the coastline. Experiments are conducted to demonstrate the efficiency of the proposed method.