NASA Astrophysics Data System (ADS)
Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan
2015-10-01
Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.
Discovery of Newer Therapeutic Leads for Prostate Cancer
2009-06-01
promising plant extracts and then prepare large-scale quantities of the plant extracts using supercritical fluid extraction techniques and use this...quantities of the plant extracts using supercritical fluid extraction techniques. Large scale plant collections were conducted for 14 of the top 20...material for bioassay-guided fractionation of the biologically active constituents using modern chromatography techniques. The chemical structures of
Extracting Useful Semantic Information from Large Scale Corpora of Text
ERIC Educational Resources Information Center
Mendoza, Ray Padilla, Jr.
2012-01-01
Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…
López-Padilla, Alexis; Ruiz-Rodriguez, Alejandro; Restrepo Flórez, Claudia Estela; Rivero Barrios, Diana Marsela; Reglero, Guillermo; Fornari, Tiziana
2016-06-25
Vaccinium meridionale Swartz (Mortiño or Colombian blueberry) is one of the Vaccinium species abundantly found across the Colombian mountains, which are characterized by high contents of polyphenolic compounds (anthocyanins and flavonoids). The supercritical fluid extraction (SFE) of Vaccinium species has mainly focused on the study of V. myrtillus L. (blueberry). In this work, the SFE of Mortiño fruit from Colombia was studied in a small-scale extraction cell (273 cm³) and different extraction pressures (20 and 30 MPa) and temperatures (313 and 343 K) were investigated. Then, process scaling-up to a larger extraction cell (1350 cm³) was analyzed using well-known semi-empirical engineering approaches. The Broken and Intact Cell (BIC) model was adjusted to represent the kinetic behavior of the low-scale extraction and to simulate the large-scale conditions. Extraction yields obtained were in the range 0.1%-3.2%. Most of the Mortiño solutes are readily accessible and, thus, 92% of the extractable material was recovered in around 30 min. The constant CO₂ residence time criterion produced excellent results regarding the small-scale kinetic curve according to the BIC model, and this conclusion was experimentally validated in large-scale kinetic experiments.
López-Padilla, Alexis; Ruiz-Rodriguez, Alejandro; Restrepo Flórez, Claudia Estela; Rivero Barrios, Diana Marsela; Reglero, Guillermo; Fornari, Tiziana
2016-01-01
Vaccinium meridionale Swartz (Mortiño or Colombian blueberry) is one of the Vaccinium species abundantly found across the Colombian mountains, which are characterized by high contents of polyphenolic compounds (anthocyanins and flavonoids). The supercritical fluid extraction (SFE) of Vaccinium species has mainly focused on the study of V. myrtillus L. (blueberry). In this work, the SFE of Mortiño fruit from Colombia was studied in a small-scale extraction cell (273 cm3) and different extraction pressures (20 and 30 MPa) and temperatures (313 and 343 K) were investigated. Then, process scaling-up to a larger extraction cell (1350 cm3) was analyzed using well-known semi-empirical engineering approaches. The Broken and Intact Cell (BIC) model was adjusted to represent the kinetic behavior of the low-scale extraction and to simulate the large-scale conditions. Extraction yields obtained were in the range 0.1%–3.2%. Most of the Mortiño solutes are readily accessible and, thus, 92% of the extractable material was recovered in around 30 min. The constant CO2 residence time criterion produced excellent results regarding the small-scale kinetic curve according to the BIC model, and this conclusion was experimentally validated in large-scale kinetic experiments. PMID:28773640
A Review of Feature Extraction Software for Microarray Gene Expression Data
Tan, Ching Siang; Ting, Wai Soon; Mohamad, Mohd Saberi; Chan, Weng Howe; Deris, Safaai; Ali Shah, Zuraini
2014-01-01
When gene expression data are too large to be processed, they are transformed into a reduced representation set of genes. Transforming large-scale gene expression data into a set of genes is called feature extraction. If the genes extracted are carefully chosen, this gene set can extract the relevant information from the large-scale gene expression data, allowing further analysis by using this reduced representation instead of the full size data. In this paper, we review numerous software applications that can be used for feature extraction. The software reviewed is mainly for Principal Component Analysis (PCA), Independent Component Analysis (ICA), Partial Least Squares (PLS), and Local Linear Embedding (LLE). A summary and sources of the software are provided in the last section for each feature extraction method. PMID:25250315
Efficient feature extraction from wide-area motion imagery by MapReduce in Hadoop
NASA Astrophysics Data System (ADS)
Cheng, Erkang; Ma, Liya; Blaisse, Adam; Blasch, Erik; Sheaff, Carolyn; Chen, Genshe; Wu, Jie; Ling, Haibin
2014-06-01
Wide-Area Motion Imagery (WAMI) feature extraction is important for applications such as target tracking, traffic management and accident discovery. With the increasing amount of WAMI collections and feature extraction from the data, a scalable framework is needed to handle the large amount of information. Cloud computing is one of the approaches recently applied in large scale or big data. In this paper, MapReduce in Hadoop is investigated for large scale feature extraction tasks for WAMI. Specifically, a large dataset of WAMI images is divided into several splits. Each split has a small subset of WAMI images. The feature extractions of WAMI images in each split are distributed to slave nodes in the Hadoop system. Feature extraction of each image is performed individually in the assigned slave node. Finally, the feature extraction results are sent to the Hadoop File System (HDFS) to aggregate the feature information over the collected imagery. Experiments of feature extraction with and without MapReduce are conducted to illustrate the effectiveness of our proposed Cloud-Enabled WAMI Exploitation (CAWE) approach.
Optimization and Scale-up of Inulin Extraction from Taraxacum kok-saghyz roots.
Hahn, Thomas; Klemm, Andrea; Ziesse, Patrick; Harms, Karsten; Wach, Wolfgang; Rupp, Steffen; Hirth, Thomas; Zibek, Susanne
2016-05-01
The optimization and scale-up of inulin extraction from Taraxacum kok-saghyz Rodin was successfully performed. Evaluating solubility investigations, the extraction temperature was fixed at 85 degrees C. The inulin stability regarding degradation or hydrolysis could be confirmed by extraction in the presence of model inulin. Confirming stability at the given conditions the isolation procedure was transferred from a 1 L- to a 1 m3-reactor. The Reynolds number was selected as the relevant dimensionless number that has to remain constant in both scales. The stirrer speed in the large scale was adjusted to 3.25 rpm regarding a 300 rpm stirrer speed in the 1 L-scale and relevant physical and process engineering parameters. Assumptions were confirmed by approximately homologous extraction kinetics in both scales. Since T. kok-saghyz is in the focus of research due to its rubber content side-product isolation from residual biomass it is of great economic interest. Inulin is one of these additional side-products that can be isolated in high quantity (- 35% of dry mass) and with a high average degree of polymerization (15.5) in large scale with a purity of 77%.
2013-01-01
Background A large-scale, highly accurate, machine-understandable drug-disease treatment relationship knowledge base is important for computational approaches to drug repurposing. The large body of published biomedical research articles and clinical case reports available on MEDLINE is a rich source of FDA-approved drug-disease indication as well as drug-repurposing knowledge that is crucial for applying FDA-approved drugs for new diseases. However, much of this information is buried in free text and not captured in any existing databases. The goal of this study is to extract a large number of accurate drug-disease treatment pairs from published literature. Results In this study, we developed a simple but highly accurate pattern-learning approach to extract treatment-specific drug-disease pairs from 20 million biomedical abstracts available on MEDLINE. We extracted a total of 34,305 unique drug-disease treatment pairs, the majority of which are not included in existing structured databases. Our algorithm achieved a precision of 0.904 and a recall of 0.131 in extracting all pairs, and a precision of 0.904 and a recall of 0.842 in extracting frequent pairs. In addition, we have shown that the extracted pairs strongly correlate with both drug target genes and therapeutic classes, therefore may have high potential in drug discovery. Conclusions We demonstrated that our simple pattern-learning relationship extraction algorithm is able to accurately extract many drug-disease pairs from the free text of biomedical literature that are not captured in structured databases. The large-scale, accurate, machine-understandable drug-disease treatment knowledge base that is resultant of our study, in combination with pairs from structured databases, will have high potential in computational drug repurposing tasks. PMID:23742147
Line segment extraction for large scale unorganized point clouds
NASA Astrophysics Data System (ADS)
Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan
2015-04-01
Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.
NASA Astrophysics Data System (ADS)
de Jong, Maarten F.; Baptist, Martin J.; van Hal, Ralf; de Boois, Ingeborg J.; Lindeboom, Han J.; Hoekstra, Piet
2014-06-01
For the seaward harbour extension of the Port of Rotterdam in the Netherlands, approximately 220 million m3 sand was extracted between 2009 and 2013. In order to decrease the surface area of direct impact, the authorities permitted deep sand extraction, down to 20 m below the seabed. Biological and physical impacts of large-scale and deep sand extraction are still being investigated and largely unknown. For this reason, we investigated the colonization of demersal fish in a deep sand extraction site. Two sandbars were artificially created by selective dredging, copying naturally occurring meso-scale bedforms to increase habitat heterogeneity and increasing post-dredging benthic and demersal fish species richness and biomass. Significant differences in demersal fish species assemblages in the sand extraction site were associated with variables such as water depth, median grain size, fraction of very fine sand, biomass of white furrow shell (Abra alba) and time after the cessation of sand extraction. Large quantities of undigested crushed white furrow shell fragments were found in all stomachs and intestines of plaice (Pleuronectes platessa), indicating that it is an important prey item. One and two years after cessation, a significant 20-fold increase in demersal fish biomass was observed in deep parts of the extraction site. In the troughs of a landscaped sandbar however, a significant drop in biomass down to reference levels and a significant change in species assemblage was observed two years after cessation. The fish assemblage at the crests of the sandbars differed significantly from the troughs with tub gurnard (Chelidonichthys lucerna) being a Dufrêne-Legendre indicator species of the crests. This is a first indication of the applicability of landscaping techniques to induce heterogeneity of the seabed although it remains difficult to draw a strong conclusion due the lack of replication in the experiment. A new ecological equilibrium is not reached after 2 years since biotic and abiotic variables are still adapting. To understand the final impact of deep and large-scale sand extraction on demersal fish, we recommend monitoring for a longer period, at least for a period of six years or even longer.
2012-01-01
Isolation of polyhydroxyalkanoates (PHAs) from bacterial cell matter is a critical step in order to achieve a profitable production of the polymer. Therefore, an extraction method must lead to a high recovery of a pure product at low costs. This study presents a simplified method for large scale poly(3-hydroxybutyrate), poly(3HB), extraction using sodium hypochlorite. Poly(3HB) was extracted from cells of Ralstonia eutropha H16 at almost 96% purity. At different extraction volumes, a maximum recovery rate of 91.32% was obtained. At the largest extraction volume of 50 L, poly(3HB) with an average purity of 93.32% ± 4.62% was extracted with a maximum recovery of 87.03% of the initial poly(3HB) content. This process is easy to handle and requires less efforts than previously described processes. PMID:23164136
Spatial confinement of active microtubule networks induces large-scale rotational cytoplasmic flow
Suzuki, Kazuya; Miyazaki, Makito; Takagi, Jun; Itabashi, Takeshi; Ishiwata, Shin’ichi
2017-01-01
Collective behaviors of motile units through hydrodynamic interactions induce directed fluid flow on a larger length scale than individual units. In cells, active cytoskeletal systems composed of polar filaments and molecular motors drive fluid flow, a process known as cytoplasmic streaming. The motor-driven elongation of microtubule bundles generates turbulent-like flow in purified systems; however, it remains unclear whether and how microtubule bundles induce large-scale directed flow like the cytoplasmic streaming observed in cells. Here, we adopted Xenopus egg extracts as a model system of the cytoplasm and found that microtubule bundle elongation induces directed flow for which the length scale and timescale depend on the existence of geometrical constraints. At the lower activity of dynein, kinesins bundle and slide microtubules, organizing extensile microtubule bundles. In bulk extracts, the extensile bundles connected with each other and formed a random network, and vortex flows with a length scale comparable to the bundle length continually emerged and persisted for 1 min at multiple places. When the extracts were encapsulated in droplets, the extensile bundles pushed the droplet boundary. This pushing force initiated symmetry breaking of the randomly oriented bundle network, leading to bundles aligning into a rotating vortex structure. This vortex induced rotational cytoplasmic flows on the length scale and timescale that were 10- to 100-fold longer than the vortex flows emerging in bulk extracts. Our results suggest that microtubule systems use not only hydrodynamic interactions but also mechanical interactions to induce large-scale temporally stable cytoplasmic flow. PMID:28265076
Joint classification and contour extraction of large 3D point clouds
NASA Astrophysics Data System (ADS)
Hackel, Timo; Wegner, Jan D.; Schindler, Konrad
2017-08-01
We present an effective and efficient method for point-wise semantic classification and extraction of object contours of large-scale 3D point clouds. What makes point cloud interpretation challenging is the sheer size of several millions of points per scan and the non-grid, sparse, and uneven distribution of points. Standard image processing tools like texture filters, for example, cannot handle such data efficiently, which calls for dedicated point cloud labeling methods. It turns out that one of the major drivers for efficient computation and handling of strong variations in point density, is a careful formulation of per-point neighborhoods at multiple scales. This allows, both, to define an expressive feature set and to extract topologically meaningful object contours. Semantic classification and contour extraction are interlaced problems. Point-wise semantic classification enables extracting a meaningful candidate set of contour points while contours help generating a rich feature representation that benefits point-wise classification. These methods are tailored to have fast run time and small memory footprint for processing large-scale, unstructured, and inhomogeneous point clouds, while still achieving high classification accuracy. We evaluate our methods on the semantic3d.net benchmark for terrestrial laser scans with >109 points.
DEXTER: Disease-Expression Relation Extraction from Text.
Gupta, Samir; Dingerdissen, Hayley; Ross, Karen E; Hu, Yu; Wu, Cathy H; Mazumder, Raja; Vijay-Shanker, K
2018-01-01
Gene expression levels affect biological processes and play a key role in many diseases. Characterizing expression profiles is useful for clinical research, and diagnostics and prognostics of diseases. There are currently several high-quality databases that capture gene expression information, obtained mostly from large-scale studies, such as microarray and next-generation sequencing technologies, in the context of disease. The scientific literature is another rich source of information on gene expression-disease relationships that not only have been captured from large-scale studies but have also been observed in thousands of small-scale studies. Expression information obtained from literature through manual curation can extend expression databases. While many of the existing databases include information from literature, they are limited by the time-consuming nature of manual curation and have difficulty keeping up with the explosion of publications in the biomedical field. In this work, we describe an automated text-mining tool, Disease-Expression Relation Extraction from Text (DEXTER) to extract information from literature on gene and microRNA expression in the context of disease. One of the motivations in developing DEXTER was to extend the BioXpress database, a cancer-focused gene expression database that includes data derived from large-scale experiments and manual curation of publications. The literature-based portion of BioXpress lags behind significantly compared to expression information obtained from large-scale studies and can benefit from our text-mined results. We have conducted two different evaluations to measure the accuracy of our text-mining tool and achieved average F-scores of 88.51 and 81.81% for the two evaluations, respectively. Also, to demonstrate the ability to extract rich expression information in different disease-related scenarios, we used DEXTER to extract information on differential expression information for 2024 genes in lung cancer, 115 glycosyltransferases in 62 cancers and 826 microRNA in 171 cancers. All extractions using DEXTER are integrated in the literature-based portion of BioXpress.Database URL: http://biotm.cis.udel.edu/DEXTER.
NASA Astrophysics Data System (ADS)
van der Molen, Johan
2015-04-01
Tidal power generation through submerged turbine-type devices is in an advanced stage of testing, and large-scale applications are being planned in areas with high tidal current speeds. The potential impact of such large-scale applications on the hydrography can be investigated using hydrodynamical models. In addition, aspects of the potential impact on the marine ecosystem can be studied using biogeochemical models. In this study, the coupled hydrodynamics-biogeochemistry model GETM-ERSEM is used in a shelf-wide application to investigate the potential impact of large-scale tidal power generation in the Pentland Firth. A scenario representing the currently licensed power extraction suggested i) an average reduction in M2 tidal current velocities of several cm/s within the Pentland Firth, ii) changes in the residual circulation of several mm/s in the vicinity of the Pentland Firth, iii) an increase in M2 tidal amplitude of up to 1 cm to the west of the Pentland Firth, and iv) a reduction of several mm in M2 tidal amplitude along the east coast of the UK. A second scenario representing 10 times the currently licensed power extraction resulted in changes that were approximately 10 times as large. Simulations including the biogeochemistry model for these scenarios are currently in preparation, and first results will be presented at the the conference, aiming at impacts on primary production and benthic production.
Sun, Mingmei; Xu, Xiao; Zhang, Qiuqin; Rui, Xin; Wu, Junjun; Dong, Mingsheng
2018-02-01
Ultrasound-assisted aqueous extraction (UAAE) was used to extract oil from Clanis bilineata (CB), a traditional edible insect that can be reared on a large scale in China, and the physicochemical property and antioxidant capacity of the UAAE-derived oil (UAAEO) were investigated for the first time. UAAE conditions of CB oil was optimized using response surface methodology (RSM) and the highest oil yield (19.47%) was obtained under optimal conditions for ultrasonic power, extraction temperature, extraction time, and ultrasonic interval time at 400 W, 40°C, 50 min, and 2 s, respectively. Compared with Soxhlet extraction-derived oil (SEO), UAAEO had lower acid (AV), peroxide (PV) and p-anisidine values (PAV) as well as higher polyunsaturated fatty acids contents and thermal stability. Furthermore, UAAEO showed stronger antioxidant activities than those of SEO, according to DPPH radical scavenging and β-carotene bleaching tests. Therefore, UAAE is a promising process for the large-scale production of CB oil and CB has a developing potential as functional oil resource.
Development of Solvent Extraction Approach to Recycle Enriched Molybdenum Material
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tkac, Peter; Brown, M. Alex; Sen, Sujat
2016-06-01
Argonne National Laboratory, in cooperation with Oak Ridge National Laboratory and NorthStar Medical Technologies, LLC, is developing a recycling process for a solution containing valuable Mo-100 or Mo-98 enriched material. Previously, Argonne had developed a recycle process using a precipitation technique. However, this process is labor intensive and can lead to production of large volumes of highly corrosive waste. This report discusses an alternative process to recover enriched Mo in the form of ammonium heptamolybdate by using solvent extraction. Small-scale experiments determined the optimal conditions for effective extraction of high Mo concentrations. Methods were developed for removal of ammonium chloridemore » from the molybdenum product of the solvent extraction process. In large-scale experiments, very good purification from potassium and other elements was observed with very high recovery yields (~98%).« less
An Eulerian time filtering technique to study large-scale transient flow phenomena
NASA Astrophysics Data System (ADS)
Vanierschot, Maarten; Persoons, Tim; van den Bulck, Eric
2009-10-01
Unsteady fluctuating velocity fields can contain large-scale periodic motions with frequencies well separated from those of turbulence. Examples are the wake behind a cylinder or the processing vortex core in a swirling jet. These turbulent flow fields contain large-scale, low-frequency oscillations, which are obscured by turbulence, making it impossible to identify them. In this paper, we present an Eulerian time filtering (ETF) technique to extract the large-scale motions from unsteady statistical non-stationary velocity fields or flow fields with multiple phenomena that have sufficiently separated spectral content. The ETF method is based on non-causal time filtering of the velocity records in each point of the flow field. It is shown that the ETF technique gives good results, similar to the ones obtained by the phase-averaging method. In this paper, not only the influence of the temporal filter is checked, but also parameters such as the cut-off frequency and sampling frequency of the data are investigated. The technique is validated on a selected set of time-resolved stereoscopic particle image velocimetry measurements such as the initial region of an annular jet and the transition between flow patterns in an annular jet. The major advantage of the ETF method in the extraction of large scales is that it is computationally less expensive and it requires less measurement time compared to other extraction methods. Therefore, the technique is suitable in the startup phase of an experiment or in a measurement campaign where several experiments are needed such as parametric studies.
Extraction of drainage networks from large terrain datasets using high throughput computing
NASA Astrophysics Data System (ADS)
Gong, Jianya; Xie, Jibo
2009-02-01
Advanced digital photogrammetry and remote sensing technology produces large terrain datasets (LTD). How to process and use these LTD has become a big challenge for GIS users. Extracting drainage networks, which are basic for hydrological applications, from LTD is one of the typical applications of digital terrain analysis (DTA) in geographical information applications. Existing serial drainage algorithms cannot deal with large data volumes in a timely fashion, and few GIS platforms can process LTD beyond the GB size. High throughput computing (HTC), a distributed parallel computing mode, is proposed to improve the efficiency of drainage networks extraction from LTD. Drainage network extraction using HTC involves two key issues: (1) how to decompose the large DEM datasets into independent computing units and (2) how to merge the separate outputs into a final result. A new decomposition method is presented in which the large datasets are partitioned into independent computing units using natural watershed boundaries instead of using regular 1-dimensional (strip-wise) and 2-dimensional (block-wise) decomposition. Because the distribution of drainage networks is strongly related to watershed boundaries, the new decomposition method is more effective and natural. The method to extract natural watershed boundaries was improved by using multi-scale DEMs instead of single-scale DEMs. A HTC environment is employed to test the proposed methods with real datasets.
NASA Astrophysics Data System (ADS)
Ray, R. K.; Syed, T. H.; Saha, Dipankar; Sarkar, B. C.; Patre, A. K.
2017-12-01
Extracted groundwater, 90% of which is used for irrigated agriculture, is central to the socio-economic development of India. A lack of regulation or implementation of regulations, alongside unrecorded extraction, often leads to over exploitation of large-scale common-pool resources like groundwater. Inevitably, management of groundwater extraction (draft) for irrigation is critical for sustainability of aquifers and the society at large. However, existing assessments of groundwater draft, which are mostly available at large spatial scales, are inadequate for managing groundwater resources that are primarily exploited by stakeholders at much finer scales. This study presents an estimate, projection and analysis of fine-scale groundwater draft in the Seonath-Kharun interfluve of central India. Using field surveys of instantaneous discharge from irrigation wells and boreholes, annual groundwater draft for irrigation in this area is estimated to be 212 × 106 m3, most of which (89%) is withdrawn during non-monsoon season. However, the density of wells/boreholes, and consequent extraction of groundwater, is controlled by the existing hydrogeological conditions. Based on trends in the number of abstraction structures (1982-2011), groundwater draft for the year 2020 is projected to be approximately 307 × 106 m3; hence, groundwater draft for irrigation in the study area is predicted to increase by ˜44% within a span of 8 years. Central to the work presented here is the approach for estimation and prediction of groundwater draft at finer scales, which can be extended to critical groundwater zones of the country.
Nonlinear modulation of the HI power spectrum on ultra-large scales. I
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umeh, Obinna; Maartens, Roy; Santos, Mario, E-mail: umeobinna@gmail.com, E-mail: roy.maartens@gmail.com, E-mail: mgrsantos@uwc.ac.za
2016-03-01
Intensity mapping of the neutral hydrogen brightness temperature promises to provide a three-dimensional view of the universe on very large scales. Nonlinear effects are typically thought to alter only the small-scale power, but we show how they may bias the extraction of cosmological information contained in the power spectrum on ultra-large scales. For linear perturbations to remain valid on large scales, we need to renormalize perturbations at higher order. In the case of intensity mapping, the second-order contribution to clustering from weak lensing dominates the nonlinear contribution at high redshift. Renormalization modifies the mean brightness temperature and therefore the evolutionmore » bias. It also introduces a term that mimics white noise. These effects may influence forecasting analysis on ultra-large scales.« less
Marrone, Babetta L.; Lacey, Ronald E.; Anderson, Daniel B.; ...
2017-08-07
Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with disrupting the algae cell wall and drying the biomass before solvent extraction of the lipids. Here we review the research and development conducted by the Harvesting and Extraction Team during the 3-year National Alliance for Advanced Biofuels and Bioproducts (NAABB) algal consortium project. The harvesting andmore » extraction team investigated five harvesting and three wet extraction technologies at lab bench scale for effectiveness, and conducted a techoeconomic study to evaluate their costs and energy efficiency compared to available baseline technologies. Based on this study, three harvesting technologies were selected for further study at larger scale. We evaluated the selected harvesting technologies: electrocoagulation, membrane filtration, and ultrasonic harvesting, in a field study at minimum scale of 100 L/h. None of the extraction technologies were determined to be ready for scale-up; therefore, an emerging extraction technology (wet solvent extraction) was selected from industry to provide scale-up data and capabilities to produce lipid and lipid-extracted materials for the NAABB program. One specialized extraction/adsorption technology was developed that showed promise for recovering high value co-products from lipid extracts. Overall, the NAABB Harvesting and Extraction Team improved the readiness level of several innovative, energy efficient technologies to integrate with algae production processes and captured valuable lessons learned about scale-up challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marrone, Babetta L.; Lacey, Ronald E.; Anderson, Daniel B.
Energy-efficient and scalable harvesting and lipid extraction processes must be developed in order for the algal biofuels and bioproducts industry to thrive. The major challenge for harvesting is the handling of large volumes of cultivation water to concentrate low amounts of biomass. For lipid extraction, the major energy and cost drivers are associated with disrupting the algae cell wall and drying the biomass before solvent extraction of the lipids. Here we review the research and development conducted by the Harvesting and Extraction Team during the 3-year National Alliance for Advanced Biofuels and Bioproducts (NAABB) algal consortium project. The harvesting andmore » extraction team investigated five harvesting and three wet extraction technologies at lab bench scale for effectiveness, and conducted a techoeconomic study to evaluate their costs and energy efficiency compared to available baseline technologies. Based on this study, three harvesting technologies were selected for further study at larger scale. We evaluated the selected harvesting technologies: electrocoagulation, membrane filtration, and ultrasonic harvesting, in a field study at minimum scale of 100 L/h. None of the extraction technologies were determined to be ready for scale-up; therefore, an emerging extraction technology (wet solvent extraction) was selected from industry to provide scale-up data and capabilities to produce lipid and lipid-extracted materials for the NAABB program. One specialized extraction/adsorption technology was developed that showed promise for recovering high value co-products from lipid extracts. Overall, the NAABB Harvesting and Extraction Team improved the readiness level of several innovative, energy efficient technologies to integrate with algae production processes and captured valuable lessons learned about scale-up challenges.« less
An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys
ERIC Educational Resources Information Center
Wetzel, Eunike; Xu, Xueli; von Davier, Matthias
2015-01-01
In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…
Extracting Communities from Complex Networks by the k-Dense Method
NASA Astrophysics Data System (ADS)
Saito, Kazumi; Yamada, Takeshi; Kazama, Kazuhiro
To understand the structural and functional properties of large-scale complex networks, it is crucial to efficiently extract a set of cohesive subnetworks as communities. There have been proposed several such community extraction methods in the literature, including the classical k-core decomposition method and, more recently, the k-clique based community extraction method. The k-core method, although computationally efficient, is often not powerful enough for uncovering a detailed community structure and it produces only coarse-grained and loosely connected communities. The k-clique method, on the other hand, can extract fine-grained and tightly connected communities but requires a substantial amount of computational load for large-scale complex networks. In this paper, we present a new notion of a subnetwork called k-dense, and propose an efficient algorithm for extracting k-dense communities. We applied our method to the three different types of networks assembled from real data, namely, from blog trackbacks, word associations and Wikipedia references, and demonstrated that the k-dense method could extract communities almost as efficiently as the k-core method, while the qualities of the extracted communities are comparable to those obtained by the k-clique method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Taiping; Khangaonkar, Tarang; Long, Wen
2014-02-07
In recent years, with the rapid growth of global energy demand, the interest in extracting uranium from seawater for nuclear energy has been renewed. While extracting seawater uranium is not yet commercially viable, it serves as a “backstop” to the conventional uranium resources and provides an essentially unlimited supply of uranium resource. With recent advances in seawater uranium extraction technology, extracting uranium from seawater could be economically feasible when the extraction devices are deployed at a large scale (e.g., several hundred km2). There is concern however that the large scale deployment of adsorbent farms could result in potential impacts tomore » the hydrodynamic flow field in an oceanic setting. In this study, a kelp-type structure module was incorporated into a coastal ocean model to simulate the blockage effect of uranium extraction devices on the flow field. The module was quantitatively validated against laboratory flume experiments for both velocity and turbulence profiles. The model-data comparison showed an overall good agreement and validated the approach of applying the model to assess the potential hydrodynamic impact of uranium extraction devices or other underwater structures in coastal oceans.« less
Mathieson, William; Guljar, Nafia; Sanchez, Ignacio; Sroya, Manveer; Thomas, Gerry A
2018-05-03
DNA extracted from formalin-fixed, paraffin-embedded (FFPE) tissue blocks is amenable to analytical techniques, including sequencing. DNA extraction protocols are typically long and complex, often involving an overnight proteinase K digest. Automated platforms that shorten and simplify the process are therefore an attractive proposition for users wanting a faster turn-around or to process large numbers of biospecimens. It is, however, unclear whether automated extraction systems return poorer DNA yields or quality than manual extractions performed by experienced technicians. We extracted DNA from 42 FFPE clinical tissue biospecimens using the QiaCube (Qiagen) and ExScale (ExScale Biospecimen Solutions) automated platforms, comparing DNA yields and integrities with those from manual extractions. The QIAamp DNA FFPE Spin Column Kit was used for manual and QiaCube DNA extractions and the ExScale extractions were performed using two of the manufacturer's magnetic bead kits: one extracting DNA only and the other simultaneously extracting DNA and RNA. In all automated extraction methods, DNA yields and integrities (assayed using DNA Integrity Numbers from a 4200 TapeStation and the qPCR-based Illumina FFPE QC Assay) were poorer than in the manual method, with the QiaCube system performing better than the ExScale system. However, ExScale was fastest, offered the highest reproducibility when extracting DNA only, and required the least intervention or technician experience. Thus, the extraction methods have different strengths and weaknesses, would appeal to different users with different requirements, and therefore, we cannot recommend one method over another.
Energy extraction from a large-scale microbial fuel cell system treating municipal wastewater
NASA Astrophysics Data System (ADS)
Ge, Zheng; Wu, Liao; Zhang, Fei; He, Zhen
2015-11-01
Development of microbial fuel cell (MFC) technology must address the challenges associated with energy extraction from large-scale MFC systems consisting of multiple modules. Herein, energy extraction is investigated with a 200-L MFC system (effective volume of 100 L for this study) treating actual municipal wastewater. A commercially available energy harvesting device (BQ 25504) is used successfully to convert 0.8-2.4 V from the MFCs to 5 V for charging ultracapacitors and running a DC motor. Four different types of serial connection containing different numbers of MFC modules are examined for energy extraction and conversion efficiency. The connection containing three rows of the MFCs has exhibited the best performance with the highest power output of ∼114 mW and the conversion efficiency of ∼80%. The weak performance of one-row MFCs negatively affects the overall performance of the connected MFCs in terms of both energy production and conversion. Those results indicate that an MFC system with balanced performance among individual modules will be critical to energy extraction. Future work will focus on application of the extracted energy to support MFC operation.
NASA Astrophysics Data System (ADS)
van der Molen, Johan; Ruardij, Piet; Greenwood, Naomi
2016-04-01
Final results are presented of a model study to assess the potential wider area effects of large-scale tidal energy extraction in the Pentland Firth on the biogeochemistry. The coupled hydrodynamics-biogeochemistry model GETM-ERSEM-BFM was used in a shelf-wide application with a parameterisation of the effects of power extraction by tidal turbines on fluid momentum. Three secenario runs were carried out: a reference run without turbines, an 800 MW extraction run corresponding to current licenses, and an academic 8 GW extraction run. The changes simulated with the 800 MW extraction were negligible. The academic 8 GW extraction resulted in reductions in tidal elevations along the east coast of the UK that would be measurable (several cm.), and associated reductions in bed-shear stresses. These resulted in reductions in SPM concentrations, increased primary production, and increased biomass of zooplankton and benthic fauna. The effects were most pronounced in the shallow seas surrounding The Wash, with changes of up to 10%. These results indicate that, should tidal power generation substantially beyond the currently licensed amount be planned, either concentrated in one location or spread over multiple locations along the coast, further investigations are advisable.
A rapid extraction of landslide disaster information research based on GF-1 image
NASA Astrophysics Data System (ADS)
Wang, Sai; Xu, Suning; Peng, Ling; Wang, Zhiyi; Wang, Na
2015-08-01
In recent years, the landslide disasters occurred frequently because of the seismic activity. It brings great harm to people's life. It has caused high attention of the state and the extensive concern of society. In the field of geological disaster, landslide information extraction based on remote sensing has been controversial, but high resolution remote sensing image can improve the accuracy of information extraction effectively with its rich texture and geometry information. Therefore, it is feasible to extract the information of earthquake- triggered landslides with serious surface damage and large scale. Taking the Wenchuan county as the study area, this paper uses multi-scale segmentation method to extract the landslide image object through domestic GF-1 images and DEM data, which uses the estimation of scale parameter tool to determine the optimal segmentation scale; After analyzing the characteristics of landslide high-resolution image comprehensively and selecting spectrum feature, texture feature, geometric features and landform characteristics of the image, we can establish the extracting rules to extract landslide disaster information. The extraction results show that there are 20 landslide whose total area is 521279.31 .Compared with visual interpretation results, the extraction accuracy is 72.22%. This study indicates its efficient and feasible to extract earthquake landslide disaster information based on high resolution remote sensing and it provides important technical support for post-disaster emergency investigation and disaster assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardle, Kent E.; Frey, Kurt; Pereira, Candido
2014-02-02
This task is aimed at predictive modeling of solvent extraction processes in typical extraction equipment through multiple simulation methods at various scales of resolution. We have conducted detailed continuum fluid dynamics simulation on the process unit level as well as simulations of the molecular-level physical interactions which govern extraction chemistry. Through combination of information gained through simulations at each of these two tiers along with advanced techniques such as the Lattice Boltzmann Method (LBM) which can bridge these two scales, we can develop the tools to work towards predictive simulation for solvent extraction on the equipment scale (Figure 1). Themore » goal of such a tool-along with enabling optimized design and operation of extraction units-would be to allow prediction of stage extraction effrciency under specified conditions. Simulation efforts on each of the two scales will be described below. As the initial application of FELBM in the work performed during FYl0 has been on annular mixing it will be discussed in context of the continuum-scale. In the future, however, it is anticipated that the real value of FELBM will be in its use as a tool for sub-grid model development through highly refined DNS-like multiphase simulations facilitating exploration and development of droplet models including breakup and coalescence which will be needed for the large-scale simulations where droplet level physics cannot be resolved. In this area, it can have a significant advantage over traditional CFD methods as its high computational efficiency allows exploration of significantly greater physical detail especially as computational resources increase in the future.« less
Increasing Scalability of Researcher Network Extraction from the Web
NASA Astrophysics Data System (ADS)
Asada, Yohei; Matsuo, Yutaka; Ishizuka, Mitsuru
Social networks, which describe relations among people or organizations as a network, have recently attracted attention. With the help of a social network, we can analyze the structure of a community and thereby promote efficient communications within it. We investigate the problem of extracting a network of researchers from the Web, to assist efficient cooperation among researchers. Our method uses a search engine to get the cooccurences of names of two researchers and calculates the streangth of the relation between them. Then we label the relation by analyzing the Web pages in which these two names cooccur. Research on social network extraction using search engines as ours, is attracting attention in Japan as well as abroad. However, the former approaches issue too many queries to search engines to extract a large-scale network. In this paper, we propose a method to filter superfluous queries and facilitates the extraction of large-scale networks. By this method we are able to extract a network of around 3000-nodes. Our experimental results show that the proposed method reduces the number of queries significantly while preserving the quality of the network as compared to former methods.
Atkinson, Jonathan A; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E; Griffiths, Marcus; Wells, Darren M
2017-10-01
Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. © The Authors 2017. Published by Oxford University Press.
Atkinson, Jonathan A.; Lobet, Guillaume; Noll, Manuel; Meyer, Patrick E.; Griffiths, Marcus
2017-01-01
Abstract Genetic analyses of plant root systems require large datasets of extracted architectural traits. To quantify such traits from images of root systems, researchers often have to choose between automated tools (that are prone to error and extract only a limited number of architectural traits) or semi-automated ones (that are highly time consuming). We trained a Random Forest algorithm to infer architectural traits from automatically extracted image descriptors. The training was performed on a subset of the dataset, then applied to its entirety. This strategy allowed us to (i) decrease the image analysis time by 73% and (ii) extract meaningful architectural traits based on image descriptors. We also show that these traits are sufficient to identify the quantitative trait loci that had previously been discovered using a semi-automated method. We have shown that combining semi-automated image analysis with machine learning algorithms has the power to increase the throughput of large-scale root studies. We expect that such an approach will enable the quantification of more complex root systems for genetic studies. We also believe that our approach could be extended to other areas of plant phenotyping. PMID:29020748
Evaluation of nucleus segmentation in digital pathology images through large scale image synthesis
NASA Astrophysics Data System (ADS)
Zhou, Naiyun; Yu, Xiaxia; Zhao, Tianhao; Wen, Si; Wang, Fusheng; Zhu, Wei; Kurc, Tahsin; Tannenbaum, Allen; Saltz, Joel; Gao, Yi
2017-03-01
Digital histopathology images with more than 1 Gigapixel are drawing more and more attention in clinical, biomedical research, and computer vision fields. Among the multiple observable features spanning multiple scales in the pathology images, the nuclear morphology is one of the central criteria for diagnosis and grading. As a result it is also the mostly studied target in image computing. Large amount of research papers have devoted to the problem of extracting nuclei from digital pathology images, which is the foundation of any further correlation study. However, the validation and evaluation of nucleus extraction have yet been formulated rigorously and systematically. Some researches report a human verified segmentation with thousands of nuclei, whereas a single whole slide image may contain up to million. The main obstacle lies in the difficulty of obtaining such a large number of validated nuclei, which is essentially an impossible task for pathologist. We propose a systematic validation and evaluation approach based on large scale image synthesis. This could facilitate a more quantitatively validated study for current and future histopathology image analysis field.
The role of large scale motions on passive scalar transport
NASA Astrophysics Data System (ADS)
Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano
2014-11-01
We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.
NASA Astrophysics Data System (ADS)
De Dominicis, Michela; O'Hara Murray, Rory; Wolf, Judith
2017-04-01
A comprehensive assessment of the tidal energy resource realistically available for electricity generation and the study of the potential environmental impacts associated with its extraction in the Pentland Firth (Scottish Waters, UK) are presented. In order to examine both local (< 100 km) and region-wide (>100 km) spatial scales, the Scottish Shelf Model (SSM), an unstructured grid three-dimensional FVCOM (Finite Volume Community Ocean Model) model implementation has been used, since it covers the entire NW European Shelf, with a high resolution where the tidal stream energy is extracted. A large theoretical array of tidal stream turbines has been designed and implemented in the model using the momentum sink approach, in which a momentum sink term represents the loss of momentum due to tidal energy extraction. The estimate of the maximum available power for electricity generation from the Pentland Firth is 1.64 GW, which requires thousands of turbines to be deployed. This estimate takes into account the tidal stream energy extraction feedbacks on the flow and considers, for the first time, the realistic operation of a generic tidal stream turbine, which is limited to operate in a range of flow velocities due to technological constraints. The ocean response to the extraction of 1.64 GW of energy has been examined by comparing a typical annual cycle of the NW European Shelf hydrodynamics reproduced by the SSM with the same period perturbed by tidal stream energy extraction. The changes were analysed at the temporal scale of a spring-neap tidal cycle and, for the first time, on longer term seasonal timescales. Tidal elevation mainly increases in the vicinity of the tidal farm, while far-field effects show a decrease in the mean spring tidal range of the order of 2 cm along the whole east coast of the UK, possibly counteracting some part of the predicted sea level rise due to climate change. Marine currents, both tidal and residual flows, are also affected. They can slow down due to the turbines action or speed up due to flow diversion processes, on both a local and regional scale. The strongest signal in tidal velocities is an overall reduction, which can in turn decrease the energy of tidal mixing and perturb the seasonal stratification on the NW European Shelf. Although the strength of summer water stratification has been found to slightly increase, the extent of the stratified region does not greatly change, thus suggesting the enhanced biological and pelagic biodiversity hotspots, e.g. tidal mixing front locations, are not displaced. Such large scale tidal stream energy extraction is unlikely to occur in the near future, but such potential changes should be considered when planning future tidal energy exploitation. It is likely that large scale developments around the NW European shelf will interact and could, for example, intensify or weaken the changes predicted here, or even be used as mitigation measures (e.g. coastal defence) for other changes (e.g. climate change).
Watchueng, Jean; Kamnaing, Pierre; Gao, Jin-Ming; Kiyota, Taira; Yeboah, Faustinus; Konishi, Yasuo
2011-05-20
Paclitaxel was purified using high-performance displacement chromatography (HPDC) technique, but not by the mechanism of HPDC. On small scale, paclitaxel was extracted with methanol from dry needles of Taxus canadensis and was enriched by extracting with chloroform after removing water-soluble hydrophilic components and hexane-soluble hydrophobic components. Then, 93-99% purity of paclitaxel was obtained using the HPDC technique. On large scale, taxanes were enriched by solvent partitioning between acetic acid/MeOH/H(2)O and hexane and extracted with CH(2)Cl(2). Taxanes except paclitaxel were further removed by extracting with methanol-water-trifluoroacetic acid (1.0:98.9:0.1, v/v/v). Applying HPDC technique to water-insoluble substances is problematic as this method requires a highly aqueous solvent system. In order to overcome this incompatibility, a system was set up where paclitaxel, although in low concentration, was extracted by methanol-water-trifluoroacetic acid (10.0:89.9:0.1, v/v/v). Recycling the extracting solvent to ensure minimal volume, the extracted paclitaxel was adsorbed on a C(18) trap column. A C(18) column of 4.6mm internal diameter was then connected to the trap column. The HPDC technique was thus carried out using an isocratic acetonitrile-water-trifluoroacetic acid (30.0:69.9:0.1, v/v/v) mobile phase consisting of a displacer cetylpyridinium trifluoroacetate (3mg/mL). Paclitaxel was co-eluted with the displacer and spontaneously crystallized. The crystal (114mg) showed 99.4% purity and only 10% of paclitaxel in the starting crude extract was lost during the enrichment/purification processes. This large scale purification method was successfully applied to purify paclitaxel from Chinese yew in small scale, suggesting general applicability of the method. This is the first report of purifying a water-insoluble natural product using HPDC technique. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Fish scale terrace GaInN/GaN light-emitting diodes with enhanced light extraction
NASA Astrophysics Data System (ADS)
Stark, Christoph J. M.; Detchprohm, Theeradetch; Zhao, Liang; Paskova, Tanya; Preble, Edward A.; Wetzel, Christian
2012-12-01
Non-planar GaInN/GaN light-emitting diodes were epitaxially grown to exhibit steps for enhanced light emission. By means of a large off-cut of the epitaxial growth plane from the c-plane (0.06° to 2.24°), surface morphologies of steps and inclined terraces that resemble fish scale patterns could controllably be achieved. These patterns penetrate the active region without deteriorating the electrical device performance. We find conditions leading to a large increase in light-output power over the virtually on-axis device and over planar sapphire references. The process is found suitable to enhance light extraction even without post-growth processing.
ERIC Educational Resources Information Center
Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.
2007-01-01
The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…
Mehryary, Farrokh; Kaewphan, Suwisa; Hakala, Kai; Ginter, Filip
2016-01-01
Biomedical event extraction is one of the key tasks in biomedical text mining, supporting various applications such as database curation and hypothesis generation. Several systems, some of which have been applied at a large scale, have been introduced to solve this task. Past studies have shown that the identification of the phrases describing biological processes, also known as trigger detection, is a crucial part of event extraction, and notable overall performance gains can be obtained by solely focusing on this sub-task. In this paper we propose a novel approach for filtering falsely identified triggers from large-scale event databases, thus improving the quality of knowledge extraction. Our method relies on state-of-the-art word embeddings, event statistics gathered from the whole biomedical literature, and both supervised and unsupervised machine learning techniques. We focus on EVEX, an event database covering the whole PubMed and PubMed Central Open Access literature containing more than 40 million extracted events. The top most frequent EVEX trigger words are hierarchically clustered, and the resulting cluster tree is pruned to identify words that can never act as triggers regardless of their context. For rarely occurring trigger words we introduce a supervised approach trained on the combination of trigger word classification produced by the unsupervised clustering method and manual annotation. The method is evaluated on the official test set of BioNLP Shared Task on Event Extraction. The evaluation shows that the method can be used to improve the performance of the state-of-the-art event extraction systems. This successful effort also translates into removing 1,338,075 of potentially incorrect events from EVEX, thus greatly improving the quality of the data. The method is not solely bound to the EVEX resource and can be thus used to improve the quality of any event extraction system or database. The data and source code for this work are available at: http://bionlp-www.utu.fi/trigger-clustering/.
SureChEMBL: a large-scale, chemically annotated patent document database.
Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P
2016-01-04
SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
SureChEMBL: a large-scale, chemically annotated patent document database
Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A.; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P.
2016-01-01
SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. PMID:26582922
Meadows, J.C.; Tillitt, D.E.; Schwartz, T.R.; Schroeder, D.J.; Echols, K.R.; Gale, R.W.; Powell, D.C.; Bursian, S.J.
1996-01-01
A 41.3-kg sample of double-crested cormorant (Phalacrocorax auritus) egg contents was extracted, yielding over 2 L of egg lipid. The double-crested cormorant (DCC) egg extract, after clean-up and concentration, was intended for use in egg injection studies to determine the embryotoxicity of the organic contaminants found within the eggs. Large-scale dialysis was used as a preliminary treatment to separate the extracted contaminants from the co-extracted sample lipids. The lipid was dialyzed in 80×5 cm semi-permeable membrane devices (SPMDs) in 50-ml aliquants. After the removal of 87 g of cholesterol by freeze-fractionation, the remaining lipid carryover (56 g) was removed by 100 routine gel permeation chromatography (GPC) operations. A 41,293-g sample was thus extracted and purified to the extent that it could easily be placed at a volume of 5 ml, the volume calculated to be necessary for the egg injection study. Analyses were performed comparing contaminant concentrations in the final purified extract to those present in the original egg material, in the extract after dialysis and cholesterol removal, and in the excluded materials. Recoveries of organochlorine pesticides through dialysis and cholesterol ranged from 96% to 135%. Total polychlorinated biphenyls in the final extract were 96% of those measured in the original egg material. Analysis of excluded lipid and cholesterol indicated that 92% of the polychlorinated dibenzo-dioxins and-furans were separated into the final extract.
SPECTRAL LINE DE-CONFUSION IN AN INTENSITY MAPPING SURVEY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Yun-Ting; Bock, James; Bradford, C. Matt
2016-12-01
Spectral line intensity mapping (LIM) has been proposed as a promising tool to efficiently probe the cosmic reionization and the large-scale structure. Without detecting individual sources, LIM makes use of all available photons and measures the integrated light in the source confusion limit to efficiently map the three-dimensional matter distribution on large scales as traced by a given emission line. One particular challenge is the separation of desired signals from astrophysical continuum foregrounds and line interlopers. Here we present a technique to extract large-scale structure information traced by emission lines from different redshifts, embedded in a three-dimensional intensity mapping data cube.more » The line redshifts are distinguished by the anisotropic shape of the power spectra when projected onto a common coordinate frame. We consider the case where high-redshift [C ii] lines are confused with multiple low-redshift CO rotational lines. We present a semi-analytic model for [C ii] and CO line estimates based on the cosmic infrared background measurements, and show that with a modest instrumental noise level and survey geometry, the large-scale [C ii] and CO power spectrum amplitudes can be successfully extracted from a confusion-limited data set, without external information. We discuss the implications and limits of this technique for possible LIM experiments.« less
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS ? (ABSTRACT)
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
FIELD-SCALE STUDIES: HOW DOES SOIL SAMPLE PRETREATMENT AFFECT REPRESENTATIVENESS?
Samples from field-scale studies are very heterogeneous and can contain large soil and rock particles. Oversize materials are often removed before chemical analysis of the soil samples because it is not practical to include these materials. Is the extracted sample representativ...
First results of the ITER-relevant negative ion beam test facility ELISE (invited).
Fantz, U; Franzen, P; Heinemann, B; Wünderlich, D
2014-02-01
An important step in the European R&D roadmap towards the neutral beam heating systems of ITER is the new test facility ELISE (Extraction from a Large Ion Source Experiment) for large-scale extraction from a half-size ITER RF source. The test facility was constructed in the last years at Max-Planck-Institut für Plasmaphysik Garching and is now operational. ELISE is gaining early experience of the performance and operation of large RF-driven negative hydrogen ion sources with plasma illumination of a source area of 1 × 0.9 m(2) and an extraction area of 0.1 m(2) using 640 apertures. First results in volume operation, i.e., without caesium seeding, are presented.
Recovering the full velocity and density fields from large-scale redshift-distance samples
NASA Technical Reports Server (NTRS)
Bertschinger, Edmund; Dekel, Avishai
1989-01-01
A new method for extracting the large-scale three-dimensional velocity and mass density fields from measurements of the radial peculiar velocities is presented. Galaxies are assumed to trace the velocity field rather than the mass. The key assumption made is that the Lagrangian velocity field has negligible vorticity, as might be expected from perturbations that grew by gravitational instability. By applying the method to cosmological N-body simulations, it is demonstrated that it accurately reconstructs the velocity field. This technique promises a direct determination of the mass density field and the initial conditions for the formation of large-scale structure from galaxy peculiar velocity surveys.
Large scale preparation and crystallization of neuron-specific enolase.
Ishioka, N; Isobe, T; Kadoya, T; Okuyama, T; Nakajima, T
1984-03-01
A simple method has been developed for the large scale purification of neuron-specific enolase [EC 4.2.1.11]. The method consists of ammonium sulfate fractionation of brain extract, and two subsequent column chromatography steps on DEAE Sephadex A-50. The chromatography was performed on a short (25 cm height) and thick (8.5 cm inside diameter) column unit that was specially devised for the large scale preparation. The purified enolase was crystallized in 0.05 M imidazole-HCl buffer containing 1.6 M ammonium sulfate (pH 6.39), with a yield of 0.9 g/kg of bovine brain tissue.
Plianwong, Samarwadee; Sripattanaporn, Areerut; Waewsa-nga, Kwanrutai; Buacheen, Parin; Opanasopit, Praneet; Ngawhirunpat, Tanasait; Rojanarata, Theerasak
2012-08-30
A fast, facile, and economical assay for basic nitrogenous drugs has been developed based on the mini-scale extraction of the drug-dye ion pair complex combined with the use of safe-for-analyst and eco-friendlier organic extractant and drop-based micro-spectrophotometry. Instead of using large volume devices, the extraction was simply carried out in typical 1.5 mL microcentrifuge tubes along with the use of micropipettes for accurate transfer of liquids, vortex mixer for efficient partitioning of solutes and benchtop centrifuge for rapid phase separation. In the last step, back-extraction was performed by using the microvolume of acidic solution in order to concentrate the colored species into a confined aqueous microdrop and to keep the analyst away from unwanted contact and inhalation of organic solvents during the quantitation step which was achieved by using cuvetteless UV-vis micro-spectrophotometry without any prior dilutions. Using chlorpheniramine maleate as a representative analyte and n-butyl acetate as a less toxic and non-ozone depleting extractant, the miniaturized method was less laborious and much faster. It was accurate, precise and insensitive to the interferences from common excipients. Notably, it gave the assay results of drug in tablets and oral solution comparable to the large-scale pharmacopeial method while the consumption of organic solvents and the release of wastes were lowered by 200-400 folds. Copyright © 2012 Elsevier B.V. All rights reserved.
Evaluation of Sampling Methods for Bacillus Spore ...
Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.
The influence of large-scale wind power on global climate.
Keith, David W; Decarolis, Joseph F; Denkenberger, David C; Lenschow, Donald H; Malyshev, Sergey L; Pacala, Stephen; Rasch, Philip J
2004-11-16
Large-scale use of wind power can alter local and global climate by extracting kinetic energy and altering turbulent transport in the atmospheric boundary layer. We report climate-model simulations that address the possible climatic impacts of wind power at regional to global scales by using two general circulation models and several parameterizations of the interaction of wind turbines with the boundary layer. We find that very large amounts of wind power can produce nonnegligible climatic change at continental scales. Although large-scale effects are observed, wind power has a negligible effect on global-mean surface temperature, and it would deliver enormous global benefits by reducing emissions of CO(2) and air pollutants. Our results may enable a comparison between the climate impacts due to wind power and the reduction in climatic impacts achieved by the substitution of wind for fossil fuels.
Bremer, Peer-Timo; Weber, Gunther; Tierny, Julien; Pascucci, Valerio; Day, Marcus S; Bell, John B
2011-09-01
Large-scale simulations are increasingly being used to study complex scientific and engineering phenomena. As a result, advanced visualization and data analysis are also becoming an integral part of the scientific process. Often, a key step in extracting insight from these large simulations involves the definition, extraction, and evaluation of features in the space and time coordinates of the solution. However, in many applications, these features involve a range of parameters and decisions that will affect the quality and direction of the analysis. Examples include particular level sets of a specific scalar field, or local inequalities between derived quantities. A critical step in the analysis is to understand how these arbitrary parameters/decisions impact the statistical properties of the features, since such a characterization will help to evaluate the conclusions of the analysis as a whole. We present a new topological framework that in a single-pass extracts and encodes entire families of possible features definitions as well as their statistical properties. For each time step we construct a hierarchical merge tree a highly compact, yet flexible feature representation. While this data structure is more than two orders of magnitude smaller than the raw simulation data it allows us to extract a set of features for any given parameter selection in a postprocessing step. Furthermore, we augment the trees with additional attributes making it possible to gather a large number of useful global, local, as well as conditional statistic that would otherwise be extremely difficult to compile. We also use this representation to create tracking graphs that describe the temporal evolution of the features over time. Our system provides a linked-view interface to explore the time-evolution of the graph interactively alongside the segmentation, thus making it possible to perform extensive data analysis in a very efficient manner. We demonstrate our framework by extracting and analyzing burning cells from a large-scale turbulent combustion simulation. In particular, we show how the statistical analysis enabled by our techniques provides new insight into the combustion process.
An improved active contour model for glacial lake extraction
NASA Astrophysics Data System (ADS)
Zhao, H.; Chen, F.; Zhang, M.
2017-12-01
Active contour model is a widely used method in visual tracking and image segmentation. Under the driven of objective function, the initial curve defined in active contour model will evolve to a stable condition - a desired result in given image. As a typical region-based active contour model, C-V model has a good effect on weak boundaries detection and anti noise ability which shows great potential in glacial lake extraction. Glacial lake is a sensitive indicator for reflecting global climate change, therefore accurate delineate glacial lake boundaries is essential to evaluate hydrologic environment and living environment. However, the current method in glacial lake extraction mainly contains water index method and recognition classification method are diffcult to directly applied in large scale glacial lake extraction due to the diversity of glacial lakes and masses impacted factors in the image, such as image noise, shadows, snow and ice, etc. Regarding the abovementioned advantanges of C-V model and diffcults in glacial lake extraction, we introduce the signed pressure force function to improve the C-V model for adapting to processing of glacial lake extraction. To inspect the effect of glacial lake extraction results, three typical glacial lake development sites were selected, include Altai mountains, Centre Himalayas, South-eastern Tibet, and Landsat8 OLI imagery was conducted as experiment data source, Google earth imagery as reference data for varifying the results. The experiment consequence suggests that improved active contour model we proposed can effectively discriminate the glacial lakes from complex backgound with a higher Kappa Coefficient - 0.895, especially in some small glacial lakes which belongs to weak information in the image. Our finding provide a new approach to improved accuracy under the condition of large proportion of small glacial lakes and the possibility for automated glacial lake mapping in large-scale area.
El Bali, Latifa; Diman, Aurélie; Bernard, Alfred; Roosens, Nancy H. C.; De Keersmaecker, Sigrid C. J.
2014-01-01
Human genomic DNA extracted from urine could be an interesting tool for large-scale public health studies involving characterization of genetic variations or DNA biomarkers as a result of the simple and noninvasive collection method. These studies, involving many samples, require a rapid, easy, and standardized extraction protocol. Moreover, for practicability, there is a necessity to collect urine at a moment different from the first void and to store it appropriately until analysis. The present study compared seven commercial kits to select the most appropriate urinary human DNA extraction procedure for epidemiological studies. DNA yield has been determined using different quantification methods: two classical, i.e., NanoDrop and PicoGreen, and two species-specific real-time quantitative (q)PCR assays, as DNA extracted from urine contains, besides human, microbial DNA also, which largely contributes to the total DNA yield. In addition, the kits giving a good yield were also tested for the presence of PCR inhibitors. Further comparisons were performed regarding the sampling time and the storage conditions. Finally, as a proof-of-concept, an important gene related to smoking has been genotyped using the developed tools. We could select one well-performing kit for the human DNA extraction from urine suitable for molecular diagnostic real-time qPCR-based assays targeting genetic variations, applicable to large-scale studies. In addition, successful genotyping was possible using DNA extracted from urine stored at −20°C for several months, and an acceptable yield could also be obtained from urine collected at different moments during the day, which is particularly important for public health studies. PMID:25365790
ERIC Educational Resources Information Center
DOLBY, J.L.; AND OTHERS
THE STUDY IS CONCERNED WITH THE LINGUISTIC PROBLEM INVOLVED IN TEXT COMPRESSION--EXTRACTING, INDEXING, AND THE AUTOMATIC CREATION OF SPECIAL-PURPOSE CITATION DICTIONARIES. IN SPITE OF EARLY SUCCESS IN USING LARGE-SCALE COMPUTERS TO AUTOMATE CERTAIN HUMAN TASKS, THESE PROBLEMS REMAIN AMONG THE MOST DIFFICULT TO SOLVE. ESSENTIALLY, THE PROBLEM IS TO…
On the linearity of tracer bias around voids
NASA Astrophysics Data System (ADS)
Pollina, Giorgia; Hamaus, Nico; Dolag, Klaus; Weller, Jochen; Baldi, Marco; Moscardini, Lauro
2017-07-01
The large-scale structure of the Universe can be observed only via luminous tracers of the dark matter. However, the clustering statistics of tracers are biased and depend on various properties, such as their host-halo mass and assembly history. On very large scales, this tracer bias results in a constant offset in the clustering amplitude, known as linear bias. Towards smaller non-linear scales, this is no longer the case and tracer bias becomes a complicated function of scale and time. We focus on tracer bias centred on cosmic voids, I.e. depressions of the density field that spatially dominate the Universe. We consider three types of tracers: galaxies, galaxy clusters and active galactic nuclei, extracted from the hydrodynamical simulation Magneticum Pathfinder. In contrast to common clustering statistics that focus on auto-correlations of tracers, we find that void-tracer cross-correlations are successfully described by a linear bias relation. The tracer-density profile of voids can thus be related to their matter-density profile by a single number. We show that it coincides with the linear tracer bias extracted from the large-scale auto-correlation function and expectations from theory, if sufficiently large voids are considered. For smaller voids we observe a shift towards higher values. This has important consequences on cosmological parameter inference, as the problem of unknown tracer bias is alleviated up to a constant number. The smallest scales in existing data sets become accessible to simpler models, providing numerous modes of the density field that have been disregarded so far, but may help to further reduce statistical errors in constraining cosmology.
Spatial resolution requirements for automated cartographic road extraction
Benjamin, S.; Gaydos, L.
1990-01-01
Ground resolution requirements for detection and extraction of road locations in a digitized large-scale photographic database were investigated. A color infrared photograph of Sunnyvale, California was scanned, registered to a map grid, and spatially degraded to 1- to 5-metre resolution pixels. Road locations in each data set were extracted using a combination of image processing and CAD programs. These locations were compared to a photointerpretation of road locations to determine a preferred pixel size for the extraction method. Based on road pixel omission error computations, a 3-metre pixel resolution appears to be the best choice for this extraction method. -Authors
Modelling the large-scale redshift-space 3-point correlation function of galaxies
NASA Astrophysics Data System (ADS)
Slepian, Zachary; Eisenstein, Daniel J.
2017-08-01
We present a configuration-space model of the large-scale galaxy 3-point correlation function (3PCF) based on leading-order perturbation theory and including redshift-space distortions (RSD). This model should be useful in extracting distance-scale information from the 3PCF via the baryon acoustic oscillation method. We include the first redshift-space treatment of biasing by the baryon-dark matter relative velocity. Overall, on large scales the effect of RSD is primarily a renormalization of the 3PCF that is roughly independent of both physical scale and triangle opening angle; for our adopted Ωm and bias values, the rescaling is a factor of ˜1.8. We also present an efficient scheme for computing 3PCF predictions from our model, important for allowing fast exploration of the space of cosmological parameters in future analyses.
Groups of galaxies in the Center for Astrophysics redshift survey
NASA Technical Reports Server (NTRS)
Ramella, Massimo; Geller, Margaret J.; Huchra, John P.
1989-01-01
By applying the Huchra and Geller (1982) objective group identification algorithm to the Center for Astrophysics' redshift survey, a catalog of 128 groups with three or more members is extracted, and 92 of these are used as a statistical sample. A comparison of the distribution of group centers with the distribution of all galaxies in the survey indicates qualitatively that groups trace the large-scale structure of the region. The physical properties of groups may be related to the details of large-scale structure, and it is concluded that differences among group catalogs may be due to the properties of large-scale structures and their location relative to the survey limits.
Supercritical fluid extraction of plant flavors and fragrances.
Capuzzo, Andrea; Maffei, Massimo E; Occhipinti, Andrea
2013-06-19
Supercritical fluid extraction (SFE) of plant material with solvents like CO₂, propane, butane, or ethylene is a topic of growing interest. SFE allows the processing of plant material at low temperatures, hence limiting thermal degradation, and avoids the use of toxic solvents. Although today SFE is mainly used for decaffeination of coffee and tea as well as production of hop extracts on a large scale, there is also a growing interest in this extraction method for other industrial applications operating at different scales. In this review we update the literature data on SFE technology, with particular reference to flavors and fragrance, by comparing traditional extraction techniques of some industrial medicinal and aromatic crops with SFE. Moreover, we describe the biological activity of SFE extracts by describing their insecticidal, acaricidal, antimycotic, antimicrobial, cytotoxic and antioxidant properties. Finally, we discuss the process modelling, mass-transfer mechanisms, kinetics parameters and thermodynamic by giving an overview of SFE potential in the flavors and fragrances arena.
A Large-Scale Analysis of Variance in Written Language
ERIC Educational Resources Information Center
Johns, Brendan T.; Jamieson, Randall K.
2018-01-01
The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers,…
Meullemiestre, A; Petitcolas, E; Maache-Rezzoug, Z; Chemat, F; Rezzoug, S A
2016-01-01
Maritime pine sawdust, a by-product from industry of wood transformation, has been investigated as a potential source of polyphenols which were extracted by ultrasound-assisted maceration (UAM). UAM was optimized for enhancing extraction efficiency of polyphenols and reducing time-consuming. In a first time, a preliminary study was carried out to optimize the solid/liquid ratio (6g of dry material per mL) and the particle size (0.26 cm(2)) by conventional maceration (CVM). Under these conditions, the optimum conditions for polyphenols extraction by UAM, obtained by response surface methodology, were 0.67 W/cm(2) for the ultrasonic intensity (UI), 40°C for the processing temperature (T) and 43 min for the sonication time (t). UAM was compared with CVM, the results showed that the quantity of polyphenols was improved by 40% (342.4 and 233.5mg of catechin equivalent per 100g of dry basis, respectively for UAM and CVM). A multistage cross-current extraction procedure allowed evaluating the real impact of UAM on the solid-liquid extraction enhancement. The potential industrialization of this procedure was implemented through a transition from a lab sonicated reactor (3 L) to a large scale one with 30 L volume. Copyright © 2015 Elsevier B.V. All rights reserved.
Katsura, Kazushige; Matsuda, Takayoshi; Tomabechi, Yuri; Yonemochi, Mayumi; Hanada, Kazuharu; Ohsawa, Noboru; Sakamoto, Kensaku; Takemoto, Chie; Shirouzu, Mikako
2017-11-01
Cell-free protein synthesis is a useful method for preparing proteins for functional or structural analyses. However, batch-to-batch variability with regard to protein synthesis activity remains a problem for large-scale production of cell extract in the laboratory. To address this issue, we have developed a novel procedure for large-scale preparation of bacterial cell extract with high protein synthesis activity. The developed procedure comprises cell cultivation using a fermentor, harvesting and washing of cells by tangential flow filtration, cell disruption with high-pressure homogenizer and continuous diafiltration. By optimizing and combining these methods, ∼100 ml of the cell extract was prepared from 150 g of Escherichia coli cells. The protein synthesis activities, defined as the yield of protein per unit of absorbance at 260 nm of the cell extract, were shown to be reproducible, and the average activity of several batches was twice that obtained using a previously reported method. In addition, combinatorial use of the high-pressure homogenizer and diafiltration increased the scalability, indicating that the cell concentration at disruption varies from 0.04 to 1 g/ml. Furthermore, addition of Gam protein and examinations of the N-terminal sequence rendered the extract prepared here useful for rapid screening with linear DNA templates. © The Authors 2017. Published by Oxford University Press on behalf of the Japanese Biochemical Society. All rights reserved.
Hu, Shun-Wei; Chen, Shushi
2017-01-01
The large-scale simultaneous extraction and concentration of aqueous solutions of triazine analogs, and aflatoxins, through a hydrocarbon-based membrane (e.g., polyethylene, polyethylene/polypropylene copolymer) under ambient temperature and atmospheric pressure is reported. The subsequent adsorption of analyte in the extraction chamber over the lignin-modified silica gel facilitates the process by reducing the operating time. The maximum adsorption capacity values for triazine analogs and aflatoxins are mainly adsorption mechanism-dependent and were calculated to be 0.432 and 0.297 mg/10 mg, respectively. The permeation, and therefore the percentage of analyte extracted, ranges from 1% to almost 100%, and varies among the solvents examined. It is considered to be vapor pressure- and chemical polarity-dependent, and is thus highly affected by the nature and thickness of the membrane, the discrepancy in the solubility values of the analyte between the two liquid phases, and the amount of adsorbent used in the process. A dependence on the size of the analyte was observed in the adsorption capacity measurement, but not in the extraction process. The theoretical interaction simulation and FTIR data show that the planar aflatoxin molecule releases much more energy when facing toward the membrane molecule when approaching it, and the mechanism leading to the adsorption. PMID:28398252
Hu, Shun-Wei; Chen, Shushi
2017-04-11
The large-scale simultaneous extraction and concentration of aqueous solutions of triazine analogs, and aflatoxins, through a hydrocarbon-based membrane (e.g., polyethylene, polyethylene/polypropylene copolymer) under ambient temperature and atmospheric pressure is reported. The subsequent adsorption of analyte in the extraction chamber over the lignin-modified silica gel facilitates the process by reducing the operating time. The maximum adsorption capacity values for triazine analogs and aflatoxins are mainly adsorption mechanism-dependent and were calculated to be 0.432 and 0.297 mg/10 mg, respectively. The permeation, and therefore the percentage of analyte extracted, ranges from 1% to almost 100%, and varies among the solvents examined. It is considered to be vapor pressure- and chemical polarity-dependent, and is thus highly affected by the nature and thickness of the membrane, the discrepancy in the solubility values of the analyte between the two liquid phases, and the amount of adsorbent used in the process. A dependence on the size of the analyte was observed in the adsorption capacity measurement, but not in the extraction process. The theoretical interaction simulation and FTIR data show that the planar aflatoxin molecule releases much more energy when facing toward the membrane molecule when approaching it, and the mechanism leading to the adsorption.
Athavale, Prashant; Xu, Robert; Radau, Perry; Nachman, Adrian; Wright, Graham A
2015-07-01
Images consist of structures of varying scales: large scale structures such as flat regions, and small scale structures such as noise, textures, and rapidly oscillatory patterns. In the hierarchical (BV, L(2)) image decomposition, Tadmor, et al. (2004) start with extracting coarse scale structures from a given image, and successively extract finer structures from the residuals in each step of the iterative decomposition. We propose to begin instead by extracting the finest structures from the given image and then proceed to extract increasingly coarser structures. In most images, noise could be considered as a fine scale structure. Thus, starting the image decomposition with finer scales, rather than large scales, leads to fast denoising. We note that our approach turns out to be equivalent to the nonstationary regularization in Scherzer and Weickert (2000). The continuous limit of this procedure leads to a time-scaled version of total variation flow. Motivated by specific clinical applications, we introduce an image depending weight in the regularization functional, and study the corresponding weighted TV flow. We show that the edge-preserving property of the multiscale representation of an input image obtained with the weighted TV flow can be enhanced and localized by appropriate choice of the weight. We use this in developing an efficient and edge-preserving denoising algorithm with control on speed and localization properties. We examine analytical properties of the weighted TV flow that give precise information about the denoising speed and the rate of change of energy of the images. An additional contribution of the paper is to use the images obtained at different scales for robust multiscale registration. We show that the inherently multiscale nature of the weighted TV flow improved performance for registration of noisy cardiac MRI images, compared to other methods such as bilateral or Gaussian filtering. A clinical application of the multiscale registration algorithm is also demonstrated for aligning viability assessment magnetic resonance (MR) images from 8 patients with previous myocardial infarctions. Copyright © 2015. Published by Elsevier B.V.
Cardona, Jorge A; Lee, Joon-Hee; Talcott, Stephen T
2009-09-23
The muscadine grape ( Vitis rotundifolia ) industry of the southern United States is largely devoid of value-added processes that capture the phytochemical content of wine and juice byproducts. Methods to recover and stabilize polyphenolics from muscadine grape pomace following juice manufacture were evaluated in laboratory-scale and pilot-scale trials. In laboratory-scale trials using osmotic equilibration, water-based extracts from juice pomace initially extracted 31-42% of total polyphenolics, 26-32% of total ellagic acid, and 36-62% of total anthocyanins. When adsorbed onto Amberlite XAD-4 resin to concentrate polyphenolics, these extracts lost 10.5% of their total ellagic acid from inefficient adsorption to the solid phase support. Subsequent pilot-scale trials were evaluated using hot water extracts from grape juice pomace followed by aerobic yeast fermentation to remove sugars and comparison to reversed phase C(18) and Amberlite XAD-4. Extracts were also concentrated using spray-drying and vacuum evaporation. Fermentation had a minor impact on the retention of most polyphenolic compounds evaluated, yet resulted in a 16.3% decrease in antioxidant capacity. Spray-drying resulted in a 30.3% loss in total anthocyanins, a 21.5% loss in total phenolics, and a 23.3% decrease in antioxidant activity, whereas vacuum evaporation had no deleterious impact on these parameters. The physiology of the muscadine grape and its unique phytochemical composition has limited utilization of pomace from wine and juice manufacture. However, these studies demonstrated the potential to extract and concentrate polyphenolic-rich extracts for use in value-added applications.
Porous extraction paddle: a solid phase extraction technique for studying the urine metabolome
Shao, Gang; MacNeil, Michael; Yao, Yuanyuan; Giese, Roger W.
2016-01-01
RATIONALE A method was needed to accomplish solid phase extraction of a large urine volume in a convenient way where resources are limited, towards a goal of metabolome and xenobiotic exposome analysis at another, distant location. METHODS A porous extraction paddle (PEP) was set up, comprising a porous nylon bag containing extraction particles that is flattened and immobilized between two stainless steel meshes. Stirring the PEP after attachment to a shaft of a motor mounted on the lid of the jar containing the urine accomplishes extraction. The bag contained a mixture of nonpolar and partly nonpolar particles to extract a diversity of corresponding compounds. RESULTS Elution of a urine-exposed, water-washed PEP with aqueous methanol containing triethylammonium acetate (conditions intended to give a complete elution), followed by MALDI-TOF/TOF-MS, demonstrated that a diversity of compounds had been extracted ranging from uric acid to peptides. CONCLUSION The PEP allows the user to extract a large liquid sample in a jar simply by turning on a motor. The technique will be helpful in conducting metabolomics and xenobiotic exposome studies of urine, encouraging the extraction of large volumes to set up a convenient repository sample (e.g. 2 g of exposed adsorbent in a cryovial) for shipment and re-analysis in various ways in the future, including scaled-up isolation of unknown chemicals for identification. PMID:27624170
Porous extraction paddle: a solid phase extraction technique for studying the urine metabolome.
Shao, Gang; MacNeil, Michael; Yao, Yuanyuan; Giese, Roger W
2016-09-14
A method was needed to accomplish solid phase extraction of a large urine volume in a convenient way where resources are limited, towards a goal of metabolome and xenobiotic exposome analysis at another, distant location. A porous extraction paddle (PEP) was set up, comprising a porous nylon bag containing extraction particles that is flattened and immobilized between two stainless steel meshes. Stirring the PEP after attachment to a shaft of a motor mounted on the lid of the jar containing the urine accomplishes extraction. The bag contained a mixture of nonpolar and partly nonpolar particles to extract a diversity of corresponding compounds. Elution of a urine-exposed, water-washed PEP with aqueous methanol containing triethylammonium acetate (conditions intended to give a complete elution), followed by MALDI-TOF/TOF-MS, demonstrated that a diversity of compounds had been extracted ranging from uric acid to peptides. The PEP allows the user to extract a large liquid sample in a jar simply by turning on a motor. The technique will be helpful in conducting metabolomics and xenobiotic exposome studies of urine, encouraging the extraction of large volumes to set up a convenient repository sample (e.g. 2 g of exposed adsorbent in a cryovial) for shipment and re-analysis in various ways in the future, including scaled-up isolation of unknown chemicals for identification. This article is protected by copyright. All rights reserved.
Multi scales based sparse matrix spectral clustering image segmentation
NASA Astrophysics Data System (ADS)
Liu, Zhongmin; Chen, Zhicai; Li, Zhanming; Hu, Wenjin
2018-04-01
In image segmentation, spectral clustering algorithms have to adopt the appropriate scaling parameter to calculate the similarity matrix between the pixels, which may have a great impact on the clustering result. Moreover, when the number of data instance is large, computational complexity and memory use of the algorithm will greatly increase. To solve these two problems, we proposed a new spectral clustering image segmentation algorithm based on multi scales and sparse matrix. We devised a new feature extraction method at first, then extracted the features of image on different scales, at last, using the feature information to construct sparse similarity matrix which can improve the operation efficiency. Compared with traditional spectral clustering algorithm, image segmentation experimental results show our algorithm have better degree of accuracy and robustness.
Application of LANDSAT data to delimitation of avalanche hazards in Montane Colorado
NASA Technical Reports Server (NTRS)
Knepper, D. H. (Principal Investigator); Ives, J. D.; Summer, R.
1975-01-01
The author has identified the following significant results. Interpretation of small scale LANDSAT imagery provides a means for determining the general location and distribution of avalanche paths. The accuracy and completeness of small scale mapping is less than is obtained from the interpretation of large scale color infrared photos. Interpretation of enlargement prints (18X) of LANDSAT imagery is superior to small scale imagery, because more detailed information can be extracted and annotated.
1987-09-01
77) Large scale purification of the acetylcholine receptor protein In its membrane-bound and detergent extracted forms from Torpedo marmorata...maintenance of the postsynaptic apparatus in the adult. Our studies have alac led to the Identification of agrin, a protein that is extracted from the synapse...in extracts of muscle, and monoclonal antibodies directed against &grin recognize molecules highly concentrated in the synaptic basal lamina at the
Complexity-aware simple modeling.
Gómez-Schiavon, Mariana; El-Samad, Hana
2018-02-26
Mathematical models continue to be essential for deepening our understanding of biology. On one extreme, simple or small-scale models help delineate general biological principles. However, the parsimony of detail in these models as well as their assumption of modularity and insulation make them inaccurate for describing quantitative features. On the other extreme, large-scale and detailed models can quantitatively recapitulate a phenotype of interest, but have to rely on many unknown parameters, making them often difficult to parse mechanistically and to use for extracting general principles. We discuss some examples of a new approach-complexity-aware simple modeling-that can bridge the gap between the small-scale and large-scale approaches. Copyright © 2018 Elsevier Ltd. All rights reserved.
Avalanches and scaling collapse in the large-N Kuramoto model
NASA Astrophysics Data System (ADS)
Coleman, J. Patrick; Dahmen, Karin A.; Weaver, Richard L.
2018-04-01
We study avalanches in the Kuramoto model, defined as excursions of the order parameter due to ephemeral episodes of synchronization. We present scaling collapses of the avalanche sizes, durations, heights, and temporal profiles, extracting scaling exponents, exponent relations, and scaling functions that are shown to be consistent with the scaling behavior of the power spectrum, a quantity independent of our particular definition of an avalanche. A comprehensive scaling picture of the noise in the subcritical finite-N Kuramoto model is developed, linking this undriven system to a larger class of driven avalanching systems.
Sophia: A Expedient UMLS Concept Extraction Annotator.
Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H
2014-01-01
An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks.
Sophia: A Expedient UMLS Concept Extraction Annotator
Divita, Guy; Zeng, Qing T; Gundlapalli, Adi V.; Duvall, Scott; Nebeker, Jonathan; Samore, Matthew H.
2014-01-01
An opportunity exists for meaningful concept extraction and indexing from large corpora of clinical notes in the Veterans Affairs (VA) electronic medical record. Currently available tools such as MetaMap, cTAKES and HITex do not scale up to address this big data need. Sophia, a rapid UMLS concept extraction annotator was developed to fulfill a mandate and address extraction where high throughput is needed while preserving performance. We report on the development, testing and benchmarking of Sophia against MetaMap and cTAKEs. Sophia demonstrated improved performance on recall as compared to cTAKES and MetaMap (0.71 vs 0.66 and 0.38). The overall f-score was similar to cTAKES and an improvement over MetaMap (0.53 vs 0.57 and 0.43). With regard to speed of processing records, we noted Sophia to be several fold faster than cTAKES and the scaled-out MetaMap service. Sophia offers a viable alternative for high-throughput information extraction tasks. PMID:25954351
Kang, Dong Young; Kim, Won-Suk; Heo, In Sook; Park, Young Hun; Lee, Seungho
2010-11-01
Hyaluronic acid (HA) was extracted in a relatively large scale from rooster comb using a method similar to that reported previously. The extraction method was modified to simplify and to reduce time and cost in order to accommodate a large-scale extraction. Five hundred grams of frozen rooster combs yielded about 500 mg of dried HA. Extracted HA was characterized using asymmetrical flow field-flow fractionation (AsFlFFF) coupled online to a multiangle light scattering detector and a refractive index detector to determine the molecular size, molecular weight (MW) distribution, and molecular conformation of HA. For characterization of HA, AsFlFFF was operated by a simplified two-step procedure, instead of the conventional three-step procedure, where the first two steps (sample loading and focusing) were combined into one to avoid the adsorption of viscous HA onto the channel membrane. The simplified two-step AsFlFFF yielded reasonably good separations of HA molecules based on their MWs. The weight average MW (M(w) ) and the average root-mean-square (RMS) radius of HA extracted from rooster comb were 1.20×10(6) and 94.7 nm, respectively. When the sample solution was filtered through a 0.45 μm disposable syringe filter, they were reduced down to 3.8×10(5) and 50.1 nm, respectively. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Pelka, David G.; And Others
1978-01-01
The large-scale generation of electrical power by wind turbine fields is discussed. It is shown that the maximum power that can be extracted by a wind turbine is 16/27 of the power available in the wind. (BB)
Recent patents on the extraction of carotenoids.
Riggi, Ezio
2010-01-01
This article reviews the patents that have been presented during the last decade related to the extraction of carotenoids from various forms of organic matter (fruit, vegetables, animals), with an emphasis on the methods and mechanisms exploited by these technologies, and on technical solutions for the practical problems related to these technologies. I present and classify 29 methods related to the extraction processes (physical, mechanical, chemical, and enzymatic). The large number of processes for extraction by means of supercritical fluids and the growing number of large-scale industrial plants suggest a positive trend towards using this technique that is currently slowed by its cost. This trend should be reinforced by growing restrictions imposed on the use of most organic solvents for extraction of food products and by increasingly strict waste management regulations that are indirectly promoting the use of extraction processes that leave the residual (post-extraction) matrix substantially free from solvents and compounds that must subsequently be removed or treated. None of the reviewed approaches is the best answer for every extractable compound and source, so each should be considered as one of several alternatives, including the use of a combination of extraction approaches.
A fast learning method for large scale and multi-class samples of SVM
NASA Astrophysics Data System (ADS)
Fan, Yu; Guo, Huiming
2017-06-01
A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.
Ortega, Humberto E; Teixeira, Eliane de Morais; Rabello, Ana; Higginbotham, Sarah; Cubilla-Ríos, Luis
2014-01-01
Palmarumycin CP18, isolated from an extract of the fermentation broth and mycelium of the Panamanian endophytic fungus Edenia sp., was previously reported with strong and specific activity against Leishmania donovani. Here we report that when the same strain was cultured on different solid media--Harrold Agar, Leonian Agar, Potato dextrose Agar (PDA), Corn Meal Agar, Honey Peptone Agar, and eight vegetables (V8) Agar--in order to determine the optimal conditions for isolation of palmarumycin CP18, no signal for this compound was observed in any of the 1H NMR spectra of fractions obtained from these extracts. However, one extract, prepared from the fungal culture in PDA contained significant amounts of CJ-12,372, a possible biosynthetic precursor of palmarumycin CP18. Edenia sp. was cultivated on a large scale on PDA and CJ-12,372 was converted to palmarumycin CP18 by oxidation of its p-hydroquinone moiety with DDQ in dioxane. Palmarumycin CP18 showed anti-leishmanial activity against L. donovani in a macrophage/amastigote model, with IC50 values of 23.5 microM.
Computational methods to extract meaning from text and advance theories of human cognition.
McNamara, Danielle S
2011-01-01
Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA. Copyright © 2010 Cognitive Science Society, Inc.
Large-Scale Pumping Test Recommendations for the 200-ZP-1 Operable Unit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spane, Frank A.
2010-09-08
CH2M Hill Plateau Remediation Company (CHPRC) is currently assessing aquifer characterization needs to optimize pump-and-treat remedial strategies (e.g., extraction well pumping rates, pumping schedule/design) in the 200-ZP-1 operable unit (OU), and in particular for the immediate area of the 241 TX-TY Tank Farm. Specifically, CHPRC is focusing on hydrologic characterization opportunities that may be available for newly constructed and planned ZP-1 extraction wells. These new extraction wells will be used to further refine the 3-dimensional subsurface contaminant distribution within this area and will be used in concert with other existing pump-and-treat wells to remediate the existing carbon tetrachloride contaminant plume.more » Currently, 14 extraction wells are actively used in the Interim Record of Decision ZP-1 pump-and-treat system for the purpose of remediating the existing carbon tetrachloride contamination in groundwater within this general area. As many as 20 new extraction wells and 17 injection wells may be installed to support final pump-and-treat operations within the OU area. It should be noted that although the report specifically refers to the 200-ZP-1 OU, the large-scale test recommendations are also applicable to the adjacent 200-UP-1 OU area. This is because of the similar hydrogeologic conditions exhibited within these two adjoining OU locations.« less
Use of tandem circulation wells to measure hydraulic conductivity without groundwater extraction
NASA Astrophysics Data System (ADS)
Goltz, Mark N.; Huang, Junqi; Close, Murray E.; Flintoft, Mark J.; Pang, Liping
2008-09-01
Conventional methods to measure the hydraulic conductivity of an aquifer on a relatively large scale (10-100 m) require extraction of significant quantities of groundwater. This can be expensive, and otherwise problematic, when investigating a contaminated aquifer. In this study, innovative approaches that make use of tandem circulation wells to measure hydraulic conductivity are proposed. These approaches measure conductivity on a relatively large scale, but do not require extraction of groundwater. Two basic approaches for using circulation wells to measure hydraulic conductivity are presented; one approach is based upon the dipole-flow test method, while the other approach relies on a tracer test to measure the flow of water between two recirculating wells. The approaches are tested in a relatively homogeneous and isotropic artificial aquifer, where the conductivities measured by both approaches are compared to each other and to the previously measured hydraulic conductivity of the aquifer. It was shown that both approaches have the potential to accurately measure horizontal and vertical hydraulic conductivity for a relatively large subsurface volume without the need to pump groundwater to the surface. Future work is recommended to evaluate the ability of these tandem circulation wells to accurately measure hydraulic conductivity when anisotropy and heterogeneity are greater than in the artificial aquifer used for these studies.
2012-01-01
Computational approaches to generate hypotheses from biomedical literature have been studied intensively in recent years. Nevertheless, it still remains a challenge to automatically discover novel, cross-silo biomedical hypotheses from large-scale literature repositories. In order to address this challenge, we first model a biomedical literature repository as a comprehensive network of biomedical concepts and formulate hypotheses generation as a process of link discovery on the concept network. We extract the relevant information from the biomedical literature corpus and generate a concept network and concept-author map on a cluster using Map-Reduce frame-work. We extract a set of heterogeneous features such as random walk based features, neighborhood features and common author features. The potential number of links to consider for the possibility of link discovery is large in our concept network and to address the scalability problem, the features from a concept network are extracted using a cluster with Map-Reduce framework. We further model link discovery as a classification problem carried out on a training data set automatically extracted from two network snapshots taken in two consecutive time duration. A set of heterogeneous features, which cover both topological and semantic features derived from the concept network, have been studied with respect to their impacts on the accuracy of the proposed supervised link discovery process. A case study of hypotheses generation based on the proposed method has been presented in the paper. PMID:22759614
Xu, Rong; Li, Li; Wang, QuanQiu
2013-01-01
Motivation: Systems approaches to studying phenotypic relationships among diseases are emerging as an active area of research for both novel disease gene discovery and drug repurposing. Currently, systematic study of disease phenotypic relationships on a phenome-wide scale is limited because large-scale machine-understandable disease–phenotype relationship knowledge bases are often unavailable. Here, we present an automatic approach to extract disease–manifestation (D-M) pairs (one specific type of disease–phenotype relationship) from the wide body of published biomedical literature. Data and Methods: Our method leverages external knowledge and limits the amount of human effort required. For the text corpus, we used 119 085 682 MEDLINE sentences (21 354 075 citations). First, we used D-M pairs from existing biomedical ontologies as prior knowledge to automatically discover D-M–specific syntactic patterns. We then extracted additional pairs from MEDLINE using the learned patterns. Finally, we analysed correlations between disease manifestations and disease-associated genes and drugs to demonstrate the potential of this newly created knowledge base in disease gene discovery and drug repurposing. Results: In total, we extracted 121 359 unique D-M pairs with a high precision of 0.924. Among the extracted pairs, 120 419 (99.2%) have not been captured in existing structured knowledge sources. We have shown that disease manifestations correlate positively with both disease-associated genes and drug treatments. Conclusions: The main contribution of our study is the creation of a large-scale and accurate D-M phenotype relationship knowledge base. This unique knowledge base, when combined with existing phenotypic, genetic and proteomic datasets, can have profound implications in our deeper understanding of disease etiology and in rapid drug repurposing. Availability: http://nlp.case.edu/public/data/DMPatternUMLS/ Contact: rxx@case.edu PMID:23828786
NASA Astrophysics Data System (ADS)
Bai, Rui; Tiejian, Li; Huang, Yuefei; Jiaye, Li; Wang, Guangqian; Yin, Dongqin
2015-12-01
The increasing resolution of Digital Elevation Models (DEMs) and the development of drainage network extraction algorithms make it possible to develop high-resolution drainage networks for large river basins. These vector networks contain massive numbers of river reaches with associated geographical features, including topological connections and topographical parameters. These features create challenges for efficient map display and data management. Of particular interest are the requirements of data management for multi-scale hydrological simulations using multi-resolution river networks. In this paper, a hierarchical pyramid method is proposed, which generates coarsened vector drainage networks from the originals iteratively. The method is based on the Horton-Strahler's (H-S) order schema. At each coarsening step, the river reaches with the lowest H-S order are pruned, and their related sub-basins are merged. At the same time, the topological connections and topographical parameters of each coarsened drainage network are inherited from the former level using formulas that are presented in this study. The method was applied to the original drainage networks of a watershed in the Huangfuchuan River basin extracted from a 1-m-resolution airborne LiDAR DEM and applied to the full Yangtze River basin in China, which was extracted from a 30-m-resolution ASTER GDEM. In addition, a map-display and parameter-query web service was published for the Mississippi River basin, and its data were extracted from the 30-m-resolution ASTER GDEM. The results presented in this study indicate that the developed method can effectively manage and display massive amounts of drainage network data and can facilitate multi-scale hydrological simulations.
NASA Technical Reports Server (NTRS)
Liu, J. T. C.
1986-01-01
Advances in the mechanics of boundary layer flow are reported. The physical problems of large scale coherent structures in real, developing free turbulent shear flows, from the nonlinear aspects of hydrodynamic stability are addressed. The presence of fine grained turbulence in the problem, and its absence, lacks a small parameter. The problem is presented on the basis of conservation principles, which are the dynamics of the problem directed towards extracting the most physical information, however, it is emphasized that it must also involve approximations.
Large-area photogrammetry based testing of wind turbine blades
NASA Astrophysics Data System (ADS)
Poozesh, Peyman; Baqersad, Javad; Niezrecki, Christopher; Avitabile, Peter; Harvey, Eric; Yarala, Rahul
2017-03-01
An optically based sensing system that can measure the displacement and strain over essentially the entire area of a utility-scale blade leads to a measurement system that can significantly reduce the time and cost associated with traditional instrumentation. This paper evaluates the performance of conventional three dimensional digital image correlation (3D DIC) and three dimensional point tracking (3DPT) approaches over the surface of wind turbine blades and proposes a multi-camera measurement system using dynamic spatial data stitching. The potential advantages for the proposed approach include: (1) full-field measurement distributed over a very large area, (2) the elimination of time-consuming wiring and expensive sensors, and (3) the need for large-channel data acquisition systems. There are several challenges associated with extending the capability of a standard 3D DIC system to measure entire surface of utility scale blades to extract distributed strain, deflection, and modal parameters. This paper only tries to address some of the difficulties including: (1) assessing the accuracy of the 3D DIC system to measure full-field distributed strain and displacement over the large area, (2) understanding the geometrical constraints associated with a wind turbine testing facility (e.g. lighting, working distance, and speckle pattern size), (3) evaluating the performance of the dynamic stitching method to combine two different fields of view by extracting modal parameters from aligned point clouds, and (4) determining the feasibility of employing an output-only system identification to estimate modal parameters of a utility scale wind turbine blade from optically measured data. Within the current work, the results of an optical measurement (one stereo-vision system) performed on a large area over a 50-m utility-scale blade subjected to quasi-static and cyclic loading are presented. The blade certification and testing is typically performed using International Electro-Technical Commission standard (IEC 61400-23). For static tests, the blade is pulled in either flap-wise or edge-wise directions to measure deflection or distributed strain at a few limited locations of a large-sized blade. Additionally, the paper explores the error associated with using a multi-camera system (two stereo-vision systems) in measuring 3D displacement and extracting structural dynamic parameters on a mock set up emulating a utility-scale wind turbine blade. The results obtained in this paper reveal that the multi-camera measurement system has the potential to identify the dynamic characteristics of a very large structure.
Ranius, Thomas; Hämäläinen, Aino; Egnell, Gustaf; Olsson, Bengt; Eklöf, Karin; Stendahl, Johan; Rudolphi, Jörgen; Sténs, Anna; Felton, Adam
2018-03-01
We review the consequences for biodiversity and ecosystem services from the industrial-scale extraction of logging residues (tops, branches and stumps from harvested trees and small-diameter trees from thinnings) in managed forests. Logging residue extraction can replace fossil fuels, and thus contribute to climate change mitigation. The additional biomass and nutrients removed, and soils and other structures disturbed, have several potential environmental impacts. To evaluate potential impacts on ecosystem services and biodiversity we reviewed 279 scientific papers that compared logging residue extraction with non-extraction, the majority of which were conducted in Northern Europe and North America. The weight of available evidence indicates that logging residue extraction can have significant negative effects on biodiversity, especially for species naturally adapted to sun-exposed conditions and the large amounts of dead wood that are created by large-scaled forest disturbances. Slash extraction may also pose risks for future biomass production itself, due to the associated loss of nutrients. For water quality, reindeer herding, mammalian game species, berries, and natural heritage the results were complicated by primarily negative but some positive effects, while for recreation and pest control positive effects were more consistent. Further, there are initial negative effects on carbon storage, but these effects are transient and carbon stocks are mostly restored over decadal time perspectives. We summarize ways of decreasing some of the negative effects of logging residue extraction on specific ecosystem services, by changing the categories of residue extracted, and site or forest type targeted for extraction. However, we found that suggested pathways for minimizing adverse outcomes were often in conflict among the ecosystem services assessed. Compensatory measures for logging residue extraction may also be used (e.g. ash recycling, liming, fertilization), though these may also be associated with adverse environmental impacts. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Self-sustaining processes at all scales in wall-bounded turbulent shear flows
NASA Astrophysics Data System (ADS)
Cossu, Carlo; Hwang, Yongyun
2017-03-01
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend's attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier-Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions.
NASA Astrophysics Data System (ADS)
Neklyudov, A. A.; Savenkov, V. N.; Sergeyez, A. G.
1984-06-01
Memories are improved by increasing speed or the memory volume on a single chip. The most effective means for increasing speeds in bipolar memories are current control circuits with the lowest extraction times for a specific power consumption (1/4 pJ/bit). The control current circuitry involves multistage current switches and circuits accelerating transient processes in storage elements and links. Circuit principles for the design of bipolar memories with maximum speeds for an assigned minimum of circuit topology are analyzed. Two main classes of storage with current control are considered: the ECL type and super-integrated injection type storage with data capacities of N = 1/4 and N 4/16, respectively. The circuits reduce logic voltage differentials and the volumes of lexical and discharge buses and control circuit buses. The limiting speed is determined by the antiinterference requirements of the memory in storage and extraction modes.
NASA Astrophysics Data System (ADS)
Shi, Wenzhong; Deng, Susu; Xu, Wenbing
2018-02-01
For automatic landslide detection, landslide morphological features should be quantitatively expressed and extracted. High-resolution Digital Elevation Models (DEMs) derived from airborne Light Detection and Ranging (LiDAR) data allow fine-scale morphological features to be extracted, but noise in DEMs influences morphological feature extraction, and the multi-scale nature of landslide features should be considered. This paper proposes a method to extract landslide morphological features characterized by homogeneous spatial patterns. Both profile and tangential curvature are utilized to quantify land surface morphology, and a local Gi* statistic is calculated for each cell to identify significant patterns of clustering of similar morphometric values. The method was tested on both synthetic surfaces simulating natural terrain and airborne LiDAR data acquired over an area dominated by shallow debris slides and flows. The test results of the synthetic data indicate that the concave and convex morphologies of the simulated terrain features at different scales and distinctness could be recognized using the proposed method, even when random noise was added to the synthetic data. In the test area, cells with large local Gi* values were extracted at a specified significance level from the profile and the tangential curvature image generated from the LiDAR-derived 1-m DEM. The morphologies of landslide main scarps, source areas and trails were clearly indicated, and the morphological features were represented by clusters of extracted cells. A comparison with the morphological feature extraction method based on curvature thresholds proved the proposed method's robustness to DEM noise. When verified against a landslide inventory, the morphological features of almost all recent (< 5 years) landslides and approximately 35% of historical (> 10 years) landslides were extracted. This finding indicates that the proposed method can facilitate landslide detection, although the cell clusters extracted from curvature images should be filtered using a filtering strategy based on supplementary information provided by expert knowledge or other data sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@harbor.kobe-u.ac.jp, E-mail: sano@ile.osaka-u.ac.jp
2014-10-10
The mechanism of large-scale dynamos in rigidly rotating stratified convection is explored by direct numerical simulations (DNS) in Cartesian geometry. A mean-field dynamo model is also constructed using turbulent velocity profiles consistently extracted from the corresponding DNS results. By quantitative comparison between the DNS and our mean-field model, it is demonstrated that the oscillatory α{sup 2} dynamo wave, excited and sustained in the convection zone, is responsible for large-scale magnetic activities such as cyclic polarity reversal and spatiotemporal migration. The results provide strong evidence that a nonuniformity of the α-effect, which is a natural outcome of rotating stratified convection, canmore » be an important prerequisite for large-scale stellar dynamos, even without the Ω-effect.« less
The Use of Weighted Graphs for Large-Scale Genome Analysis
Zhou, Fang; Toivonen, Hannu; King, Ross D.
2014-01-01
There is an acute need for better tools to extract knowledge from the growing flood of sequence data. For example, thousands of complete genomes have been sequenced, and their metabolic networks inferred. Such data should enable a better understanding of evolution. However, most existing network analysis methods are based on pair-wise comparisons, and these do not scale to thousands of genomes. Here we propose the use of weighted graphs as a data structure to enable large-scale phylogenetic analysis of networks. We have developed three types of weighted graph for enzymes: taxonomic (these summarize phylogenetic importance), isoenzymatic (these summarize enzymatic variety/redundancy), and sequence-similarity (these summarize sequence conservation); and we applied these types of weighted graph to survey prokaryotic metabolism. To demonstrate the utility of this approach we have compared and contrasted the large-scale evolution of metabolism in Archaea and Eubacteria. Our results provide evidence for limits to the contingency of evolution. PMID:24619061
NASA Astrophysics Data System (ADS)
Jayakumarai, G.; Gokulpriya, C.; Sudhapriya, R.; Sharmila, G.; Muthukumaran, C.
2015-12-01
Simple effective and rapid approach for the green synthesis of copper oxide nanoparticles (CONPs) using of Albizia lebbeck leaf extract was investigated in this study. Various instrumental techniques were adopted to characterize the synthesized CONPs, viz. UV-Vis spectroscopy, SEM, TEM, EDS and XRD. The synthesized CONPs were found to be spherical in shape and size less than 100 nm. It could be concluded that A. lebbeck leaf extract can be used as a cheap and effective reducing agent for CONPs production in large scale.
Sapphire Energy - Integrated Algal Biorefinery
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Rebecca L.; Tyler, Mike
2015-07-22
Sapphire Energy, Inc. (SEI) is a leader in large-scale photosynthetic algal biomass production, with a strongly cohesive research, development, and operations program. SEI takes a multidiscipline approach to integrate lab-based strain selection, cultivation and harvest and production scale, and extraction for the production of Green Crude oil, a drop in replacement for traditional crude oil.. SEI’s technical accomplishments since 2007 have produced a multifunctional platform that can address needs for fuel, feed, and other higher value products. Figure 1 outlines SEI’s commercialization process, including Green Crude production and refinement to drop in fuel replacements. The large scale algal biomass productionmore » facility, the SEI Integrated Algal Biorefinery (IABR), was built in Luna County near Columbus, New Mexico (see fig 2). The extraction unit was located at the existing SEI facility in Las Cruces, New Mexico, approximately 95 miles from the IABR. The IABR facility was constructed on time and on budget, and the extraction unit expansion to accommodate the biomass output from the IABR was completed in October 2012. The IABR facility uses open pond cultivation with a proprietary harvesting method to produce algal biomass; this biomass is then shipped to the extraction facility for conversion to Green Crude. The operation of the IABR and the extraction facilities has demonstrated the critical integration of traditional agricultural techniques with algae cultivation knowledge for algal biomass production, and the successful conversion of the biomass to Green Crude. All primary unit operations are de-risked, and at a scale suitable for process demonstration. The results are stable, reliable, and long-term cultivation of strains for year round algal biomass production. From June 2012 to November 2014, the IABR and extraction facilities produced 524 metric tons (MT) of biomass (on a dry weight basis), and 2,587 gallons of Green Crude. Additionally, the IABR demonstrated significant year over year yield improvements (2013 to 2014), and reduction in the cost of biomass production. Therefore, the IABR fulfills a number of critical functions in SEI’s integrated development pipeline. These functions are critical in general for the commercialization of algal biomass production and production of biofuels from algal biomass.« less
SLIDE - a web-based tool for interactive visualization of large-scale -omics data.
Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon
2018-06-28
Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.
Random access in large-scale DNA data storage.
Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin
2018-03-01
Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.
Self-sustaining processes at all scales in wall-bounded turbulent shear flows
Hwang, Yongyun
2017-01-01
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend’s attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier–Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167581
Self-sustaining processes at all scales in wall-bounded turbulent shear flows.
Cossu, Carlo; Hwang, Yongyun
2017-03-13
We collect and discuss the results of our recent studies which show evidence of the existence of a whole family of self-sustaining motions in wall-bounded turbulent shear flows with scales ranging from those of buffer-layer streaks to those of large-scale and very-large-scale motions in the outer layer. The statistical and dynamical features of this family of self-sustaining motions, which are associated with streaks and quasi-streamwise vortices, are consistent with those of Townsend's attached eddies. Motions at each relevant scale are able to sustain themselves in the absence of forcing from larger- or smaller-scale motions by extracting energy from the mean flow via a coherent lift-up effect. The coherent self-sustaining process is embedded in a set of invariant solutions of the filtered Navier-Stokes equations which take into full account the Reynolds stresses associated with the residual smaller-scale motions.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).
Large-scale machine learning and evaluation platform for real-time traffic surveillance
NASA Astrophysics Data System (ADS)
Eichel, Justin A.; Mishra, Akshaya; Miller, Nicholas; Jankovic, Nicholas; Thomas, Mohan A.; Abbott, Tyler; Swanson, Douglas; Keller, Joel
2016-09-01
In traffic engineering, vehicle detectors are trained on limited datasets, resulting in poor accuracy when deployed in real-world surveillance applications. Annotating large-scale high-quality datasets is challenging. Typically, these datasets have limited diversity; they do not reflect the real-world operating environment. There is a need for a large-scale, cloud-based positive and negative mining process and a large-scale learning and evaluation system for the application of automatic traffic measurements and classification. The proposed positive and negative mining process addresses the quality of crowd sourced ground truth data through machine learning review and human feedback mechanisms. The proposed learning and evaluation system uses a distributed cloud computing framework to handle data-scaling issues associated with large numbers of samples and a high-dimensional feature space. The system is trained using AdaBoost on 1,000,000 Haar-like features extracted from 70,000 annotated video frames. The trained real-time vehicle detector achieves an accuracy of at least 95% for 1/2 and about 78% for 19/20 of the time when tested on ˜7,500,000 video frames. At the end of 2016, the dataset is expected to have over 1 billion annotated video frames.
Large-scale deep learning for robotically gathered imagery for science
NASA Astrophysics Data System (ADS)
Skinner, K.; Johnson-Roberson, M.; Li, J.; Iscar, E.
2016-12-01
With the explosion of computing power, the intelligence and capability of mobile robotics has dramatically increased over the last two decades. Today, we can deploy autonomous robots to achieve observations in a variety of environments ripe for scientific exploration. These platforms are capable of gathering a volume of data previously unimaginable. Additionally, optical cameras, driven by mobile phones and consumer photography, have rapidly improved in size, power consumption, and quality making their deployment cheaper and easier. Finally, in parallel we have seen the rise of large-scale machine learning approaches, particularly deep neural networks (DNNs), increasing the quality of the semantic understanding that can be automatically extracted from optical imagery. In concert this enables new science using a combination of machine learning and robotics. This work will discuss the application of new low-cost high-performance computing approaches and the associated software frameworks to enable scientists to rapidly extract useful science data from millions of robotically gathered images. The automated analysis of imagery on this scale opens up new avenues of inquiry unavailable using more traditional manual or semi-automated approaches. We will use a large archive of millions of benthic images gathered with an autonomous underwater vehicle to demonstrate how these tools enable new scientific questions to be posed.
NASA Astrophysics Data System (ADS)
Guo, Jie; Zhu, Chang`an
2016-01-01
The development of optics and computer technologies enables the application of the vision-based technique that uses digital cameras to the displacement measurement of large-scale structures. Compared with traditional contact measurements, vision-based technique allows for remote measurement, has a non-intrusive characteristic, and does not necessitate mass introduction. In this study, a high-speed camera system is developed to complete the displacement measurement in real time. The system consists of a high-speed camera and a notebook computer. The high-speed camera can capture images at a speed of hundreds of frames per second. To process the captured images in computer, the Lucas-Kanade template tracking algorithm in the field of computer vision is introduced. Additionally, a modified inverse compositional algorithm is proposed to reduce the computing time of the original algorithm and improve the efficiency further. The modified algorithm can rapidly accomplish one displacement extraction within 1 ms without having to install any pre-designed target panel onto the structures in advance. The accuracy and the efficiency of the system in the remote measurement of dynamic displacement are demonstrated in the experiments on motion platform and sound barrier on suspension viaduct. Experimental results show that the proposed algorithm can extract accurate displacement signal and accomplish the vibration measurement of large-scale structures.
NASA Astrophysics Data System (ADS)
Sun, P.; Jokipii, J. R.; Giacalone, J.
2016-12-01
Anisotropies in astrophysical turbulence has been proposed and observed for a long time. And recent observations adopting the multi-scale analysis techniques provided a detailed description of the scale-dependent power spectrum of the magnetic field parallel and perpendicular to the scale-dependent magnetic field line at different scales in the solar wind. In the previous work, we proposed a multi-scale method to synthesize non-isotropic turbulent magnetic field with pre-determined power spectra of the fluctuating magnetic field as a function of scales. We present the effect of test particle transport in the resulting field with a two-scale algorithm. We find that the scale-dependent turbulence anisotropy has a significant difference in the effect on charged par- ticle transport from what the isotropy or the global anisotropy has. It is important to apply this field synthesis method to the solar wind magnetic field based on spacecraft data. However, this relies on how we extract the power spectra of the turbulent magnetic field across different scales. In this study, we propose here a power spectrum synthesis method based on Fourier analysis to extract the large and small scale power spectrum from a single spacecraft observation with a long enough period and a high sampling frequency. We apply the method to the solar wind measurement by the magnetometer onboard the ACE spacecraft and regenerate the large scale isotropic 2D spectrum and the small scale anisotropic 2D spectrum. We run test particle simulations in the magnetid field generated in this way to estimate the transport coefficients and to compare with the isotropic turbulence model.
ERIC Educational Resources Information Center
John, Lindsay Herbert
2004-01-01
The validity of a scale, from the Ontario Health Survey, measuring the subjective sense of well-being, for a large multicultural population in Metropolitan Toronto, is examined through principal components analysis with oblique rotation. Four factors are extracted. Factor 1, is a stress and strain factor, and consists of health worries, feeling…
Investigation of relationships between parameters of solar nano-flares and solar activity
NASA Astrophysics Data System (ADS)
Safari, Hossein; Javaherian, Mohsen; Kaki, Bardia
2016-07-01
Solar flares are one of the important coronal events which are originated in solar magnetic activity. They release lots of energy during the interstellar medium, right after the trigger. Flare prediction can play main role in avoiding eventual damages on the Earth. Here, to interpret solar large-scale events (e.g., flares), we investigate relationships between small-scale events (nano-flares) and large-scale events (e.g., flares). In our method, by using simulations of nano-flares based on Monte Carlo method, the intensity time series of nano-flares are simulated. Then, the solar full disk images taken at 171 angstrom recorded by SDO/AIA are employed. Some parts of the solar disk (quiet Sun (QS), coronal holes (CHs), and active regions (ARs)) are cropped and the time series of these regions are extracted. To compare the simulated intensity time series of nano-flares with the intensity time series of real data extracted from different parts of the Sun, the artificial neural networks is employed. Therefore, we are able to extract physical parameters of nano-flares like both kick and decay rate lifetime, and the power of their power-law distributions. The procedure of variations in the power value of power-law distributions within QS, CH is similar to AR. Thus, by observing the small part of the Sun, we can follow the procedure of solar activity.
Shirshekanb, Mahsa; Rezadoost, Hassan; Javanbakht, Mehran; Ghassempour, Ali Reza
2017-01-01
There is no other naturally occurring defense agent against cancer that has a stronger effect than paclitaxel, commonly known under the brand name of Taxol ® . The major drawback for the more widespread use of paclitaxel and its precious precursor, 10-deacetylbaccatin III (10-DAB III), is that they require large-scale extraction from different parts of yew trees ( Taxus species), cell cultures, taxane-producing endophytic fungi, and Corylus species. In our previous work, a novel online two-dimensional heart-cut liquid chromatography process using hydrophilic interaction/ reversed-phase chromatography was used to introduce a semi-preparative treatment for the separation of polar (10-deacetylbaccatin III) and non-polar (paclitaxel) taxanes from Taxus baccata L. In this work, a combination of the absorbent (Diaion ® HP-20) and a silica based solid phase extraction is utilized as a new, efficient, and cost effective method for large-scale production of taxanes. This process avoids the technical problem of two-dimensional preparative liquid chromatography. The first stage of the process involves discarding co-extractive polar compounds including chlorophylls and pigments using a non-polar synthetic hydrophobic absorbent, Diaion ® HP-20. Extract was then loaded on to a silica based hydrophilic interaction solid phase extraction (silica 40-60 micron). Taxanes was eluted using a mixture of water and methanol at the optimized ratio of 70:30. Finally, the fraction containing taxanes was applied to semi-preparative reversed phase HPLC. The results revealed that using this procedure, paclitaxel and 10-DAB III could be obtained at 8 and 3 times more, respectively than by the traditional method of extraction.
1987-11-01
developed that can be used by circuit engineers to extract the maximum performance from the devices on various board technologies including multilayer ceramic...Design guidelines have been developed that can be used by circuit engineers to extract the maxi- mum performance from the devices on various board...25 Attenuation and Dispersion Effects ......................................... 27 Skin Effect
Dahling, Daniel R
2002-01-01
Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.
[Lithology feature extraction of CASI hyperspectral data based on fractal signal algorithm].
Tang, Chao; Chen, Jian-Ping; Cui, Jing; Wen, Bo-Tao
2014-05-01
Hyperspectral data is characterized by combination of image and spectrum and large data volume dimension reduction is the main research direction. Band selection and feature extraction is the primary method used for this objective. In the present article, the authors tested methods applied for the lithology feature extraction from hyperspectral data. Based on the self-similarity of hyperspectral data, the authors explored the application of fractal algorithm to lithology feature extraction from CASI hyperspectral data. The "carpet method" was corrected and then applied to calculate the fractal value of every pixel in the hyperspectral data. The results show that fractal information highlights the exposed bedrock lithology better than the original hyperspectral data The fractal signal and characterized scale are influenced by the spectral curve shape, the initial scale selection and iteration step. At present, research on the fractal signal of spectral curve is rare, implying the necessity of further quantitative analysis and investigation of its physical implications.
MODFLOW-LGR: Practical application to a large regional dataset
NASA Astrophysics Data System (ADS)
Barnes, D.; Coulibaly, K. M.
2011-12-01
In many areas of the US, including southwest Florida, large regional-scale groundwater models have been developed to aid in decision making and water resources management. These models are subsequently used as a basis for site-specific investigations. Because the large scale of these regional models is not appropriate for local application, refinement is necessary to analyze the local effects of pumping wells and groundwater related projects at specific sites. The most commonly used approach to date is Telescopic Mesh Refinement or TMR. It allows the extraction of a subset of the large regional model with boundary conditions derived from the regional model results. The extracted model is then updated and refined for local use using a variable sized grid focused on the area of interest. MODFLOW-LGR, local grid refinement, is an alternative approach which allows model discretization at a finer resolution in areas of interest and provides coupling between the larger "parent" model and the locally refined "child." In the present work, these two approaches are tested on a mining impact assessment case in southwest Florida using a large regional dataset (The Lower West Coast Surficial Aquifer System Model). Various metrics for performance are considered. They include: computation time, water balance (as compared to the variable sized grid), calibration, implementation effort, and application advantages and limitations. The results indicate that MODFLOW-LGR is a useful tool to improve local resolution of regional scale models. While performance metrics, such as computation time, are case-dependent (model size, refinement level, stresses involved), implementation effort, particularly when regional models of suitable scale are available, can be minimized. The creation of multiple child models within a larger scale parent model makes it possible to reuse the same calibrated regional dataset with minimal modification. In cases similar to the Lower West Coast model, where a model is larger than optimal for direct application as a parent grid, a combination of TMR and LGR approaches should be used to develop a suitable parent grid.
DOT National Transportation Integrated Search
2013-04-01
There are three tasks for this research : 1. Methodology to extract Road Usage Patterns from Phone Data: We combined the : most complete record of daily mobility, based on large-scale mobile phone data, with : detailed Geographic Information System (...
Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang
2014-01-01
Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639
NASA Astrophysics Data System (ADS)
Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien
Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provide a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to a selection rule that favors the rare trajectories of interest. However, such algorithms are plagued by finite simulation time- and finite population size- effects that can render their use delicate. Using the continuous-time cloning algorithm, we analyze the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of the rare trajectories. We use these scalings in order to propose a numerical approach which allows to extract the infinite-time and infinite-size limit of these estimators.
Ultrafast carrier dynamics in the large-magnetoresistance material WTe 2
Dai, Y. M.; Bowlan, J.; Li, H.; ...
2015-10-07
In this study, ultrafast optical pump-probe spectroscopy is used to track carrier dynamics in the large-magnetoresistance material WTe 2. Our experiments reveal a fast relaxation process occurring on a subpicosecond time scale that is caused by electron-phonon thermalization, allowing us to extract the electron-phonon coupling constant. An additional slower relaxation process, occurring on a time scale of ~5–15 ps, is attributed to phonon-assisted electron-hole recombination. As the temperature decreases from 300 K, the time scale governing this process increases due to the reduction of the phonon population. However, below ~50 K, an unusual decrease of the recombination time sets in,more » most likely due to a change in the electronic structure that has been linked to the large magnetoresistance observed in this material.« less
Clipping the cosmos: the bias and bispectrum of large scale structure.
Simpson, Fergus; James, J Berian; Heavens, Alan F; Heymans, Catherine
2011-12-30
A large fraction of the information collected by cosmological surveys is simply discarded to avoid length scales which are difficult to model theoretically. We introduce a new technique which enables the extraction of useful information from the bispectrum of galaxies well beyond the conventional limits of perturbation theory. Our results strongly suggest that this method increases the range of scales where the relation between the bispectrum and power spectrum in tree-level perturbation theory may be applied, from k(max) ∼ 0.1 to ∼0.7 hMpc(-1). This leads to correspondingly large improvements in the determination of galaxy bias. Since the clipped matter power spectrum closely follows the linear power spectrum, there is the potential to use this technique to probe the growth rate of linear perturbations and confront theories of modified gravity with observation.
Shao, Qingsong; Huang, Yuqiu; Zhou, Aicun; Guo, Haipeng; Zhang, Ailian; Wang, Yong
2014-05-01
Crocus sativus has been used as a traditional Chinese medicine for a long time. The volatile compounds of C. sativus appear biologically active and may act as antioxidants as well as anticonvulsants, antidepressants and antitumour agents. In order to obtain the highest possible yield of essential oils from C. sativus, response surface methodology was employed to optimise the conditions of supercritical fluid carbon dioxide extraction of the volatile compounds from C. sativus. Four factorswere investigated: temperature, pressure, extraction time and carbon dioxide flow rate. Furthermore, the chemical compositions of the volatile compounds extracted by supercritical fluid extraction were compared with those obtained by hydro-distillation and Soxhlet extraction. The optimum extraction conditions were found to be: optimised temperature 44.9°C, pressure 34.9 MPa, extraction time 150.2 min and CO₂ flow rate 10.1 L h⁻¹. Under these conditions, the mean extraction yield was 10.94 g kg⁻¹. The volatile compounds extracted by supercritical fluid extraction and Soxhlet extraction contained a large amount of unsaturated fatty acids. Response surface methodology was successfully applied for supercritical fluid CO₂ extraction optimisation of the volatile compounds from C. sativus. The study showed that pressure and CO₂ flow rate had significant effect on volatile compounds yield produced by supercritical fluid extraction. This study is beneficial for the further research operating on a large scale. © 2013 Society of Chemical Industry.
Large-Scale Demonstration of Liquid Hydrogen Storage with Zero Boiloff for In-Space Applications
NASA Technical Reports Server (NTRS)
Hastings, L. J.; Bryant, C. B.; Flachbart, R. H.; Holt, K. A.; Johnson, E.; Hedayat, A.; Hipp, B.; Plachta, D. W.
2010-01-01
Cryocooler and passive insulation technology advances have substantially improved prospects for zero-boiloff cryogenic storage. Therefore, a cooperative effort by NASA s Ames Research Center, Glenn Research Center, and Marshall Space Flight Center (MSFC) was implemented to develop zero-boiloff concepts for in-space cryogenic storage. Described herein is one program element - a large-scale, zero-boiloff demonstration using the MSFC multipurpose hydrogen test bed (MHTB). A commercial cryocooler was interfaced with an existing MHTB spray bar mixer and insulation system in a manner that enabled a balance between incoming and extracted thermal energy.
Remote sensing of the biological dynamics of large-scale salt evaporation ponds
NASA Technical Reports Server (NTRS)
Richardson, Laurie L.; Bachoon, Dave; Ingram-Willey, Vebbra; Chow, Colin C.; Weinstock, Kenneth
1992-01-01
Optical properties of salt evaporation ponds associated with Exportadora de Sal, a salt production company in Baja California Sur, Mexico, were analyzed using a combination of spectroradiometer and extracted pigment data, and Landsat-5 Thematic Mapper imagery. The optical characteristics of each pond are determined by the biota, which consists of dense populations of algae and photosynthetic bacteria containing a wide variety of photosynthetic and photoprotective pigments. Analysis has shown that spectral and image data can differentiate between taxonomic groups of the microbiota, detect changes in population distributions, and reveal large-scale seasonal dynamics.
Isosurface Extraction in Time-Varying Fields Using a Temporal Hierarchical Index Tree
NASA Technical Reports Server (NTRS)
Shen, Han-Wei; Gerald-Yamasaki, Michael (Technical Monitor)
1998-01-01
Many high-performance isosurface extraction algorithms have been proposed in the past several years as a result of intensive research efforts. When applying these algorithms to large-scale time-varying fields, the storage overhead incurred from storing the search index often becomes overwhelming. this paper proposes an algorithm for locating isosurface cells in time-varying fields. We devise a new data structure, called Temporal Hierarchical Index Tree, which utilizes the temporal coherence that exists in a time-varying field and adoptively coalesces the cells' extreme values over time; the resulting extreme values are then used to create the isosurface cell search index. For a typical time-varying scalar data set, not only does this temporal hierarchical index tree require much less storage space, but also the amount of I/O required to access the indices from the disk at different time steps is substantially reduced. We illustrate the utility and speed of our algorithm with data from several large-scale time-varying CID simulations. Our algorithm can achieve more than 80% of disk-space savings when compared with the existing techniques, while the isosurface extraction time is nearly optimal.
Wang, Yongqiang; Gao, Yujie; Ding, Hui; Liu, Shejiang; Han, Xu; Gui, Jianzhou; Liu, Dan
2017-03-01
A large-scale process to extract flavonoids from Moringa oleifera leaf by subcritical ethanol was developed and HPLC-MS analysis was conducted to qualitatively identify the compounds in the extracts. To optimize the effects of process parameters on the yield of flavonoids, a Box-Behnken design combined with response surface methodology was conducted in the present work. The results indicated that the highest extraction yield of flavonoids by subcritical ethanol extraction could reach 2.60% using 70% ethanol at 126.6°C for 2.05h extraction. Under the optimized conditions, flavonoids yield was substantially improved by 26.7% compared with the traditional ethanol reflux method while the extraction time was only 2h, and obvious energy saving was observed. FRAP and DPPH assays showed that the extracts had strong antioxidant and free radical scavenging activities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Mapping the integrated Sachs-Wolfe effect
NASA Astrophysics Data System (ADS)
Manzotti, A.; Dodelson, S.
2014-12-01
On large scales, the anisotropies in the cosmic microwave background (CMB) reflect not only the primordial density field but also the energy gain when photons traverse decaying gravitational potentials of large scale structure, what is called the integrated Sachs-Wolfe (ISW) effect. Decomposing the anisotropy signal into a primordial piece and an ISW component, the main secondary effect on large scales, is more urgent than ever as cosmologists strive to understand the Universe on those scales. We present a likelihood technique for extracting the ISW signal combining measurements of the CMB, the distribution of galaxies, and maps of gravitational lensing. We test this technique with simulated data showing that we can successfully reconstruct the ISW map using all the data sets together. Then we present the ISW map obtained from a combination of real data: the NRAO VLA sky survey (NVSS) galaxy survey, temperature anisotropies, and lensing maps made by the Planck satellite. This map shows that, with the data sets used and assuming linear physics, there is no evidence, from the reconstructed ISW signal in the Cold Spot region, for an entirely ISW origin of this large scale anomaly in the CMB. However a large scale structure origin from low redshift voids outside the NVSS redshift range is still possible. Finally we show that future surveys, thanks to a better large scale lensing reconstruction will be able to improve the reconstruction signal to noise which is now mainly coming from galaxy surveys.
Road Damage Extraction from Post-Earthquake Uav Images Assisted by Vector Data
NASA Astrophysics Data System (ADS)
Chen, Z.; Dou, A.
2018-04-01
Extraction of road damage information after earthquake has been regarded as urgent mission. To collect information about stricken areas, Unmanned Aerial Vehicle can be used to obtain images rapidly. This paper put forward a novel method to detect road damage and bring forward a coefficient to assess road accessibility. With the assistance of vector road data, image data of the Jiuzhaigou Ms7.0 Earthquake is tested. In the first, the image is clipped according to vector buffer. Then a large-scale segmentation is applied to remove irrelevant objects. Thirdly, statistics of road features are analysed, and damage information is extracted. Combining with the on-filed investigation, the extraction result is effective.
Parallel Visualization Co-Processing of Overnight CFD Propulsion Applications
NASA Technical Reports Server (NTRS)
Edwards, David E.; Haimes, Robert
1999-01-01
An interactive visualization system pV3 is being developed for the investigation of advanced computational methodologies employing visualization and parallel processing for the extraction of information contained in large-scale transient engineering simulations. Visual techniques for extracting information from the data in terms of cutting planes, iso-surfaces, particle tracing and vector fields are included in this system. This paper discusses improvements to the pV3 system developed under NASA's Affordable High Performance Computing project.
The 2-MEV model: Constancy of adolescent environmental values within an 8-year time frame
NASA Astrophysics Data System (ADS)
Bogner, F. X.; Johnson, B.; Buxner, S.; Felix, L.
2015-08-01
The 2-MEV model is a widely used tool to monitor children's environmental perception by scoring individual values. Although the scale's validity has been confirmed repeatedly and independently as well as the scale is in usage within more than two dozen language units all over the world, longitudinal properties still need clarification. The purpose of the present study therefore was to validate the 2-MEV scale based on a large data basis of 10,676 children collected over an eight-year period. Cohorts of three different US states contributed to the sample by responding to a paper-and-pencil questionnaire within their pre-test initiatives in the context of field center programs. Since we used only the pre-program 2-MEV scale results (which is before participation in education programs), the data were clearly unspoiled by any follow-up interventions. The purpose of analysis was fourfold: First, to test and confirm the hypothesized factorized structure for the large data set and for the subsample of each of the three states. Second, to analyze the scoring pattern across the eight years' time range for both preservation and utilitarian preferences. Third, to investigate any age effects in the extracted factors. Finally, to extract suitable recommendations for educational implementation efforts.
The XMM Large Scale Structure Survey
NASA Astrophysics Data System (ADS)
Pierre, Marguerite
2005-10-01
We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.
A quality score for coronary artery tree extraction results
NASA Astrophysics Data System (ADS)
Cao, Qing; Broersen, Alexander; Kitslaar, Pieter H.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke
2018-02-01
Coronary artery trees (CATs) are often extracted to aid the fully automatic analysis of coronary artery disease on coronary computed tomography angiography (CCTA) images. Automatically extracted CATs often miss some arteries or include wrong extractions which require manual corrections before performing successive steps. For analyzing a large number of datasets, a manual quality check of the extraction results is time-consuming. This paper presents a method to automatically calculate quality scores for extracted CATs in terms of clinical significance of the extracted arteries and the completeness of the extracted CAT. Both right dominant (RD) and left dominant (LD) anatomical statistical models are generated and exploited in developing the quality score. To automatically determine which model should be used, a dominance type detection method is also designed. Experiments are performed on the automatically extracted and manually refined CATs from 42 datasets to evaluate the proposed quality score. In 39 (92.9%) cases, the proposed method is able to measure the quality of the manually refined CATs with higher scores than the automatically extracted CATs. In a 100-point scale system, the average scores for automatically and manually refined CATs are 82.0 (+/-15.8) and 88.9 (+/-5.4) respectively. The proposed quality score will assist the automatic processing of the CAT extractions for large cohorts which contain both RD and LD cases. To the best of our knowledge, this is the first time that a general quality score for an extracted CAT is presented.
US National Large-scale City Orthoimage Standard Initiative
Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.
2003-01-01
The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.
Static analysis techniques for semiautomatic synthesis of message passing software skeletons
Sottile, Matthew; Dagit, Jason; Zhang, Deli; ...
2015-06-29
The design of high-performance computing architectures demands performance analysis of large-scale parallel applications to derive various parameters concerning hardware design and software development. The process of performance analysis and benchmarking an application can be done in several ways with varying degrees of fidelity. One of the most cost-effective ways is to do a coarse-grained study of large-scale parallel applications through the use of program skeletons. The concept of a “program skeleton” that we discuss in this article is an abstracted program that is derived from a larger program where source code that is determined to be irrelevant is removed formore » the purposes of the skeleton. In this work, we develop a semiautomatic approach for extracting program skeletons based on compiler program analysis. Finally, we demonstrate correctness of our skeleton extraction process by comparing details from communication traces, as well as show the performance speedup of using skeletons by running simulations in the SST/macro simulator.« less
Williams, Alex H; Kim, Tony Hyun; Wang, Forea; Vyas, Saurabh; Ryu, Stephen I; Shenoy, Krishna V; Schnitzer, Mark; Kolda, Tamara G; Ganguli, Surya
2018-06-27
Perceptions, thoughts, and actions unfold over millisecond timescales, while learned behaviors can require many days to mature. While recent experimental advances enable large-scale and long-term neural recordings with high temporal fidelity, it remains a formidable challenge to extract unbiased and interpretable descriptions of how rapid single-trial circuit dynamics change slowly over many trials to mediate learning. We demonstrate a simple tensor component analysis (TCA) can meet this challenge by extracting three interconnected, low-dimensional descriptions of neural data: neuron factors, reflecting cell assemblies; temporal factors, reflecting rapid circuit dynamics mediating perceptions, thoughts, and actions within each trial; and trial factors, describing both long-term learning and trial-to-trial changes in cognitive state. We demonstrate the broad applicability of TCA by revealing insights into diverse datasets derived from artificial neural networks, large-scale calcium imaging of rodent prefrontal cortex during maze navigation, and multielectrode recordings of macaque motor cortex during brain machine interface learning. Copyright © 2018 Elsevier Inc. All rights reserved.
Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun; Wang, Bei; Bremer, Peer-Timo; Papka, Michael E; Curtiss, Larry A; Pascucci, Valerio
2016-01-01
Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermally annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun
2016-01-01
Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermallymore » annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.« less
Gyulassy, Attila; Knoll, Aaron; Lau, Kah Chun; ...
2016-01-31
Large-scale molecular dynamics (MD) simulations are commonly used for simulating the synthesis and ion diffusion of battery materials. A good battery anode material is determined by its capacity to store ion or other diffusers. However, modeling of ion diffusion dynamics and transport properties at large length and long time scales would be impossible with current MD codes. To analyze the fundamental properties of these materials, therefore, we turn to geometric and topological analysis of their structure. In this paper, we apply a novel technique inspired by discrete Morse theory to the Delaunay triangulation of the simulated geometry of a thermallymore » annealed carbon nanosphere. We utilize our computed structures to drive further geometric analysis to extract the interstitial diffusion structure as a single mesh. Lastly, our results provide a new approach to analyze the geometry of the simulated carbon nanosphere, and new insights into the role of carbon defect size and distribution in determining the charge capacity and charge dynamics of these carbon based battery materials.« less
NASA Astrophysics Data System (ADS)
Wang, Hongyan; Li, Qiangzi; Du, Xin; Zhao, Longcai
2017-12-01
In the karst regions of southwest China, rocky desertification is one of the most serious problems in land degradation. The bedrock exposure rate is an important index to assess the degree of rocky desertification in karst regions. Because of the inherent merits of macro-scale, frequency, efficiency, and synthesis, remote sensing is a promising method to monitor and assess karst rocky desertification on a large scale. However, actual measurement of the bedrock exposure rate is difficult and existing remote-sensing methods cannot directly be exploited to extract the bedrock exposure rate owing to the high complexity and heterogeneity of karst environments. Therefore, using unmanned aerial vehicle (UAV) and Landsat-8 Operational Land Imager (OLI) data for Xingren County, Guizhou Province, quantitative extraction of the bedrock exposure rate based on multi-scale remote-sensing data was developed. Firstly, we used an object-oriented method to carry out accurate classification of UAVimages. From the results of rock extraction, the bedrock exposure rate was calculated at the 30 m grid scale. Parts of the calculated samples were used as training data; other data were used for model validation. Secondly, in each grid the band reflectivity of Landsat-8 OLI data was extracted and a variety of rock and vegetation indexes (e.g., NDVI and SAVI) were calculated. Finally, a network model was established to extract the bedrock exposure rate. The correlation coefficient of the network model was 0.855, that of the validation model was 0.677 and the root mean square error of the validation model was 0.073. This method is valuable for wide-scale estimation of bedrock exposure rate in karst environments. Using the quantitative inversion model, a distribution map of the bedrock exposure rate in Xingren County was obtained.
Vibration-based structural health monitoring of the aircraft large component
NASA Astrophysics Data System (ADS)
Pavelko, V.; Kuznetsov, S.; Nevsky, A.; Marinbah, M.
2017-10-01
In the presented paper there are investigated the basic problems of the local system of SHM of large scale aircraft component. Vibration-based damage detection is accepted as a basic condition, and main attention focused to a low-cost solution that would be attractive for practice. The conditions of small damage detection in the full scale structural component at low-frequency excitation were defined in analytical study and modal FEA. In experimental study the dynamic test of the helicopter Mi-8 tail beam was performed at harmonic excitation with frequency close to first natural frequency of the beam. The index of correlation coefficient deviation (CCD) was used for extraction of the features due to embedded pseudo-damage. It is shown that the problem of vibration-based detection of a small damage in the large scale structure at low-frequency excitation can be solved successfully.
Large-scale structure after COBE: Peculiar velocities and correlations of cold dark matter halos
NASA Technical Reports Server (NTRS)
Zurek, Wojciech H.; Quinn, Peter J.; Salmon, John K.; Warren, Michael S.
1994-01-01
Large N-body simulations on parallel supercomputers allow one to simultaneously investigate large-scale structure and the formation of galactic halos with unprecedented resolution. Our study shows that the masses as well as the spatial distribution of halos on scales of tens of megaparsecs in a cold dark matter (CDM) universe with the spectrum normalized to the anisotropies detected by Cosmic Background Explorer (COBE) is compatible with the observations. We also show that the average value of the relative pairwise velocity dispersion sigma(sub v) - used as a principal argument against COBE-normalized CDM models-is significantly lower for halos than for individual particles. When the observational methods of extracting sigma(sub v) are applied to the redshift catalogs obtained from the numerical experiments, estimates differ significantly between different observation-sized samples and overlap observational estimates obtained following the same procedure.
Mohr, Stephan; Dawson, William; Wagner, Michael; Caliste, Damien; Nakajima, Takahito; Genovese, Luigi
2017-10-10
We present CheSS, the "Chebyshev Sparse Solvers" library, which has been designed to solve typical problems arising in large-scale electronic structure calculations using localized basis sets. The library is based on a flexible and efficient expansion in terms of Chebyshev polynomials and presently features the calculation of the density matrix, the calculation of matrix powers for arbitrary powers, and the extraction of eigenvalues in a selected interval. CheSS is able to exploit the sparsity of the matrices and scales linearly with respect to the number of nonzero entries, making it well-suited for large-scale calculations. The approach is particularly adapted for setups leading to small spectral widths of the involved matrices and outperforms alternative methods in this regime. By coupling CheSS to the DFT code BigDFT, we show that such a favorable setup is indeed possible in practice. In addition, the approach based on Chebyshev polynomials can be massively parallelized, and CheSS exhibits excellent scaling up to thousands of cores even for relatively small matrix sizes.
Extracting Primordial Non-Gaussianity from Large Scale Structure in the Post-Planck Era
NASA Astrophysics Data System (ADS)
Dore, Olivier
Astronomical observations have become a unique tool to probe fundamental physics. Cosmology, in particular, emerged as a data-driven science whose phenomenological modeling has achieved great success: in the post-Planck era, key cosmological parameters are measured to percent precision. A single model reproduces a wealth of astronomical observations involving very distinct physical processes at different times. This success leads to fundamental physical questions. One of the most salient is the origin of the primordial perturbations that grew to form the large-scale structures we now observe. More and more cosmological observables point to inflationary physics as the origin of the structure observed in the universe. Inflationary physics predict the statistical properties of the primordial perturbations and it is thought to be slightly non-Gaussian. The detection of this small deviation from Gaussianity represents the next frontier in early Universe physics. To measure it would provide direct, unique and quantitative insights about the physics at play when the Universe was only a fraction of a second old, thus probing energies untouchable otherwise. En par with the well-known relic gravitational wave radiation -- the famous ``B-modes'' -- it is one the few probes of inflation. This departure from Gaussianity leads to very specific signature in the large scale clustering of galaxies. Observing large-scale structure, we can thus establish a direct connection with fundamental theories of the early universe. In the post-Planck era, large-scale structures are our most promising pathway to measuring this primordial signal. Current estimates suggests that the next generation of space or ground based large scale structure surveys (e.g. the ESA EUCLID or NASA WFIRST missions) might enable a detection of this signal. This potential huge payoff requires us to solidify the theoretical predictions supporting these measurements. Even if the exact signal we are looking for is of unknown amplitude, it is obvious that we must measure it as well as these ground breaking data set will permit. We propose to develop the supporting theoretical work to the point where the complete non-gaussianian signature can be extracted from these data sets. We will do so by developing three complementary directions: - We will develop the appropriate formalism to measure and model galaxy clustering on the largest scales. - We will study the impact of non-Gaussianity on higher-order statistics, the most promising statistics for our purpose.. - We will explicit the connection between these observables and the microphysics of a large class of inflation models, but also identify fundamental limitations to this interpretation.
NASA Astrophysics Data System (ADS)
Li, Jianping; Xia, Xiangsheng
2015-09-01
In order to improve the understanding of the hot deformation and dynamic recrystallization (DRX) behaviors of large-scaled AZ80 magnesium alloy fabricated by semi-continuous casting, compression tests were carried out in the temperature range from 250 to 400 °C and strain rate range from 0.001 to 0.1 s-1 on a Gleeble 1500 thermo-mechanical machine. The effects of the temperature and strain rate on the hot deformation behavior have been expressed by means of the conventional hyperbolic sine equation, and the influence of the strain has been incorporated in the equation by considering its effect on different material constants for large-scaled AZ80 magnesium alloy. In addition, the DRX behavior has been discussed. The result shows that the deformation temperature and strain rate exerted remarkable influences on the flow stress. The constitutive equation of large-scaled AZ80 magnesium alloy for hot deformation at steady-state stage (ɛ = 0.5) was The true stress-true strain curves predicted by the extracted model were in good agreement with the experimental results, thereby confirming the validity of the developed constitutive relation. The DRX kinetic model of large-scaled AZ80 magnesium alloy was established as X d = 1 - exp[-0.95((ɛ - ɛc)/ɛ*)2.4904]. The rate of DRX increases with increasing deformation temperature, and high temperature is beneficial for achieving complete DRX in the large-scaled AZ80 magnesium alloy.
Determinations of pesticides in food are often complicated by the presence of fats and require multiple cleanup steps before analysis. Cost-effective analytical methods are needed for conducting large-scale exposure studies. We examined two extraction methods, supercritical flu...
Three-dimensional time dependent computation of turbulent flow
NASA Technical Reports Server (NTRS)
Kwak, D.; Reynolds, W. C.; Ferziger, J. H.
1975-01-01
The three-dimensional, primitive equations of motion are solved numerically for the case of isotropic box turbulence and the distortion of homogeneous turbulence by irrotational plane strain at large Reynolds numbers. A Gaussian filter is applied to governing equations to define the large scale field. This gives rise to additional second order computed scale stresses (Leonard stresses). The residual stresses are simulated through an eddy viscosity. Uniform grids are used, with a fourth order differencing scheme in space and a second order Adams-Bashforth predictor for explicit time stepping. The results are compared to the experiments and statistical information extracted from the computer generated data.
Geophysical potential for wind energy over the open oceans
2017-01-01
Wind turbines continuously remove kinetic energy from the lower troposphere, thereby reducing the wind speed near hub height. The rate of electricity generation in large wind farms containing multiple wind arrays is, therefore, constrained by the rate of kinetic energy replenishment from the atmosphere above. In recent years, a growing body of research argues that the rate of generated power is limited to around 1.5 W m−2 within large wind farms. However, in this study, we show that considerably higher power generation rates may be sustainable over some open ocean areas. In particular, the North Atlantic is identified as a region where the downward transport of kinetic energy may sustain extraction rates of 6 W m−2 and above over large areas in the annual mean. Furthermore, our results indicate that the surface heat flux from the oceans to the atmosphere may play an important role in creating regions where sustained high rates of downward transport of kinetic energy and thus, high rates of kinetic energy extraction may be geophysical possible. While no commercial-scale deep water wind farms yet exist, our results suggest that such technologies, if they became technically and economically feasible, could potentially provide civilization-scale power. PMID:29073053
Geophysical potential for wind energy over the open oceans.
Possner, Anna; Caldeira, Ken
2017-10-24
Wind turbines continuously remove kinetic energy from the lower troposphere, thereby reducing the wind speed near hub height. The rate of electricity generation in large wind farms containing multiple wind arrays is, therefore, constrained by the rate of kinetic energy replenishment from the atmosphere above. In recent years, a growing body of research argues that the rate of generated power is limited to around 1.5 W m -2 within large wind farms. However, in this study, we show that considerably higher power generation rates may be sustainable over some open ocean areas. In particular, the North Atlantic is identified as a region where the downward transport of kinetic energy may sustain extraction rates of 6 W m -2 and above over large areas in the annual mean. Furthermore, our results indicate that the surface heat flux from the oceans to the atmosphere may play an important role in creating regions where sustained high rates of downward transport of kinetic energy and thus, high rates of kinetic energy extraction may be geophysical possible. While no commercial-scale deep water wind farms yet exist, our results suggest that such technologies, if they became technically and economically feasible, could potentially provide civilization-scale power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Wei, E-mail: wguo2@ncsu.edu; Kirste, Ronny; Bryan, Zachary
Enhanced light extraction efficiency was demonstrated on nanostructure patterned GaN and AlGaN/AlN Multiple-Quantum-Well (MQW) structures using mass production techniques including natural lithography and interference lithography with feature size as small as 100 nm. Periodic nanostructures showed higher light extraction efficiency and modified emission profile compared to non-periodic structures based on integral reflection and angular-resolved transmission measurement. Light extraction mechanism of macroscopic and microscopic nanopatterning is discussed, and the advantage of using periodic nanostructure patterning is provided. An enhanced photoluminescence emission intensity was observed on nanostructure patterned AlGaN/AlN MQW compared to as-grown structure, demonstrating a large-scale and mass-producible pathway to higher lightmore » extraction efficiency in deep-ultra-violet light-emitting diodes.« less
Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction
NASA Technical Reports Server (NTRS)
Li, Zhijin; Chao, Yi; Li, P. Peggy
2012-01-01
A multi-scale three-dimensional variational data assimilation system (MS-3DVAR) has been formulated and the associated software system has been developed for improving high-resolution coastal ocean prediction. This system helps improve coastal ocean prediction skill, and has been used in support of operational coastal ocean forecasting systems and field experiments. The system has been developed to improve the capability of data assimilation for assimilating, simultaneously and effectively, sparse vertical profiles and high-resolution remote sensing surface measurements into coastal ocean models, as well as constraining model biases. In this system, the cost function is decomposed into two separate units for the large- and small-scale components, respectively. As such, data assimilation is implemented sequentially from large to small scales, the background error covariance is constructed to be scale-dependent, and a scale-dependent dynamic balance is incorporated. This scheme then allows effective constraining large scales and model bias through assimilating sparse vertical profiles, and small scales through assimilating high-resolution surface measurements. This MS-3DVAR enhances the capability of the traditional 3DVAR for assimilating highly heterogeneously distributed observations, such as along-track satellite altimetry data, and particularly maximizing the extraction of information from limited numbers of vertical profile observations.
Wu, Junjun; Du, Guocheng; Zhou, Jingwen; Chen, Jian
2014-10-20
Flavonoids possess pharmaceutical potential due to their health-promoting activities. The complex structures of these products make extraction from plants difficult, and chemical synthesis is limited because of the use of many toxic solvents. Microbial production offers an alternate way to produce these compounds on an industrial scale in a more economical and environment-friendly manner. However, at present microbial production has been achieved only on a laboratory scale and improvements and scale-up of these processes remain challenging. Naringenin and pinocembrin, which are flavonoid scaffolds and precursors for most of the flavonoids, are the model molecules that are key to solving the current issues restricting industrial production of these chemicals. The emergence of systems metabolic engineering, which combines systems biology with synthetic biology and evolutionary engineering at the systems level, offers new perspectives on strain and process optimization. In this review, current challenges in large-scale fermentation processes involving flavonoid scaffolds and the strategies and tools of systems metabolic engineering used to overcome these challenges are summarized. This will offer insights into overcoming the limitations and challenges of large-scale microbial production of these important pharmaceutical compounds. Copyright © 2014 Elsevier B.V. All rights reserved.
Efficient extraction strategies of tea (Camellia sinensis) biomolecules.
Banerjee, Satarupa; Chatterjee, Jyotirmoy
2015-06-01
Tea is a popular daily beverage worldwide. Modulation and modifications of its basic components like catechins, alkaloids, proteins and carbohydrate during fermentation or extraction process changes organoleptic, gustatory and medicinal properties of tea. Through these processes increase or decrease in yield of desired components are evident. Considering the varied impacts of parameters in tea production, storage and processes that affect the yield, extraction of tea biomolecules at optimized condition is thought to be challenging. Implementation of technological advancements in green chemistry approaches can minimize the deviation retaining maximum qualitative properties in environment friendly way. Existed extraction processes with optimization parameters of tea have been discussed in this paper including its prospects and limitations. This exhaustive review of various extraction parameters, decaffeination process of tea and large scale cost effective isolation of tea components with aid of modern technology can assist people to choose extraction condition of tea according to necessity.
NASA Astrophysics Data System (ADS)
van der Molen, Johan; Ruardij, Piet; Greenwood, Naomi
2016-05-01
A model study was carried out of the potential large-scale (> 100 km) effects of marine renewable tidal energy generation in the Pentland Firth, using the 3-D hydrodynamics-biogeochemistry model GETM-ERSEM-BFM. A realistic 800 MW scenario and a high-impact scenario with massive expansion of tidal energy extraction to 8 GW scenario were considered. The realistic 800 MW scenario suggested minor effects on the tides, and undetectable effects on the biogeochemistry. The massive-expansion 8 GW scenario suggested effects would be observed over hundreds of kilometres away with changes of up to 10 % in tidal and ecosystem variables, in particular in a broad area in the vicinity of the Wash. There, waters became less turbid, and primary production increased with associated increases in faunal ecosystem variables. Moreover, a one-off increase in carbon storage in the sea bed was detected. Although these first results suggest positive environmental effects, further investigation is recommended of (i) the residual circulation in the vicinity of the Pentland Firth and effects on larval dispersal using a higher-resolution model and (ii) ecosystem effects with (future) state-of-the-art models if energy extraction substantially beyond 1 GW is planned.
Patterson, Olga V; Freiberg, Matthew S; Skanderson, Melissa; J Fodeh, Samah; Brandt, Cynthia A; DuVall, Scott L
2017-06-12
In order to investigate the mechanisms of cardiovascular disease in HIV infected and uninfected patients, an analysis of echocardiogram reports is required for a large longitudinal multi-center study. A natural language processing system using a dictionary lookup, rules, and patterns was developed to extract heart function measurements that are typically recorded in echocardiogram reports as measurement-value pairs. Curated semantic bootstrapping was used to create a custom dictionary that extends existing terminologies based on terms that actually appear in the medical record. A novel disambiguation method based on semantic constraints was created to identify and discard erroneous alternative definitions of the measurement terms. The system was built utilizing a scalable framework, making it available for processing large datasets. The system was developed for and validated on notes from three sources: general clinic notes, echocardiogram reports, and radiology reports. The system achieved F-scores of 0.872, 0.844, and 0.877 with precision of 0.936, 0.982, and 0.969 for each dataset respectively averaged across all extracted values. Left ventricular ejection fraction (LVEF) is the most frequently extracted measurement. The precision of extraction of the LVEF measure ranged from 0.968 to 1.0 across different document types. This system illustrates the feasibility and effectiveness of a large-scale information extraction on clinical data. New clinical questions can be addressed in the domain of heart failure using retrospective clinical data analysis because key heart function measurements can be successfully extracted using natural language processing.
USDA-ARS?s Scientific Manuscript database
Oligosaccharide accumulation occurs during high solid loading enzymatic hydrolysis of corn stover (CS) irrespective of using different pretreated corn stover (dilute acid: DA, ionic liquids: IL, ammonia fiber expansion: AFEX and extractive ammonia: EA). The methodology for large-scale separation of ...
33 CFR 164.33 - Charts and publications.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Ocean Service, U.S. Army Corps of Engineers, or a river authority that— (i) Are of a large enough scale...) For the area to be transited, the current edition of, or applicable current extract from: (i) Tide tables published by private entities using data provided by the National Ocean Service. (ii) Tidal...
NASA Astrophysics Data System (ADS)
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-12-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.
Pinxterhuis, Erik B.; Gualtierotti, Jean-Baptiste; Heeres, Hero J.
2017-01-01
Access to enantiopure compounds on large scale in an environmentally friendly and cost-efficient manner remains one of the greatest challenges in chemistry. Resolution of racemates using enantioselective liquid–liquid extraction has great potential to meet that challenge. However, a relatively feeble understanding of the chemical principles and physical properties behind this technique has hampered the development of hosts possessing sufficient resolving power for their application to large scale processes. Herein we present, employing the previously untested SPINOL based phosphoric acids host family, an in depths study of the parameters affecting the efficiency of the resolution of amino-alcohols in the optic of further understanding the core principles behind ELLE. We have systematically investigated the dependencies of the enantioselection by parameters such as the choice of solvent, the temperature, as well as the pH and bring to light many previously unsuspected and highly intriguing interactions. Furthermore, utilizing these new insights to our advantage, we developed novel, highly efficient, extraction and resolving protocols which provide remarkable levels of enantioselectivity. It was shown that the extraction is catalytic in host by demonstrating transport in a U-tube and finally it was demonstrated how the solvent dependency could be exploited in an unprecedented triphasic resolution system. PMID:28989671
NASA Astrophysics Data System (ADS)
Schneider, Christian
2017-04-01
The study analyzes the impact of different farming systems on soil quality and soil degradation in European loess landscapes. The analyses are based on geo-chemical soil properties, landscape metrics and geomorphological indicators. The German Middle Saxonian Loess Region represents loess landscapes whose ecological functions were shaped by land consolidation measures resulting in large-scale high-input farming systems. The Polish Proszowice Plateau is still characterized by a traditional small-scale peasant agriculture. The research areas were analyzed on different scale levels combining GIS, field, and laboratory methods. A digital terrain classification was used to identify representative catchment basins for detailed pedological studies which were focused on soil properties that responded to soil management within several years, like pH-value, total carbon (TC), total nitrogen (TN), inorganic carbon (IC), soil organic carbon (TOC=TC-IC), hot-water extractable carbon (HWC), hot-water extractable nitrogen (HWN), total phosphorus, plant-available phosphorus (P), plant-available potassium (K) and the potential cation exchange capacity (CEC). The study has shown that significant differences in major soil properties can be observed because of different fertilizer inputs and partly because of different cultivation techniques. Also the traditional system increases soil heterogeneity. Contrary to expectations the study has shown that the small-scale peasant farming system resulted in similar mean soil organic carbon and phosphorus contents like the industrialized high-input farming system. A further study could include investigations of the effects of soil amendments like herbicides and pesticide on soil degradation.
Automatic extraction of property norm-like data from large text corpora.
Kelly, Colin; Devereux, Barry; Korhonen, Anna
2014-01-01
Traditional methods for deriving property-based representations of concepts from text have focused on either extracting only a subset of possible relation types, such as hyponymy/hypernymy (e.g., car is-a vehicle) or meronymy/metonymy (e.g., car has wheels), or unspecified relations (e.g., car--petrol). We propose a system for the challenging task of automatic, large-scale acquisition of unconstrained, human-like property norms from large text corpora, and discuss the theoretical implications of such a system. We employ syntactic, semantic, and encyclopedic information to guide our extraction, yielding concept-relation-feature triples (e.g., car be fast, car require petrol, car cause pollution), which approximate property-based conceptual representations. Our novel method extracts candidate triples from parsed corpora (Wikipedia and the British National Corpus) using syntactically and grammatically motivated rules, then reweights triples with a linear combination of their frequency and four statistical metrics. We assess our system output in three ways: lexical comparison with norms derived from human-generated property norm data, direct evaluation by four human judges, and a semantic distance comparison with both WordNet similarity data and human-judged concept similarity ratings. Our system offers a viable and performant method of plausible triple extraction: Our lexical comparison shows comparable performance to the current state-of-the-art, while subsequent evaluations exhibit the human-like character of our generated properties.
Nguyen, Trung T; Barber, Andrew R; Corbin, Kendall; Zhang, Wei
2017-01-01
The worldwide annual production of lobster was 165,367 tons valued over $3.32 billion in 2004, but this figure rose up to 304,000 tons in 2012. Over half the volume of the worldwide lobster production has been processed to meet the rising global demand in diversified lobster products. Lobster processing generates a large amount of by-products (heads, shells, livers, and eggs) which account for 50-70% of the starting material. Continued production of these lobster processing by-products (LPBs) without corresponding process development for efficient utilization has led to disposal issues associated with costs and pollutions. This review presents the promising opportunities to maximize the utilization of LPBs by economic recovery of their valuable components to produce high value-added products. More than 50,000 tons of LPBs are globally generated, which costs lobster processing companies upward of about $7.5 million/year for disposal. This not only presents financial and environmental burdens to the lobster processors but also wastes a valuable bioresource. LPBs are rich in a range of high-value compounds such as proteins, chitin, lipids, minerals, and pigments. Extracts recovered from LPBs have been demonstrated to possess several functionalities and bioactivities, which are useful for numerous applications in water treatment, agriculture, food, nutraceutical, pharmaceutical products, and biomedicine. Although LPBs have been studied for recovery of valuable components, utilization of these materials for the large-scale production is still very limited. Extraction of lobster components using microwave, ultrasonic, and supercritical fluid extraction were found to be promising techniques that could be used for large-scale production. LPBs are rich in high-value compounds that are currently being underutilized. These compounds can be extracted for being used as functional ingredients, nutraceuticals, and pharmaceuticals in a wide range of commercial applications. The efficient utilization of LPBs would not only generate significant economic benefits but also reduce the problems of waste management associated with the lobster industry. This comprehensive review highlights the availability of the global LPBs, the key components in LPBs and their current applications, the limitations to the extraction techniques used, and the suggested emerging techniques which may be promising on an industrial scale for the maximized utilization of LPBs. Graphical abstractLobster processing by-product as bioresource of several functional and bioactive compounds used in various value-added products.
2015-01-01
Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we demonstrate that both the re-ranking approach and the word vectors can provide slight performance improvement. A manual evaluation of the re-ranking results pinpoints some of the challenges faced in applying large-scale text mining knowledge to event extraction. PMID:26551766
Chen, Ruifeng; Zhu, Lijun; Lv, Lihuo; Yao, Su; Li, Bin; Qian, Junqing
2017-06-01
Optimization of compatible solutes (ectoine) extraction and purification from Halomonas elongata cell fermentation had been investigated in the laboratory tests of a large scale commercial production project. After culturing H. elongata cells in developed medium at 28 °C for 23-30 h, we obtained an average yield and biomass of ectoine for 15.9 g/L and 92.9 (OD 600 ), respectively. Cell lysis was performed with acid treatment at moderate high temperature (60-70 °C). The downstream processing operations were designed to be as follows: filtration, desalination, cation exchange, extraction of crude product and three times of refining. Among which the cation exchange and extraction of crude product acquired a high average recovery rate of 95 and 96%; whereas a great loss rate of 19 and 15% was observed during the filtration and desalination, respectively. Combined with the recovering of ectoine from the mother liquor of the three times refining, the average of overall yield (referring to the amount of ectoine synthesized in cells) and purity of final product obtained were 43% and over 98%, respectively. However, key factors that affected the production efficiency were not yields but the time used in the extraction of crude product, involving the crystallization step from water, which spended 24-72 h according to the production scale. Although regarding to the productivity and simplicity on laboratory scale, the method described here can not compete with other investigations, in this study we acquired higher purity of ectoine and provided downstream processes that are capable of operating on industrial scale.
v3NLP Framework: Tools to Build Applications for Extracting Concepts from Clinical Text
Divita, Guy; Carter, Marjorie E.; Tran, Le-Thuy; Redd, Doug; Zeng, Qing T; Duvall, Scott; Samore, Matthew H.; Gundlapalli, Adi V.
2016-01-01
Introduction: Substantial amounts of clinically significant information are contained only within the narrative of the clinical notes in electronic medical records. The v3NLP Framework is a set of “best-of-breed” functionalities developed to transform this information into structured data for use in quality improvement, research, population health surveillance, and decision support. Background: MetaMap, cTAKES and similar well-known natural language processing (NLP) tools do not have sufficient scalability out of the box. The v3NLP Framework evolved out of the necessity to scale-up these tools up and provide a framework to customize and tune techniques that fit a variety of tasks, including document classification, tuned concept extraction for specific conditions, patient classification, and information retrieval. Innovation: Beyond scalability, several v3NLP Framework-developed projects have been efficacy tested and benchmarked. While v3NLP Framework includes annotators, pipelines and applications, its functionalities enable developers to create novel annotators and to place annotators into pipelines and scaled applications. Discussion: The v3NLP Framework has been successfully utilized in many projects including general concept extraction, risk factors for homelessness among veterans, and identification of mentions of the presence of an indwelling urinary catheter. Projects as diverse as predicting colonization with methicillin-resistant Staphylococcus aureus and extracting references to military sexual trauma are being built using v3NLP Framework components. Conclusion: The v3NLP Framework is a set of functionalities and components that provide Java developers with the ability to create novel annotators and to place those annotators into pipelines and applications to extract concepts from clinical text. There are scale-up and scale-out functionalities to process large numbers of records. PMID:27683667
Magnetostructural coupling and magnetocaloric effect in Ni-Mn-Ga-Cu microwires
NASA Astrophysics Data System (ADS)
Zhang, Xuexi; Qian, Mingfang; Zhang, Zhe; Wei, Longsha; Geng, Lin; Sun, Jianfei
2016-02-01
Ni-Mn-Ga-X microwires were produced by melt-extraction technique on a large scale. Their shape memory effect, superelasticity, and damping capacity have been demonstrated. Here, the excellent magnetocaloric effect was revealed in Ni-Mn-Ga-Cu microwires produced by melt-extraction and subsequent annealing. The overlap of the martensitic and magnetic transformations, i.e., magnetostructural coupling, was achieved in the annealed microwires. The magnetostructural coupling and wide martensitic transformation temperature range contribute to a large magnetic entropy change of -8.3 J/kg K with a wide working temperature interval of ˜13 K under a magnetic field of 50 kOe. Accordingly, a high refrigeration capacity of ˜78 J/kg was produced in the annealed microwires.
Scalable clustering algorithms for continuous environmental flow cytometry.
Hyrkas, Jeremy; Clayton, Sophie; Ribalet, Francois; Halperin, Daniel; Armbrust, E Virginia; Howe, Bill
2016-02-01
Recent technological innovations in flow cytometry now allow oceanographers to collect high-frequency flow cytometry data from particles in aquatic environments on a scale far surpassing conventional flow cytometers. The SeaFlow cytometer continuously profiles microbial phytoplankton populations across thousands of kilometers of the surface ocean. The data streams produced by instruments such as SeaFlow challenge the traditional sample-by-sample approach in cytometric analysis and highlight the need for scalable clustering algorithms to extract population information from these large-scale, high-frequency flow cytometers. We explore how available algorithms commonly used for medical applications perform at classification of such a large-scale, environmental flow cytometry data. We apply large-scale Gaussian mixture models to massive datasets using Hadoop. This approach outperforms current state-of-the-art cytometry classification algorithms in accuracy and can be coupled with manual or automatic partitioning of data into homogeneous sections for further classification gains. We propose the Gaussian mixture model with partitioning approach for classification of large-scale, high-frequency flow cytometry data. Source code available for download at https://github.com/jhyrkas/seaflow_cluster, implemented in Java for use with Hadoop. hyrkas@cs.washington.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Liu, Yuqiong; Du, Qingyun; Wang, Qi; Yu, Huanyun; Liu, Jianfeng; Tian, Yu; Chang, Chunying; Lei, Jing
2017-07-01
The causation between bioavailability of heavy metals and environmental factors are generally obtained from field experiments at local scales at present, and lack sufficient evidence from large scales. However, inferring causation between bioavailability of heavy metals and environmental factors across large-scale regions is challenging. Because the conventional correlation-based approaches used for causation assessments across large-scale regions, at the expense of actual causation, can result in spurious insights. In this study, a general approach framework, Intervention calculus when the directed acyclic graph (DAG) is absent (IDA) combined with the backdoor criterion (BC), was introduced to identify causation between the bioavailability of heavy metals and the potential environmental factors across large-scale regions. We take the Pearl River Delta (PRD) in China as a case study. The causal structures and effects were identified based on the concentrations of heavy metals (Zn, As, Cu, Hg, Pb, Cr, Ni and Cd) in soil (0-20 cm depth) and vegetable (lettuce) and 40 environmental factors (soil properties, extractable heavy metals and weathering indices) in 94 samples across the PRD. Results show that the bioavailability of heavy metals (Cd, Zn, Cr, Ni and As) was causally influenced by soil properties and soil weathering factors, whereas no causal factor impacted the bioavailability of Cu, Hg and Pb. No latent factor was found between the bioavailability of heavy metals and environmental factors. The causation between the bioavailability of heavy metals and environmental factors at field experiments is consistent with that on a large scale. The IDA combined with the BC provides a powerful tool to identify causation between the bioavailability of heavy metals and environmental factors across large-scale regions. Causal inference in a large system with the dynamic changes has great implications for system-based risk management. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hung, Cheng-Hung
The main objective of this project was to develop a low-cost integrated substrate for rigid OLED solid-state lighting produced at a manufacturing scale. The integrated substrates could include combinations of soda lime glass substrate, light extraction layer, and an anode layer (i.e., Transparent Conductive Oxide, TCO). Over the 3 + year course of the project, the scope of work was revised to focus on the development of a glass substrates with an internal light extraction (IEL) layer. A manufacturing-scale float glass on-line particle embedding process capable of producing an IEL glass substrate having a thickness of less than 1.7mm andmore » an area larger than 500mm x 400mm was demonstrated. Substrates measuring 470mm x 370mm were used in the OLED manufacturing process for fabricating OLED lighting panels in single pixel devices as large as 120.5mm x 120.5mm. The measured light extraction efficiency (calculated as external quantum efficiency, EQE) for on-line produced IEL samples (>50%) met the project’s initial goal.« less
Canivez, Gary L; Watkins, Marley W
2010-12-01
The present study examined the factor structure of the Wechsler Adult Intelligence Scale--Fourth Edition (WAIS-IV; D. Wechsler, 2008a) standardization sample using exploratory factor analysis, multiple factor extraction criteria, and higher order exploratory factor analysis (J. Schmid & J. M. Leiman, 1957) not included in the WAIS-IV Technical and Interpretation Manual (D. Wechsler, 2008b). Results indicated that the WAIS-IV subtests were properly associated with the theoretically proposed first-order factors, but all but one factor-extraction criterion recommended extraction of one or two factors. Hierarchical exploratory analyses with the Schmid and Leiman procedure found that the second-order g factor accounted for large portions of total and common variance, whereas the four first-order factors accounted for small portions of total and common variance. It was concluded that the WAIS-IV provides strong measurement of general intelligence, and clinical interpretation should be primarily at that level.
Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien
2017-06-01
Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which-as shown on the contact process-provides a significant improvement of the large deviation function estimators compared to the standard one.
Tan, Zhi-Jian; Yang, Zi-Zhen; Yi, Yong-Jian; Wang, Hong-Ying; Zhou, Wan-Lai; Li, Fen-Fang; Wang, Chao-Yun
2016-08-01
In this study, enzyme-assisted three-phase partitioning (EATPP) was used to extract oil from flaxseed. The whole procedure is composed of two parts: the enzymolysis procedure in which the flaxseed was hydrolyzed using an enzyme solution (the influencing parameters such as the type and concentration of enzyme, temperature, and pH were optimized) and three-phase partitioning (TPP), which was conducted by adding salt and t-butanol to the crude flaxseed slurry, resulting in the extraction of flaxseed oil into alcohol-rich upper phase. The concentration of t-butanol, concentration of salt, and the temperature were optimized to maximize the extraction yield. Under optimized conditions of a 49.29 % t-butanol concentration, 30.43 % ammonium sulfate concentration, and 35 °C extraction temperature, a maximum extraction yield of 71.68 % was obtained. This simple and effective EATPP can be used to achieve high extraction yields and oil quality, and thus, it is potential for large-scale oil production.
Refining of metallurgical-grade silicon
NASA Technical Reports Server (NTRS)
Dietl, J.
1986-01-01
A basic requirement of large scale solar cell fabrication is to provide low cost base material. Unconventional refining of metallurical grade silicon represents one of the most promising ways of silicon meltstock processing. The refining concept is based on an optimized combination of metallurgical treatments. Commercially available crude silicon, in this sequence, requires a first pyrometallurgical step by slagging, or, alternatively, solvent extraction by aluminum. After grinding and leaching, high purity qualtiy is gained as an advanced stage of refinement. To reach solar grade quality a final pyrometallurgical step is needed: liquid-gas extraction.
Remote visual analysis of large turbulence databases at multiple scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Remote visual analysis of large turbulence databases at multiple scales
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...
2018-06-15
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Collaboration in the Humanities, Arts and Social Sciences in Australia
ERIC Educational Resources Information Center
Haddow, Gaby; Xia, Jianhong; Willson, Michele
2017-01-01
This paper reports on the first large-scale quantitative investigation into collaboration, demonstrated in co-authorship, by Australian humanities, arts and social sciences (HASS) researchers. Web of Science data were extracted for Australian HASS publications, with a focus on the softer social sciences, over the period 2004-2013. The findings…
Stephanie J. Wessell-Kelly; Deanna H. Olson
2013-01-01
Increasing global demands on forest resources are driving large-scale shifts toward plantation forestry. Simultaneously balancing resource extraction and ecological sustainability objectives in plantation forests requires the incorporation of innovative silvicultural strategies such as leave islands (green-tree retention clusters). Our primary research goal was to...
Physics and biochemical engineering: 3
NASA Astrophysics Data System (ADS)
Fairbrother, Robert; Riddle, Wendy; Fairbrother, Neil
2006-09-01
Once an antibiotic has been produced on a large scale, as described in our preceding articles, it has to be extracted and purified. Filtration and centrifugation are the two main ways of doing this, and the design of industrial processing systems is governed by simple physics involving factors such as pressure, viscosity and rotational motion.
USDA-ARS?s Scientific Manuscript database
Both sugarcane (Saccharum officinarum) and sweet sorghum (Sorghum bicolor) crops are members of the grass (Poaceae) family, and consist of stalks rich in soluble sugars. The extracted juice from both of these crops contains insoluble starch, with much greater quantities occurring in sweet sorghum. ...
Evolving from bioinformatics in-the-small to bioinformatics in-the-large.
Parker, D Stott; Gorlick, Michael M; Lee, Christopher J
2003-01-01
We argue the significance of a fundamental shift in bioinformatics, from in-the-small to in-the-large. Adopting a large-scale perspective is a way to manage the problems endemic to the world of the small-constellations of incompatible tools for which the effort required to assemble an integrated system exceeds the perceived benefit of the integration. Where bioinformatics in-the-small is about data and tools, bioinformatics in-the-large is about metadata and dependencies. Dependencies represent the complexities of large-scale integration, including the requirements and assumptions governing the composition of tools. The popular make utility is a very effective system for defining and maintaining simple dependencies, and it offers a number of insights about the essence of bioinformatics in-the-large. Keeping an in-the-large perspective has been very useful to us in large bioinformatics projects. We give two fairly different examples, and extract lessons from them showing how it has helped. These examples both suggest the benefit of explicitly defining and managing knowledge flows and knowledge maps (which represent metadata regarding types, flows, and dependencies), and also suggest approaches for developing bioinformatics database systems. Generally, we argue that large-scale engineering principles can be successfully adapted from disciplines such as software engineering and data management, and that having an in-the-large perspective will be a key advantage in the next phase of bioinformatics development.
Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines
Mikut, Ralf
2017-01-01
Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927
Remote Sensing Extraction of Stopes and Tailings Ponds in AN Ultra-Low Iron Mining Area
NASA Astrophysics Data System (ADS)
Ma, B.; Chen, Y.; Li, X.; Wu, L.
2018-04-01
With the development of economy, global demand for steel has accelerated since 2000, and thus mining activities of iron ore have become intensive accordingly. An ultra-low-grade iron has been extracted by open-pit mining and processed massively since 2001 in Kuancheng County, Hebei Province. There are large-scale stopes and tailings ponds in this area. It is important to extract their spatial distribution information for environmental protection and disaster prevention. A remote sensing method of extracting stopes and tailings ponds is studied based on spectral characteristics by use of Landsat 8 OLI imagery and ground spectral data. The overall accuracy of extraction is 95.06 %. In addition, tailings ponds are distinguished from stopes based on thermal characteristics by use of temperature image. The results could provide decision support for environmental protection, disaster prevention, and ecological restoration in the ultra-low-grade iron ore mining area.
Eyben, Florian; Weninger, Felix; Lehment, Nicolas; Schuller, Björn; Rigoll, Gerhard
2013-01-01
Without doubt general video and sound, as found in large multimedia archives, carry emotional information. Thus, audio and video retrieval by certain emotional categories or dimensions could play a central role for tomorrow's intelligent systems, enabling search for movies with a particular mood, computer aided scene and sound design in order to elicit certain emotions in the audience, etc. Yet, the lion's share of research in affective computing is exclusively focusing on signals conveyed by humans, such as affective speech. Uniting the fields of multimedia retrieval and affective computing is believed to lend to a multiplicity of interesting retrieval applications, and at the same time to benefit affective computing research, by moving its methodology "out of the lab" to real-world, diverse data. In this contribution, we address the problem of finding "disturbing" scenes in movies, a scenario that is highly relevant for computer-aided parental guidance. We apply large-scale segmental feature extraction combined with audio-visual classification to the particular task of detecting violence. Our system performs fully data-driven analysis including automatic segmentation. We evaluate the system in terms of mean average precision (MAP) on the official data set of the MediaEval 2012 evaluation campaign's Affect Task, which consists of 18 original Hollywood movies, achieving up to .398 MAP on unseen test data in full realism. An in-depth analysis of the worth of individual features with respect to the target class and the system errors is carried out and reveals the importance of peak-related audio feature extraction and low-level histogram-based video analysis.
Eyben, Florian; Weninger, Felix; Lehment, Nicolas; Schuller, Björn; Rigoll, Gerhard
2013-01-01
Without doubt general video and sound, as found in large multimedia archives, carry emotional information. Thus, audio and video retrieval by certain emotional categories or dimensions could play a central role for tomorrow's intelligent systems, enabling search for movies with a particular mood, computer aided scene and sound design in order to elicit certain emotions in the audience, etc. Yet, the lion's share of research in affective computing is exclusively focusing on signals conveyed by humans, such as affective speech. Uniting the fields of multimedia retrieval and affective computing is believed to lend to a multiplicity of interesting retrieval applications, and at the same time to benefit affective computing research, by moving its methodology “out of the lab” to real-world, diverse data. In this contribution, we address the problem of finding “disturbing” scenes in movies, a scenario that is highly relevant for computer-aided parental guidance. We apply large-scale segmental feature extraction combined with audio-visual classification to the particular task of detecting violence. Our system performs fully data-driven analysis including automatic segmentation. We evaluate the system in terms of mean average precision (MAP) on the official data set of the MediaEval 2012 evaluation campaign's Affect Task, which consists of 18 original Hollywood movies, achieving up to .398 MAP on unseen test data in full realism. An in-depth analysis of the worth of individual features with respect to the target class and the system errors is carried out and reveals the importance of peak-related audio feature extraction and low-level histogram-based video analysis. PMID:24391704
NASA Astrophysics Data System (ADS)
D'Archivio, Angelo Antonio; Maggi, Maria Anna; Odoardi, Antonella; Santucci, Sandro; Passacantando, Maurizio
2018-02-01
Multi-walled carbon nanotubes (MWCNTs), because of their small size and large available surface area, are potentially efficient sorbents for the extraction of water solutes. Dispersion of MWCNTs in aqueous medium is suitable to adsorb organic contaminants from small sample volumes, but, the recovery of the suspended sorbent for successive re-use represents a critical step, which makes this method inapplicable in large-scale water-treatment technologies. To overcome this problem, we proposed here MWCNTs grown on silicon supports and investigated on a small-volume scale their adsorption properties towards triazine herbicides dissolved in water. The adsorption efficiency of the supported MWCNTs has been tested on seven triazine herbicides, which are emerging water contaminants in Europe and USA, because of their massive use, persistence in soils and potential risks for the aquatic organisms and human health. The investigated compounds, in spite of their common molecular skeleton, cover a relatively large property range in terms of both solubility in water and hydrophilicity/hydrophobicity. The functionalisation of MWCNTs carried out by acidic oxidation, apart from increasing wettability of the material, results in a better adsorption performance. Increasing of functionalisation time between 17 and 60 h progressively increases the extraction of all seven pesticides and produces a moderate increment of selectivity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Seyedhosseini, Mojtaba; Kumar, Ritwik; Jurrus, Elizabeth R.
2011-10-01
Automated neural circuit reconstruction through electron microscopy (EM) images is a challenging problem. In this paper, we present a novel method that exploits multi-scale contextual information together with Radon-like features (RLF) to learn a series of discriminative models. The main idea is to build a framework which is capable of extracting information about cell membranes from a large contextual area of an EM image in a computationally efficient way. Toward this goal, we extract RLF that can be computed efficiently from the input image and generate a scale-space representation of the context images that are obtained at the output ofmore » each discriminative model in the series. Compared to a single-scale model, the use of a multi-scale representation of the context image gives the subsequent classifiers access to a larger contextual area in an effective way. Our strategy is general and independent of the classifier and has the potential to be used in any context based framework. We demonstrate that our method outperforms the state-of-the-art algorithms in detection of neuron membranes in EM images.« less
Optimal Information Extraction of Laser Scanning Dataset by Scale-Adaptive Reduction
NASA Astrophysics Data System (ADS)
Zang, Y.; Yang, B.
2018-04-01
3D laser technology is widely used to collocate the surface information of object. For various applications, we need to extract a good perceptual quality point cloud from the scanned points. To solve the problem, most of existing methods extract important points based on a fixed scale. However, geometric features of 3D object come from various geometric scales. We propose a multi-scale construction method based on radial basis function. For each scale, important points are extracted from the point cloud based on their importance. We apply a perception metric Just-Noticeable-Difference to measure degradation of each geometric scale. Finally, scale-adaptive optimal information extraction is realized. Experiments are undertaken to evaluate the effective of the proposed method, suggesting a reliable solution for optimal information extraction of object.
Medical image classification based on multi-scale non-negative sparse coding.
Zhang, Ruijie; Shen, Jian; Wei, Fushan; Li, Xiong; Sangaiah, Arun Kumar
2017-11-01
With the rapid development of modern medical imaging technology, medical image classification has become more and more important in medical diagnosis and clinical practice. Conventional medical image classification algorithms usually neglect the semantic gap problem between low-level features and high-level image semantic, which will largely degrade the classification performance. To solve this problem, we propose a multi-scale non-negative sparse coding based medical image classification algorithm. Firstly, Medical images are decomposed into multiple scale layers, thus diverse visual details can be extracted from different scale layers. Secondly, for each scale layer, the non-negative sparse coding model with fisher discriminative analysis is constructed to obtain the discriminative sparse representation of medical images. Then, the obtained multi-scale non-negative sparse coding features are combined to form a multi-scale feature histogram as the final representation for a medical image. Finally, SVM classifier is combined to conduct medical image classification. The experimental results demonstrate that our proposed algorithm can effectively utilize multi-scale and contextual spatial information of medical images, reduce the semantic gap in a large degree and improve medical image classification performance. Copyright © 2017 Elsevier B.V. All rights reserved.
Elberry, Ahmed A.; Al-Maghrabi, Jaudah; Abdel Sattar, Essam; Ghareib, Salah A.; Mosli, Hisham A.; Gabr, Salah A.
2014-01-01
Red onion scales (ROS) contain large amounts of flavonoids that are responsible for the reported antioxidant activity, immune enhancement, and anticancer property. Atypical prostatic hyperplasia (APH) was induced in adult castrated Wistar rats by both s.c. injection of testosterone (0.5 mg/rat/day) and by smearing citral on shaved skin once every 3 days for 30 days. Saw palmetto (100 mg/kg) as a positive control and ROS suspension at doses of 75, 150, and 300 mg/kg/day were given orally every day for 30 days. All medications were started 7 days after castration and along with testosterone and citral. The HPLC profile of ROS methanolic extract displayed two major peaks identified as quercetin and quercetin-4′-β-O-D-glucoside. Histopathological examination of APH-induced prostatic rats revealed evidence of hyperplasia and inflammation with cellular proliferation and reduced apoptosis Immunohistochemistry showed increased tissue expressions of IL-6, IL-8, TNF-α, IGF-1, and clusterin, while TGF-β1 was decreased, which correlates with the presence of inflammation. Both saw palmetto and RO scale treatment have ameliorated these changes. These ameliorative effects were more evident in RO scale groups and were dose dependent. In conclusion, methanolic extract of ROS showed a protective effect against APH induced rats that may be attributed to potential anti-inflammatory and immunomodulatory effects. PMID:24829522
Hussain, Khaja Amjad; Tarakji, Bassel; Kandy, Binu Purushothaman Panar; John, Jacob; Mathews, Jacob; Ramphul, Vandana; Divakar, Darshan Devang
2015-01-01
Use of plant extracts and phytochemicals with known antimicrobial properties may have great significance in therapeutic treatments. To assess the in vitro antimicrobial potential and also determine the minimum inhibitory concentration (MIC) of Citrus sinensis peel extracts with a view of searching a novel extract as a remedy for periodontal pathogens. Aqueous and ethanol (cold and hot) extracts prepared from peel of Citrus sinensis were screened for in vitro antimicrobial activity against Aggregatibacter actinomycetemcomitans, Porphyromonas gingivalis and Prevotella intermedia, using agar well diffusion method. The lowest concentration of every extract considered as the minimal inhibitory concentration (MIC) values were determined for both test organisms. Confidence level and level of significance were set at 95% and 5% respectively. Prevotella intermedia and Porphyromonas gingivalis were resistant to aqueous extracts while Aggregatibacter actinomycetemcomitans was inhibited at very high cncentrations. Hot ethanolic extracts showed significantly higher zone of inhibition than cold ethanolic extract. Minimum inhibitory concentration of hot and cold ethanolic extracts of Citrus sinensis peel ranged between 12-15 mg/ml against all three periodontal pathogens. Both extracts were found sensitive and contain compounds with therapeutic potential. Nevertheless, clinical trials on the effect of these plants are essential before advocating large-scale therapy.
Energy transfer, pressure tensor, and heating of kinetic plasma
NASA Astrophysics Data System (ADS)
Yang, Yan; Matthaeus, William H.; Parashar, Tulasi N.; Haggerty, Colby C.; Roytershteyn, Vadim; Daughton, William; Wan, Minping; Shi, Yipeng; Chen, Shiyi
2017-07-01
Kinetic plasma turbulence cascade spans multiple scales ranging from macroscopic fluid flow to sub-electron scales. Mechanisms that dissipate large scale energy, terminate the inertial range cascade, and convert kinetic energy into heat are hotly debated. Here, we revisit these puzzles using fully kinetic simulation. By performing scale-dependent spatial filtering on the Vlasov equation, we extract information at prescribed scales and introduce several energy transfer functions. This approach allows highly inhomogeneous energy cascade to be quantified as it proceeds down to kinetic scales. The pressure work, - ( P . ∇ ) . u , can trigger a channel of the energy conversion between fluid flow and random motions, which contains a collision-free generalization of the viscous dissipation in collisional fluid. Both the energy transfer and the pressure work are strongly correlated with velocity gradients.
Plant extract: a promising biomatrix for ecofriendly, controlled synthesis of silver nanoparticles.
Borase, Hemant P; Salunke, Bipinchandra K; Salunkhe, Rahul B; Patil, Chandrashekhar D; Hallsworth, John E; Kim, Beom S; Patil, Satish V
2014-05-01
Uses of plants extracts are found to be more advantageous over chemical, physical and microbial (bacterial, fungal, algal) methods for silver nanoparticles (AgNPs) synthesis. In phytonanosynthesis, biochemical diversity of plant extract, non-pathogenicity, low cost and flexibility in reaction parameters are accounted for high rate of AgNPs production with different shape, size and applications. At the same time, care has to be taken to select suitable phytofactory for AgNPs synthesis based on certain parameters such as easy availability, large-scale nanosynthesis potential and non-toxic nature of plant extract. This review focuses on synthesis of AgNPs with particular emphasis on biological synthesis using plant extracts. Some points have been given on selection of plant extract for AgNPs synthesis and case studies on AgNPs synthesis using different plant extracts. Reaction parameters contributing to higher yield of nanoparticles are presented here. Synthesis mechanisms and overview of present and future applications of plant-extract-synthesized AgNPs are also discussed here. Limitations associated with use of AgNPs are summarised in the present review.
Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua
2016-01-01
Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples. PMID:27924944
Research of seafloor topographic analyses for a staged mineral exploration
NASA Astrophysics Data System (ADS)
Ikeda, M.; Kadoshima, K.; Koizumi, Y.; Yamakawa, T.; Asakawa, E.; Sumi, T.; Kose, M.
2016-12-01
J-MARES (Research and Development Partnership for Next Generation Technology of Marine Resources Survey, JAPAN) has been designing a low-cost and high-efficiency exploration system for seafloor hydrothermal massive sulfide (SMS) deposits in "Cross-ministerial Strategic Innovation Promotion Program (SIP)" granted by the Cabinet Office, Government of Japan since 2014. We proposed the multi-stage approach, which is designed from the regional scaled to the detail scaled survey stages through semi-detail scaled, focusing a prospective area by seafloor topographic analyses. We applied this method to the area of more than 100km x 100km around Okinawa Trough, including some well-known mineralized deposits. In the regional scale survey, we assume survey areas are more than 100 km x 100km. Then the spatial resolution of topography data should be bigger than 100m. The 500 m resolution data which is interpolated into 250 m resolution was used for extracting depression and performing principal component analysis (PCA) by the wavelength obtained from frequency analysis. As the result, we have successfully extracted the areas having the topographic features quite similar to well-known mineralized deposits. In the semi-local survey stage, we use the topography data obtained by bathymetric survey using multi-narrow beam echo-sounder. The 30m-resolution data was used for extracting depression, relative-large mounds, hills, lineaments as fault, and also for performing frequency analysis. As the result, wavelength as principal component constituting in the target area was extracted by PCA of wavelength obtained from frequency analysis. Therefore, color image was composited by using the second principal component (PC2) to the forth principal component (PC4) in which the continuity of specific wavelength was observed, and consistent with extracted lineaments. In addition, well-known mineralized deposits were discriminated in the same clusters by using clustering from PC2 to PC4.We applied the results described above to a new area, and successfully extract the quite similar area in vicinity to one of the well-known mineralized deposits. So we are going to verify the extracted areas by using geophysical methods, such as vertical cable seismic and time-domain EM survey, developed in this SIP project.
NASA Technical Reports Server (NTRS)
Anderson, F. S.; Drake, J. S.; Hamilton, V. E.
2005-01-01
We have developed a means of equalizing the atmospheric signature in Mars Odyssey Thermal Emission Imaging System (THEMIS) infrared data over regions with large topography such as the Valles Marineris (VM). This equalization allows for the analysis of compositional variations in regions that previously have been difficult to study because of the large differences in atmospheric path length that result from large changes in surface elevation. Specifically, our motivation for this study is to examine deposits that are small at the scales observable by the Thermal Emission Spectrometer (TES) onboard Mars Global Surveyor, but which are more readily resolved with THEMIS.
AutoBD: Automated Bi-Level Description for Scalable Fine-Grained Visual Categorization.
Yao, Hantao; Zhang, Shiliang; Yan, Chenggang; Zhang, Yongdong; Li, Jintao; Tian, Qi
Compared with traditional image classification, fine-grained visual categorization is a more challenging task, because it targets to classify objects belonging to the same species, e.g. , classify hundreds of birds or cars. In the past several years, researchers have made many achievements on this topic. However, most of them are heavily dependent on the artificial annotations, e.g., bounding boxes, part annotations, and so on . The requirement of artificial annotations largely hinders the scalability and application. Motivated to release such dependence, this paper proposes a robust and discriminative visual description named Automated Bi-level Description (AutoBD). "Bi-level" denotes two complementary part-level and object-level visual descriptions, respectively. AutoBD is "automated," because it only requires the image-level labels of training images and does not need any annotations for testing images. Compared with the part annotations labeled by the human, the image-level labels can be easily acquired, which thus makes AutoBD suitable for large-scale visual categorization. Specifically, the part-level description is extracted by identifying the local region saliently representing the visual distinctiveness. The object-level description is extracted from object bounding boxes generated with a co-localization algorithm. Although only using the image-level labels, AutoBD outperforms the recent studies on two public benchmark, i.e. , classification accuracy achieves 81.6% on CUB-200-2011 and 88.9% on Car-196, respectively. On the large-scale Birdsnap data set, AutoBD achieves the accuracy of 68%, which is currently the best performance to the best of our knowledge.Compared with traditional image classification, fine-grained visual categorization is a more challenging task, because it targets to classify objects belonging to the same species, e.g. , classify hundreds of birds or cars. In the past several years, researchers have made many achievements on this topic. However, most of them are heavily dependent on the artificial annotations, e.g., bounding boxes, part annotations, and so on . The requirement of artificial annotations largely hinders the scalability and application. Motivated to release such dependence, this paper proposes a robust and discriminative visual description named Automated Bi-level Description (AutoBD). "Bi-level" denotes two complementary part-level and object-level visual descriptions, respectively. AutoBD is "automated," because it only requires the image-level labels of training images and does not need any annotations for testing images. Compared with the part annotations labeled by the human, the image-level labels can be easily acquired, which thus makes AutoBD suitable for large-scale visual categorization. Specifically, the part-level description is extracted by identifying the local region saliently representing the visual distinctiveness. The object-level description is extracted from object bounding boxes generated with a co-localization algorithm. Although only using the image-level labels, AutoBD outperforms the recent studies on two public benchmark, i.e. , classification accuracy achieves 81.6% on CUB-200-2011 and 88.9% on Car-196, respectively. On the large-scale Birdsnap data set, AutoBD achieves the accuracy of 68%, which is currently the best performance to the best of our knowledge.
On Feature Extraction from Large Scale Linear LiDAR Data
NASA Astrophysics Data System (ADS)
Acharjee, Partha Pratim
Airborne light detection and ranging (LiDAR) can generate co-registered elevation and intensity map over large terrain. The co-registered 3D map and intensity information can be used efficiently for different feature extraction application. In this dissertation, we developed two algorithms for feature extraction, and usages of features for practical applications. One of the developed algorithms can map still and flowing waterbody features, and another one can extract building feature and estimate solar potential on rooftops and facades. Remote sensing capabilities, distinguishing characteristics of laser returns from water surface and specific data collection procedures provide LiDAR data an edge in this application domain. Furthermore, water surface mapping solutions must work on extremely large datasets, from a thousand square miles, to hundreds of thousands of square miles. National and state-wide map generation/upgradation and hydro-flattening of LiDAR data for many other applications are two leading needs of water surface mapping. These call for as much automation as possible. Researchers have developed many semi-automated algorithms using multiple semi-automated tools and human interventions. This reported work describes a consolidated algorithm and toolbox developed for large scale, automated water surface mapping. Geometric features such as flatness of water surface, higher elevation change in water-land interface and, optical properties such as dropouts caused by specular reflection, bimodal intensity distributions were some of the linear LiDAR features exploited for water surface mapping. Large-scale data handling capabilities are incorporated by automated and intelligent windowing, by resolving boundary issues and integrating all results to a single output. This whole algorithm is developed as an ArcGIS toolbox using Python libraries. Testing and validation are performed on a large datasets to determine the effectiveness of the toolbox and results are presented. Significant power demand is located in urban areas, where, theoretically, a large amount of building surface area is also available for solar panel installation. Therefore, property owners and power generation companies can benefit from a citywide solar potential map, which can provide available estimated annual solar energy at a given location. An efficient solar potential measurement is a prerequisite for an effective solar energy system in an urban area. In addition, the solar potential calculation from rooftops and building facades could open up a wide variety of options for solar panel installations. However, complex urban scenes make it hard to estimate the solar potential, partly because of shadows cast by the buildings. LiDAR-based 3D city models could possibly be the right technology for solar potential mapping. Although, most of the current LiDAR-based local solar potential assessment algorithms mainly address rooftop potential calculation, whereas building facades can contribute a significant amount of viable surface area for solar panel installation. In this paper, we introduce a new algorithm to calculate solar potential of both rooftop and building facades. Solar potential received by the rooftops and facades over the year are also investigated in the test area.
Ultrasound-assisted extraction of flaxseed oil using immobilized enzymes.
Long, Jing-jing; Fu, Yu-jie; Zu, Yuan-gang; Li, Ji; Wang, Wei; Gu, Cheng-bo; Luo, Meng
2011-11-01
An aqueous enzymatic process assisted by ultrasound extraction (AEP-UE) was applied to the extraction of oil from flaxseed (Linum usitatissimum L.). The highest oil recovery of 68.1% was obtained when ground flaxseed was incubated with 130 U/g of cellulase, pectinase, and hemicellulase for 12h, at 45°C and pH 5.0. The IC(50) values of oil obtained by AEP-UE and organic solvent extraction (OSE), as measured by DPPH scavenging activity essay, were 2.27 mg/mL and 3.31 mg/mL. The AEP-UE-derived oil had a 1.5% higher content of unsaturated fatty acids than the OSE-derived oil. AEP-UE is therefore a promising environmentally friendly method for large-scale preparation of flaxseed oil. Copyright © 2011 Elsevier Ltd. All rights reserved.
Rey-Villamizar, Nicolas; Somasundar, Vinay; Megjhani, Murad; Xu, Yan; Lu, Yanbin; Padmanabhan, Raghav; Trett, Kristen; Shain, William; Roysam, Badri
2014-01-01
In this article, we describe the use of Python for large-scale automated server-based bio-image analysis in FARSIGHT, a free and open-source toolkit of image analysis methods for quantitative studies of complex and dynamic tissue microenvironments imaged by modern optical microscopes, including confocal, multi-spectral, multi-photon, and time-lapse systems. The core FARSIGHT modules for image segmentation, feature extraction, tracking, and machine learning are written in C++, leveraging widely used libraries including ITK, VTK, Boost, and Qt. For solving complex image analysis tasks, these modules must be combined into scripts using Python. As a concrete example, we consider the problem of analyzing 3-D multi-spectral images of brain tissue surrounding implanted neuroprosthetic devices, acquired using high-throughput multi-spectral spinning disk step-and-repeat confocal microscopy. The resulting images typically contain 5 fluorescent channels. Each channel consists of 6000 × 10,000 × 500 voxels with 16 bits/voxel, implying image sizes exceeding 250 GB. These images must be mosaicked, pre-processed to overcome imaging artifacts, and segmented to enable cellular-scale feature extraction. The features are used to identify cell types, and perform large-scale analysis for identifying spatial distributions of specific cell types relative to the device. Python was used to build a server-based script (Dell 910 PowerEdge servers with 4 sockets/server with 10 cores each, 2 threads per core and 1TB of RAM running on Red Hat Enterprise Linux linked to a RAID 5 SAN) capable of routinely handling image datasets at this scale and performing all these processing steps in a collaborative multi-user multi-platform environment. Our Python script enables efficient data storage and movement between computers and storage servers, logs all the processing steps, and performs full multi-threaded execution of all codes, including open and closed-source third party libraries.
Global Distribution of Aerosols Over the Open Ocean as Derived from the Coastal Zone Color Scanner
NASA Technical Reports Server (NTRS)
Stegmann, P. M.; Tindale, N. W.
1999-01-01
Climatological maps of monthly mean aerosol radiance levels derived from the coastal zone color scanner (CZCS) were constructed for the world's ocean basins. This is the first study to use the 7.5.-year CZCS data set to examine the distribution and seasonality of aerosols over the open ocean on a global scale. Examination of our satellite images found the most prominent large-scale patch of elevated aerosol radiances in each month off the coast of northwest Africa. The well-known, large-scale plumes of elevated aerosol levels in the Arabian Sea, the northwest Pacific, and off the east coast of North America were also successfully captured. Radiance data were extracted from 13 major open-ocean zones, ranging from the subpolar to equatorial regions. Results from these extractions revealed the aerosol load in both subpolar and subtropical zones to be higher in the Northern Hemisphere than in the Southern Hemisphere. Aerosol radiances in the subtropics of both hemispheres were about 2 times higher in summer than in winter. In subpolar regions, aerosol radiances in late spring/early summer were almost 3 times that observed in winter. In general, the aerosol signal was higher during the warmer months and lower during the cooler months, irrespective of location. A comparison between our mean monthly aerosol radiance maps with mean monthly chlorophyll maps (also from CZCS) showed similar seasonality between aerosol and chlorophyll levels in the subpolar zones of both hemispheres, i.e., high levels in summer, low levels in winter. In the subtropics of both hemispheres, however, chlorophyll levels were higher in winter months which coincided with a depressed aerosol signal. Our results indicate that the near-IR channel on ocean color sensors can be used to successfully capture well-known, large-scale aerosol plumes on a global scale and that future ocean color sensors may provide a platform for long-term synoptic studies of combined aerosol-phytoplankton productivity interactions.
An Open-Source Galaxy Redshift Survey Simulator for next-generation Large Scale Structure Surveys
NASA Astrophysics Data System (ADS)
Seijak, Uros
Galaxy redshift surveys produce three-dimensional maps of the galaxy distribution. On large scales these maps trace the underlying matter fluctuations in a relatively simple manner, so that the properties of the primordial fluctuations along with the overall expansion history and growth of perturbations can be extracted. The BAO standard ruler method to measure the expansion history of the universe using galaxy redshift surveys is thought to be robust to observational artifacts and understood theoretically with high precision. These same surveys can offer a host of additional information, including a measurement of the growth rate of large scale structure through redshift space distortions, the possibility of measuring the sum of neutrino masses, tighter constraints on the expansion history through the Alcock-Paczynski effect, and constraints on the scale-dependence and non-Gaussianity of the primordial fluctuations. Extracting this broadband clustering information hinges on both our ability to minimize and subtract observational systematics to the observed galaxy power spectrum, and our ability to model the broadband behavior of the observed galaxy power spectrum with exquisite precision. Rapid development on both fronts is required to capitalize on WFIRST's data set. We propose to develop an open-source computational toolbox that will propel development in both areas by connecting large scale structure modeling and instrument and survey modeling with the statistical inference process. We will use the proposed simulator to both tailor perturbation theory and fully non-linear models of the broadband clustering of WFIRST galaxies and discover novel observables in the non-linear regime that are robust to observational systematics and able to distinguish between a wide range of spatial and dynamic biasing models for the WFIRST galaxy redshift survey sources. We have demonstrated the utility of this approach in a pilot study of the SDSS-III BOSS galaxies, in which we improved the redshift space distortion growth rate measurement precision by a factor of 2.5 using customized clustering statistics in the non-linear regime that were immunized against observational systematics. We look forward to addressing the unique challenges of modeling and empirically characterizing the WFIRST galaxies and observational systematics.
Developing Data Systems To Support the Analysis and Development of Large-Scale, On-Line Assessment.
ERIC Educational Resources Information Center
Yu, Chong Ho
Today many data warehousing systems are data rich, but information poor. Extracting useful information from an ocean of data to support administrative, policy, and instructional decisions becomes a major challenge to both database designers and measurement specialists. This paper focuses on the development of a data processing system that…
A Circuit Extraction System and Graphical Display for VLSI (Very Large Scale Integrated) Design.
1989-12-01
understandable as a net-list. The file contains information on the different physical layers of a polysilicon chip, not how these layers combine to form...yperc; struct vwsurf vsurf =DEFAULT_VWSURF(pixwt-ndd); stt-uct vwsurf vsurf2 DEFAULT-VWSURF(pixwfLndd); ma in) another[ Ol =IV while (anothler[0O = ’y
Applications of artificial intelligence systems in the analysis of epidemiological data.
Flouris, Andreas D; Duffy, Jack
2006-01-01
A brief review of the germane literature suggests that the use of artificial intelligence (AI) statistical algorithms in epidemiology has been limited. We discuss the advantages and disadvantages of using AI systems in large-scale sets of epidemiological data to extract inherent, formerly unidentified, and potentially valuable patterns that human-driven deductive models may miss.
Multiresolution persistent homology for excessively large biomolecular datasets
NASA Astrophysics Data System (ADS)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei
2015-10-01
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Linfeng
A literature survey has been conducted to collect information on the International R&D activities in the extraction of uranium from seawater for the period from the 1960s till the year of 2010. The reported activities, on both the laboratory scale bench experiments and the large scale marine experiments, were summarized by country/region in this report. Among all countries where such activities have been reported, Japan has carried out the most advanced large scale marine experiments with the amidoxime-based system, and achieved the collection efficiency (1.5 g-U/kg-adsorbent for 30 days soaking in the ocean) that could justify the development of industrialmore » scale marine systems to produce uranium from seawater at the price competitive with those from conventional uranium resources. R&D opportunities are discussed for improving the system performance (selectivity for uranium, loading capacity, chemical stability and mechanical durability in the sorption-elution cycle, and sorption kinetics) and making the collection of uranium from seawater more economically competitive.« less
Weak Lensing by Large-Scale Structure: A Dark Matter Halo Approach.
Cooray; Hu; Miralda-Escudé
2000-05-20
Weak gravitational lensing observations probe the spectrum and evolution of density fluctuations and the cosmological parameters that govern them, but they are currently limited to small fields and subject to selection biases. We show how the expected signal from large-scale structure arises from the contributions from and correlations between individual halos. We determine the convergence power spectrum as a function of the maximum halo mass and so provide the means to interpret results from surveys that lack high-mass halos either through selection criteria or small fields. Since shot noise from rare massive halos is mainly responsible for the sample variance below 10&arcmin;, our method should aid our ability to extract cosmological information from small fields.
Batalle, Dafnis; Muñoz-Moreno, Emma; Figueras, Francesc; Bargallo, Nuria; Eixarch, Elisenda; Gratacos, Eduard
2013-12-01
Obtaining individual biomarkers for the prediction of altered neurological outcome is a challenge of modern medicine and neuroscience. Connectomics based on magnetic resonance imaging (MRI) stands as a good candidate to exhaustively extract information from MRI by integrating the information obtained in a few network features that can be used as individual biomarkers of neurological outcome. However, this approach typically requires the use of diffusion and/or functional MRI to extract individual brain networks, which require high acquisition times and present an extreme sensitivity to motion artifacts, critical problems when scanning fetuses and infants. Extraction of individual networks based on morphological similarity from gray matter is a new approach that benefits from the power of graph theory analysis to describe gray matter morphology as a large-scale morphological network from a typical clinical anatomic acquisition such as T1-weighted MRI. In the present paper we propose a methodology to normalize these large-scale morphological networks to a brain network with standardized size based on a parcellation scheme. The proposed methodology was applied to reconstruct individual brain networks of 63 one-year-old infants, 41 infants with intrauterine growth restriction (IUGR) and 22 controls, showing altered network features in the IUGR group, and their association with neurodevelopmental outcome at two years of age by means of ordinal regression analysis of the network features obtained with Bayley Scale for Infant and Toddler Development, third edition. Although it must be more widely assessed, this methodology stands as a good candidate for the development of biomarkers for altered neurodevelopment in the pediatric population. © 2013 Elsevier Inc. All rights reserved.
Mathieu, Renaud; Aryal, Jagannath; Chong, Albert K
2007-11-20
Effective assessment of biodiversity in cities requires detailed vegetation maps.To date, most remote sensing of urban vegetation has focused on thematically coarse landcover products. Detailed habitat maps are created by manual interpretation of aerialphotographs, but this is time consuming and costly at large scale. To address this issue, wetested the effectiveness of object-based classifications that use automated imagesegmentation to extract meaningful ground features from imagery. We applied thesetechniques to very high resolution multispectral Ikonos images to produce vegetationcommunity maps in Dunedin City, New Zealand. An Ikonos image was orthorectified and amulti-scale segmentation algorithm used to produce a hierarchical network of image objects.The upper level included four coarse strata: industrial/commercial (commercial buildings),residential (houses and backyard private gardens), vegetation (vegetation patches larger than0.8/1ha), and water. We focused on the vegetation stratum that was segmented at moredetailed level to extract and classify fifteen classes of vegetation communities. The firstclassification yielded a moderate overall classification accuracy (64%, κ = 0.52), which ledus to consider a simplified classification with ten vegetation classes. The overallclassification accuracy from the simplified classification was 77% with a κ value close tothe excellent range (κ = 0.74). These results compared favourably with similar studies inother environments. We conclude that this approach does not provide maps as detailed as those produced by manually interpreting aerial photographs, but it can still extract ecologically significant classes. It is an efficient way to generate accurate and detailed maps in significantly shorter time. The final map accuracy could be improved by integrating segmentation, automated and manual classification in the mapping process, especially when considering important vegetation classes with limited spectral contrast.
ACME, a GIS tool for Automated Cirque Metric Extraction
NASA Astrophysics Data System (ADS)
Spagnolo, Matteo; Pellitero, Ramon; Barr, Iestyn D.; Ely, Jeremy C.; Pellicer, Xavier M.; Rea, Brice R.
2017-02-01
Regional scale studies of glacial cirque metrics provide key insights on the (palaeo) environment related to the formation of these erosional landforms. The growing availability of high resolution terrain models means that more glacial cirques can be identified and mapped in the future. However, the extraction of their metrics still largely relies on time consuming manual techniques or the combination of, more or less obsolete, GIS tools. In this paper, a newly coded toolbox is provided for the automated, and comparatively quick, extraction of 16 key glacial cirque metrics; including length, width, circularity, planar and 3D area, elevation, slope, aspect, plan closure and hypsometry. The set of tools, named ACME (Automated Cirque Metric Extraction), is coded in Python, runs in one of the most commonly used GIS packages (ArcGIS) and has a user friendly interface. A polygon layer of mapped cirques is required for all metrics, while a Digital Terrain Model and a point layer of cirque threshold midpoints are needed to run some of the tools. Results from ACME are comparable to those from other techniques and can be obtained rapidly, allowing large cirque datasets to be analysed and potentially important regional trends highlighted.
Jentzer, Jean-Baptiste; Alignan, Marion; Vaca-Garcia, Carlos; Rigal, Luc; Vilarem, Gérard
2015-01-01
Following the approval of steviol glycosides as a food additive in Europe in December 2011, large-scale stevia cultivation will have to be developed within the EU. Thus there is a need to increase the efficiency of stevia evaluation through germplasm enhancement and agronomic improvement programs. To address the need for faster and reproducible sample throughput, conditions for automated extraction of dried stevia leaves using Accelerated Solvent Extraction were optimised. A response surface methodology was used to investigate the influence of three factors: extraction temperature, static time and cycle number on the stevioside and rebaudioside A extraction yields. The model showed that all the factors had an individual influence on the yield. Optimum extraction conditions were set at 100 °C, 4 min and 1 cycle, which yielded 91.8% ± 3.4% of total extractable steviol glycosides analysed. An additional optimisation was achieved by reducing the grind size of the leaves giving a final yield of 100.8% ± 3.3%. Copyright © 2014 Elsevier Ltd. All rights reserved.
Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development
NASA Technical Reports Server (NTRS)
Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.;
2015-01-01
The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.
Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development
NASA Astrophysics Data System (ADS)
Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dünner, R.; Essinger-Hileman, T.; Eimer, J.; Fluxa, P.; Gothe, D.; Halpern, M.; Harrington, K.; Hilton, G.; Hinshaw, G.; Hubmayr, J.; Iuliano, J.; Marriage, T. A.; Miller, N.; Moseley, S. H.; Mumby, G.; Petroff, M.; Reintsema, C.; Rostem, K.; U-Yen, K.; Watts, D.; Wagner, E.; Wollack, E. J.; Xu, Z.; Zeng, L.
2016-08-01
The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe ˜ 70 % of the sky. A variable-delay polarization modulator provides modulation of the polarization at ˜ 10 Hz to suppress the 1/ f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.
Xu, Wei; Yan, Xiuhua; Shao, Rong; Chen, Ligen; Ke, Zengguang
Castor cake is the residue in castor oil production in which many active components exist and the major one among them is ricinine. In this study, optimization of extraction of ricinine from castor cake using ultrasonic-microwave synergistic extraction (UMSE) was investigated to obtain high yield and purity by Box-Behnken design (BBD) response surface design. The optimal conditions of extraction were: ultrasound power 342 W, extracting time 5 min, microwave power 395 W, and non-significant factor of liquid/solid ratio 1:10. The crude extraction was recrystallized from ethanol. As a result, the maximum yield of ricinine was approximately 67.52%. The purity of ricinine was 99.39% which was determined by high performance liquid chromatography (HPLC). Additionally, the structure of purified ricinine was identified by fourier transforms infrared (FTIR) and liquid chromatography-mass spectrometry (LC-MS). Scanning electron microscope (SEM) was used to characterize the prismatic crystals morphology of ricinine. Results demonstrated that the present method combined the advantages of ultrasonic extraction and microwave extraction, which is time-saving with high extraction yield. Our results offer a suitable method for large-scale isolation of ricinine.
Experimental quantification of nonlinear time scales in inertial wave rotating turbulence
NASA Astrophysics Data System (ADS)
Yarom, Ehud; Salhov, Alon; Sharon, Eran
2017-12-01
We study nonlinearities of inertial waves in rotating turbulence. At small Rossby numbers the kinetic energy in the system is contained in helical inertial waves with time dependence amplitudes. In this regime the amplitude variations time scales are slow compared to wave periods, and the spectrum is concentrated along the dispersion relation of the waves. A nonlinear time scale was extracted from the width of the spectrum, which reflects the intensity of nonlinear wave interactions. This nonlinear time scale is found to be proportional to (U.k ) -1, where k is the wave vector and U is the root-mean-square horizontal velocity, which is dominated by large scales. This correlation, which indicates the existence of turbulence in which inertial waves undergo weak nonlinear interactions, persists only for small Rossby numbers.
Life cycle environmental impacts of wastewater-based algal biofuels.
Mu, Dongyan; Min, Min; Krohn, Brian; Mullins, Kimberley A; Ruan, Roger; Hill, Jason
2014-10-07
Recent research has proposed integrating wastewater treatment with algae cultivation as a way of producing algal biofuels at a commercial scale more sustainably. This study evaluates the environmental performance of wastewater-based algal biofuels with a well-to-wheel life cycle assessment (LCA). Production pathways examined include different nutrient sources (municipal wastewater influent to the activated sludge process, centrate from the sludge drying process, swine manure, and freshwater with synthetic fertilizers) combined with emerging biomass conversion technologies (microwave pyrolysis, combustion, wet lipid extraction, and hydrothermal liquefaction). Results show that the environmental performance of wastewater-based algal biofuels is generally better than freshwater-based algal biofuels, but depends on the characteristics of the wastewater and the conversion technologies. Of 16 pathways compared, only the centrate cultivation with wet lipid extraction pathway and the centrate cultivation with combustion pathway have lower impacts than petroleum diesel in all environmental categories examined (fossil fuel use, greenhouse gas emissions, eutrophication potential, and consumptive water use). The potential for large-scale implementation of centrate-based algal biofuel, however, is limited by availability of centrate. Thus, it is unlikely that algal biofuels can provide a large-scale and environmentally preferable alternative to petroleum transportation fuels without considerable improvement in current production technologies. Additionally, the cobenefit of wastewater-based algal biofuel production as an alternate means of treating various wastewaters should be further explored.
Large Scale EOF Analysis of Climate Data
NASA Astrophysics Data System (ADS)
Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.
2016-12-01
We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.
Robust, non-invasive methods for metering groundwater well extraction in remote environments
NASA Astrophysics Data System (ADS)
Bulovic, Nevenka; Keir, Greg; McIntyre, Neil
2017-04-01
Quantifying the rate of extraction from groundwater wells can be essential for regional scale groundwater management and impact assessment. This is especially the case in regions heavily dependent on groundwater such as the semi-arid Surat and Bowen Basins in Queensland, Australia. Of the 30 000+ groundwater wells in this area, the majority of which are used for stock watering and domestic purposes, almost none have flow metering devices installed. As part of a research project to estimate regional groundwater extraction, we have undertaken a small scale flow metering program on a selected set of wells. Conventional in-line flow meters were unsuitable for our project, as both non-invasiveness and adaptability / suitability to a variety of discharge pipe characteristics was critical. We describe the use of two metering technologies not widely used in groundwater applications, non-invasive, clamp-on ultrasonic transit time flow meters and tipping bucket flow meters, as semi-permanent installations on discharge pipes of various artesian and sub-artesian groundwater wells. We present examples of detailed extraction rate time-series, which are of particular value in developing predictive models of water well extraction in data limited areas where water use dynamics and drivers are poorly understood. We conclude by discussing future project trajectories, which include expansion of the monitoring network through development of novel metering techniques and telemetry across large areas of poor connectivity.
Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette
2013-06-01
High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
van der Molen, J.; Ruardij, P.; Greenwood, N.
2015-12-01
A model study was carried out of the potential large-scale (> 100 km) effects of marine renewable tidal energy generation in the Pentland Firth, using the 3-D hydrodynamics-biogeochemistry model GETM-ERSEM-BFM. A realistic 800 MW scenario and an exaggerated academic 8 GW scenario were considered. The realistic 800 MW scenario suggested minor effects on the tides, and undetectable effects on the biogeochemistry. The academic 8 GW scenario suggested effects would be observed over hundreds of kilometres away with changes of up to 10 % in tidal and ecosystem variables, in particular in a broad area in the vicinity of The Wash. There, waters became less turbid, and primary production increased with associated increases in faunal ecosystem variables. Moreover, a one-off increase in carbon storage in the sea bed was detected. Although these first results suggest positive environmental effects, further investigation is recommended of: (i) the residual circulation in the vicinity of the Pentland Firth and effects on larval dispersal using a higher resolution model, (ii) ecosystem effects with (future) state-of-the-art models if energy extraction substantially beyond 1 GW is planned.
Huang, Wen-Can; Park, Chan Woo; Kim, Jong-Duk
2017-02-01
Although microalgae are considered promising renewable sources of biodiesel, the high cost of the downstream process is a significant obstacle in large-scale biodiesel production. In this study, a novel approach for microalgal biodiesel production was developed by using the biodiesel as an extractant. First, wet microalgae with 70% water content were incubated with a mixture of biodiesel/methanol and penetration of the mixture through the cell membrane and swelling of the lipids contained in microalgae was confirmed. Significant increases of lipid droplets were observed by confocal microscopy. Second, the swelled lipid droplets in microalgae were squeezed out using mechanical stress across the cell membrane and washed with methanol. The lipid extraction efficiency reached 68%. This process does not require drying of microalgae or solvent recovery, which the most energy-intensive step in solvent-based biodiesel production. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Modified polyurethane foam as a local hemostatic agent after dental extractions].
Selten, M H A; Broekema, F I; Zuidema, J; van Oeveren, W; Bos, R R M
2013-01-01
In this split mouth experiment, the feasibility ofpolyurethane foam as a local hemostatic agent after dental extractions was studied. Ten healthy patients underwent 2 extractions ofa dental element in 1 treatment session. The 10 patients were subsequently randomly divided in a gelatin group and a collagen group. In the gelatin group, a polyurethane foam (PU) was applied in 1 extraction socket, while in the other socket a commercially available gelatin foam was applied. In the collagen group, a PU was applied in 1 socket, and a collagen wadding in the other. All hemostats were removed after 2 minutes, after which the degree of coagulation was measured using a thrombin/antithrombin test and a fibrinogen test. This study suggests that polyurethane foam has hemostatic capacity. Large scale clinical research is needed to confirm this finding, and should indicate whether this hemostatic capacity is clinically relevant.
Struniawski, R; Szpechcinski, A; Poplawska, B; Skronski, M; Chorostowska-Wynimko, J
2013-01-01
The dried blood spot (DBS) specimens have been successfully employed for the large-scale diagnostics of α1-antitrypsin (AAT) deficiency as an easy to collect and transport alternative to plasma/serum. In the present study we propose a fast, efficient, and cost effective protocol of DNA extraction from dried blood spot (DBS) samples that provides sufficient quantity and quality of DNA and effectively eliminates any natural PCR inhibitors, allowing for successful AAT genotyping by real-time PCR and direct sequencing. DNA extracted from 84 DBS samples from chronic obstructive pulmonary disease patients was genotyped for AAT deficiency variants by real-time PCR. The results of DBS AAT genotyping were validated by serum IEF phenotyping and AAT concentration measurement. The proposed protocol allowed successful DNA extraction from all analyzed DBS samples. Both quantity and quality of DNA were sufficient for further real-time PCR and, if necessary, for genetic sequence analysis. A 100% concordance between AAT DBS genotypes and serum phenotypes in positive detection of two major deficiency S- and Z- alleles was achieved. Both assays, DBS AAT genotyping by real-time PCR and serum AAT phenotyping by IEF, positively identified PI*S and PI*Z allele in 8 out of the 84 (9.5%) and 16 out of 84 (19.0%) patients, respectively. In conclusion, the proposed protocol noticeably reduces the costs and the hand-on-time of DBS samples preparation providing genomic DNA of sufficient quantity and quality for further real-time PCR or genetic sequence analysis. Consequently, it is ideally suited for large-scale AAT deficiency screening programs and should be method of choice.
Automated control of robotic camera tacheometers for measurements of industrial large scale objects
NASA Astrophysics Data System (ADS)
Heimonen, Teuvo; Leinonen, Jukka; Sipola, Jani
2013-04-01
The modern robotic tacheometers equipped with digital cameras (called also imaging total stations) and capable to measure reflectorless offer new possibilities to gather 3d data. In this paper an automated approach for the tacheometer measurements needed in the dimensional control of industrial large scale objects is proposed. There are two new contributions in the approach: the automated extraction of the vital points (i.e. the points to be measured) and the automated fine aiming of the tacheometer. The proposed approach proceeds through the following steps: First the coordinates of the vital points are automatically extracted from the computer aided design (CAD) data. The extracted design coordinates are then used to aim the tacheometer to point out to the designed location of the points, one after another. However, due to the deviations between the designed and the actual location of the points, the aiming need to be adjusted. An automated dynamic image-based look-and-move type servoing architecture is proposed to be used for this task. After a successful fine aiming, the actual coordinates of the point in question can be automatically measured by using the measuring functionalities of the tacheometer. The approach was validated experimentally and noted to be feasible. On average 97 % of the points actually measured in four different shipbuilding measurement cases were indeed proposed to be vital points by the automated extraction algorithm. The accuracy of the results obtained with the automatic control method of the tachoemeter were comparable to the results obtained with the manual control, and also the reliability of the image processing step of the method was found to be high in the laboratory experiments.
Antihepatotoxic Effect and Metabolite Profiling of Panicum turgidum Extract via UPLC-qTOF-MS.
Farag, Mohamed A; El Fishawy, Ahlam M; El-Toumy, Sayed A; Amer, Khadiga F; Mansour, Ahmed M; Taha, Hala E
2016-07-01
Panicum turgidum , desert grass, has not reported any detailed phytochemical or biological study as yet. To establish P. turgidum secondary metabolite profile and to assess its antihepatotoxic effect. Ultra-performance liquid chromatography (UPLC) coupled to quadrupole high-resolution time of flight mass spectrometry (qTOF-MS) was used for large-scale secondary metabolites profiling in P. turgidum extract, alongside assessing median lethal dose (LD 50 ) and hepatoprotective effect against carbon tetrachloride (CCl 4 ) intoxication. A total of 39 metabolites were identified with flavonoids as the major class present as O/C -glycosides of luteolin, apigenin, isorhamnetin and naringenin, most of which are first time to be reported in Panicum sp. Antihepatotoxic effect of P. turgidum crude extract was revealed via improving several biochemical marker levels and mitigation against oxidative stress in the serum and liver tissues, compared with CCl4 intoxicated group and further confirmed by histopathological examination. This study reveals that P. turgidum , enriched in C -flavonoids, presents a novel source of safe antihepatotoxic agents and further demonstrates the efficacy of UPLC-MS metabolomics in the field of natural products drug discovery. UPLC coupled to qTOF-MS was used for large scale secondary metabolites profiling in P. turgidum .A total of 39 metabolites were identified with flavonoids amounting as the major metabolite class.Anti-hepatotoxic effect of P. turgidum extract was revealed via several biochemical markers and histopathological examination.This study reveals that P. turgidum , enriched in C -flavonoids, present a novel source of antihepatotoxic agents. Abbreviations used: UPLC: Ultra-performance liquid chromatography (UPLC), LD50: median lethal dose, MDA: malondialdehyde, GSH: glutathione reductase, CAT: catalase, SOD: superoxide dismutase, ALT: alanine aminotransferase, AST: aspartate aminotransferase, ALP: alkaline phosphatase, TG: triglycerides.
Antihepatotoxic Effect and Metabolite Profiling of Panicum turgidum Extract via UPLC-qTOF-MS
Farag, Mohamed A.; El Fishawy, Ahlam M.; El-Toumy, Sayed A.; Amer, Khadiga F.; Mansour, Ahmed M.; Taha, Hala E.
2016-01-01
Background: Panicum turgidum, desert grass, has not reported any detailed phytochemical or biological study as yet Objective: To establish P. turgidum secondary metabolite profile and to assess its antihepatotoxic effect Materials and Methods: Ultra-performance liquid chromatography (UPLC) coupled to quadrupole high-resolution time of flight mass spectrometry (qTOF-MS) was used for large-scale secondary metabolites profiling in P. turgidum extract, alongside assessing median lethal dose (LD50) and hepatoprotective effect against carbon tetrachloride (CCl4) intoxication Results: A total of 39 metabolites were identified with flavonoids as the major class present as O/C-glycosides of luteolin, apigenin, isorhamnetin and naringenin, most of which are first time to be reported in Panicum sp. Antihepatotoxic effect of P. turgidum crude extract was revealed via improving several biochemical marker levels and mitigation against oxidative stress in the serum and liver tissues, compared with CCl4 intoxicated group and further confirmed by histopathological examination. Conclusion: This study reveals that P. turgidum, enriched in C-flavonoids, presents a novel source of safe antihepatotoxic agents and further demonstrates the efficacy of UPLC-MS metabolomics in the field of natural products drug discovery. SUMMARY UPLC coupled to qTOF-MS was used for large scale secondary metabolites profiling in P. turgidum.A total of 39 metabolites were identified with flavonoids amounting as the major metabolite class.Anti-hepatotoxic effect of P. turgidum extract was revealed via several biochemical markers and histopathological examination.This study reveals that P. turgidum, enriched in C-flavonoids, present a novel source of antihepatotoxic agents. Abbreviations used: UPLC: Ultra-performance liquid chromatography (UPLC), LD50: median lethal dose, MDA: malondialdehyde, GSH: glutathione reductase, CAT: catalase, SOD: superoxide dismutase, ALT: alanine aminotransferase, AST: aspartate aminotransferase, ALP: alkaline phosphatase, TG: triglycerides. PMID:27761073
Transition from lognormal to χ2-superstatistics for financial time series
NASA Astrophysics Data System (ADS)
Xu, Dan; Beck, Christian
2016-07-01
Share price returns on different time scales can be well modelled by a superstatistical dynamics. Here we provide an investigation which type of superstatistics is most suitable to properly describe share price dynamics on various time scales. It is shown that while χ2-superstatistics works well on a time scale of days, on a much smaller time scale of minutes the price changes are better described by lognormal superstatistics. The system dynamics thus exhibits a transition from lognormal to χ2 superstatistics as a function of time scale. We discuss a more general model interpolating between both statistics which fits the observed data very well. We also present results on correlation functions of the extracted superstatistical volatility parameter, which exhibits exponential decay for returns on large time scales, whereas for returns on small time scales there are long-range correlations and power-law decay.
Olatunji, Bunmi O; Kim, Se-Kang; Wall, David
2015-09-01
The present study employs Profile Analysis via Multidimensional Scaling (PAMS), a procedure for extracting dimensions, in order to identify core eating disorder symptoms in a clinical sample. A large sample of patients with eating disorders (N=5193) presenting for treatment completed the Eating Disorders Inventory-2 (EDI-2; Garner, 1991), and PAMS was then employed to estimate individual profile weights that reflect the degree to which an individual's observed symptom profile approximates the pattern of the dimensions. The findings revealed three symptom dimensions: Body Thinness, Body Perfectionism, and Body Awareness. Subsequent analysis using individual level data illustrate that the PAMS profiles properly operate as prototypical profiles that encapsulate all individuals' response patterns. The implications of these dimensional findings for the assessment and diagnosis of eating disorders are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.
Extracting and Comparing Places Using Geo-Social Media
NASA Astrophysics Data System (ADS)
Ostermann, F. O.; Huang, H.; Andrienko, G.; Andrienko, N.; Capineri, C.; Farkas, K.; Purves, R. S.
2015-08-01
Increasing availability of Geo-Social Media (e.g. Facebook, Foursquare and Flickr) has led to the accumulation of large volumes of social media data. These data, especially geotagged ones, contain information about perception of and experiences in various environments. Harnessing these data can be used to provide a better understanding of the semantics of places. We are interested in the similarities or differences between different Geo-Social Media in the description of places. This extended abstract presents the results of a first step towards a more in-depth study of semantic similarity of places. Particularly, we took places extracted through spatio-temporal clustering from one data source (Twitter) and examined whether their structure is reflected semantically in another data set (Flickr). Based on that, we analyse how the semantic similarity between places varies over space and scale, and how Tobler's first law of geography holds with regards to scale and places.
Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng
2017-04-10
This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks.
Ma, Xiaolei; Dai, Zhuang; He, Zhengbing; Ma, Jihui; Wang, Yong; Wang, Yunpeng
2017-01-01
This paper proposes a convolutional neural network (CNN)-based method that learns traffic as images and predicts large-scale, network-wide traffic speed with a high accuracy. Spatiotemporal traffic dynamics are converted to images describing the time and space relations of traffic flow via a two-dimensional time-space matrix. A CNN is applied to the image following two consecutive steps: abstract traffic feature extraction and network-wide traffic speed prediction. The effectiveness of the proposed method is evaluated by taking two real-world transportation networks, the second ring road and north-east transportation network in Beijing, as examples, and comparing the method with four prevailing algorithms, namely, ordinary least squares, k-nearest neighbors, artificial neural network, and random forest, and three deep learning architectures, namely, stacked autoencoder, recurrent neural network, and long-short-term memory network. The results show that the proposed method outperforms other algorithms by an average accuracy improvement of 42.91% within an acceptable execution time. The CNN can train the model in a reasonable time and, thus, is suitable for large-scale transportation networks. PMID:28394270
When micro meets macro: microbial lipid analysis and ecosystem ecology
NASA Astrophysics Data System (ADS)
Balser, T.; Gutknecht, J.
2008-12-01
There is growing interest in linking soil microbial community composition and activity with large-scale field studies of nutrient cycling or plant community response to disturbances. And while analysis of microbial communities has moved rapidly in the past decade from culture-based to non-culture based techniques, still it must be asked what have we gained from the move? How well does the necessarily micro-scale of microbial analysis allow us to address questions of interest at the macro-scale? Several challenges exist in bridging the scales, and foremost is the question of methodological feasibility. Past microbiological methodologies have not been readily adaptable to the large sample sizes necessary for ecosystem-scale research. As a result, it has been difficult to generate compatible microbial and ecosystem data sets. We describe the use of a modified lipid extraction method to generate microbial community data sets that allow us to match landscape-scale or long-term ecological studies with microbial community data. We briefly discuss the challenges and advantages associated with lipid analysis as an approach to addressing ecosystem ecological studies, and provide examples from our research in ecosystem restoration and recovery following disturbance and climate change.
Regional climate model sensitivity to domain size
NASA Astrophysics Data System (ADS)
Leduc, Martin; Laprise, René
2009-05-01
Regional climate models are increasingly used to add small-scale features that are not present in their lateral boundary conditions (LBC). It is well known that the limited area over which a model is integrated must be large enough to allow the full development of small-scale features. On the other hand, integrations on very large domains have shown important departures from the driving data, unless large scale nudging is applied. The issue of domain size is studied here by using the “perfect model” approach. This method consists first of generating a high-resolution climatic simulation, nicknamed big brother (BB), over a large domain of integration. The next step is to degrade this dataset with a low-pass filter emulating the usual coarse-resolution LBC. The filtered nesting data (FBB) are hence used to drive a set of four simulations (LBs for Little Brothers), with the same model, but on progressively smaller domain sizes. The LB statistics for a climate sample of four winter months are compared with BB over a common region. The time average (stationary) and transient-eddy standard deviation patterns of the LB atmospheric fields generally improve in terms of spatial correlation with the reference (BB) when domain gets smaller. The extraction of the small-scale features by using a spectral filter allows detecting important underestimations of the transient-eddy variability in the vicinity of the inflow boundary, which can penalize the use of small domains (less than 100 × 100 grid points). The permanent “spatial spin-up” corresponds to the characteristic distance that the large-scale flow needs to travel before developing small-scale features. The spin-up distance tends to grow in size at higher levels in the atmosphere.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis
Knecht, Avi C.; Campbell, Malachy T.; Caprez, Adam; Swanson, David R.; Walia, Harkamal
2016-01-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. PMID:27141917
Robust Long-Range Coordination of Spontaneous Neural Activity in Waking, Sleep and Anesthesia.
Liu, Xiao; Yanagawa, Toru; Leopold, David A; Fujii, Naotaka; Duyn, Jeff H
2015-09-01
Although the emerging field of functional connectomics relies increasingly on the analysis of spontaneous fMRI signal covariation to infer the spatial fingerprint of the brain's large-scale functional networks, the nature of the underlying neuro-electrical activity remains incompletely understood. In part, this lack in understanding owes to the invasiveness of electrophysiological acquisition, the difficulty in their simultaneous recording over large cortical areas, and the absence of fully established methods for unbiased extraction of network information from these data. Here, we demonstrate a novel, data-driven approach to analyze spontaneous signal variations in electrocorticographic (ECoG) recordings from nearly entire hemispheres of macaque monkeys. Based on both broadband analysis and analysis of specific frequency bands, the ECoG signals were found to co-vary in patterns that resembled the fMRI networks reported in previous studies. The extracted patterns were robust against changes in consciousness associated with sleep and anesthesia, despite profound changes in intrinsic characteristics of the raw signals, including their spectral signatures. These results suggest that the spatial organization of large-scale brain networks results from neural activity with a broadband spectral feature and is a core aspect of the brain's physiology that does not depend on the state of consciousness. Published by Oxford University Press 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.
Sweeten, Sara E.; Ford, W. Mark
2016-01-01
Large-scale coal mining practices, particularly surface coal extraction and associated valley fills as well as residential wastewater discharge, are of ecological concern for aquatic systems in central Appalachia. Identifying and quantifying alterations to ecosystems along a gradient of spatial scales is a necessary first-step to aid in mitigation of negative consequences to aquatic biota. In central Appalachian headwater streams, apart from fish, salamanders are the most abundant vertebrate predator that provide a significant intermediate trophic role linking aquatic and terrestrial food webs. Stream salamander species are considered to be sensitive to aquatic stressors and environmental alterations, as past research has shown linkages among microhabitat parameters, large-scale land use such as urbanization and logging, and salamander abundances. However, there is little information examining these relationships between environmental conditions and salamander occupancy in the coalfields of central Appalachia. In the summer of 2013, 70 sites (sampled two to three times each) in the southwest Virginia coalfields were visited to collect salamanders and quantify stream and riparian microhabitat parameters. Using an information-theoretic framework, effects of microhabitat and large-scale land use on stream salamander occupancy were compared. The findings indicate that Desmognathus spp. occupancy rates are more correlated to microhabitat parameters such as canopy cover than to large-scale land uses. However, Eurycea spp. occupancy rates had a strong association with large-scale land uses, particularly recent mining and forest cover within the watershed. These findings suggest that protection of riparian habitats is an important consideration for maintaining aquatic systems in central Appalachia. If this is not possible, restoration riparian areas should follow guidelines using quick-growing tree species that are native to Appalachian riparian areas. These types of trees would rapidly establish a canopy cover, stabilize the soil, and impede invasive plant species which would, in turn, provide high-quality refuges for stream salamanders.
Bioethanol production from raffinate phase of supercritical CO2 extracted Stevia rebaudiana leaves.
Coban, Isik; Sargin, Sayit; Celiktas, Melih Soner; Yesil-Celiktas, Ozlem
2012-09-01
The extracts of Stevia rebaudiana are marketed as dietary supplements and utilized as natural sweetening agent in food products. Subsequent to extraction on industrial scale, large quantities of solid wastes are produced. The aim of this study was to investigate the bioconversion efficiency of supercritical CO(2) extracted S. rebaudiana residues. Therefore, leaves were extracted with supercritical CO(2) and ethanol mixture in order to obtain glycosides, then the raffinate phase was hydrolyzed by both dilute acid and various concentrations of cellulase and β-glucosidase cocktail. The maximum yield of reducing sugars reached 25.67 g/L under the optimal conditions of enzyme pretreatment, whereas 32.00 g/L was reached by consecutive enzymatic and acid hydrolyses. Bioethanol yield (20 g/L, 2.0% inoculum, 2 days) based on the sugar consumed was 45.55% corresponding to a productivity of 0.19 kg/m(3)h which demonstrates challenges to be utilized as a potential feedstock for the production of bioethanol. Copyright © 2012 Elsevier Ltd. All rights reserved.
Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds
Poojary, Mahesha M.; Barba, Francisco J.; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A.; Juliano, Pablo
2016-01-01
Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability. PMID:27879659
Innovative Alternative Technologies to Extract Carotenoids from Microalgae and Seaweeds.
Poojary, Mahesha M; Barba, Francisco J; Aliakbarian, Bahar; Donsì, Francesco; Pataro, Gianpiero; Dias, Daniel A; Juliano, Pablo
2016-11-22
Marine microalgae and seaweeds (microalgae) represent a sustainable source of various bioactive natural carotenoids, including β-carotene, lutein, astaxanthin, zeaxanthin, violaxanthin and fucoxanthin. Recently, the large-scale production of carotenoids from algal sources has gained significant interest with respect to commercial and industrial applications for health, nutrition, and cosmetic applications. Although conventional processing technologies, based on solvent extraction, offer a simple approach to isolating carotenoids, they suffer several, inherent limitations, including low efficiency (extraction yield), selectivity (purity), high solvent consumption, and long treatment times, which have led to advancements in the search for innovative extraction technologies. This comprehensive review summarizes the recent trends in the extraction of carotenoids from microalgae and seaweeds through the assistance of different innovative techniques, such as pulsed electric fields, liquid pressurization, supercritical fluids, subcritical fluids, microwaves, ultrasounds, and high-pressure homogenization. In particular, the review critically analyzes technologies, characteristics, advantages, and shortcomings of the different innovative processes, highlighting the differences in terms of yield, selectivity, and economic and environmental sustainability.
Tian, Ruijun; Jin, Jing; Taylor, Lorne; Larsen, Brett; Quaggin, Susan E; Pawson, Tony
2013-04-01
Gangliosides are ubiquitous components of cell membranes. Their interactions with bacterial toxins and membrane-associated proteins (e.g. receptor tyrosine kinases) have important roles in the regulation of multiple cellular functions. Currently, an effective approach for measuring ganglioside-protein interactions especially in a large-scale fashion is largely missing. To this end, we report a facile MS-based approach to explore gangliosides extracted from cells and measure their interactions with protein of interest globally. We optimized a two-step protocol for extracting total gangliosides from cells within 2 h. Easy-to-use magnetic beads conjugated with a protein of interest were used to capture interacting gangliosides. To measure ganglioside-protein interaction on a global scale, we applied a high-sensitive LC-MS system, containing hydrophilic interaction LC separation and multiple reaction monitoring-based MS for ganglioside detection. Sensitivity for ganglioside GM1 is below 100 pg, and the whole analysis can be done in 20 min with isocratic elution. To measure ganglioside interactions with soluble vascular endothelial growth factor receptor 1 (sFlt1), we extracted and readily detected 36 species of gangliosides from perivascular retinal pigment epithelium cells across eight different classes. Twenty-three ganglioside species have significant interactions with sFlt1 as compared with IgG control based on p value cutoff <0.05. These results show that the described method provides a rapid and high-sensitive approach for systematically measuring ganglioside-protein interactions. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willcox, Karen; Marzouk, Youssef
2013-11-12
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of themore » SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghattas, Omar
2013-10-15
The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUAROmore » Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.« less
Colorado State Capitol Geothermal project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shepherd, Lance
Colorado State Capitol Geothermal Project - Final report is redacted due to space constraints. This project was an innovative large-scale ground-source heat pump (GSHP) project at the Colorado State Capitol in Denver, Colorado. The project employed two large wells on the property. One for pulling water from the aquifer, and another for returning the water to the aquifer, after performing the heat exchange. The two wells can work in either direction. Heat extracted/added to the water via a heat exchanger is used to perform space conditioning in the building.
Serag, Ahmed; Blesa, Manuel; Moore, Emma J; Pataky, Rozalia; Sparrow, Sarah A; Wilkinson, A G; Macnaught, Gillian; Semple, Scott I; Boardman, James P
2016-03-24
Accurate whole-brain segmentation, or brain extraction, of magnetic resonance imaging (MRI) is a critical first step in most neuroimage analysis pipelines. The majority of brain extraction algorithms have been developed and evaluated for adult data and their validity for neonatal brain extraction, which presents age-specific challenges for this task, has not been established. We developed a novel method for brain extraction of multi-modal neonatal brain MR images, named ALFA (Accurate Learning with Few Atlases). The method uses a new sparsity-based atlas selection strategy that requires a very limited number of atlases 'uniformly' distributed in the low-dimensional data space, combined with a machine learning based label fusion technique. The performance of the method for brain extraction from multi-modal data of 50 newborns is evaluated and compared with results obtained using eleven publicly available brain extraction methods. ALFA outperformed the eleven compared methods providing robust and accurate brain extraction results across different modalities. As ALFA can learn from partially labelled datasets, it can be used to segment large-scale datasets efficiently. ALFA could also be applied to other imaging modalities and other stages across the life course.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Zhaoqing; Wang, Taiping; Copping, Andrea E.
Understanding and providing proactive information on the potential for tidal energy projects to cause changes to the physical system and to key water quality constituents in tidal waters is a necessary and cost-effective means to avoid costly regulatory involvement and late stage surprises in the permitting process. This paper presents a modeling study for evaluating the tidal energy extraction and its potential impacts on the marine environment in a real world site - Tacoma Narrows of Puget Sound, Washington State, USA. An unstructured-grid coastal ocean model, fitted with a module that simulates tidal energy devices, was applied to simulate themore » tidal energy extracted by different turbine array configurations and the potential effects of the extraction at local and system-wide scales in Tacoma Narrows and South Puget Sound. Model results demonstrated the advantage of an unstructured-grid model for simulating the far-field effects of tidal energy extraction in a large model domain, as well as assessing the near-field effect using a fine grid resolution near the tidal turbines. The outcome shows that a realistic near-term deployment scenario extracts a very small fraction of the total tidal energy in the system and that system wide environmental effects are not likely; however, near-field effects on the flow field and bed shear stress in the area of tidal turbine farm are more likely. Model results also indicate that from a practical standpoint, hydrodynamic or water quality effects are not likely to be the limiting factor for development of large commercial-scale tidal farms. Results indicate that very high numbers of turbines are required to significantly alter the tidal system; limitations on marine space or other environmental concerns are likely to be reached before reaching these deployment levels. These findings show that important information obtained from numerical modeling can be used to inform regulatory and policy processes for tidal energy development.« less
A Multiscale Survival Process for Modeling Human Activity Patterns.
Zhang, Tianyang; Cui, Peng; Song, Chaoming; Zhu, Wenwu; Yang, Shiqiang
2016-01-01
Human activity plays a central role in understanding large-scale social dynamics. It is well documented that individual activity pattern follows bursty dynamics characterized by heavy-tailed interevent time distributions. Here we study a large-scale online chatting dataset consisting of 5,549,570 users, finding that individual activity pattern varies with timescales whereas existing models only approximate empirical observations within a limited timescale. We propose a novel approach that models the intensity rate of an individual triggering an activity. We demonstrate that the model precisely captures corresponding human dynamics across multiple timescales over five orders of magnitudes. Our model also allows extracting the population heterogeneity of activity patterns, characterized by a set of individual-specific ingredients. Integrating our approach with social interactions leads to a wide range of implications.
Discovery of a large-scale clumpy structure of the Lynx supercluster at z[similar]1.27
NASA Astrophysics Data System (ADS)
Nakata, Fumiaki; Kodama, Tadayuki; Shimasaku, Kazuhiro; Doi, Mamoru; Furusawa, Hisanori; Hamabe, Masaru; Kimura, Masahiko; Komiyama, Yutaka; Miyazaki, Satoshi; Okamura, Sadanori; Ouchi, Masami; Sekiguchi, Maki; Yagi, Masafumi; Yasuda, Naoki
2004-07-01
We report the discovery of a probable large-scale structure composed of many galaxy clumps around the known twin clusters at z=1.26 and z=1.27 in the Lynx region. Our analysis is based on deep, panoramic, and multi-colour imaging with the Suprime-Cam on the 8.2 m Subaru telescope. We apply a photometric redshift technique to extract plausible cluster members at z˜1.27 down to ˜ M*+2.5. From the 2-D distribution of these photometrically selected galaxies, we newly identify seven candidates of galaxy groups or clusters where the surface density of red galaxies is significantly high (>5σ), in addition to the two known clusters, comprising the largest most distant supercluster ever identified.
Topological structure dynamics revealing collective evolution in active nematics
Shi, Xia-qing; Ma, Yu-qiang
2013-01-01
Topological defects frequently emerge in active matter like bacterial colonies, cytoskeleton extracts on substrates, self-propelled granular or colloidal layers and so on, but their dynamical properties and the relations to large-scale organization and fluctuations in these active systems are seldom touched. Here we reveal, through a simple model for active nematics using self-driven hard elliptic rods, that the excitation, annihilation and transportation of topological defects differ markedly from those in non-active media. These dynamical processes exhibit strong irreversibility in active nematics in the absence of detailed balance. Moreover, topological defects are the key factors in organizing large-scale dynamic structures and collective flows, resulting in multi-spatial temporal effects. These findings allow us to control the self-organization of active matter through topological structures. PMID:24346733
Strategy for large-scale isolation of enantiomers in drug discovery.
Leek, Hanna; Thunberg, Linda; Jonson, Anna C; Öhlén, Kristina; Klarqvist, Magnus
2017-01-01
A strategy for large-scale chiral resolution is illustrated by the isolation of pure enantiomer from a 5kg batch. Results from supercritical fluid chromatography will be presented and compared with normal phase liquid chromatography. Solubility of the compound in the supercritical mobile phase was shown to be the limiting factor. To circumvent this, extraction injection was used but shown not to be efficient for this compound. Finally, a method for chiral resolution by crystallization was developed and applied to give diastereomeric salt with an enantiomeric excess of 99% at a 91% yield. Direct access to a diverse separation tool box will be shown to be essential for solving separation problems in the most cost and time efficient way. Copyright © 2016 Elsevier Ltd. All rights reserved.
Guo, C; Gynn, M; Chang, T M S
2015-06-01
We report a novel method to simultaneously extract superoxide dismutase (SOD), catalase (CAT), and carbonic anhydrase (CA) from the same sample of red blood cells (RBCs). This avoids the need to use expensive commercial enzymes, thus enabling a cost-effective process for large-scale production of a nanobiotechnological polyHb-SOD-CAT-CA complex, with enhancement of all three red blood cell functions. An optimal concentration of phosphate buffer for ethanol-chloroform treatment results in good recovery of CAT, SOD, and CA after extraction. Different concentrations of the enzymes can be used to enhance the activity of polyHb-SOD-CAT-CA to 2, 4, or 6 times that of RBC.
A facile and green preparation of reduced graphene oxide using Eucalyptus leaf extract
NASA Astrophysics Data System (ADS)
Li, Chengyang; Zhuang, Zechao; Jin, Xiaoying; Chen, Zuliang
2017-11-01
In this paper, a green and facile synthesis of reduced graphene oxide (GO) by Eucalyptus leaf extract (EL-RGO) was investigated, which was characterized with ultraviolet-visible spectroscopy (UV), Raman spectroscopy, X-ray diffraction (XRD), scanning electron microscope (SEM), atomic force microscopy (AFM), X-ray photoelectron spectroscopy (XPS) and Thermal gravimetric analysis (TG). Eucalyptus leaf extract also play both reducing and capping stabilizing agents prepared EL-RGO as shown a good stability and electrochemical properties. This approach could provide an alternative method to prepare EL-RGO in large-scale production. Moreover, the good electrochemical property and biocompatibility can be used in various applications. In addition, the merit of this study is that both the oxidized products and the reducing agents are environmental friendly by green reduction.
Structure of high and low shear-stress events in a turbulent boundary layer
NASA Astrophysics Data System (ADS)
Gomit, G.; de Kat, R.; Ganapathisubramani, B.
2018-01-01
Simultaneous particle image velocimetry (PIV) and wall-shear-stress sensor measurements were performed to study structures associated with shear-stress events in a flat plate turbulent boundary layer at a Reynolds number Reτ≈4000 . The PIV field of view covers 8 δ (where δ is the boundary layer thickness) along the streamwise direction and captures the entire boundary layer in the wall-normal direction. Simultaneously, wall-shear-stress measurements that capture the large-scale fluctuations were taken using a spanwise array of hot-film skin-friction sensors (spanning 2 δ ). Based on this combination of measurements, the organization of the conditional wall-normal and streamwise velocity fluctuations (u and v ) and of the Reynolds shear stress (-u v ) can be extracted. Conditional averages of the velocity field are computed by dividing the histogram of the large-scale wall-shear-stress fluctuations into four quartiles, each containing 25% of the occurrences. The conditional events corresponding to the extreme quartiles of the histogram (positive and negative) predominantly contribute to a change of velocity profile associated with the large structures and in the modulation of the small scales. A detailed examination of the Reynolds shear-stress contribution related to each of the four quartiles shows that the flow above a low wall-shear-stress event carries a larger amount of Reynolds shear stress than the other quartiles. The contribution of the small and large scales to this observation is discussed based on a scale decomposition of the velocity field.
Downstream Processing of Synechocystis for Biofuel Production
NASA Astrophysics Data System (ADS)
Sheng, Jie
Lipids and free fatty acids (FFA) from cyanobacterium Synechocystis can be used for biofuel (e.g. biodiesel or renewable diesel) production. In order to utilize and scale up this technique, downstream processes including culturing and harvest, cell disruption, and extraction were studied. Several solvents/solvent systems were screened for lipid extraction from Synechocystis. Chloroform + methanol-based Folch and Bligh & Dyer methods were proved to be "gold standard" for small-scale analysis due to their highest lipid recoveries that were confirmed by their penetration of the cell membranes, higher polarity, and stronger interaction with hydrogen bonds. Less toxic solvents, such as methanol and MTBE, or direct transesterification of biomass (without preextraction step) gave only slightly lower lipid-extraction yields and can be considered for large-scale application. Sustained exposure to high and low temperature extremes severely lowered the biomass and lipid productivity. Temperature stress also triggered changes of lipid quality such as the degree of unsaturation; thus, it affected the productivities and quality of Synechocystis-derived biofuel. Pulsed electric field (PEF) was evaluated for cell disruption prior to lipid extraction. A treatment intensity > 35 kWh/m3 caused significant damage to the plasma membrane, cell wall, and thylakoid membrane, and it even led to complete disruption of some cells into fragments. Treatment by PEF enhanced the potential for the low-toxicity solvent isopropanol to access lipid molecules during subsequent solvent extraction, leading to lower usage of isopropanol for the same extraction efficiency. Other cell-disruption methods also were tested. Distinct disruption effects to the cell envelope, plasma membrane, and thylakoid membranes were observed that were related to extraction efficiency. Microwave and ultrasound had significant enhancement of lipid extraction. Autoclaving, ultrasound, and French press caused significant release of lipid into the medium, which may increase solvent usage and make medium recycling difficult. Production of excreted FFA by mutant Synechocystis has the potential of reducing the complexity of downstream processing. Major problems, such as FFA precipitation and biodegradation by scavengers, account for FFA loss in operation. Even a low concentration of FFA scavengers could consume FFA at a high rate that outpaced FFA production rate. Potential strategies to overcome FFA loss include high pH, adsorptive resin, and sterilization techniques.
Prendergast, Martina M.; Kosunen, Timo U.; Moran, Anthony P.
2001-01-01
Mimicry of peripheral nerve gangliosides by Campylobacter jejuni lipopolysaccharides (LPSs) has been proposed to induce cross-reacting antiganglioside antibodies in Guillain-Barré syndrome (GBS). Because current methods for LPS characterization are labor-intensive and inhibit the screening of large numbers of strains, a rapid GM1 epitope screening assay was developed. Biomass from two agar plates of confluent growth yielded sufficient LPS using a novel phenol-water and ether extraction procedure. Extracts of LPS were reacted with cholera toxin (GM1 ligand), peanut agglutinin (Galβ1→3GalNAc ligand), and anti-GM1 antibodies. After the assay was validated, 12 of 59 (20%) C. jejuni serostrains, including four serotypes that have not previously been associated with GBS, reacted with two or more anti-GM1 ganglioside reagents. Subsequently, LPS extracts from 5 of 7 (71%) C. jejuni isolates and 2 of 3 (67%) C. jejuni culture collection strains bore GM1 structures. Overall, the assay system was reliable, efficient, and reproducible and may be adapted for large-scale epidemiological studies. PMID:11283076
Enzyme assisted extraction of biomolecules as an approach to novel extraction technology: A review.
Nadar, Shamraja S; Rao, Priyanka; Rathod, Virendra K
2018-06-01
An interest in the development of extraction techniques of biomolecules from various natural sources has increased in recent years due to their potential applications particularly for food and nutraceutical purposes. The presence of polysaccharides such as hemicelluloses, starch, pectin inside the cell wall, reduces the extraction efficiency of conventional extraction techniques. Conventional techniques also suffer from low extraction yields, time inefficiency and inferior extract quality due to traces of organic solvents present in them. Hence, there is a need of the green and novel extraction methods to recover biomolecules. The present review provides a holistic insight to various aspects related to enzyme aided extraction. Applications of enzymes in the recovery of various biomolecules such as polyphenols, oils, polysaccharides, flavours and colorants have been highlighted. Additionally, the employment of hyphenated extraction technologies can overcome some of the major drawbacks of enzyme based extraction such as longer extraction time and immoderate use of solvents. This review also includes hyphenated intensification techniques by coupling conventional methods with ultrasound, microwave, high pressure and supercritical carbon dioxide. The last section gives an insight on application of enzyme immobilization as a strategy for large scale extraction. Immobilization of enzymes on magnetic nanoparticles can be employed to enhance the operational performance of the system by multiple use of expensive enzymes making them industrially and economically feasible. Copyright © 2018 Elsevier Ltd. All rights reserved.
A simple purification and activity assay of the coagulant protein from Moringa oleifera seed.
Ghebremichael, Kebreab A; Gunaratna, K R; Henriksson, Hongbin; Brumer, Harry; Dalhammar, Gunnel
2005-06-01
Use of extracts from Moringa oleifera (MO) is of great interest for low-cost water treatment. This paper discusses water and salt extraction of a coagulant protein from the seed, purification using ion exchange, its chemical characteristics, coagulation and antimicrobial properties. The coagulant from both extracts is a cationic protein with pI greater than 9.6 and molecular mass less than 6.5 kDa. Mass spectrometric analysis of the purified water extract indicated that it contained at least four homologous proteins, based on MS/MS peptide sequence data. The protein is thermoresistant and remained active after 5h heat treatment at 95 degrees C. The coagulant protein showed both flocculating and antibacterial effects of 1.1--4 log reduction. With samples of high turbidity, the MO extract showed similar coagulation activity as alum. Cecropin A and MO extract were found to have similar flocculation effects for clay and microorganisms. Simple methods for both the purification and assay of MO coagulating proteins are presented, which are necessary for large-scale water treatment applications.
Fast Reduction Method in Dominance-Based Information Systems
NASA Astrophysics Data System (ADS)
Li, Yan; Zhou, Qinghua; Wen, Yongchuan
2018-01-01
In real world applications, there are often some data with continuous values or preference-ordered values. Rough sets based on dominance relations can effectively deal with these kinds of data. Attribute reduction can be done in the framework of dominance-relation based approach to better extract decision rules. However, the computational cost of the dominance classes greatly affects the efficiency of attribute reduction and rule extraction. This paper presents an efficient method of computing dominance classes, and further compares it with traditional method with increasing attributes and samples. Experiments on UCI data sets show that the proposed algorithm obviously improves the efficiency of the traditional method, especially for large-scale data.
Quantum algorithms for topological and geometric analysis of data
Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo
2016-01-01
Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491
Yang, Zhi; Wu, Youqian; Wu, Shihua
2016-01-29
Despite of substantial developments of extraction and separation techniques, isolation of natural products from natural resources is still a challenging task. In this work, an efficient strategy for extraction and isolation of multi-component natural products has been successfully developed by combination of systematic two-phase liquid-liquid extraction-(13)C NMR pattern recognition and following conical counter-current chromatography separation. A small-scale crude sample was first distributed into 9 systematic hexane-ethyl acetate-methanol-water (HEMWat) two-phase solvent systems for determination of the optimum extraction solvents and partition coefficients of the prominent components. Then, the optimized solvent systems were used in succession to enrich the hydrophilic and lipophilic components from the large-scale crude sample. At last, the enriched components samples were further purified by a new conical counter-current chromatography (CCC). Due to the use of (13)C NMR pattern recognition, the kinds and structures of major components in the solvent extracts could be predicted. Therefore, the method could collect simultaneously the partition coefficients and the structural information of components in the selected two-phase solvents. As an example, a cytotoxic extract of podophyllotoxins and flavonoids from Dysosma versipellis (Hance) was selected. After the systematic HEMWat system solvent extraction and (13)C NMR pattern recognition analyses, the crude extract of D. versipellis was first degreased by the upper phase of HEMWat system (9:1:9:1, v/v), and then distributed in the two phases of the system of HEMWat (2:8:2:8, v/v) to obtain the hydrophilic lower phase extract and lipophilic upper phase extract, respectively. These extracts were further separated by conical CCC with the HEMWat systems (1:9:1:9 and 4:6:4:6, v/v). As results, total 17 cytotoxic compounds were isolated and identified. In general, whole results suggested that the strategy was very efficient for the systematic extraction and isolation of biological active components from the complex biomaterials. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Frikha, Mayssa; Fendri, Emna; Hammami, Mohamed
2017-09-01
Using semantic attributes such as gender, clothes, and accessories to describe people's appearance is an appealing modeling method for video surveillance applications. We proposed a midlevel appearance signature based on extracting a list of nameable semantic attributes describing the body in uncontrolled acquisition conditions. Conventional approaches extract the same set of low-level features to learn the semantic classifiers uniformly. Their critical limitation is the inability to capture the dominant visual characteristics for each trait separately. The proposed approach consists of extracting low-level features in an attribute-adaptive way by automatically selecting the most relevant features for each attribute separately. Furthermore, relying on a small training-dataset would easily lead to poor performance due to the large intraclass and interclass variations. We annotated large scale people images collected from different person reidentification benchmarks covering a large attribute sample and reflecting the challenges of uncontrolled acquisition conditions. These annotations were gathered into an appearance semantic attribute dataset that contains 3590 images annotated with 14 attributes. Various experiments prove that carefully designed features for learning the visual characteristics for an attribute provide an improvement of the correct classification accuracy and a reduction of both spatial and temporal complexities against state-of-the-art approaches.
2014-01-01
Background Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. Results The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. Conclusions We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance. PMID:24428898
Xu, Rong; Wang, QuanQiu
2014-01-15
Independent data sources can be used to augment post-marketing drug safety signal detection. The vast amount of publicly available biomedical literature contains rich side effect information for drugs at all clinical stages. In this study, we present a large-scale signal boosting approach that combines over 4 million records in the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) and over 21 million biomedical articles. The datasets are comprised of 4,285,097 records from FAERS and 21,354,075 MEDLINE articles. We first extracted all drug-side effect (SE) pairs from FAERS. Our study implemented a total of seven signal ranking algorithms. We then compared these different ranking algorithms before and after they were boosted with signals from MEDLINE sentences or abstracts. Finally, we manually curated all drug-cardiovascular (CV) pairs that appeared in both data sources and investigated whether our approach can detect many true signals that have not been included in FDA drug labels. We extracted a total of 2,787,797 drug-SE pairs from FAERS with a low initial precision of 0.025. The ranking algorithm combined signals from both FAERS and MEDLINE, significantly improving the precision from 0.025 to 0.371 for top-ranked pairs, representing a 13.8 fold elevation in precision. We showed by manual curation that drug-SE pairs that appeared in both data sources were highly enriched with true signals, many of which have not yet been included in FDA drug labels. We have developed an efficient and effective drug safety signal ranking and strengthening approach We demonstrate that large-scale combining information from FAERS and biomedical literature can significantly contribute to drug safety surveillance.
Fast Coding of Orientation in Primary Visual Cortex
Shriki, Oren; Kohn, Adam; Shamir, Maoz
2012-01-01
Understanding how populations of neurons encode sensory information is a major goal of systems neuroscience. Attempts to answer this question have focused on responses measured over several hundred milliseconds, a duration much longer than that frequently used by animals to make decisions about the environment. How reliably sensory information is encoded on briefer time scales, and how best to extract this information, is unknown. Although it has been proposed that neuronal response latency provides a major cue for fast decisions in the visual system, this hypothesis has not been tested systematically and in a quantitative manner. Here we use a simple ‘race to threshold’ readout mechanism to quantify the information content of spike time latency of primary visual (V1) cortical cells to stimulus orientation. We find that many V1 cells show pronounced tuning of their spike latency to stimulus orientation and that almost as much information can be extracted from spike latencies as from firing rates measured over much longer durations. To extract this information, stimulus onset must be estimated accurately. We show that the responses of cells with weak tuning of spike latency can provide a reliable onset detector. We find that spike latency information can be pooled from a large neuronal population, provided that the decision threshold is scaled linearly with the population size, yielding a processing time of the order of a few tens of milliseconds. Our results provide a novel mechanism for extracting information from neuronal populations over the very brief time scales in which behavioral judgments must sometimes be made. PMID:22719237
Large-Scale Event Extraction from Literature with Multi-Level Gene Normalization
Wei, Chih-Hsuan; Hakala, Kai; Pyysalo, Sampo; Ananiadou, Sophia; Kao, Hung-Yu; Lu, Zhiyong; Salakoski, Tapio; Van de Peer, Yves; Ginter, Filip
2013-01-01
Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/). Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from http://evexdb.org/download/, under the Creative Commons – Attribution – Share Alike (CC BY-SA) license. PMID:23613707
Apparatus for the production of boron nitride nanotubes
Smith, Michael W; Jordan, Kevin
2014-06-17
An apparatus for the large scale production of boron nitride nanotubes comprising; a pressure chamber containing; a continuously fed boron containing target; a source of thermal energy preferably a focused laser beam; a cooled condenser; a source of pressurized nitrogen gas; and a mechanism for extracting boron nitride nanotubes that are condensed on or in the area of the cooled condenser from the pressure chamber.
Russian Political, Economic, and Security Issues and U.S. Interests
2007-01-18
polonium 210 from Moscow, through Germany, to London, apparently carried by one of the Russians Litvinenko met November 1. Russian authorities deny...radio under tight state control and virtually eliminated effective political opposition. Federal forces have suppressed large-scale military resistance...Russia’s needs — food and food processing, oil and gas extraction technology, computers, communications, transportation, and investment capital — are
Measurement of Device Parameters Using Image Recovery Techniques in Large-Scale IC Devices
NASA Technical Reports Server (NTRS)
Scheick, Leif; Edmonds, Larry
2004-01-01
Devices that respond to radiation on a cell level will produce histograms showing the relative frequency of cell damage as a function of damage. The measured distribution is the convolution of distributions from radiation responses, measurement noise, and manufacturing parameters. A method of extracting device characteristics and parameters from measured distributions via mathematical and image subtraction techniques is described.
Salt-gradient Solar Ponds: Summary of US Department of Energy Sponsored Research
NASA Technical Reports Server (NTRS)
French, R. L.; Johnson, D. H.; Jones, G. F.; Zangrando, F.
1984-01-01
The solar pond research program conducted by the United States Department of Energy was discontinued after 1983. This document summarizes the results of the program, reviews the state of the art, and identifies the remaining outstanding issues. Solar ponds is a generic term but, in the context of this report, the term solar pond refers specifically to saltgradient solar pond. Several small research solar ponds have been built and successfully tested. Procedures for filling the pond, maintaining the gradient, adjusting the zone boundaries, and extracting heat were developed. Theories and models were developed and verified. The major remaining unknowns or issues involve the physical behavior of large ponds; i.e., wind mixing of the surface, lateral range or reach of horizontally injected fluids, ground thermal losses, and gradient zone boundary erosion caused by pumping fluid for heat extraction. These issues cannot be scaled and must be studied in a large outdoor solar pond.
Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L.; Costanzo, Michael; Andrews, Brenda; Boone, Charles
2017-01-01
Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae. In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. PMID:28325812
Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L; Costanzo, Michael; Andrews, Brenda; Boone, Charles
2017-05-05
Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. Copyright © 2017 Usaj et al.
[Between anxiety and depression. The status of assertiveness disorders and social phobias].
Granger, B; Azais, F; Albercque, C; Debray, Q
1995-05-01
The authors try to answer the question of the nosological status of social phobias and assertiveness difficulties, which are usually included in the large group of anxious troubles. The correlation between Rathus Rating Scale, Hamilton Depression Rating Scale (HDRS) and sub-scores of HDRS were studied in two populations; the first one was constituted by anxious and/or depressed patients, the second, extracted from the first one, by anxious patients only. The results show that lack of assertiveness has probably both affective and anxious components. These results are important from a nosological and therapeutic point of view.
Assessing the importance of internal tide scattering in the deep ocean
NASA Astrophysics Data System (ADS)
Haji, Maha; Peacock, Thomas; Carter, Glenn; Johnston, T. M. Shaun
2014-11-01
Tides are one of the main sources of energy input to the deep ocean, and the pathways of energy transfer from barotropic tides to turbulent mixing scales via internal tides are not well understood. Large-scale (low-mode) internal tides account for the bulk of energy extracted from barotropic tides and have been observed to propagate over 1000 km from their generation sites. We seek to examine the fate of these large-scale internal tides and the processes by which their energy is transferred, or ``scattered,'' to small-scale (high-mode) internal tides, which dissipate locally and are responsible for internal tide driven mixing. The EXperiment on Internal Tide Scattering (EXITS) field study conducted in 2010-2011 sought to examine the role of topographic scattering at the Line Islands Ridge. The scattering process was examined via data from three moorings equipped with moored profilers, spanning total depths of 3000--5000 m. The results of our field data analysis are rationalized via comparison to data from two- and three-dimensional numerical models and a two-dimensional analytical model based on Green function theory.
Camphor-Enabled Transfer and Mechanical Testing of Centimeter-Scale Ultrathin Films.
Wang, Bin; Luo, Da; Li, Zhancheng; Kwon, Youngwoo; Wang, Meihui; Goo, Min; Jin, Sunghwan; Huang, Ming; Shen, Yongtao; Shi, Haofei; Ding, Feng; Ruoff, Rodney S
2018-05-21
Camphor is used to transfer centimeter-scale ultrathin films onto custom-designed substrates for mechanical (tensile) testing. Compared to traditional transfer methods using dissolving/peeling to remove the support-layers, camphor is sublimed away in air at low temperature, thereby avoiding additional stress on the as-transferred films. Large-area ultrathin films can be transferred onto hollow substrates without damage by this method. Tensile measurements are made on centimeter-scale 300 nm-thick graphene oxide film specimens, much thinner than the ≈2 μm minimum thickness of macroscale graphene-oxide films previously reported. Tensile tests were also done on two different types of large-area samples of adlayer free CVD-grown single-layer graphene supported by a ≈100 nm thick polycarbonate film; graphene stiffens this sample significantly, thus the intrinsic mechanical response of the graphene can be extracted. This is the first tensile measurement of centimeter-scale monolayer graphene films. The Young's modulus of polycrystalline graphene ranges from 637 to 793 GPa, while for near single-crystal graphene, it ranges from 728 to 908 GPa (folds parallel to the tensile loading direction) and from 683 to 775 GPa (folds orthogonal to the tensile loading direction), demonstrating the mechanical performance of large-area graphene in a size scale relevant to many applications. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
How extractive industries affect health: Political economy underpinnings and pathways.
Schrecker, Ted; Birn, Anne-Emanuelle; Aguilera, Mariajosé
2018-06-07
A systematic and theoretically informed analysis of how extractive industries affect health outcomes and health inequities is overdue. Informed by the work of Saskia Sassen on "logics of extraction," we adopt an expansive definition of extractive industries to include (for example) large-scale foreign acquisitions of agricultural land for export production. To ground our analysis in concrete place-based evidence, we begin with a brief review of four case examples of major extractive activities. We then analyze the political economy of extractivism, focusing on the societal structures, processes, and relationships of power that drive and enable extraction. Next, we examine how this global order shapes and interacts with politics, institutions, and policies at the state/national level contextualizing extractive activity. Having provided necessary context, we posit a set of pathways that link the global political economy and national politics and institutional practices surrounding extraction to health outcomes and their distribution. These pathways involve both direct health effects, such as toxic work and environmental exposures and assassination of activists, and indirect effects, including sustained impoverishment, water insecurity, and stress-related ailments. We conclude with some reflections on the need for future research on the health and health equity implications of the global extractive order. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.
Bank gully extraction from DEMs utilizing the geomorphologic features of a loess hilly area in China
NASA Astrophysics Data System (ADS)
Yang, Xin; Na, Jiaming; Tang, Guoan; Wang, Tingting; Zhu, Axing
2018-04-01
As one of most active gully types in the Chinese Loess Plateau, bank gullies generally indicate soil loss and land degradation. This study addressed the lack of detailed, large scale monitoring of bank gullies and proposed a semi-automatic method for extracting bank gullies, given typical topographic features based on 5 m resolution DEMs. First, channel networks, including bank gullies, are extracted through an iterative channel burn-in algorithm. Second, gully heads are correctly positioned based on the spatial relationship between gully heads and their corresponding gully shoulder lines. Third, bank gullies are distinguished from other gullies using the newly proposed topographic measurement of "relative gully depth (RGD)." The experimental results from the loess hilly area of the Linjiajian watershed in the Chinese Loess Plateau show that the producer accuracy reaches 87.5%. The accuracy is affected by the DEM resolution and RGD parameters, as well as the accuracy of the gully shoulder line. The application in the Madigou watershed with a high DEM resolution validated the duplicability of this method in other areas. The overall performance shows that bank gullies can be extracted with acceptable accuracy over a large area, which provides essential information for research on soil erosion, geomorphology, and environmental ecology.
A research of road centerline extraction algorithm from high resolution remote sensing images
NASA Astrophysics Data System (ADS)
Zhang, Yushan; Xu, Tingfa
2017-09-01
Satellite remote sensing technology has become one of the most effective methods for land surface monitoring in recent years, due to its advantages such as short period, large scale and rich information. Meanwhile, road extraction is an important field in the applications of high resolution remote sensing images. An intelligent and automatic road extraction algorithm with high precision has great significance for transportation, road network updating and urban planning. The fuzzy c-means (FCM) clustering segmentation algorithms have been used in road extraction, but the traditional algorithms did not consider spatial information. An improved fuzzy C-means clustering algorithm combined with spatial information (SFCM) is proposed in this paper, which is proved to be effective for noisy image segmentation. Firstly, the image is segmented using the SFCM. Secondly, the segmentation result is processed by mathematical morphology to remover the joint region. Thirdly, the road centerlines are extracted by morphology thinning and burr trimming. The average integrity of the centerline extraction algorithm is 97.98%, the average accuracy is 95.36% and the average quality is 93.59%. Experimental results show that the proposed method in this paper is effective for road centerline extraction.
Progress on lipid extraction from wet algal biomass for biodiesel production.
Ghasemi Naghdi, Forough; González González, Lina M; Chan, William; Schenk, Peer M
2016-11-01
Lipid recovery and purification from microalgal cells continues to be a significant bottleneck in biodiesel production due to high costs involved and a high energy demand. Therefore, there is a considerable necessity to develop an extraction method which meets the essential requirements of being safe, cost-effective, robust, efficient, selective, environmentally friendly, feasible for large-scale production and free of product contamination. The use of wet concentrated algal biomass as a feedstock for oil extraction is especially desirable as it would avoid the requirement for further concentration and/or drying. This would save considerable costs and circumvent at least two lengthy processes during algae-based oil production. This article provides an overview on recent progress that has been made on the extraction of lipids from wet algal biomass. The biggest contributing factors appear to be the composition of algal cell walls, pre-treatments of biomass and the use of solvents (e.g. a solvent mixture or solvent-free lipid extraction). We compare recently developed wet extraction processes for oleaginous microalgae and make recommendations towards future research to improve lipid extraction from wet algal biomass. © 2016 The Authors. Microbial Biotechnology published by John Wiley & Sons Ltd and Society for Applied Microbiology.
Podolak, Charles J.
2013-01-01
An ensemble of rule-based models was constructed to assess possible future braided river planform configurations for the Toklat River in Denali National Park and Preserve, Alaska. This approach combined an analysis of large-scale influences on stability with several reduced-complexity models to produce the predictions at a practical level for managers concerned about the persistence of bank erosion while acknowledging the great uncertainty in any landscape prediction. First, a model of confluence angles reproduced observed angles of a major confluence, but showed limited susceptibility to a major rearrangement of the channel planform downstream. Second, a probabilistic map of channel locations was created with a two-parameter channel avulsion model. The predicted channel belt location was concentrated in the same area as the current channel belt. Finally, a suite of valley-scale channel and braid plain characteristics were extracted from a light detection and ranging (LiDAR)-derived surface. The characteristics demonstrated large-scale stabilizing topographic influences on channel planform. The combination of independent analyses increased confidence in the conclusion that the Toklat River braided planform is a dynamically stable system due to large and persistent valley-scale influences, and that a range of avulsive perturbations are likely to result in a relatively unchanged planform configuration in the short term.
Statistical analysis of kinetic energy entrainment in a model wind turbine array boundary layer
NASA Astrophysics Data System (ADS)
Cal, Raul Bayoan; Hamilton, Nicholas; Kang, Hyung-Suk; Meneveau, Charles
2012-11-01
For large wind farms, kinetic energy must be entrained from the flow above the wind turbines to replenish wakes and enable power extraction in the array. Various statistical features of turbulence causing vertical entrainment of mean-flow kinetic energy are studied using hot-wire velocimetry data taken in a model wind farm in a scaled wind tunnel experiment. Conditional statistics and spectral decompositions are employed to characterize the most relevant turbulent flow structures and determine their length-scales. Sweep and ejection events are shown to be the largest contributors to the vertical kinetic energy flux, although their relative contribution depends upon the location in the wake. Sweeps are shown to be dominant in the region above the wind turbine array. A spectral analysis of the data shows that large scales of the flow, about the size of the rotor diameter in length or larger, dominate the vertical entrainment. The flow is more incoherent below the array, causing decreased vertical fluxes there. The results show that improving the rate of vertical kinetic energy entrainment into wind turbine arrays is a standing challenge and would require modifying the large-scale structures of the flow. This work was funded in part by the National Science Foundation (CBET-0730922, CBET-1133800 and CBET-0953053).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei, E-mail: wei@math.msu.edu
Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topologicalmore » analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.« less
NASA Astrophysics Data System (ADS)
Kashid, Satishkumar S.; Maity, Rajib
2012-08-01
SummaryPrediction of Indian Summer Monsoon Rainfall (ISMR) is of vital importance for Indian economy, and it has been remained a great challenge for hydro-meteorologists due to inherent complexities in the climatic systems. The Large-scale atmospheric circulation patterns from tropical Pacific Ocean (ENSO) and those from tropical Indian Ocean (EQUINOO) are established to influence the Indian Summer Monsoon Rainfall. The information of these two large scale atmospheric circulation patterns in terms of their indices is used to model the complex relationship between Indian Summer Monsoon Rainfall and the ENSO as well as EQUINOO indices. However, extracting the signal from such large-scale indices for modeling such complex systems is significantly difficult. Rainfall predictions have been done for 'All India' as one unit, as well as for five 'homogeneous monsoon regions of India', defined by Indian Institute of Tropical Meteorology. Recent 'Artificial Intelligence' tool 'Genetic Programming' (GP) has been employed for modeling such problem. The Genetic Programming approach is found to capture the complex relationship between the monthly Indian Summer Monsoon Rainfall and large scale atmospheric circulation pattern indices - ENSO and EQUINOO. Research findings of this study indicate that GP-derived monthly rainfall forecasting models, that use large-scale atmospheric circulation information are successful in prediction of All India Summer Monsoon Rainfall with correlation coefficient as good as 0.866, which may appears attractive for such a complex system. A separate analysis is carried out for All India Summer Monsoon rainfall for India as one unit, and five homogeneous monsoon regions, based on ENSO and EQUINOO indices of months of March, April and May only, performed at end of month of May. In this case, All India Summer Monsoon Rainfall could be predicted with 0.70 as correlation coefficient with somewhat lesser Correlation Coefficient (C.C.) values for different 'homogeneous monsoon regions'.
Ahmad, Riaz; Naz, Saeeda; Afzal, Muhammad Zeshan; Amin, Sayed Hassan; Breuel, Thomas
2015-01-01
The presence of a large number of unique shapes called ligatures in cursive languages, along with variations due to scaling, orientation and location provides one of the most challenging pattern recognition problems. Recognition of the large number of ligatures is often a complicated task in oriental languages such as Pashto, Urdu, Persian and Arabic. Research on cursive script recognition often ignores the fact that scaling, orientation, location and font variations are common in printed cursive text. Therefore, these variations are not included in image databases and in experimental evaluations. This research uncovers challenges faced by Arabic cursive script recognition in a holistic framework by considering Pashto as a test case, because Pashto language has larger alphabet set than Arabic, Persian and Urdu. A database containing 8000 images of 1000 unique ligatures having scaling, orientation and location variations is introduced. In this article, a feature space based on scale invariant feature transform (SIFT) along with a segmentation framework has been proposed for overcoming the above mentioned challenges. The experimental results show a significantly improved performance of proposed scheme over traditional feature extraction techniques such as principal component analysis (PCA). PMID:26368566
Geometric quantification of features in large flow fields.
Kendall, Wesley; Huang, Jian; Peterka, Tom
2012-01-01
Interactive exploration of flow features in large-scale 3D unsteady-flow data is one of the most challenging visualization problems today. To comprehensively explore the complex feature spaces in these datasets, a proposed system employs a scalable framework for investigating a multitude of characteristics from traced field lines. This capability supports the examination of various neighborhood-based geometric attributes in concert with other scalar quantities. Such an analysis wasn't previously possible because of the large computational overhead and I/O requirements. The system integrates visual analytics methods by letting users procedurally and interactively describe and extract high-level flow features. An exploration of various phenomena in a large global ocean-modeling simulation demonstrates the approach's generality and expressiveness as well as its efficacy.
Bravo, Àlex; Piñero, Janet; Queralt-Rosinach, Núria; Rautschka, Michael; Furlong, Laura I
2015-02-21
Current biomedical research needs to leverage and exploit the large amount of information reported in scientific publications. Automated text mining approaches, in particular those aimed at finding relationships between entities, are key for identification of actionable knowledge from free text repositories. We present the BeFree system aimed at identifying relationships between biomedical entities with a special focus on genes and their associated diseases. By exploiting morpho-syntactic information of the text, BeFree is able to identify gene-disease, drug-disease and drug-target associations with state-of-the-art performance. The application of BeFree to real-case scenarios shows its effectiveness in extracting information relevant for translational research. We show the value of the gene-disease associations extracted by BeFree through a number of analyses and integration with other data sources. BeFree succeeds in identifying genes associated to a major cause of morbidity worldwide, depression, which are not present in other public resources. Moreover, large-scale extraction and analysis of gene-disease associations, and integration with current biomedical knowledge, provided interesting insights on the kind of information that can be found in the literature, and raised challenges regarding data prioritization and curation. We found that only a small proportion of the gene-disease associations discovered by using BeFree is collected in expert-curated databases. Thus, there is a pressing need to find alternative strategies to manual curation, in order to review, prioritize and curate text-mining data and incorporate it into domain-specific databases. We present our strategy for data prioritization and discuss its implications for supporting biomedical research and applications. BeFree is a novel text mining system that performs competitively for the identification of gene-disease, drug-disease and drug-target associations. Our analyses show that mining only a small fraction of MEDLINE results in a large dataset of gene-disease associations, and only a small proportion of this dataset is actually recorded in curated resources (2%), raising several issues on data prioritization and curation. We propose that joint analysis of text mined data with data curated by experts appears as a suitable approach to both assess data quality and highlight novel and interesting information.
Additional Results of Glaze Icing Scaling in SLD Conditions
NASA Technical Reports Server (NTRS)
Tsao, Jen-Ching
2016-01-01
New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 inches and the scale model had a chord of 21 inches. Reference tests were run with airspeeds of 100 and 130.3 knots and with MVD's of 85 and 170 microns. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number W (sub eL). The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the non-dimensional water-film thickness expression and the film Weber number W (sub ef). All tests were conducted at 0 degrees angle of arrival. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For non-dimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-dimensional ice shape profiles at any selected span-wise location from the high fidelity 3-dimensional scanned ice shapes obtained in the IRT.
Additional Results of Glaze Icing Scaling in SLD Conditions
NASA Technical Reports Server (NTRS)
Tsao, Jen-Ching
2016-01-01
New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.
NASA Astrophysics Data System (ADS)
Fernández, Ariel; Ferrari, José A.
2017-05-01
Pattern recognition and feature extraction are image processing applications of great interest in defect inspection and robot vision among others. In comparison to purely digital methods, the attractiveness of optical processors for pattern recognition lies in their highly parallel operation and real-time processing capability. This work presents an optical implementation of the generalized Hough transform (GHT), a well-established technique for recognition of geometrical features in binary images. Detection of a geometric feature under the GHT is accomplished by mapping the original image to an accumulator space; the large computational requirements for this mapping make the optical implementation an attractive alternative to digital-only methods. We explore an optical setup where the transformation is obtained, and the size and orientation parameters can be controlled, allowing for dynamic scale and orientation-variant pattern recognition. A compact system for the above purposes results from the use of an electrically tunable lens for scale control and a pupil mask implemented on a high-contrast spatial light modulator for orientation/shape variation of the template. Real-time can also be achieved. In addition, by thresholding of the GHT and optically inverse transforming, the previously detected features of interest can be extracted.
Reverse engineering and analysis of large genome-scale gene networks
Aluru, Maneesha; Zola, Jaroslaw; Nettleton, Dan; Aluru, Srinivas
2013-01-01
Reverse engineering the whole-genome networks of complex multicellular organisms continues to remain a challenge. While simpler models easily scale to large number of genes and gene expression datasets, more accurate models are compute intensive limiting their scale of applicability. To enable fast and accurate reconstruction of large networks, we developed Tool for Inferring Network of Genes (TINGe), a parallel mutual information (MI)-based program. The novel features of our approach include: (i) B-spline-based formulation for linear-time computation of MI, (ii) a novel algorithm for direct permutation testing and (iii) development of parallel algorithms to reduce run-time and facilitate construction of large networks. We assess the quality of our method by comparison with ARACNe (Algorithm for the Reconstruction of Accurate Cellular Networks) and GeneNet and demonstrate its unique capability by reverse engineering the whole-genome network of Arabidopsis thaliana from 3137 Affymetrix ATH1 GeneChips in just 9 min on a 1024-core cluster. We further report on the development of a new software Gene Network Analyzer (GeNA) for extracting context-specific subnetworks from a given set of seed genes. Using TINGe and GeNA, we performed analysis of 241 Arabidopsis AraCyc 8.0 pathways, and the results are made available through the web. PMID:23042249
Jang, Min Jee; Nam, Yoonkey
2015-01-01
Abstract. Optical recording facilitates monitoring the activity of a large neural network at the cellular scale, but the analysis and interpretation of the collected data remain challenging. Here, we present a MATLAB-based toolbox, named NeuroCa, for the automated processing and quantitative analysis of large-scale calcium imaging data. Our tool includes several computational algorithms to extract the calcium spike trains of individual neurons from the calcium imaging data in an automatic fashion. Two algorithms were developed to decompose the imaging data into the activity of individual cells and subsequently detect calcium spikes from each neuronal signal. Applying our method to dense networks in dissociated cultures, we were able to obtain the calcium spike trains of ∼1000 neurons in a few minutes. Further analyses using these data permitted the quantification of neuronal responses to chemical stimuli as well as functional mapping of spatiotemporal patterns in neuronal firing within the spontaneous, synchronous activity of a large network. These results demonstrate that our method not only automates time-consuming, labor-intensive tasks in the analysis of neural data obtained using optical recording techniques but also provides a systematic way to visualize and quantify the collective dynamics of a network in terms of its cellular elements. PMID:26229973
Relating drug–protein interaction network with drug side effects
Mizutani, Sayaka; Pauwels, Edouard; Stoven, Véronique; Goto, Susumu; Yamanishi, Yoshihiro
2012-01-01
Motivation: Identifying the emergence and underlying mechanisms of drug side effects is a challenging task in the drug development process. This underscores the importance of system–wide approaches for linking different scales of drug actions; namely drug-protein interactions (molecular scale) and side effects (phenotypic scale) toward side effect prediction for uncharacterized drugs. Results: We performed a large-scale analysis to extract correlated sets of targeted proteins and side effects, based on the co-occurrence of drugs in protein-binding profiles and side effect profiles, using sparse canonical correlation analysis. The analysis of 658 drugs with the two profiles for 1368 proteins and 1339 side effects led to the extraction of 80 correlated sets. Enrichment analyses using KEGG and Gene Ontology showed that most of the correlated sets were significantly enriched with proteins that are involved in the same biological pathways, even if their molecular functions are different. This allowed for a biologically relevant interpretation regarding the relationship between drug–targeted proteins and side effects. The extracted side effects can be regarded as possible phenotypic outcomes by drugs targeting the proteins that appear in the same correlated set. The proposed method is expected to be useful for predicting potential side effects of new drug candidate compounds based on their protein-binding profiles. Supplementary information: Datasets and all results are available at http://web.kuicr.kyoto-u.ac.jp/supp/smizutan/target-effect/. Availability: Software is available at the above supplementary website. Contact: yamanishi@bioreg.kyushu-u.ac.jp, or goto@kuicr.kyoto-u.ac.jp PMID:22962476
NASA Technical Reports Server (NTRS)
Rowan, L. C.; Abrams, M. J. (Principal Investigator)
1979-01-01
The author has identified the following significant results. Positive findings of earlier evaluations of the color-ratio compositing technique for mapping limonitic altered rocks in south-central Nevada are confirmed, but important limitations in the approach used are pointed out. These limitations arise from environmental, geologic, and image processing factors. The greater vegetation density in the East Tintic Mountains required several modifications in procedures to improve the overall mapping accuracy of the CRC approach. Large format ratio images provide better internal registration of the diazo films and avoids the problems associated with magnifications required in the original procedure. Use of the Linoscan 204 color recognition scanner permits accurate consistent extraction of the green pixels representing limonitic bedrock maps that can be used for mapping at large scales as well as for small scale reconnaissance.
Large Scale Helium Liquefaction and Considerations for Site Services for a Plant Located in Algeria
NASA Astrophysics Data System (ADS)
Froehlich, P.; Clausen, J. J.
2008-03-01
The large-scale liquefaction of helium extracted from natural gas is depicted. Based on a block diagram the process chain, starting with the pipeline downstream of the natural-gas plant to the final storage of liquid helium, is explained. Information will be provided about the recent experiences during installation and start-up of a bulk helium liquefaction plant located in Skikda, Algeria, including part-load operation based on a reduced feed gas supply. The local working and ambient conditions are described, including challenging logistic problems like shipping and receiving of parts, qualified and semi-qualified subcontractors, basic provisions and tools on site, and precautions to sea water and ambient conditions. Finally, the differences in commissioning (technically and evaluation of time and work packages) to European locations and standards will be discussed.
Gyrodampers for large space structures
NASA Technical Reports Server (NTRS)
Aubrun, J. N.; Margulies, G.
1979-01-01
The problem of controlling the vibrations of a large space structures by the use of actively augmented damping devices distributed throughout the structure is addressed. The gyrodamper which consists of a set of single gimbal control moment gyros which are actively controlled to extract the structural vibratory energy through the local rotational deformations of the structure, is described and analyzed. Various linear and nonlinear dynamic simulations of gyrodamped beams are shown, including results on self-induced vibrations due to sensor noise and rotor imbalance. The complete nonlinear dynamic equations are included. The problem of designing and sizing a system of gyrodampers for a given structure, or extrapolating results for one gyrodamped structure to another is solved in terms of scaling laws. Novel scaling laws for gyro systems are derived, based upon fundamental physical principles, and various examples are given.
Mapping Dark Matter in Simulated Galaxy Clusters
NASA Astrophysics Data System (ADS)
Bowyer, Rachel
2018-01-01
Galaxy clusters are the most massive bound objects in the Universe with most of their mass being dark matter. Cosmological simulations of structure formation show that clusters are embedded in a cosmic web of dark matter filaments and large scale structure. It is thought that these filaments are found preferentially close to the long axes of clusters. We extract galaxy clusters from the simulations "cosmo-OWLS" in order to study their properties directly and also to infer their properties from weak gravitational lensing signatures. We investigate various stacking procedures to enhance the signal of the filaments and large scale structure surrounding the clusters to better understand how the filaments of the cosmic web connect with galaxy clusters. This project was supported in part by the NSF REU grant AST-1358980 and by the Nantucket Maria Mitchell Association.
Zheng, Shuai; Jabbour, Salma K; O'Reilly, Shannon E; Lu, James J; Dong, Lihua; Ding, Lijuan; Xiao, Ying; Yue, Ning; Wang, Fusheng; Zou, Wei
2018-02-01
In outcome studies of oncology patients undergoing radiation, researchers extract valuable information from medical records generated before, during, and after radiotherapy visits, such as survival data, toxicities, and complications. Clinical studies rely heavily on these data to correlate the treatment regimen with the prognosis to develop evidence-based radiation therapy paradigms. These data are available mainly in forms of narrative texts or table formats with heterogeneous vocabularies. Manual extraction of the related information from these data can be time consuming and labor intensive, which is not ideal for large studies. The objective of this study was to adapt the interactive information extraction platform Information and Data Extraction using Adaptive Learning (IDEAL-X) to extract treatment and prognosis data for patients with locally advanced or inoperable non-small cell lung cancer (NSCLC). We transformed patient treatment and prognosis documents into normalized structured forms using the IDEAL-X system for easy data navigation. The adaptive learning and user-customized controlled toxicity vocabularies were applied to extract categorized treatment and prognosis data, so as to generate structured output. In total, we extracted data from 261 treatment and prognosis documents relating to 50 patients, with overall precision and recall more than 93% and 83%, respectively. For toxicity information extractions, which are important to study patient posttreatment side effects and quality of life, the precision and recall achieved 95.7% and 94.5% respectively. The IDEAL-X system is capable of extracting study data regarding NSCLC chemoradiation patients with significant accuracy and effectiveness, and therefore can be used in large-scale radiotherapy clinical data studies. ©Shuai Zheng, Salma K Jabbour, Shannon E O'Reilly, James J Lu, Lihua Dong, Lijuan Ding, Ying Xiao, Ning Yue, Fusheng Wang, Wei Zou. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 01.02.2018.
NASA Astrophysics Data System (ADS)
Zhang, Peng; Zhang, Lifu; Wu, Taixia; Zhang, Hongming; Sun, Xuejian
2017-01-01
Due to weathering and external forces, solar panels are subject to fouling and defects after a certain amount of time in service. These fouling and defects have direct adverse consequences such as low-power efficiency. Because solar power plants usually have large-scale photovoltaic (PV) panels, fast detection and location of fouling and defects across large PV areas are imperative. A drone-mounted infrared thermography system was designed and developed, and its ability to detect rapid fouling on large-scale PV panel systems was investigated. The infrared images were preprocessed using the K neighbor mean filter, and the single PV module on each image was recognized and extracted. Combining the local and global detection method, suspicious sites were located precisely. The results showed the flexible drone-mounted infrared thermography system to have a strong ability to detect the presence and determine the position of PV fouling. Drone-mounted infrared thermography also has good technical feasibility and practical value in the detection of PV fouling detection.
Image Harvest: an open-source platform for high-throughput plant image processing and analysis.
Knecht, Avi C; Campbell, Malachy T; Caprez, Adam; Swanson, David R; Walia, Harkamal
2016-05-01
High-throughput plant phenotyping is an effective approach to bridge the genotype-to-phenotype gap in crops. Phenomics experiments typically result in large-scale image datasets, which are not amenable for processing on desktop computers, thus creating a bottleneck in the image-analysis pipeline. Here, we present an open-source, flexible image-analysis framework, called Image Harvest (IH), for processing images originating from high-throughput plant phenotyping platforms. Image Harvest is developed to perform parallel processing on computing grids and provides an integrated feature for metadata extraction from large-scale file organization. Moreover, the integration of IH with the Open Science Grid provides academic researchers with the computational resources required for processing large image datasets at no cost. Image Harvest also offers functionalities to extract digital traits from images to interpret plant architecture-related characteristics. To demonstrate the applications of these digital traits, a rice (Oryza sativa) diversity panel was phenotyped and genome-wide association mapping was performed using digital traits that are used to describe different plant ideotypes. Three major quantitative trait loci were identified on rice chromosomes 4 and 6, which co-localize with quantitative trait loci known to regulate agronomically important traits in rice. Image Harvest is an open-source software for high-throughput image processing that requires a minimal learning curve for plant biologists to analyzephenomics datasets. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.
Fast Algorithms for Mining Co-evolving Time Series
2011-09-01
Keogh et al., 2001, 2004] and (b) forecasting, like an autoregressive integrated moving average model ( ARIMA ) and related meth- ods [Box et al., 1994...computing hardware? We develop models to mine time series with missing values, to extract compact representation from time sequences, to segment the...sequences, and to do forecasting. For large scale data, we propose algorithms for learning time series models , in particular, including Linear Dynamical
NASA Astrophysics Data System (ADS)
Ruiz Simo, I.; Martinez-Consentino, V. L.; Amaro, J. E.; Ruiz Arriola, E.
2018-06-01
We use a recent scaling analysis of the quasielastic electron scattering data from
Extra-Tropical Cyclones at Climate Scales: Comparing Models to Observations
NASA Astrophysics Data System (ADS)
Tselioudis, G.; Bauer, M.; Rossow, W.
2009-04-01
Climate is often defined as the accumulation of weather, and weather is not the concern of climate models. Justification for this latter sentiment has long been hidden behind coarse model resolutions and blunt validation tools based on climatological maps. The spatial-temporal resolutions of today's climate models and observations are converging onto meteorological scales, however, which means that with the correct tools we can test the largely unproven assumption that climate model weather is correct enough that its accumulation results in a robust climate simulation. Towards this effort we introduce a new tool for extracting detailed cyclone statistics from observations and climate model output. These include the usual cyclone characteristics (centers, tracks), but also adaptive cyclone-centric composites. We have created a novel dataset, the MAP Climatology of Mid-latitude Storminess (MCMS), which provides a detailed 6 hourly assessment of the areas under the influence of mid-latitude cyclones, using a search algorithm that delimits the boundaries of each system from the outer-most closed SLP contour. Using this we then extract composites of cloud, radiation, and precipitation properties from sources such as ISCCP and GPCP to create a large comparative dataset for climate model validation. A demonstration of the potential usefulness of these tools in process-based climate model evaluation studies will be shown.
Chan, Chung-Hung; See, Tiam-You; Yusoff, Rozita; Ngoh, Gek-Cheng; Kow, Kien-Woh
2017-04-15
This work demonstrated the optimization and scale up of microwave-assisted extraction (MAE) and ultrasonic-assisted extraction (UAE) of bioactive compounds from Orthosiphon stamineus using energy-based parameters such as absorbed power density and absorbed energy density (APD-AED) and response surface methodology (RSM). The intensive optimum conditions of MAE obtained at 80% EtOH, 50mL/g, APD of 0.35W/mL, AED of 250J/mL can be used to determine the optimum conditions of the scale-dependent parameters i.e. microwave power and treatment time at various extraction scales (100-300mL solvent loading). The yields of the up scaled conditions were consistent with less than 8% discrepancy and they were about 91-98% of the Soxhlet extraction yield. By adapting APD-AED method in the case of UAE, the intensive optimum conditions of the extraction, i.e. 70% EtOH, 30mL/g, APD of 0.22W/mL, AED of 450J/mL are able to achieve similar scale up results. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sockalingam, K., E-mail: gd130106@siswa.uthm.edu.my; Abdullah, H. Z., E-mail: hasan@uthm.edu.my
2015-07-22
Black tilapia (Oreochromis mossambicus) fish wastes (scales) were evaluated for its suitability as sources of gelatin. Scales were subjected to acid treatment for demineralization before it undergoes thermal extraction process. The raw scales were characterized via Scanning Electron Microscopy (SEM), which demarcated the cycloid pattern of the scales. SEM images also reveal the presence of collagen fiber in the fish scale. The black tilapia fish scales yields 11.88 % of gelatin, indicating the possibility of this fish species as sources of gelatin. Further characterizations were done on both raw scale and extracted gelatin through Fourier Transform Infrared Spectroscopy (FTIR) andmore » proximate analysis. The scale gelatin shows high protein content (86.9 %) with low moisture (8.2 %) and ash (1.4 %). This further proves the effectiveness of the demineralization and extraction method used. The black tilapia fish scale is found to be a prospective source of gelatin with good chemical and functional properties.« less
Kotamäki, Niina; Thessler, Sirpa; Koskiaho, Jari; Hannukkala, Asko O.; Huitu, Hanna; Huttula, Timo; Havento, Jukka; Järvenpää, Markku
2009-01-01
Sensor networks are increasingly being implemented for environmental monitoring and agriculture to provide spatially accurate and continuous environmental information and (near) real-time applications. These networks provide a large amount of data which poses challenges for ensuring data quality and extracting relevant information. In the present paper we describe a river basin scale wireless sensor network for agriculture and water monitoring. The network, called SoilWeather, is unique and the first of this type in Finland. The performance of the network is assessed from the user and maintainer perspectives, concentrating on data quality, network maintenance and applications. The results showed that the SoilWeather network has been functioning in a relatively reliable way, but also that the maintenance and data quality assurance by automatic algorithms and calibration samples requires a lot of effort, especially in continuous water monitoring over large areas. We see great benefits on sensor networks enabling continuous, real-time monitoring, while data quality control and maintenance efforts highlight the need for tight collaboration between sensor and sensor network owners to decrease costs and increase the quality of the sensor data in large scale applications. PMID:22574050
NASA Technical Reports Server (NTRS)
Globus, Al; Biegel, Bryan A.; Traugott, Steve
2004-01-01
AsterAnts is a concept calling for a fleet of solar sail powered spacecraft to retrieve large numbers of small (1/2-1 meter diameter) Near Earth Objects (NEOs) for orbital processing. AsterAnts could use the International Space Station (ISS) for NEO processing, solar sail construction, and to test NEO capture hardware. Solar sails constructed on orbit are expected to have substantially better performance than their ground built counterparts [Wright 1992]. Furthermore, solar sails may be used to hold geosynchronous communication satellites out-of-plane [Forward 1981] increasing the total number of slots by at least a factor of three. potentially generating $2 billion worth of orbital real estate over North America alone. NEOs are believed to contain large quantities of water, carbon, other life-support materials and metals. Thus. with proper processing, NEO materials could in principle be used to resupply the ISS, produce rocket propellant, manufacture tools, and build additional ISS working space. Unlike proposals requiring massive facilities, such as lunar bases, before returning any extraterrestrial larger than a typical inter-planetary mission. Furthermore, AsterAnts could be scaled up to deliver large amounts of material by building many copies of the same spacecraft, thereby achieving manufacturing economies of scale. Because AsterAnts would capture NEOs whole, NEO composition details, which are generally poorly characterized, are relatively unimportant and no complex extraction equipment is necessary. In combination with a materials processing facility at the ISS, AsterAnts might inaugurate an era of large-scale orbital construction using extraterrestrial materials.
NASA Astrophysics Data System (ADS)
Ren, B.; Wen, Q.; Zhou, H.; Guan, F.; Li, L.; Yu, H.; Wang, Z.
2018-04-01
The purpose of this paper is to provide decision support for the adjustment and optimization of crop planting structure in Jingxian County. The object-oriented information extraction method is used to extract corn and cotton from Jingxian County of Hengshui City in Hebei Province, based on multi-period GF-1 16-meter images. The best time of data extraction was screened by analyzing the spectral characteristics of corn and cotton at different growth stages based on multi-period GF-116-meter images, phenological data, and field survey data. The results showed that the total classification accuracy of corn and cotton was up to 95.7 %, the producer accuracy was 96 % and 94 % respectively, and the user precision was 95.05 % and 95.9 % respectively, which satisfied the demand of crop monitoring application. Therefore, combined with multi-period high-resolution images and object-oriented classification can be a good extraction of large-scale distribution of crop information for crop monitoring to provide convenient and effective technical means.
Olmstead, Ian L D; Kentish, Sandra E; Scales, Peter J; Martin, Gregory J O
2013-11-01
An industrially relevant method for disrupting microalgal cells and preferentially extracting neutral lipids for large-scale biodiesel production was demonstrated on pastes (20-25% solids) of Nannochloropsis sp. The highly resistant Nannochloropsis sp. cells. were disrupted by incubation for 15 h at 37°C followed by high pressure homogenization at 1200 ± 100 bar. Lipid extraction was performed by twice contacting concentrated algal paste with minimal hexane (solvent:biomass ratios (w/w) of <2:1 and <1.3:1) in a stirred vessel at 35°C. Cell disruption prior to extraction increased lipid recovery 100-fold, with yields of 30-50% w/w obtained in the first hexane contact, and a further 6.5-20% in the second contact. The hexane preferentially extracted neutral lipids over glyco- and phospholipids, with up to 86% w/w of the neutral lipids recovered. The process was effective on wet concentrated paste, required minimal solvent and moderate temperature, and did not require difficult to recover polar solvents. Copyright © 2013 Elsevier Ltd. All rights reserved.
Pasupuleti, Visweswara Rao; Prasad, TNVKV; Shiekh, Rayees Ahmad; Balam, Satheesh Krishna; Narasimhulu, Ganapathi; Reddy, Cirandur Suresh; Rahman, Ismail Ab; Gan, Siew Hua
2013-01-01
Nanotechnology is gaining momentum due to its ability to transform metals into nanoparticles. The synthesis, characterization, and applications of biologically synthesized nanomaterials have become an important branch of nanotechnology. Plant extracts are a cost-effective, ecologically friendly, and efficient alternative for the large-scale synthesis of nanoparticles. In this study, silver nanoparticles (AgNps) were synthesized using Rhinacanthus nasutus leaf extract. After exposing the silver ions to the leaf extract, the rapid reduction of silver ions led to the formation of AgNps in solution. The synthesis was confirmed by ultraviolet-visible spectroscopy, Fourier transform infrared spectroscopy, and transmission electron microscopy. The in vitro antimicrobial activity of the AgNps synthesized using R. nasutus leaf extract was investigated against Bacillus subtilis, Staphylococcus aureus, Pseudomonas aeruginosa, Klebsiella pneumonia, Escherichia coli, Aspergillus niger, and Aspergillus flavus using a disc diffusion method. The AgNps showed potential activity against all of the bacterial strains and fungal colonies, indicating that R. nasutus has the potential to be used in the development of value-added products in the biomedical and nanotechnology-based industries. PMID:24039419
Pasupuleti, Visweswara Rao; Prasad, T N V; Shiekh, Rayees Ahmad; Balam, Satheesh Krishna; Narasimhulu, Ganapathi; Reddy, Cirandur Suresh; Ab Rahman, Ismail; Gan, Siew Hua
2013-01-01
Nanotechnology is gaining momentum due to its ability to transform metals into nanoparticles. The synthesis, characterization, and applications of biologically synthesized nanomaterials have become an important branch of nanotechnology. Plant extracts are a cost-effective, ecologically friendly, and efficient alternative for the large-scale synthesis of nanoparticles. In this study, silver nanoparticles (AgNps) were synthesized using Rhinacanthus nasutus leaf extract. After exposing the silver ions to the leaf extract, the rapid reduction of silver ions led to the formation of AgNps in solution. The synthesis was confirmed by ultraviolet-visible spectroscopy, Fourier transform infrared spectroscopy, and transmission electron microscopy. The in vitro antimicrobial activity of the AgNps synthesized using R. nasutus leaf extract was investigated against Bacillus subtilis, Staphylococcus aureus, Pseudomonas aeruginosa, Klebsiella pneumonia, Escherichia coli, Aspergillus niger, and Aspergillus flavus using a disc diffusion method. The AgNps showed potential activity against all of the bacterial strains and fungal colonies, indicating that R. nasutus has the potential to be used in the development of value-added products in the biomedical and nanotechnology-based industries.
A time series of urban extent in China using DSMP/OLS nighttime light data
Chen, Dongsheng; Chen, Le; Wang, Huan; Guan, Qingfeng
2018-01-01
Urban extent data play an important role in urban management and urban studies, such as monitoring the process of urbanization and changes in the spatial configuration of urban areas. Traditional methods of extracting urban-extent information are primarily based on manual investigations and classifications using remote sensing images, and these methods have such problems as large costs in labor and time and low precision. This study proposes an improved, simplified and flexible method for extracting urban extents over multiple scales and the construction of spatiotemporal models using DMSP/OLS nighttime light (NTL) for practical situations. This method eliminates the regional temporal and spatial inconsistency of thresholding NTL in large-scale and multi-temporal scenes. Using this method, we have extracted the urban extents and calculated the corresponding areas on the county, municipal and provincial scales in China from 2000 to 2012. In addition, validation with the data of reference data shows that the overall accuracy (OA), Kappa and F1 Scores were 0.996, 0.793, and 0.782, respectively. We increased the spatial resolution of the urban extent to 500 m (approximately four times finer than the results of previous studies). Based on the urban extent dataset proposed above, we analyzed changes in urban extents over time and observed that urban sprawl has grown in all of the counties of China. We also identified three patterns of urban sprawl: Early Urban Growth, Constant Urban Growth and Recent Urban Growth. In addition, these trends of urban sprawl are consistent with the western, eastern and central cities of China, respectively, in terms of their spatial distribution, socioeconomic characteristics and historical background. Additionally, the urban extents display the spatial configurations of urban areas intuitively. The proposed urban extent dataset is available for download and can provide reference data and support for future studies of urbanization and urban planning. PMID:29795685
Han, Y J; Li, L H; Grier, A; Chen, L; Valavanis, A; Zhu, J; Freeman, J R; Isac, N; Colombelli, R; Dean, P; Davies, A G; Linfield, E H
2016-12-12
We report an extraction-controlled terahertz (THz)-frequency quantum cascade laser design in which a diagonal LO-phonon scattering process is used to achieve efficient current injection into the upper laser level of each period and simultaneously extract electrons from the adjacent period. The effects of the diagonality of the radiative transition are investigated, and a design with a scaled oscillator strength of 0.45 is shown experimentally to provide the highest temperature performance. A 3.3 THz device processed into a double-metal waveguide configuration operated up to 123 K in pulsed mode, with a threshold current density of 1.3 kA/cm2 at 10 K. The QCL structures are modeled using an extended density matrix approach, and the large threshold current is attributed to parasitic current paths associated with the upper laser levels. The simplicity of this design makes it an ideal platform to investigate the scattering injection process.
NASA Astrophysics Data System (ADS)
Guo, Wei; Li, Junmei; Sheikhi, Moheb; Jiang, Jie’an; Yang, Zhenhai; Li, Hongwei; Guo, Shiping; Sheng, Jiang; Sun, Jie; Bo, Baoxue; Ye, Jichun
2018-06-01
Light extraction and current injection are two important considerations in the development of high efficiency light-emitting-diodes (LEDs), but usually cannot be satisfied simultaneously in nanostructure patterned devices. In this work, we investigated near-UV LEDs with nanopillar and nanohole patterns to improve light extraction efficiency. Photoluminescence (PL) intensities were enhanced by 8.0 and 4.1 times for nanopillar and nanohole LEDs compared to that of planar LED. Nanopillar LED exhibits higher PL emission than that of the nanohole LED, attributing to a convex shape sidewall for more effective outward light scattering, and reduction of quantum-confined-stark-effect owing to strain relaxation. However, nanopillar LED exhibits lower electroluminescence intensity than the nanohole sample, which calls for further optimization in carrier distributions. Experimental results were further supported by near-field electric field simulations. This work demonstrates the difference in optical and electrical behaviors between the nanopillar and nanohole LEDs, paving the way for detailed understanding on luminescence extraction mechanisms of nanostructure patterned UV emitters.
NASA Astrophysics Data System (ADS)
Xin Hui, Yau; Yi Peng, Teoh; Wei Wen, Liu; Zhong Xian, Ooi; Peck Loo, Kiew
2016-11-01
Iron oxide nanoparticles were prepared from the reaction between the Zingiber officinale (ginger) root extracts and ferric chloride solution at 50°C for 2 h in mild stirring condition. The synthesized powder forms of nanoparticles were further characterized by using UV-Vis spectroscopy and X-ray Diffraction spectrometry. UV-Vis analysis shows the absorption peak of iron oxide nanoparticles is appeared at 370 nm. The calculation of crystallite size from the XRD showed that the average particle size of iron oxide nanoparticles was 68.43 nm. Therefore, this eco-friendly technique is low cost and large scale nanoparticles synthesis to fulfill the demand of various applications.
Improving the large scale purification of the HIV microbicide, griffithsin.
Fuqua, Joshua L; Wanga, Valentine; Palmer, Kenneth E
2015-02-22
Griffithsin is a broad spectrum antiviral lectin that inhibits viral entry and maturation processes through binding clusters of oligomannose glycans on viral envelope glycoproteins. An efficient, scaleable manufacturing process for griffithsin active pharmaceutical ingredient (API) is essential for particularly cost-sensitive products such as griffithsin -based topical microbicides for HIV-1 prevention in resource poor settings. Our previously published purification method used ceramic filtration followed by two chromatography steps, resulting in a protein recovery of 30%. Our objective was to develop a scalable purification method for griffithsin expressed in Nicotiana benthamiana plants that would increase yield, reduce production costs, and simplify manufacturing techniques. Considering the future need to transfer griffithsin manufacturing technology to resource poor areas, we chose to focus modifying the purification process, paying particular attention to introducing simple, low-cost, and scalable procedures such as use of temperature, pH, ion concentration, and filtration to enhance product recovery. We achieved >99% pure griffithsin API by generating the initial green juice extract in pH 4 buffer, heating the extract to 55°C, incubating overnight with a bentonite MgCl2 mixture, and final purification with Capto™ multimodal chromatography. Griffithsin extracted with this protocol maintains activity comparable to griffithsin purified by the previously published method and we are able to recover a substantially higher yield: 88 ± 5% of griffithsin from the initial extract. The method was scaled to produce gram quantities of griffithsin with high yields, low endotoxin levels, and low purification costs maintained. The methodology developed to purify griffithsin introduces and develops multiple tools for purification of recombinant proteins from plants at an industrial scale. These tools allow for robust cost-effective production and purification of griffithsin. The methodology can be readily scaled to the bench top or industry and process components can be used for purification of additional proteins based on biophysical characteristics.
Objective grading of facial paralysis using Local Binary Patterns in video processing.
He, Shu; Soraghan, John J; O'Reilly, Brian F
2008-01-01
This paper presents a novel framework for objective measurement of facial paralysis in biomedial videos. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the Local Binary Patterns (LBP) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of block schemes. A multi-resolution extension of uniform LBP is proposed to efficiently combine the micro-patterns and large-scale patterns into a feature vector, which increases the algorithmic robustness and reduces noise effects while still retaining computational simplicity. The symmetry of facial movements is measured by the Resistor-Average Distance (RAD) between LBP features extracted from the two sides of the face. Support Vector Machine (SVM) is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) Scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Construction of Green Tide Monitoring System and Research on its Key Techniques
NASA Astrophysics Data System (ADS)
Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.
2018-04-01
As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.
Nagarajappa, Ramesh; Batra, Mehak; Sharda, Archana J; Asawa, Kailash; Sanadhya, Sudhanshu; Daryani, Hemasha; Ramesh, Gayathri
2015-01-01
To assess and compare the antimicrobial potential and determine the minimum inhibitory concentration (MIC) of Jasminum grandiflorum and Hibiscus rosa-sinensis extracts as potential anti-pathogenic agents in dental caries. Aqueous and ethanol (cold and hot) extracts prepared from leaves of Jasminum grandiflorum and Hibiscus rosa-sinensis were screened for in vitro antimicrobial activity against Streptococcus mutans and Lactobacillus acidophilus using the agar well diffusion method. The lowest concentration of every extract considered as the minimum inhibitory concentration (MIC) was determined for both test organisms. Statistical analysis was performed with one-way analysis of variance (ANOVA). At lower concentrations, hot ethanol Jasminum grandiflorum (10 μg/ml) and Hibiscus rosa-sinensis (25 μg/ml) extracts were found to have statistically significant (P≤0.05) antimicrobial activity against S. mutans and L. acidophilus with MIC values of 6.25 μg/ml and 25 μg/ml, respectively. A proportional increase in their antimicrobial activity (zone of inhibition) was observed. Both extracts were found to be antimicrobially active and contain compounds with therapeutic potential. Nevertheless, clinical trials on the effect of these plants are essential before advocating large-scale therapy.
McArt, Scott H; Spalinger, Donald E; Kennish, John M; Collins, William B
2006-06-01
The protein precipitation assay used by Robbins et al., (1987) Ecology 68:98-107 has been shown to predict successfully the reduction in protein availability to some ruminants due to tannins. The procedure, however, is expensive and laborious, which limits its utility, especially for quantitative ecological or nutritional applications where large numbers of assays may be required. We have modified the method to decrease its cost and increase laboratory efficiency by: (1) automating the extraction by using Accelerated Solvent Extraction (ASE); and (2) by scaling and automating the precipitation reaction, chromatography, and spectrometry with microplate gel filtration and an automated UV-VIS microplate spectrometer. ASE extraction is shown to be as effective at extracting tannins as the hot methanol technique. Additionally, the microplate assay is sensitive and precise. We show that the results from the new technique correspond in a nearly 1:1 relationship to the results of the previous technique. Hence, this method could reliably replace the older method with no loss in relevance to herbivore protein digestion. Moreover, the ASE extraction technique should be applicable to other tannin-protein precipitation assays and possibly other phenolic assays.
A multiple maximum scatter difference discriminant criterion for facial feature extraction.
Song, Fengxi; Zhang, David; Mei, Dayong; Guo, Zhongwei
2007-12-01
Maximum scatter difference (MSD) discriminant criterion was a recently presented binary discriminant criterion for pattern classification that utilizes the generalized scatter difference rather than the generalized Rayleigh quotient as a class separability measure, thereby avoiding the singularity problem when addressing small-sample-size problems. MSD classifiers based on this criterion have been quite effective on face-recognition tasks, but as they are binary classifiers, they are not as efficient on large-scale classification tasks. To address the problem, this paper generalizes the classification-oriented binary criterion to its multiple counterpart--multiple MSD (MMSD) discriminant criterion for facial feature extraction. The MMSD feature-extraction method, which is based on this novel discriminant criterion, is a new subspace-based feature-extraction method. Unlike most other subspace-based feature-extraction methods, the MMSD computes its discriminant vectors from both the range of the between-class scatter matrix and the null space of the within-class scatter matrix. The MMSD is theoretically elegant and easy to calculate. Extensive experimental studies conducted on the benchmark database, FERET, show that the MMSD out-performs state-of-the-art facial feature-extraction methods such as null space method, direct linear discriminant analysis (LDA), eigenface, Fisherface, and complete LDA.
Semi-Supervised Geographical Feature Detection
NASA Astrophysics Data System (ADS)
Yu, H.; Yu, L.; Kuo, K. S.
2016-12-01
Extraction and tracking geographical features is a fundamental requirement in many geoscience fields. However, this operation has become an increasingly challenging task for domain scientists when tackling a large amount of geoscience data. Although domain scientists may have a relatively clear definition of features, it is difficult to capture the presence of features in an accurate and efficient fashion. We propose a semi-supervised approach to address large geographical feature detection. Our approach has two main components. First, we represent a heterogeneous geoscience data in a unified high-dimensional space, which can facilitate us to evaluate the similarity of data points with respect to geolocation, time, and variable values. We characterize the data from these measures, and use a set of hash functions to parameterize the initial knowledge of the data. Second, for any user query, our approach can automatically extract the initial results based on the hash functions. To improve the accuracy of querying, our approach provides a visualization interface to display the querying results and allow users to interactively explore and refine them. The user feedback will be used to enhance our knowledge base in an iterative manner. In our implementation, we use high-performance computing techniques to accelerate the construction of hash functions. Our design facilitates a parallelization scheme for feature detection and extraction, which is a traditionally challenging problem for large-scale data. We evaluate our approach and demonstrate the effectiveness using both synthetic and real world datasets.
NASA Astrophysics Data System (ADS)
Lohman, R. B.; Scott, C. P.
2014-12-01
Efforts to understand the buildup and release of strain within the Earth's crust often rely on well-characterized observations of ground deformation, over time scales that include interseismic periods, earthquakes, and transient deformation episodes. Constraints on current rates of surface deformation in 1-, 2- or 3-dimensions can be obtained by examining sets of GPS and Interferometric Synthetic Aperture Radar (InSAR) observations, both alone and in combination. Contributions to the observed signal often include motion along faults, seasonal cycles of subsidence and recharge associated with aquifers, anthropogenic extraction of hydrocarbons, and variations in atmospheric water vapor and ionospheric properties. Here we examine methods for extracting time-varying ground deformation signals from combinations of InSAR and GPS data, real and synthetic, applied to Southern California. We show that two methods for combining the data through removal of a GPS-constrained function (a plane, and filtering) from the InSAR result in a clear tradeoff between the contribution from the two datatypes at diffferent spatial scales. We also show that the contribution to the secular rates at GPS sites from seasonal signals is large enough to be a significant error in this estimation process, and should be accounted for.
Sreekanth, J; Cui, Tao; Pickett, Trevor; Rassam, David; Gilfedder, Mat; Barrett, Damian
2018-09-01
Large scale development of coal seam gas (CSG) is occurring in many sedimentary basins around the world including Australia, where commercial production of CSG has started in the Surat and Bowen basins. CSG development often involves extraction of large volumes of water that results in depressurising aquifers that overlie and/or underlie the coal seams thus perturbing their flow regimes. This can potentially impact regional aquifer systems that are used for many purposes such as irrigation, and stock and domestic water. In this study, we adopt a probabilistic approach to quantify the depressurisation of the Gunnedah coal seams and how this impacts fluxes to, and from the overlying Great Artesian Basin (GAB) Pilliga Sandstone aquifer. The proposed method is suitable when effects of a new resource development activity on the regional groundwater balance needs to be assessed and account for large scale uncertainties in the groundwater flow system and proposed activity. The results indicated that the extraction of water and gas from the coal seam could potentially induce additional fluxes from the Pilliga Sandstone to the deeper formations due to lowering pressure heads in the coal seams. The median value of the rise in the maximum flux from the Pilliga Sandstone to the deeper formations is estimated to be 85ML/year, which is considered insignificant as it forms only about 0.29% of the Long Term Annual Average Extraction Limit of 30GL/year from the groundwater management area. The probabilistic simulation of the water balance components indicates only small changes being induced by CSG development that influence interactions of the Pilliga Sandstone with the overlying and underlying formations and with the surface water courses. The current analyses that quantified the potential maximum impacts of resource developments and how they influences the regional water balance, would greatly underpin future management decisions. Copyright © 2018 Elsevier B.V. All rights reserved.
Environmental DNA sequencing primers for eutardigrades and bdelloid rotifers
2009-01-01
Background The time it takes to isolate individuals from environmental samples and then extract DNA from each individual is one of the problems with generating molecular data from meiofauna such as eutardigrades and bdelloid rotifers. The lack of consistent morphological information and the extreme abundance of these classes makes morphological identification of rare, or even common cryptic taxa a large and unwieldy task. This limits the ability to perform large-scale surveys of the diversity of these organisms. Here we demonstrate a culture-independent molecular survey approach that enables the generation of large amounts of eutardigrade and bdelloid rotifer sequence data directly from soil. Our PCR primers, specific to the 18s small-subunit rRNA gene, were developed for both eutardigrades and bdelloid rotifers. Results The developed primers successfully amplified DNA of their target organism from various soil DNA extracts. This was confirmed by both the BLAST similarity searches and phylogenetic analyses. Tardigrades showed much better phylogenetic resolution than bdelloids. Both groups of organisms exhibited varying levels of endemism. Conclusion The development of clade-specific primers for characterizing eutardigrades and bdelloid rotifers from environmental samples should greatly increase our ability to characterize the composition of these taxa in environmental samples. Environmental sequencing as shown here differs from other molecular survey methods in that there is no need to pre-isolate the organisms of interest from soil in order to amplify their DNA. The DNA sequences obtained from methods that do not require culturing can be identified post-hoc and placed phylogenetically as additional closely related sequences are obtained from morphologically identified conspecifics. Our non-cultured environmental sequence based approach will be able to provide a rapid and large-scale screening of the presence, absence and diversity of Bdelloidea and Eutardigrada in a variety of soils. PMID:20003362
Birkhofer, Klaus; Schöning, Ingo; Alt, Fabian; Herold, Nadine; Klarner, Bernhard; Maraun, Mark; Marhan, Sven; Oelmann, Yvonne; Wubet, Tesfaye; Yurkov, Andrey; Begerow, Dominik; Berner, Doreen; Buscot, François; Daniel, Rolf; Diekötter, Tim; Ehnes, Roswitha B.; Erdmann, Georgia; Fischer, Christiane; Foesel, Bärbel; Groh, Janine; Gutknecht, Jessica; Kandeler, Ellen; Lang, Christa; Lohaus, Gertrud; Meyer, Annabel; Nacke, Heiko; Näther, Astrid; Overmann, Jörg; Polle, Andrea; Pollierer, Melanie M.; Scheu, Stefan; Schloter, Michael; Schulze, Ernst-Detlef; Schulze, Waltraud; Weinert, Jan; Weisser, Wolfgang W.; Wolters, Volkmar; Schrumpf, Marion
2012-01-01
Very few principles have been unraveled that explain the relationship between soil properties and soil biota across large spatial scales and different land-use types. Here, we seek these general relationships using data from 52 differently managed grassland and forest soils in three study regions spanning a latitudinal gradient in Germany. We hypothesize that, after extraction of variation that is explained by location and land-use type, soil properties still explain significant proportions of variation in the abundance and diversity of soil biota. If the relationships between predictors and soil organisms were analyzed individually for each predictor group, soil properties explained the highest amount of variation in soil biota abundance and diversity, followed by land-use type and sampling location. After extraction of variation that originated from location or land-use, abiotic soil properties explained significant amounts of variation in fungal, meso- and macrofauna, but not in yeast or bacterial biomass or diversity. Nitrate or nitrogen concentration and fungal biomass were positively related, but nitrate concentration was negatively related to the abundances of Collembola and mites and to the myriapod species richness across a range of forest and grassland soils. The species richness of earthworms was positively correlated with clay content of soils independent of sample location and land-use type. Our study indicates that after accounting for heterogeneity resulting from large scale differences among sampling locations and land-use types, soil properties still explain significant proportions of variation in fungal and soil fauna abundance or diversity. However, soil biota was also related to processes that act at larger spatial scales and bacteria or soil yeasts only showed weak relationships to soil properties. We therefore argue that more general relationships between soil properties and soil biota can only be derived from future studies that consider larger spatial scales and different land-use types. PMID:22937029
Background derivation and image flattening: getimages
NASA Astrophysics Data System (ADS)
Men'shchikov, A.
2017-11-01
Modern high-resolution images obtained with space observatories display extremely strong intensity variations across images on all spatial scales. Source extraction in such images with methods based on global thresholding may bring unacceptably large numbers of spurious sources in bright areas while failing to detect sources in low-background or low-noise areas. It would be highly beneficial to subtract background and equalize the levels of small-scale fluctuations in the images before extracting sources or filaments. This paper describes getimages, a new method of background derivation and image flattening. It is based on median filtering with sliding windows that correspond to a range of spatial scales from the observational beam size up to a maximum structure width Xλ. The latter is a single free parameter of getimages that can be evaluated manually from the observed image ℐλ. The median filtering algorithm provides a background image \\tilde{Bλ} for structures of all widths below Xλ. The same median filtering procedure applied to an image of standard deviations 𝓓λ derived from a background-subtracted image \\tilde{Sλ} results in a flattening image \\tilde{Fλ}. Finally, a flattened detection image I{λD} = \\tilde{Sλ}/\\tilde{Fλ} is computed, whose standard deviations are uniform outside sources and filaments. Detecting sources in such greatly simplified images results in much cleaner extractions that are more complete and reliable. As a bonus, getimages reduces various observational and map-making artifacts and equalizes noise levels between independent tiles of mosaicked images.
Extracting Models in Single Molecule Experiments
NASA Astrophysics Data System (ADS)
Presse, Steve
2013-03-01
Single molecule experiments can now monitor the journey of a protein from its assembly near a ribosome to its proteolytic demise. Ideally all single molecule data should be self-explanatory. However data originating from single molecule experiments is particularly challenging to interpret on account of fluctuations and noise at such small scales. Realistically, basic understanding comes from models carefully extracted from the noisy data. Statistical mechanics, and maximum entropy in particular, provide a powerful framework for accomplishing this task in a principled fashion. Here I will discuss our work in extracting conformational memory from single molecule force spectroscopy experiments on large biomolecules. One clear advantage of this method is that we let the data tend towards the correct model, we do not fit the data. I will show that the dynamical model of the single molecule dynamics which emerges from this analysis is often more textured and complex than could otherwise come from fitting the data to a pre-conceived model.
Using Web-Based Knowledge Extraction Techniques to Support Cultural Modeling
NASA Astrophysics Data System (ADS)
Smart, Paul R.; Sieck, Winston R.; Shadbolt, Nigel R.
The World Wide Web is a potentially valuable source of information about the cognitive characteristics of cultural groups. However, attempts to use the Web in the context of cultural modeling activities are hampered by the large-scale nature of the Web and the current dominance of natural language formats. In this paper, we outline an approach to support the exploitation of the Web for cultural modeling activities. The approach begins with the development of qualitative cultural models (which describe the beliefs, concepts and values of cultural groups), and these models are subsequently used to develop an ontology-based information extraction capability. Our approach represents an attempt to combine conventional approaches to information extraction with epidemiological perspectives of culture and network-based approaches to cultural analysis. The approach can be used, we suggest, to support the development of models providing a better understanding of the cognitive characteristics of particular cultural groups.
NASA Astrophysics Data System (ADS)
Poiata, Natalia; Vilotte, Jean-Pierre; Bernard, Pascal; Satriano, Claudio; Obara, Kazushige
2018-02-01
In this study, we demonstrate the capability of an automatic network-based detection and location method to extract and analyse different components of tectonic tremor activity by analysing a 9-day energetic tectonic tremor sequence occurring at the down-dip extension of the subducting slab in southwestern Japan. The applied method exploits the coherency of multi-scale, frequency-selective characteristics of non-stationary signals recorded across the seismic network. Use of different characteristic functions, in the signal processing step of the method, allows to extract and locate the sources of short-duration impulsive signal transients associated with low-frequency earthquakes and of longer-duration energy transients during the tectonic tremor sequence. Frequency-dependent characteristic functions, based on higher-order statistics' properties of the seismic signals, are used for the detection and location of low-frequency earthquakes. This allows extracting a more complete (˜6.5 times more events) and time-resolved catalogue of low-frequency earthquakes than the routine catalogue provided by the Japan Meteorological Agency. As such, this catalogue allows resolving the space-time evolution of the low-frequency earthquakes activity in great detail, unravelling spatial and temporal clustering, modulation in response to tide, and different scales of space-time migration patterns. In the second part of the study, the detection and source location of longer-duration signal energy transients within the tectonic tremor sequence is performed using characteristic functions built from smoothed frequency-dependent energy envelopes. This leads to a catalogue of longer-duration energy sources during the tectonic tremor sequence, characterized by their durations and 3-D spatial likelihood maps of the energy-release source regions. The summary 3-D likelihood map for the 9-day tectonic tremor sequence, built from this catalogue, exhibits an along-strike spatial segmentation of the long-duration energy-release regions, matching the large-scale clustering features evidenced from the low-frequency earthquake's activity analysis. Further examination of the two catalogues showed that the extracted short-duration low-frequency earthquakes activity coincides in space, within about 10-15 km distance, with the longer-duration energy sources during the tectonic tremor sequence. This observation provides a potential constraint on the size of the longer-duration energy-radiating source region in relation with the clustering of low-frequency earthquakes activity during the analysed tectonic tremor sequence. We show that advanced statistical network-based methods offer new capabilities for automatic high-resolution detection, location and monitoring of different scale-components of tectonic tremor activity, enriching existing slow earthquakes catalogues. Systematic application of such methods to large continuous data sets will allow imaging the slow transient seismic energy-release activity at higher resolution, and therefore, provide new insights into the underlying multi-scale mechanisms of slow earthquakes generation.
Large Scale Data Analysis and Knowledge Extraction in Communication Data
2017-03-31
this purpose, we developed a novel method the " Correlation Density Ran!C’ which finds probability density distribution of related frequent event on all...which is called " Correlation Density Rank", is developed to derive the community tree from the network. As in the real world, where a network is...Community Structure in Dynamic Social Networks using the Correlation Density Rank," 2014 ASE BigData/SocialCom/Cybersecurity Conference, Stanford
A Study of Rapid Biodegradation of Oily Wastes through Composting.
1979-10-01
effective method for large-scale composting of organic wastes. This research project was based on the principles of the forced aeration technique. The...carbon results in heat loss and subsequent reduction in effectiveness of pathogen destruction. It is therefore desirable to maintain the C/N ratio at a...investigated the effect of composting on the degradation of hydrocarbons in sewage sludge. Sludge extracts were fractionated into classes of compounds and a
Competitive code-based fast palmprint identification using a set of cover trees
NASA Astrophysics Data System (ADS)
Yue, Feng; Zuo, Wangmeng; Zhang, David; Wang, Kuanquan
2009-06-01
A palmprint identification system recognizes a query palmprint image by searching for its nearest neighbor from among all the templates in a database. When applied on a large-scale identification system, it is often necessary to speed up the nearest-neighbor searching process. We use competitive code, which has very fast feature extraction and matching speed, for palmprint identification. To speed up the identification process, we extend the cover tree method and propose to use a set of cover trees to facilitate the fast and accurate nearest-neighbor searching. We can use the cover tree method because, as we show, the angular distance used in competitive code can be decomposed into a set of metrics. Using the Hong Kong PolyU palmprint database (version 2) and a large-scale palmprint database, our experimental results show that the proposed method searches for nearest neighbors faster than brute force searching.
Dudek, Jozef J.; Edwards, Robert G.
2012-03-21
In this study, we present the first comprehensive study of hybrid baryons using lattice QCD methods. Using a large basis of composite QCD interpolating fields we extract an extensive spectrum of baryon states and isolate those of hybrid character using their relatively large overlap onto operators which sample gluonic excitations. We consider the spectrum of Nucleon and Delta states at several quark masses finding a set of positive parity hybrid baryons with quantum numbersmore » $$N_{1/2^+},\\,N_{1/2^+},\\,N_{3/2^+},\\, N_{3/2^+},\\,N_{5/2^+},\\,$$ and $$\\Delta_{1/2^+},\\, \\Delta_{3/2^+}$$ at an energy scale above the first band of `conventional' excited positive parity baryons. This pattern of states is compatible with a color octet gluonic excitation having $$J^{P}=1^{+}$$ as previously reported in the hybrid meson sector and with a comparable energy scale for the excitation, suggesting a common bound-state construction for hybrid mesons and baryons.« less
Universal dimer–dimer scattering in lattice effective field theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elhatisari, Serdar; Katterjohn, Kris; Lee, Dean
We consider two-component fermions with short-range interactions and large scattering length. This system has universal properties that are realized in several different fields of physics. In the limit of large fermion–fermion scattering length a ff and zero-range interaction, all properties of the system scale proportionally with a ff. For the case with shallow bound dimers, we calculate the dimer–dimer scattering phase shifts using lattice effective field theory. We extract the universal dimer–dimer scattering length a dd/a ff=0.618(30) and effective range r dd/a ff=-0.431(48). This result for the effective range is the first calculation with quantified and controlled systematic errors. Wemore » also benchmark our methods by computing the fermion–dimer scattering parameters and testing some predictions of conformal scaling of irrelevant operators near the unitarity limit.« less
Large Scale Synthesis and Light Emitting Fibers of Tailor-Made Graphene Quantum Dots
Park, Hun; Hyun Noh, Sung; Hye Lee, Ji; Jun Lee, Won; Yun Jaung, Jae; Geol Lee, Seung; Hee Han, Tae
2015-01-01
Graphene oxide (GO), which is an oxidized form of graphene, has a mixed structure consisting of graphitic crystallites of sp2 hybridized carbon and amorphous regions. In this work, we present a straightforward route for preparing graphene-based quantum dots (GQDs) by extraction of the crystallites from the amorphous matrix of the GO sheets. GQDs with controlled functionality are readily prepared by varying the reaction temperature, which results in precise tunability of their optical properties. Here, it was concluded that the tunable optical properties of GQDs are a result of the different fraction of chemical functionalities present. The synthesis approach presented in this paper provides an efficient strategy for achieving large-scale production and long-time optical stability of the GQDs, and the hybrid assembly of GQD and polymer has potential applications as photoluminescent fibers or films. PMID:26383257
Wang, Hua; Chu, Yixuan; Fang, Chengran
2017-04-01
The occurrence and distribution of five sulfonamides and three tetracyclines in swine manure sampled from large-scale feedlots in different areas of Zhejiang Province, China were detected using solid-phase extraction and high-performance liquid chromatography. All eight test antibiotics were detected in most of the manure samples. The dominant antibiotics in swine manure were sulfadiazine, sulfamerazine, sulfadimidine, tetracycline, and chlortetracycline. The maximum concentration of residual antibiotic reached up to 57.95 mg/kg (chlortetracycline). The concentrations and distribution of both types of antibiotics in swine manure of different areas varied greatly. Relatively higher concentrations of sulfonamides were found in swine manure from the Zhejiang area in this experiment compared with previous studies. The results revealed that antibiotics were extensively used in feedlots in this district and that animal manure might act as a non-specific source of antibiotic residues in farmlands and aquatic environments.
Listening to the Deep: live monitoring of ocean noise and cetacean acoustic signals.
André, M; van der Schaar, M; Zaugg, S; Houégnigan, L; Sánchez, A M; Castell, J V
2011-01-01
The development and broad use of passive acoustic monitoring techniques have the potential to help assessing the large-scale influence of artificial noise on marine organisms and ecosystems. Deep-sea observatories have the potential to play a key role in understanding these recent acoustic changes. LIDO (Listening to the Deep Ocean Environment) is an international project that is allowing the real-time long-term monitoring of marine ambient noise as well as marine mammal sounds at cabled and standalone observatories. Here, we present the overall development of the project and the use of passive acoustic monitoring (PAM) techniques to provide the scientific community with real-time data at large spatial and temporal scales. Special attention is given to the extraction and identification of high frequency cetacean echolocation signals given the relevance of detecting target species, e.g. beaked whales, in mitigation processes, e.g. during military exercises. Copyright © 2011. Published by Elsevier Ltd.
Universal dimer–dimer scattering in lattice effective field theory
Elhatisari, Serdar; Katterjohn, Kris; Lee, Dean; ...
2017-03-14
We consider two-component fermions with short-range interactions and large scattering length. This system has universal properties that are realized in several different fields of physics. In the limit of large fermion–fermion scattering length a ff and zero-range interaction, all properties of the system scale proportionally with a ff. For the case with shallow bound dimers, we calculate the dimer–dimer scattering phase shifts using lattice effective field theory. We extract the universal dimer–dimer scattering length a dd/a ff=0.618(30) and effective range r dd/a ff=-0.431(48). This result for the effective range is the first calculation with quantified and controlled systematic errors. Wemore » also benchmark our methods by computing the fermion–dimer scattering parameters and testing some predictions of conformal scaling of irrelevant operators near the unitarity limit.« less
Haberl-Meglič, Saša; Levičnik, Eva; Luengo, Elisa; Raso, Javier; Miklavčič, Damijan
2016-12-01
Different chemical and physical methods are used for extraction of proteins from bacteria, which are used in variety of fields. But on a large scale, many methods have severe drawbacks. Recently, extraction by means of electroporation showed a great potential to quickly obtain proteins from bacteria. Since many parameters are affecting the yield of extracted proteins, our aim was to investigate the effect of temperature and bacterial growth phase on the yield of extracted proteins. At the same time bacterial viability was tested. Our results showed that the temperature has a great effect on protein extraction, the best temperature post treatment being 4°C. No effect on bacterial viability was observed for all temperatures tested. Also bacterial growth phase did not affect the yield of extracted proteins or bacterial viability. Nevertheless, further experiments may need to be performed to confirm this observation, since only one incubation temperature (4°C) and one incubation time before and after electroporation (0.5 and 1h) were tested for bacterial growth phase. Based on our results we conclude that temperature is a key element for bacterial membrane to stay in a permeabilized state, so more proteins flow out of bacteria into surrounding media. Copyright © 2016 Elsevier B.V. All rights reserved.
Zarei, Omid; Dastmalchi, Siavoush; Hamzeh-Mivehroud, Maryam
2016-01-01
Yeasts, especially Saccharomyces cerevisiae, are one of the oldest organisms with broad spectrum of applications, owing to their unique genetics and physiology. Yeast extract, i.e. the product of yeast cells, is extensively used as nutritional resource in bacterial culture media. The aim of this study was to develop a simple, rapid and cost benefit process to produce the yeast extract. In this procedure mechanical methods such as high temperature and pressure were utilized to produce the yeast extract. The growth of the bacteria feed with the produced yeast extract was monitored in order to assess the quality of the product. The results showed that the quality of the produced yeast extract was very promising concluded from the growth pattern of bacterial cells in media prepared from this product and was comparable with that of the three commercial yeast extracts in terms of bacterial growth properties. One of the main advantages of the current method was that no chemicals and enzymes were used, leading to the reduced production cost. The method is very simple and cost effective, and can be performed in a reasonable time making it suitable for being adopted by research laboratories. Furthermore, it can be scaled up to produce large quantities for industrial applications. PMID:28243289
Automatic Extraction of Destinations, Origins and Route Parts from Human Generated Route Directions
NASA Astrophysics Data System (ADS)
Zhang, Xiao; Mitra, Prasenjit; Klippel, Alexander; Maceachren, Alan
Researchers from the cognitive and spatial sciences are studying text descriptions of movement patterns in order to examine how humans communicate and understand spatial information. In particular, route directions offer a rich source of information on how cognitive systems conceptualize movement patterns by segmenting them into meaningful parts. Route directions are composed using a plethora of cognitive spatial organization principles: changing levels of granularity, hierarchical organization, incorporation of cognitively and perceptually salient elements, and so forth. Identifying such information in text documents automatically is crucial for enabling machine-understanding of human spatial language. The benefits are: a) creating opportunities for large-scale studies of human linguistic behavior; b) extracting and georeferencing salient entities (landmarks) that are used by human route direction providers; c) developing methods to translate route directions to sketches and maps; and d) enabling queries on large corpora of crawled/analyzed movement data. In this paper, we introduce our approach and implementations that bring us closer to the goal of automatically processing linguistic route directions. We report on research directed at one part of the larger problem, that is, extracting the three most critical parts of route directions and movement patterns in general: origin, destination, and route parts. We use machine-learning based algorithms to extract these parts of routes, including, for example, destination names and types. We prove the effectiveness of our approach in several experiments using hand-tagged corpora.
Ko, K Y; Nam, K C; Jo, C; Lee, E J; Ahn, D U
2011-05-01
The objective of this study was to develop a new protocol that could be used for large-scale separation of phosvitin from egg yolk using ethanol and salts. Yolk granules, which contain phosvitin, were precipitated after diluting egg yolk with 9 volumes of distilled water. The pH of the yolk solution was adjusted to pH 4.0 to 8.0 using 6 N HCl or NaOH, and then yolk granules containing phosvitin was separated by centrifugation at 3,220 × g for 30 min. Lipids and phospholipids were removed from the insoluble yolk granules using 85% ethanol. The optimal volumes and concentration of ethanol in removing lipids from the precipitants were determined. After centrifugation, the lipid-free precipitants were homogenized with 9 volumes of ammonium sulfate [(NH(4))(2)SO(4)] or NaCl to extract phosvitin. The optimal pH and concentration of (NH(4))(2)SO(4) or NaCl for the highest recovery rate and purity for phosvitin in final solution were determined. At pH 6.0, all the phosvitin in diluted egg yolk solution was precipitated. Among the (NH(4))(2)SO(4) and NaCl conditions tested, 10% (NH(4))(2)SO(4) or 10% NaCl at pH 4.0 yielded the greatest phosvitin extraction from the lipid-free precipitants. The recovery rates of phosvitin using (NH(4))(2)SO(4) and NaCl were 72 and 97%, respectively, and their purity was approximately 85%. Salt was removed from the extract using ultrafiltration. The salt-free phosvitin solution was concentrated using ultrafiltration, the impurities were removed by centrifugation, and the resulting solution was freeze-dried. The partially purified phosvitin was suitable for human use because ethanol was the only solvent used to remove lipids, (NH(4))(2)SO(4) or NaCl was used to extract phosvitin, and ultrafiltration was used to remove salt and concentrate the extract. The developed method was simple and suitable for a large-scale preparation of partially purified phosvitin.
Multi-scale virtual view on the precessing jet SS433
NASA Astrophysics Data System (ADS)
Monceau-Baroux, R.; Porth, O.; Meliani, Z.; Keppens, R.
2014-07-01
Observations of SS433 infer how an X-ray binary gives rise to a corkscrew patterned relativistic jet. XRB SS433 is well known on a large range of scales for wich we realize 3D simulation and radio mappings. For our study we use relativistic hydrodynamic in special relativity using a relativistic effective polytropic index. We use parameters extracted from observations to impose thermodynamical conditions of the ISM and jet. We follow the kinetic and thermal energy content, of the various ISM and jet regions. Our simulation follows simultaneously the evolution of the population of electrons which are accelerated by the jet. The evolving spectrum of these electrons, together with an assumed equipartition between dynamic and magnetic pressure, gives input for estimating the radio emission from our simulation. Ray tracing according to a direction of sight then realizes radio mappings of our data. Single snapshots are realised to compare with VLA observation as in Roberts et al. 2008. A radio movie is realised to compare with the 41 days movie made with the VLBA instrument. Finaly a larger scale simulation explore the discrepancy of opening angle between 10 and 20 degree between the large scale observation of SS433 and its close in observation.
Industrial Applications of High Power Ultrasonics
NASA Astrophysics Data System (ADS)
Patist, Alex; Bates, Darren
Since the change of the millennium, high-power ultrasound has become an alternative food processing technology applicable to large-scale commercial applications such as emulsification, homogenization, extraction, crystallization, dewatering, low-temperature pasteurization, degassing, defoaming, activation and inactivation of enzymes, particle size reduction, extrusion, and viscosity alteration. This new focus can be attributed to significant improvements in equipment design and efficiency during the late 1990 s. Like most innovative food processing technologies, high-power ultrasonics is not an off-the-shelf technology, and thus requires careful development and scale-up for each and every application. The objective of this chapter is to present examples of ultrasonic applications that have been successful at the commercialization stage, advantages, and limitations, as well as key learnings from scaling up an innovative food technology in general.
NASA Astrophysics Data System (ADS)
Forootan, Ehsan; Kusche, Jürgen
2016-04-01
Geodetic/geophysical observations, such as the time series of global terrestrial water storage change or sea level and temperature change, represent samples of physical processes and therefore contain information about complex physical interactionswith many inherent time scales. Extracting relevant information from these samples, for example quantifying the seasonality of a physical process or its variability due to large-scale ocean-atmosphere interactions, is not possible by rendering simple time series approaches. In the last decades, decomposition techniques have found increasing interest for extracting patterns from geophysical observations. Traditionally, principal component analysis (PCA) and more recently independent component analysis (ICA) are common techniques to extract statistical orthogonal (uncorrelated) and independent modes that represent the maximum variance of observations, respectively. PCA and ICA can be classified as stationary signal decomposition techniques since they are based on decomposing the auto-covariance matrix or diagonalizing higher (than two)-order statistical tensors from centered time series. However, the stationary assumption is obviously not justifiable for many geophysical and climate variables even after removing cyclic components e.g., the seasonal cycles. In this paper, we present a new decomposition method, the complex independent component analysis (CICA, Forootan, PhD-2014), which can be applied to extract to non-stationary (changing in space and time) patterns from geophysical time series. Here, CICA is derived as an extension of real-valued ICA (Forootan and Kusche, JoG-2012), where we (i) define a new complex data set using a Hilbert transformation. The complex time series contain the observed values in their real part, and the temporal rate of variability in their imaginary part. (ii) An ICA algorithm based on diagonalization of fourth-order cumulants is then applied to decompose the new complex data set in (i). (iii) Dominant non-stationary patterns are recognized as independent complex patterns that can be used to represent the space and time amplitude and phase propagations. We present the results of CICA on simulated and real cases e.g., for quantifying the impact of large-scale ocean-atmosphere interaction on global mass changes. Forootan (PhD-2014) Statistical signal decomposition techniques for analyzing time-variable satellite gravimetry data, PhD Thesis, University of Bonn, http://hss.ulb.uni-bonn.de/2014/3766/3766.htm Forootan and Kusche (JoG-2012) Separation of global time-variable gravity signals into maximally independent components, Journal of Geodesy 86 (7), 477-497, doi: 10.1007/s00190-011-0532-5
Karmee, Sanjib Kumar
2018-02-01
Spent coffee grounds are composed of lipid, carbohydrates, carbonaceous, and nitrogen containing compounds among others. Using n-hexane and n-hexane/isopropanol mixture highest oil yield was achived during soxhlet extraction of oil from spent coffee grounds. Alternatively, supercritical carbon dioxide can be employed as a green solvent for the extraction of oil. Using advanced chemical and biotechnological methods, spent coffee grounds are converted to various biofuels such as, biodiesel, renewable diesel, bioethanol, bioethers, bio-oil, biochar, and biogas. The in-situ transesterification of spent coffee grounds was carried out in a large scale (4 kg), which led to 80-83% biodiesel yield. In addition, a large number of value added and diversified products viz. polyhydroxyalkanoates, biosorbent, activated carbon, polyol, polyurethane foam, carotenoid, phenolic antioxidants, and green composite are obtained from spent coffee grounds. The principles of circular economy are applied to develop a sustanaible biorefinery based on valorisation of spent coffee grounds. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Swanson, C.; Jandovitz, P.; Cohen, S. A.
2017-10-01
Knowledge of the full x-ray energy distribution function (XEDF) emitted from a plasma over a large dynamic range of energies can yield valuable insights about the electron energy distribution function (EEDF) of that plasma and the dynamic processes that create them. X-ray pulse height detectors such as Amptek's X-123 Fast SDD with Silicon Nitride window can detect x-rays in the range of 200eV to 100s of keV. However, extracting EEDF from this measurement requires precise knowledge of the detector's response function. This response function, including the energy scale calibration, the window transmission function, and the resolution function, can be measured directly. We describe measurements of this function from x-rays from a mono-energetic electron beam in a purpose-built gas-target x-ray tube. Large-Z effects such as line radiation, nuclear charge screening, and polarizational Bremsstrahlung are discussed.
Yan, Rongwei; Zhao, Leilei; Tao, Junfei; Zou, Yong; Xu, Xinjun
2018-05-01
Supercritical fluid extraction with CO 2 (SFE-CO 2 ) was utilized for extraction of capsaicin (CA) and dihydrocapsaicin (DHCA) from Capsici Fructus, and then a two-step enrichment method for separating capsaicinoids from SFE-CO 2 extracts was developed. The process involved extraction with aqueous methanol and crystallization by alkali extraction and acid precipitation. Finally, a consecutive high-speed countercurrent chromatography (HSCCC) separation method was successfully applied in the purification of CA and DHCA from capsaicinoid crystal. The extraction pressure, extraction temperature and volume of co-solvent were optimized at 33 MPa, 41 °C and 75 mL, respectively, using response surface methodology; the extraction rates of CA and DHCA were about 93.18% and 93.49%, respectively. 407.43 mg capsaicinoid crystal was isolated from the SFE-CO 2 extracts obtained from 100 g capsicum powder by the two-step enrichment method. About 506 mg and 184 mg CA and DHCA with purities up to 98.31% and 96.68%, respectively, were obtained from 1 g capsaicinoid crystal in one HSCCC of three consecutive sample loadings without exchanging any solvent system. This method comprising SFE-CO 2 , a two-step enrichment and HSCCC was efficient, powerful and practical for the large-scale preparation of CA and DHCA from Capsici Fructus with high purity and high yield. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Extensive screening for herbal extracts with potent antioxidant properties
Niwano, Yoshimi; Saito, Keita; Yoshizaki, Fumihiko; Kohno, Masahiro; Ozawa, Toshihiko
2011-01-01
This paper summarizes our research for herbal extracts with potent antioxidant activity obtained from a large scale screening based on superoxide radical (O2•−) scavenging activity followed by characterization of antioxidant properties. Firstly, scavenging activity against O2•− was extensively screened from ethanol extracts of approximately 1000 kinds of herbs by applying an electron spin resonance (ESR)-spin trapping method, and we chose four edible herbal extracts with prominently potent ability to scavenge O2•−. They are the extracts from Punica granatum (Peel), Syzygium aromaticum (Bud), Mangifera indica (Kernel), and Phyllanthus emblica (Fruit). These extracts were further examined to determine if they also scavenge hydroxyl radical (•OH), by applying the ESR spin-trapping method, and if they have heat resistance as a desirable characteristic feature. Experiments with the Fenton reaction and photolysis of H2O2 induced by UV irradiation demonstrated that all four extracts have potent ability to directly scavenge •OH. Furthermore, the scavenging activities against O2•− and •OH of the extracts of P. granatum (peel), M. indica (kernel) and P. emblica (fruit) proved to be heat-resistant. The results of the review might give useful information when choosing a potent antioxidant as a foodstuff. For instance, the four herbal extracts chosen from extensive screening possess desirable antioxidant properties. In particular, the extracts of the aforementioned three herbs are expected to be suitable for food processing in which thermal devices are used, because of their heat resistance. PMID:21297917
Shetty, Sapna B.; Mahin-Syed-Ismail, Prabu; Varghese, Shaji; Thomas-George, Bibin; Kandathil- Thajuraj, Pathinettam; Baby, Deepak; Haleem, Shaista; Sreedhar, Sreeja
2016-01-01
Background Ethnomedicine is gaining admiration since years but still there is abundant medicinal flora which is unrevealed through research. The study was conducted to assess the in vitro antimicrobial potential and also determine the minimum inhibitory concentration (MIC) of Citrus sinensis peel extracts with a view of searching a novel extract as a remedy for dental caries pathogens. Material and Methods Aqueous and ethanol (cold and hot) extracts prepared from peel of Citrus sinensis were screened for in vitro antimicrobial activity against Streptococcus mutans and Lactobacillus acidophilus, using agar well diffusion method. The lowest concentration of every extract considered as the minimal inhibitory concentration (MIC) values were determined for both test organisms. One way ANOVA with Post Hoc Bonferroni test was applied for statistical analysis. Confidence level and level of significance were set at 95% and 5% respectively. Results Dental caries pathogens were inhibited most by hot ethanolic extract of Citrus sinensispeel followed by cold ethanolic extract. Aqueous extracts were effective at very high concentrations. Minimum inhibitory concentration of hot and cold ethanolic extracts of Citrus sinensis peel ranged between 12-15 mg/ml against both the dental caries pathogens. Conclusions Citrus sinensispeels extract was found to be effective against dental caries pathogens and contain compounds with therapeutic potential. Nevertheless, clinical trials on the effect of these plants are essential before advocating large-scale therapy. Key words:Agar well diffusion, antimicrobial activity, dental caries, Streptococcus mutans, Lactobacillus acidophilus. PMID:26855710
Extraterrestrial resource utilization for economy in space missions
NASA Technical Reports Server (NTRS)
Lewis, J. S.; Ramohalli, K.; Triffet, T.
1990-01-01
The NASA/University of Arizona Space Engineering Research Center is dedicated to research on the discovery, characterization, mapping, beneficiation, extraction, processing, and fabrication of useful products from extraterrestrial material. Schemes for the automated production of low-technology products that are likely to be desired in large quantities in the early stages of any large-scale space activity are identified and developed. This paper summarizes the research program, concentrating upon the production of (1) propellants, both cryogenic and storable, (2) volatiles such as water, nitrogen, and carbon dioxide for use in life-support systems (3) structural metals, and (4) refractories for use in aerobrakes and furnace linings.
Tkachenko, S.; Baillie, N.; Kuhn, S. E.; ...
2014-04-24
In this study, much less is known about neutron structure than that of the proton due to the absence of free neutron targets. Neutron information is usually extracted from data on nuclear targets such as deuterium, requiring corrections for nuclear binding and nucleon off-shell effects. These corrections are model dependent and have significant uncertainties, especially for large values of the Bjorken scaling variable x. As a consequence, the same data can lead to different conclusions, for example, about the behavior of the d quark distribution in the proton at large x.
Di Maria, Francesco; Micale, Caterina; Sordi, Alessio; Cirulli, Giuseppe; Marionni, Moreno
2013-12-01
The mechanically sorted dry fraction (MSDF) and Fines (<20mm) arising from the mechanical biological treatment of residual municipal solid waste (RMSW) contains respectively about 11% w/w each of recyclable and recoverable materials. Processing a large sample of MSDF in an existing full-scale mechanical sorting facility equipped with near infrared and 2-3 dimensional selectors led to the extraction of about 6% w/w of recyclables with respect to the RMSW weight. Maximum selection efficiency was achieved for metals, about 98% w/w, whereas it was lower for Waste Electrical and Electronic Equipment (WEEE), about 2% w/w. After a simulated lab scale soil washing treatment it was possible to extract about 2% w/w of inert exploitable substances recoverable as construction materials, with respect to the amount of RMSW. The passing curve showed that inert materials were mainly sand with a particle size ranging from 0.063 to 2mm. Leaching tests showed quite low heavy metal concentrations with the exception of the particles retained by the 0.5mm sieve. A minimum pollutant concentration was in the leachate from the 10 and 20mm particle size fractions. Copyright © 2013 Elsevier Ltd. All rights reserved.
Large-eddy simulations of a forced homogeneous isotropic turbulence with polymer additives
NASA Astrophysics Data System (ADS)
Wang, Lu; Cai, Wei-Hua; Li, Feng-Chen
2014-03-01
Large-eddy simulations (LES) based on the temporal approximate deconvolution model were performed for a forced homogeneous isotropic turbulence (FHIT) with polymer additives at moderate Taylor Reynolds number. Finitely extensible nonlinear elastic in the Peterlin approximation model was adopted as the constitutive equation for the filtered conformation tensor of the polymer molecules. The LES results were verified through comparisons with the direct numerical simulation results. Using the LES database of the FHIT in the Newtonian fluid and the polymer solution flows, the polymer effects on some important parameters such as strain, vorticity, drag reduction, and so forth were studied. By extracting the vortex structures and exploring the flatness factor through a high-order correlation function of velocity derivative and wavelet analysis, it can be found that the small-scale vortex structures and small-scale intermittency in the FHIT are all inhibited due to the existence of the polymers. The extended self-similarity scaling law in the polymer solution flow shows no apparent difference from that in the Newtonian fluid flow at the currently simulated ranges of Reynolds and Weissenberg numbers.
Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min
2015-06-01
The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.
NASA Astrophysics Data System (ADS)
Yulaeva, E.; Fan, Y.; Moosdorf, N.; Richard, S. M.; Bristol, S.; Peters, S. E.; Zaslavsky, I.; Ingebritsen, S.
2015-12-01
The Digital Crust EarthCube building block creates a framework for integrating disparate 3D/4D information from multiple sources into a comprehensive model of the structure and composition of the Earth's upper crust, and to demonstrate the utility of this model in several research scenarios. One of such scenarios is estimation of various crustal properties related to fluid dynamics (e.g. permeability and porosity) at each node of any arbitrary unstructured 3D grid to support continental-scale numerical models of fluid flow and transport. Starting from Macrostrat, an existing 4D database of 33,903 chronostratigraphic units, and employing GeoDeepDive, a software system for extracting structured information from unstructured documents, we construct 3D gridded fields of sediment/rock porosity, permeability and geochemistry for large sedimentary basins of North America, which will be used to improve our understanding of large-scale fluid flow, chemical weathering rates, and geochemical fluxes into the ocean. In this talk, we discuss the methods, data gaps (particularly in geologically complex terrain), and various physical and geological constraints on interpolation and uncertainty estimation.
Wang, Cong; Li, Jing; Wu, Shanlong; Xia, Chuanfu
2017-01-01
Remote-sensing phenology detection can compensate for deficiencies in field observations and has the advantage of capturing the continuous expression of phenology on a large scale. However, there is some variability in the results of remote-sensing phenology detection derived from different vegetation parameters in satellite time-series data. Since the enhanced vegetation index (EVI) and the leaf area index (LAI) are the most widely used vegetation parameters for remote-sensing phenology extraction, this paper aims to assess the differences in phenological information extracted from EVI and LAI time series and to explore whether either index performs well for all vegetation types on a large scale. To this end, a GLASS (Global Land Surface Satellite Product)-LAI-based phenology product (GLP) was generated using the same algorithm as the MODIS (Moderate Resolution Imaging Spectroradiometer)-EVI phenology product (MLCD) over China from 2001 to 2012. The two phenology products were compared in China for different vegetation types and evaluated using ground observations. The results show that the ratio of missing data is 8.3% for the GLP, which is less than the 22.8% for the MLCD. The differences between the GLP and the MLCD become stronger as the latitude decreases, which also vary among different vegetation types. The start of the growing season (SOS) of the GLP is earlier than that of the MLCD in most vegetation types, and the end of the growing season (EOS) of the GLP is generally later than that of the MLCD. Based on ground observations, it can be suggested that the GLP performs better than the MLCD in evergreen needleleaved forests and croplands, while the MLCD performs better than the GLP in shrublands and grasslands. PMID:28867773
Wolbachia and DNA barcoding insects: patterns, potential, and problems.
Smith, M Alex; Bertrand, Claudia; Crosby, Kate; Eveleigh, Eldon S; Fernandez-Triana, Jose; Fisher, Brian L; Gibbs, Jason; Hajibabaei, Mehrdad; Hallwachs, Winnie; Hind, Katharine; Hrcek, Jan; Huang, Da-Wei; Janda, Milan; Janzen, Daniel H; Li, Yanwei; Miller, Scott E; Packer, Laurence; Quicke, Donald; Ratnasingham, Sujeevan; Rodriguez, Josephine; Rougerie, Rodolphe; Shaw, Mark R; Sheffield, Cory; Stahlhut, Julie K; Steinke, Dirk; Whitfield, James; Wood, Monty; Zhou, Xin
2012-01-01
Wolbachia is a genus of bacterial endosymbionts that impacts the breeding systems of their hosts. Wolbachia can confuse the patterns of mitochondrial variation, including DNA barcodes, because it influences the pathways through which mitochondria are inherited. We examined the extent to which these endosymbionts are detected in routine DNA barcoding, assessed their impact upon the insect sequence divergence and identification accuracy, and considered the variation present in Wolbachia COI. Using both standard PCR assays (Wolbachia surface coding protein--wsp), and bacterial COI fragments we found evidence of Wolbachia in insect total genomic extracts created for DNA barcoding library construction. When >2 million insect COI trace files were examined on the Barcode of Life Datasystem (BOLD) Wolbachia COI was present in 0.16% of the cases. It is possible to generate Wolbachia COI using standard insect primers; however, that amplicon was never confused with the COI of the host. Wolbachia alleles recovered were predominantly Supergroup A and were broadly distributed geographically and phylogenetically. We conclude that the presence of the Wolbachia DNA in total genomic extracts made from insects is unlikely to compromise the accuracy of the DNA barcode library; in fact, the ability to query this DNA library (the database and the extracts) for endosymbionts is one of the ancillary benefits of such a large scale endeavor--which we provide several examples. It is our conclusion that regular assays for Wolbachia presence and type can, and should, be adopted by large scale insect barcoding initiatives. While COI is one of the five multi-locus sequence typing (MLST) genes used for categorizing Wolbachia, there is limited overlap with the eukaryotic DNA barcode region.
Large-scale extraction of brain connectivity from the neuroscientific literature
Richardet, Renaud; Chappelier, Jean-Cédric; Telefont, Martin; Hill, Sean
2015-01-01
Motivation: In neuroscience, as in many other scientific domains, the primary form of knowledge dissemination is through published articles. One challenge for modern neuroinformatics is finding methods to make the knowledge from the tremendous backlog of publications accessible for search, analysis and the integration of such data into computational models. A key example of this is metascale brain connectivity, where results are not reported in a normalized repository. Instead, these experimental results are published in natural language, scattered among individual scientific publications. This lack of normalization and centralization hinders the large-scale integration of brain connectivity results. In this article, we present text-mining models to extract and aggregate brain connectivity results from 13.2 million PubMed abstracts and 630 216 full-text publications related to neuroscience. The brain regions are identified with three different named entity recognizers (NERs) and then normalized against two atlases: the Allen Brain Atlas (ABA) and the atlas from the Brain Architecture Management System (BAMS). We then use three different extractors to assess inter-region connectivity. Results: NERs and connectivity extractors are evaluated against a manually annotated corpus. The complete in litero extraction models are also evaluated against in vivo connectivity data from ABA with an estimated precision of 78%. The resulting database contains over 4 million brain region mentions and over 100 000 (ABA) and 122 000 (BAMS) potential brain region connections. This database drastically accelerates connectivity literature review, by providing a centralized repository of connectivity data to neuroscientists. Availability and implementation: The resulting models are publicly available at github.com/BlueBrain/bluima. Contact: renaud.richardet@epfl.ch Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25609795
Geophysical Potential for Wind Energy over the Open Oceans
NASA Astrophysics Data System (ADS)
Possner, A.; Caldeira, K.
2017-12-01
Wind turbines continuously remove kinetic energy from the lower troposphere thereby reducing the wind speed near hub height. The rate of electricity generation in large wind farms containing multiple wind arrays is therefore constrained by the rate of kinetic energy replenishment from the atmosphere above. In particular, this study focuses on the maximum sustained transport of kinetic energy through the troposphere to the lowest hundreds of meters above the surface. In recent years, a growing body of research argues that the rate of generated power is limited to around 1.5 Wm-2 within large wind farms. However, in this study we demonstrate that considerably higher power generation rates may be sustainable over some open ocean areas in giant wind farms. We find that in the North Atlantic maximum extraction rates of up to 6.7 Wm-2 may be sustained by the atmosphere in the annual mean over giant wind farm areas approaching the size of Greenland. In contrast, only a third of this rate is sustained on land for areas of equivalent size. Our simulations indicate a fundamental difference in response of the troposphere and its vertical kinetic energy flux to giant near-surface wind farms. We find that the surface heat flux from the oceans to the atmosphere may play an important role in creating regions where large sustained rates of downward transport of kinetic energy and thus rates of kinetic energy extraction may be geophysically possible. While no commercial-scale deep-water wind turbines yet exist, our results suggest that such technologies, if they became technically and economically feasible, could potentially provide civilization-scale power.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team`s ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team's ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
Bacteremia after supragingival scaling and dental extraction: Culture and molecular analyses.
Reis, L C; Rôças, I N; Siqueira, J F; de Uzeda, M; Lacerda, V S; Domingues, Rmcp; Miranda, K R; Saraiva, R M
2018-05-01
To study the incidence and magnitude of bacteremia after dental extraction and supragingival scaling. Blood samples were taken before and 5 and 30 min after dental extraction and supragingival scaling from individuals at high (n = 44) or negligible risk (n = 51) for infective endocarditis. The former received prophylactic antibiotic therapy. Samples were subjected to aerobic and anaerobic culture and quantitative real-time polymerase chain reaction to determine the incidence of bacteremia and total bacterial levels. Patients who did not receive prophylactic antibiotic therapy had a higher incidence of positive blood cultures (30% 5 min after extraction) than patients who received prophylactic antibiotic therapy (0% 5 min after extraction; p < .01). Molecular analysis did not reveal significant differences in the incidence or magnitude of bacteremia between the two patient groups either 5 or 30 min after each of the procedures evaluated. Extraction was associated with higher incidence of bacteremia than supragingival scaling by blood culture (p = .03) and molecular analysis (p = .05). Molecular methods revealed that dental extraction and supragingival scaling were associated with similar incidence of bacteremia in groups receiving or not prophylactic antibiotic therapy. However, blood culture revealed that antibiotic therapy reduced viable cultivable bacteria in the bloodstream in the extraction group. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Su, Xin-Yao; Xue, Jian-Ping; Wang, Cai-Xia
2016-11-01
The functional ingredients in Chinese materia medica are the main active substance for traditional Chinese medicine and most of them are secondary metabolites derivatives. Until now,the main method to obtain those functional ingredients is through direct extraction from the Chinese materia medica. However, the income is very low because of the high extraction costs and the decreased medicinal plants. Synthetic biology technology, as a new and microbial approach, can be able to carry out large-scale production of functional ingredients and greatly ease the shortage of traditional Chinese medicine ingredients. This review mainly focused on the recent advances in synthetic biology for the functional ingredients production. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Wang, Min; Cui, Qi; Wang, Jie; Ming, Dongping; Lv, Guonian
2017-01-01
In this paper, we first propose several novel concepts for object-based image analysis, which include line-based shape regularity, line density, and scale-based best feature value (SBV), based on the region-line primitive association framework (RLPAF). We then propose a raft cultivation area (RCA) extraction method for high spatial resolution (HSR) remote sensing imagery based on multi-scale feature fusion and spatial rule induction. The proposed method includes the following steps: (1) Multi-scale region primitives (segments) are obtained by image segmentation method HBC-SEG, and line primitives (straight lines) are obtained by phase-based line detection method. (2) Association relationships between regions and lines are built based on RLPAF, and then multi-scale RLPAF features are extracted and SBVs are selected. (3) Several spatial rules are designed to extract RCAs within sea waters after land and water separation. Experiments show that the proposed method can successfully extract different-shaped RCAs from HR images with good performance.
Phillips, Carolyn L.; Guo, Hanqi; Peterka, Tom; ...
2016-02-19
In type-II superconductors, the dynamics of magnetic flux vortices determine their transport properties. In the Ginzburg-Landau theory, vortices correspond to topological defects in the complex order parameter field. Earlier, we introduced a method for extracting vortices from the discretized complex order parameter field generated by a large-scale simulation of vortex matter. With this method, at a fixed time step, each vortex [simplistically, a one-dimensional (1D) curve in 3D space] can be represented as a connected graph extracted from the discretized field. Here we extend this method as a function of time as well. A vortex now corresponds to a 2Dmore » space-time sheet embedded in 4D space time that can be represented as a connected graph extracted from the discretized field over both space and time. Vortices that interact by merging or splitting correspond to disappearance and appearance of holes in the connected graph in the time direction. This method of tracking vortices, which makes no assumptions about the scale or behavior of the vortices, can track the vortices with a resolution as good as the discretization of the temporally evolving complex scalar field. In addition, even details of the trajectory between time steps can be reconstructed from the connected graph. With this form of vortex tracking, the details of vortex dynamics in a model of a superconducting materials can be understood in greater detail than previously possible.« less
Sánchez-Camargo, Andrea del Pilar; García-Cañas, Virginia; Herrero, Miguel; Cifuentes, Alejandro; Ibáñez, Elena
2016-01-01
In the present work, four green processes have been compared to evaluate their potential to obtain rosemary extracts with in vitro anti-proliferative activity against two colon cancer cell lines (HT-29 and HCT116). The processes, carried out under optimal conditions, were: (1) pressurized liquid extraction (PLE, using an hydroalcoholic mixture as solvent) at lab-scale; (2) Single-step supercritical fluid extraction (SFE) at pilot scale; (3) Intensified two-step sequential SFE at pilot scale; (4) Integrated PLE plus supercritical antisolvent fractionation (SAF) at pilot scale. Although higher extraction yields were achieved by using PLE (38.46% dry weight), this extract provided the lowest anti-proliferative activity with no observed cytotoxic effects at the assayed concentrations. On the other hand, extracts obtained using the PLE + SAF process provided the most active rosemary extracts against both colon cancer cell lines, with LC50 ranging from 11.2 to 12.4 µg/mL and from 21.8 to 31.9 µg/mL for HCT116 and HT-29, respectively. In general, active rosemary extracts were characterized by containing carnosic acid (CA) and carnosol (CS) at concentrations above 263.7 and 33.9 mg/g extract, respectively. Some distinct compounds have been identified in the SAF extracts (rosmaridiphenol and safficinolide), suggesting their possible role as additional contributors to the observed strong anti-proliferative activity of CA and CS in SAF extracts. PMID:27941607
Sánchez-Camargo, Andrea Del Pilar; García-Cañas, Virginia; Herrero, Miguel; Cifuentes, Alejandro; Ibáñez, Elena
2016-12-07
In the present work, four green processes have been compared to evaluate their potential to obtain rosemary extracts with in vitro anti-proliferative activity against two colon cancer cell lines (HT-29 and HCT116). The processes, carried out under optimal conditions, were: (1) pressurized liquid extraction (PLE, using an hydroalcoholic mixture as solvent) at lab-scale; (2) Single-step supercritical fluid extraction (SFE) at pilot scale; (3) Intensified two-step sequential SFE at pilot scale; (4) Integrated PLE plus supercritical antisolvent fractionation (SAF) at pilot scale. Although higher extraction yields were achieved by using PLE (38.46% dry weight), this extract provided the lowest anti-proliferative activity with no observed cytotoxic effects at the assayed concentrations. On the other hand, extracts obtained using the PLE + SAF process provided the most active rosemary extracts against both colon cancer cell lines, with LC 50 ranging from 11.2 to 12.4 µg/mL and from 21.8 to 31.9 µg/mL for HCT116 and HT-29, respectively. In general, active rosemary extracts were characterized by containing carnosic acid (CA) and carnosol (CS) at concentrations above 263.7 and 33.9 mg/g extract, respectively. Some distinct compounds have been identified in the SAF extracts (rosmaridiphenol and safficinolide), suggesting their possible role as additional contributors to the observed strong anti-proliferative activity of CA and CS in SAF extracts.
NASA Astrophysics Data System (ADS)
Chen, J.; Chen, W.; Dou, A.; Li, W.; Sun, Y.
2018-04-01
A new information extraction method of damaged buildings rooted in optimal feature space is put forward on the basis of the traditional object-oriented method. In this new method, ESP (estimate of scale parameter) tool is used to optimize the segmentation of image. Then the distance matrix and minimum separation distance of all kinds of surface features are calculated through sample selection to find the optimal feature space, which is finally applied to extract the image of damaged buildings after earthquake. The overall extraction accuracy reaches 83.1 %, the kappa coefficient 0.813. The new information extraction method greatly improves the extraction accuracy and efficiency, compared with the traditional object-oriented method, and owns a good promotional value in the information extraction of damaged buildings. In addition, the new method can be used for the information extraction of different-resolution images of damaged buildings after earthquake, then to seek the optimal observation scale of damaged buildings through accuracy evaluation. It is supposed that the optimal observation scale of damaged buildings is between 1 m and 1.2 m, which provides a reference for future information extraction of damaged buildings.
The Large Local Hole in the Galaxy Distribution: The 2MASS Galaxy Angular Power Spectrum
NASA Astrophysics Data System (ADS)
Frith, W. J.; Outram, P. J.; Shanks, T.
2005-06-01
We present new evidence for a large deficiency in the local galaxy distribution situated in the ˜4000 deg2 APM survey area. We use models guided by the 2dF Galaxy Redshift Survey (2dFGRS) n(z) as a probe of the underlying large-scale structure. We first check the usefulness of this technique by comparing the 2dFGRS n(z) model prediction with the K-band and B-band number counts extracted from the 2MASS and 2dFGRS parent catalogues over the 2dFGRS Northern and Southern declination strips, before turning to a comparison with the APM counts. We find that the APM counts in both the B and K-bands indicate a deficiency in the local galaxy distribution of ˜30% to z ≈ 0.1 over the entire APM survey area. We examine the implied significance of such a large local hole, considering several possible forms for the real-space correlation function. We find that such a deficiency in the APM survey area indicates an excess of power at large scales over what is expected from the correlation function observed in 2dFGRS correlation function or predicted from ΛCDM Hubble Volume mock catalogues. In order to check further the clustering at large scales in the 2MASS data, we have calculated the angular power spectrum for 2MASS galaxies. Although in the linear regime (l<30), ΛCDM models can give a good fit to the 2MASS angular power spectrum, over a wider range (l<100) the power spectrum from Hubble Volume mock catalogues suggests that scale-dependent bias may be needed for ΛCDM to fit. However, the modest increase in large-scale power observed in the 2MASS angular power spectrum is still not enough to explain the local hole. If the APM survey area really is 25% deficient in galaxies out to z≈0.1, explanations for the disagreement with observed galaxy clustering statistics include the possibilities that the galaxy clustering is non-Gaussian on large scales or that the 2MASS volume is still too small to represent a `fair sample' of the Universe. Extending the 2dFGRS redshift survey over the whole APM area would resolve many of the remaining questions about the existence and interpretation of this local hole.
HD-MTL: Hierarchical Deep Multi-Task Learning for Large-Scale Visual Recognition.
Fan, Jianping; Zhao, Tianyi; Kuang, Zhenzhong; Zheng, Yu; Zhang, Ji; Yu, Jun; Peng, Jinye
2017-02-09
In this paper, a hierarchical deep multi-task learning (HD-MTL) algorithm is developed to support large-scale visual recognition (e.g., recognizing thousands or even tens of thousands of atomic object classes automatically). First, multiple sets of multi-level deep features are extracted from different layers of deep convolutional neural networks (deep CNNs), and they are used to achieve more effective accomplishment of the coarseto- fine tasks for hierarchical visual recognition. A visual tree is then learned by assigning the visually-similar atomic object classes with similar learning complexities into the same group, which can provide a good environment for determining the interrelated learning tasks automatically. By leveraging the inter-task relatedness (inter-class similarities) to learn more discriminative group-specific deep representations, our deep multi-task learning algorithm can train more discriminative node classifiers for distinguishing the visually-similar atomic object classes effectively. Our hierarchical deep multi-task learning (HD-MTL) algorithm can integrate two discriminative regularization terms to control the inter-level error propagation effectively, and it can provide an end-to-end approach for jointly learning more representative deep CNNs (for image representation) and more discriminative tree classifier (for large-scale visual recognition) and updating them simultaneously. Our incremental deep learning algorithms can effectively adapt both the deep CNNs and the tree classifier to the new training images and the new object classes. Our experimental results have demonstrated that our HD-MTL algorithm can achieve very competitive results on improving the accuracy rates for large-scale visual recognition.
1987-09-01
can be reduced substantially, compared to using numerical methods to model inter - " connect parasitics. Although some accuracy might be lost with...conductor widths and spacings listed in Table 2 1 , have been employed for simulation. In the first set of the simulations, planar dielectric inter ...model, there are no restrictions on the iumber ol diele-iric and conductors. andl the shape of the conductors and the dielectric inter - a.e,, In the
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Arvind; Pullum, Laura L.; Hobson, Tanner C.
Here, we describe a data-driven unsupervised machine learning approach to extract geo-temporal co-occurrence patterns of asthma and the flu from large-scale electronic healthcare reimbursement claims (eHRC) datasets. Specifically, we examine the eHRC data from 2009 to 2010 pandemic H1N1 influenza season and analyze whether different geographic regions within the United States (US) showed an increase in co-occurrence patterns of the flu and asthma. Our analyses reveal that the temporal patterns extracted from the eHRC data show a distinct lag time between the peak incidence of the asthma and the flu. While the increased occurrence of asthma contributed to increased flumore » incidence during the pandemic, this co-occurrence is predominant for female patients. The geo-temporal patterns reveal that the co-occurrence of the flu and asthma are typically concentrated within the south-east US. Further, in agreement with previous studies, large urban areas (such as New York, Miami, and Los Angeles) exhibit co-occurrence patterns that suggest a peak incidence of asthma and flu significantly early in the spring and winter seasons. Together, our data-analytic approach, integrated within the Oak Ridge Bio-surveillance Toolkit platform, demonstrates how eHRC data can provide novel insights into co-occurring disease patterns.« less
Ramanathan, Arvind; Pullum, Laura L.; Hobson, Tanner C.; ...
2015-08-03
Here, we describe a data-driven unsupervised machine learning approach to extract geo-temporal co-occurrence patterns of asthma and the flu from large-scale electronic healthcare reimbursement claims (eHRC) datasets. Specifically, we examine the eHRC data from 2009 to 2010 pandemic H1N1 influenza season and analyze whether different geographic regions within the United States (US) showed an increase in co-occurrence patterns of the flu and asthma. Our analyses reveal that the temporal patterns extracted from the eHRC data show a distinct lag time between the peak incidence of the asthma and the flu. While the increased occurrence of asthma contributed to increased flumore » incidence during the pandemic, this co-occurrence is predominant for female patients. The geo-temporal patterns reveal that the co-occurrence of the flu and asthma are typically concentrated within the south-east US. Further, in agreement with previous studies, large urban areas (such as New York, Miami, and Los Angeles) exhibit co-occurrence patterns that suggest a peak incidence of asthma and flu significantly early in the spring and winter seasons. Together, our data-analytic approach, integrated within the Oak Ridge Bio-surveillance Toolkit platform, demonstrates how eHRC data can provide novel insights into co-occurring disease patterns.« less
Zhang, Hongcai; Yun, Sanyue; Song, Lingling; Zhang, Yiwen; Zhao, Yanyun
2017-03-01
The crustacean shells of crabs and shrimps produces quantities of by-products, leading to seriously environmental pollution and human health problems during industrial processing, yet they turned into high-value useful products, such as chitin and chitosan. To prepare them under large-scale submerged fermentation level, shrimp shell powders (SSPs) was fermented by successive three-step fermentation of Serratia marcescens B742, Lactobacillus plantarum ATCC 8014 and Rhizopus japonicus M193 to extract chitin and chitosan based on previously optimal conditions. Moreover, the key parameters was investigated to monitor the changes of resulted products during fermentation process. The results showed that the yield of prepared chitin and chitosan reached 21.35 and 13.11% with the recovery rate of 74.67 and 63.42%, respectively. The degree of deacetylation (DDA) and molecular mass (MM) of produced chitosan were 81.23% and 512.06kDa, respectively. The obtained chitin and chitosan was characterized using Fourier transform infrared spectrometer (FT-IR) and X-ray diffraction (XRD) analysis. The established microbial fermentation method can be applied for the industrial large-scale production of chitin and chitosan, while the use of chemical reagents was significantly reduced. Copyright © 2016 Elsevier B.V. All rights reserved.
Fan, Jianping; Gao, Yuli; Luo, Hangzai
2008-03-01
In this paper, we have developed a new scheme for achieving multilevel annotations of large-scale images automatically. To achieve more sufficient representation of various visual properties of the images, both the global visual features and the local visual features are extracted for image content representation. To tackle the problem of huge intraconcept visual diversity, multiple types of kernels are integrated to characterize the diverse visual similarity relationships between the images more precisely, and a multiple kernel learning algorithm is developed for SVM image classifier training. To address the problem of huge interconcept visual similarity, a novel multitask learning algorithm is developed to learn the correlated classifiers for the sibling image concepts under the same parent concept and enhance their discrimination and adaptation power significantly. To tackle the problem of huge intraconcept visual diversity for the image concepts at the higher levels of the concept ontology, a novel hierarchical boosting algorithm is developed to learn their ensemble classifiers hierarchically. In order to assist users on selecting more effective hypotheses for image classifier training, we have developed a novel hyperbolic framework for large-scale image visualization and interactive hypotheses assessment. Our experiments on large-scale image collections have also obtained very positive results.
Size scaling of negative hydrogen ion sources for fusion
NASA Astrophysics Data System (ADS)
Fantz, U.; Franzen, P.; Kraus, W.; Schiesko, L.; Wimmer, C.; Wünderlich, D.
2015-04-01
The RF-driven negative hydrogen ion source (H-, D-) for the international fusion experiment ITER has a width of 0.9 m and a height of 1.9 m and is based on a ⅛ scale prototype source being in operation at the IPP test facilities BATMAN and MANITU for many years. Among the challenges to meet the required parameters in a caesiated source at a source pressure of 0.3 Pa or less is the challenge in size scaling of a factor of eight. As an intermediate step a ½ scale ITER source went into operation at the IPP test facility ELISE with the first plasma in February 2013. The experience and results gained so far at ELISE allowed a size scaling study from the prototype source towards the ITER relevant size at ELISE, in which operational issues, physical aspects and the source performance is addressed, highlighting differences as well as similarities. The most ITER relevant results are: low pressure operation down to 0.2 Pa is possible without problems; the magnetic filter field created by a current in the plasma grid is sufficient to reduce the electron temperature below the target value of 1 eV and to reduce together with the bias applied between the differently shaped bias plate and the plasma grid the amount of co-extracted electrons. An asymmetry of the co-extracted electron currents in the two grid segments is measured, varying strongly with filter field and bias. Contrary to the prototype source, a dedicated plasma drift in vertical direction is not observed. As in the prototype source, the performance in deuterium is limited by the amount of co-extracted electrons in short as well as in long pulse operation. Caesium conditioning is much harder in deuterium than in hydrogen for which fast and reproducible conditioning is achieved. First estimates reveal a caesium consumption comparable to the one in the prototype source despite the large size.
Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan
2016-04-22
The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.
Chinese character recognition based on Gabor feature extraction and CNN
NASA Astrophysics Data System (ADS)
Xiong, Yudian; Lu, Tongwei; Jiang, Yongyuan
2018-03-01
As an important application in the field of text line recognition and office automation, Chinese character recognition has become an important subject of pattern recognition. However, due to the large number of Chinese characters and the complexity of its structure, there is a great difficulty in the Chinese character recognition. In order to solve this problem, this paper proposes a method of printed Chinese character recognition based on Gabor feature extraction and Convolution Neural Network(CNN). The main steps are preprocessing, feature extraction, training classification. First, the gray-scale Chinese character image is binarized and normalized to reduce the redundancy of the image data. Second, each image is convoluted with Gabor filter with different orientations, and the feature map of the eight orientations of Chinese characters is extracted. Third, the feature map through Gabor filters and the original image are convoluted with learning kernels, and the results of the convolution is the input of pooling layer. Finally, the feature vector is used to classify and recognition. In addition, the generalization capacity of the network is improved by Dropout technology. The experimental results show that this method can effectively extract the characteristics of Chinese characters and recognize Chinese characters.
Large scale screening of commonly used Iranian traditional medicinal plants against urease activity
2012-01-01
Background and purpose of the study H. pylori infection is an important etiologic impetus usually leading to gastric disease and urease enzyme is the most crucial role is to protect the bacteria in the acidic environment of the stomach. Then urease inhibitors would increase sensitivity of the bacteria in acidic medium. Methods 137 Iranian traditional medicinal plants were examined against Jack bean urease activity by Berthelot reaction. Each herb was extracted using 50% aqueous methanol. The more effective extracts were further tested and their IC50 values were determined. Results 37 plants out of the 137 crude extracts revealed strong urease inhibitory activity (more than 70% inhibition against urease activity at 10 mg/ml concentration). Nine of the whole studied plants crude extracts were found as the most effective with IC50 values less than 500 μg/ml including; Rheum ribes, Sambucus ebulus, Pistachia lentiscus, Myrtus communis, Areca catechu, Citrus aurantifolia, Myristica fragrans, Cinnamomum zeylanicum and Nicotiana tabacum. Conclusions The most potent urease inhibitory was observed for Sambucus ebulus and Rheum ribes extracts with IC50 values of 57 and 92 μg/ml, respectively. PMID:23351780
REMOVAL OF PCBS FROM A CONTAMINATED SOIL USING CF-SYSTEMS SOLVENT EXTRACTION PROCESS
The US EPA's START team in cooperation with EPA's SITE program evaluated a pilot scale solvent extraction process developed by CF-Systems. This process uses liquified propane to extract organic contaminants from soils, sludges, and sediments. A pilot-scale evaluation was conducte...
Naghdi, Mohammad Reza; Smail, Katia; Wang, Joy X; Wade, Fallou; Breaker, Ronald R; Perreault, Jonathan
2017-03-15
The discovery of noncoding RNAs (ncRNAs) and their importance for gene regulation led us to develop bioinformatics tools to pursue the discovery of novel ncRNAs. Finding ncRNAs de novo is challenging, first due to the difficulty of retrieving large numbers of sequences for given gene activities, and second due to exponential demands on calculation needed for comparative genomics on a large scale. Recently, several tools for the prediction of conserved RNA secondary structure were developed, but many of them are not designed to uncover new ncRNAs, or are too slow for conducting analyses on a large scale. Here we present various approaches using the database RiboGap as a primary tool for finding known ncRNAs and for uncovering simple sequence motifs with regulatory roles. This database also can be used to easily extract intergenic sequences of eubacteria and archaea to find conserved RNA structures upstream of given genes. We also show how to extend analysis further to choose the best candidate ncRNAs for experimental validation. Copyright © 2017 Elsevier Inc. All rights reserved.
Digital geomorphological landslide hazard mapping of the Alpago area, Italy
NASA Astrophysics Data System (ADS)
van Westen, Cees J.; Soeters, Rob; Sijmons, Koert
Large-scale geomorphological maps of mountainous areas are traditionally made using complex symbol-based legends. They can serve as excellent "geomorphological databases", from which an experienced geomorphologist can extract a large amount of information for hazard mapping. However, these maps are not designed to be used in combination with a GIS, due to their complex cartographic structure. In this paper, two methods are presented for digital geomorphological mapping at large scales using GIS and digital cartographic software. The methods are applied to an area with a complex geomorphological setting on the Borsoia catchment, located in the Alpago region, near Belluno in the Italian Alps. The GIS database set-up is presented with an overview of the data layers that have been generated and how they are interrelated. The GIS database was also converted into a paper map, using a digital cartographic package. The resulting largescale geomorphological hazard map is attached. The resulting GIS database and cartographic product can be used to analyse the hazard type and hazard degree for each polygon, and to find the reasons for the hazard classification.
Multi-scale signed envelope inversion
NASA Astrophysics Data System (ADS)
Chen, Guo-Xin; Wu, Ru-Shan; Wang, Yu-Qing; Chen, Sheng-Chang
2018-06-01
Envelope inversion based on modulation signal mode was proposed to reconstruct large-scale structures of underground media. In order to solve the shortcomings of conventional envelope inversion, multi-scale envelope inversion was proposed using new envelope Fréchet derivative and multi-scale inversion strategy to invert strong contrast models. In multi-scale envelope inversion, amplitude demodulation was used to extract the low frequency information from envelope data. However, only to use amplitude demodulation method will cause the loss of wavefield polarity information, thus increasing the possibility of inversion to obtain multiple solutions. In this paper we proposed a new demodulation method which can contain both the amplitude and polarity information of the envelope data. Then we introduced this demodulation method into multi-scale envelope inversion, and proposed a new misfit functional: multi-scale signed envelope inversion. In the numerical tests, we applied the new inversion method to the salt layer model and SEG/EAGE 2-D Salt model using low-cut source (frequency components below 4 Hz were truncated). The results of numerical test demonstrated the effectiveness of this method.
High Resolution Imaging of the Sun with CORONAS-1
NASA Technical Reports Server (NTRS)
Karovska, Margarita
1998-01-01
We applied several image restoration and enhancement techniques, to CORONAS-I images. We carried out the characterization of the Point Spread Function (PSF) using the unique capability of the Blind Iterative Deconvolution (BID) technique, which recovers the real PSF at a given location and time of observation, when limited a priori information is available on its characteristics. We also applied image enhancement technique to extract the small scale structure imbeded in bright large scale structures on the disk and on the limb. The results demonstrate the capability of the image post-processing to substantially increase the yield from the space observations by improving the resolution and reducing noise in the images.
Very large scale characterization of graphene mechanical devices using a colorimetry technique.
Cartamil-Bueno, Santiago Jose; Centeno, Alba; Zurutuza, Amaia; Steeneken, Peter Gerard; van der Zant, Herre Sjoerd Jan; Houri, Samer
2017-06-08
We use a scalable optical technique to characterize more than 21 000 circular nanomechanical devices made of suspended single- and double-layer graphene on cavities with different diameters (D) and depths (g). To maximize the contrast between suspended and broken membranes we used a model for selecting the optimal color filter. The method enables parallel and automatized image processing for yield statistics. We find the survival probability to be correlated with a structural mechanics scaling parameter given by D 4 /g 3 . Moreover, we extract a median adhesion energy of Γ = 0.9 J m -2 between the membrane and the native SiO 2 at the bottom of the cavities.
The cosmic web and microwave background fossilize the first turbulent combustion
NASA Astrophysics Data System (ADS)
Gibson, Carl H.; Keeler, R. Norris
2016-10-01
Collisional fluid mechanics theory predicts a turbulent hot big bang at Planck conditions from large, negative, turbulence stresses below the Fortov-Kerr limit (< -10113 Pa). Big bang turbulence fossilized when quarks formed, extracting the mass energy of the universe by extreme negative viscous stresses of inflation, expanding to length scales larger than the horizon scale ct. Viscous-gravitational structure formation by fragmentation was triggered at big bang fossil vorticity turbulence vortex lines during the plasma epoch, as observed by the Planck space telescope. A cosmic web of protogalaxies, protogalaxyclusters, and protogalaxysuperclusters that formed in turbulent boundary layers of the spinning voids are hereby identified as expanding turbulence fossils that falsify CDMHC cosmology.
Xenopus extract approaches to studying microtubule organization and signaling in cytokinesis
Field, Christine M.; Pelletier, James F.; Mitchison, Timothy J.
2017-01-01
We report optimized methods for preparing actin-intact Xenopus egg extract. This extract is minimally perturbed, undiluted egg cytoplasm where the cell cycle can be experimentally controlled. It contains abundant organelles and glycogen, and supports active metabolism and cytoskeletal dynamics that closely mimic egg physiology. The concentration of the most abundant ~11,000 proteins is known from mass spectrometry. Actin-intact egg extract can be used for analysis of actin dynamics and interaction of actin with other cytoplasmic systems, as well as microtubule organization. It can be spread as thin layers, and naturally depletes oxygen though mitochondrial metabolism, which makes it ideal for fluorescence imaging. When combined with artificial lipid bilayers, it allows reconstitution and analysis of the spatially controlled signaling that positions the cleavage furrow during early cytokinesis. Actin-intact extract is generally useful for probing the biochemistry and biophysics of the large Xenopus egg. Protocols are provided for preparation of actin-intact egg extract, control of the cell cycle, fluorescent probes for cytoskeleton and cytoskeleton-dependent signaling, preparation of glass surfaces for imaging experiments, and immunodepletion to probe the role of specific proteins and protein complexes. We also describe methods for adding supported lipid bilayers to mimic the plasma membrane and for confining in microfluidic droplets to explore size scaling issues. PMID:28065319
The extraction of bitumen from western oil sands. Annual report, July 1991--July 1992
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oblad, A.G.; Bunger, J.W.; Dahlstrom, D.A.
1992-08-01
The University of Utah tar sand research and development program is concerned with research and development on Utah is extensive oil sands deposits. The program has been intended to develop a scientific and technological base required for eventual commercial recovery of the heavy oils from oil sands and processing these oils to produce synthetic crude oil and other products such as asphalt. The overall program is based on mining the oil sand, processing the mined sand to recover the heavy oils and upgrading them to products. Multiple deposits are being investigated since it is believed that a large scale (approximatelymore » 20,000 bbl/day) plant would require the use of resources from more than one deposit. The tasks or projects in the program are organized according to the following classification: Recovery technologies which includes thermal recovery methods, water extraction methods, and solvent extraction methods; upgrading and processing technologies which covers hydrotreating, hydrocracking, and hydropyrolysis; solvent extraction; production of specialty products; and environmental aspects of the production and processing technologies. These tasks are covered in this report.« less
Growth of the extremophilic Deinococcus geothermalis DSM 11302 using co-substrate fed-batch culture.
Bornot, Julie; Molina-Jouve, Carole; Uribelarrea, Jean-Louis; Gorret, Nathalie
2014-02-01
Deinococcus geothermalis metabolism has been scarcely studied to date, although new developments on its utilization for bioremediation have been carried out. So, large-scale production of this strain and a better understanding of its physiology are required. A fed-batch experiment was conducted to achieve a high cell density non-limiting culture of D. geothermalis DSM 11302. A co-substrate nutritional strategy using glucose and yeast extract was carried out in a 20-L bioreactor in order to maintain a non-limited growth at a maximal growth rate of 1 h(-1) at 45 °C. Substrate supplies were adjusted by monitoring online culture parameters and physiological data (dissolved oxygen, gas analyses, respiratory quotient, biomass concentration). The results showed that yeast extract could serve as both carbon and nitrogen sources, although glucose and ammonia were consumed too. Yeast extract carbon-specific uptake rate reached a value 4.5 times higher than glucose carbon-specific uptake rate. Cell concentration of 9.6 g L(-1) dry cell weight corresponding to 99 g of biomass was obtained using glucose and yeast extract as carbon and nitrogen sources.
Lipidomics of tobacco leaf and cigarette smoke.
Dunkle, Melissa N; Yoshimura, Yuta; T Kindt, Ruben; Ortiz, Alexia; Masugi, Eri; Mitsui, Kazuhisa; David, Frank; Sandra, Pat; Sandra, Koen
2016-03-25
Detailed lipidomics experiments were performed on the extracts of cured tobacco leaf and of cigarette smoke condensate (CSC) using high-resolution liquid chromatography coupled to quadrupole time-of-flight mass spectrometry (LC-Q-TOF MS). Following automated solid-phase extraction (SPE) fractionation of the lipid extracts, over 350 lipids could be annotated. From a large-scale study on 22 different leaf samples, it was determined that differentiation based on curing type was possible for both the tobacco leaf and the CSC extracts. Lipids responsible for the classification were identified and the findings were correlated to proteomics data acquired from the same tobacco leaf samples. Prediction models were constructed based on the lipid profiles observed in the 22 leaf samples and successfully allowed for curing type classification of new tobacco leaves. A comparison of the leaf and CSC data provided insight into the lipidome changes that occur during the smoking process. It was determined that lipids which survive the smoking process retain the same curing type trends in both the tobacco leaf and CSC data. Copyright © 2015 Elsevier B.V. All rights reserved.
Bruns-Toepler, Markus; Hardt, Philip
2017-07-01
The aims of the present study were: (i) Evaluate specificity and sensitivity of Hb Smart enzyme-linked immunosorbent assay (ELISA) (ScheBo Biotech) compared to colonoscopy results and (ii) assess stability of a new sample collection device containing a newly formulated buffer to extract haemoglobin using buffer and stool samples spiked with defined concentrations of haemoglobin. Stool samples were quantified with the ELISA method. The stability of haemoglobin in the extraction buffer and in native stool samples, respectively, was determined daily by ELISA during storage for 5 days at 4°C and at room temperature after addition of haemoglobin. Haemoglobin ELISA had a sensitivity of 78.4% for detection of CRC with a specificity of 98%. Haemoglobin extracted in corresponding extraction buffer demonstrated stability throughout storage for 5 days at 4°C and at room temperature. Hb Smart represents a very promising tool for large-scale screening of CRC with regard to sample handling, stability and analysis of haemoglobin in faeces. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.
Extraction of land cover change information from ENVISAT-ASAR data in Chengdu Plain
NASA Astrophysics Data System (ADS)
Xu, Wenbo; Fan, Jinlong; Huang, Jianxi; Tian, Yichen; Zhang, Yong
2006-10-01
Land cover data are essential to most global change research objectives, including the assessment of current environmental conditions and the simulation of future environmental scenarios that ultimately lead to public policy development. Chinese Academy of Sciences generated a nationwide land cover database in order to carry out the quantification and spatial characterization of land use/cover changes (LUCC) in 1990s. In order to improve the reliability of the database, we will update the database anytime. But it is difficult to obtain remote sensing data to extract land cover change information in large-scale. It is hard to acquire optical remote sensing data in Chengdu plain, so the objective of this research was to evaluate multitemporal ENVISAT advanced synthetic aperture radar (ASAR) data for extracting land cover change information. Based on the fieldwork and the nationwide 1:100000 land cover database, the paper assesses several land cover changes in Chengdu plain, for example: crop to buildings, forest to buildings, and forest to bare land. The results show that ENVISAT ASAR data have great potential for the applications of extracting land cover change information.
NASA Astrophysics Data System (ADS)
Nakamura, Yuki; Ashi, Juichiro; Morita, Sumito
2016-04-01
To clarify timing and scale of past submarine landslides is important to understand formation processes of the landslides. The study area is in a part of continental slope of the Japan Trench, where a number of large-scale submarine landslide (slump) deposits have been identified in Pliocene and Quaternary formations by analysing METI's 3D seismic data "Sanrikuoki 3D" off Shimokita Peninsula (Morita et al., 2011). As structural features, swarm of parallel dikes which are likely dewatering paths formed accompanying the slumping deformation, and slip directions are basically perpendicular to the parallel dikes. Therefore, parallel dikes are good indicator for estimation of slip directions. Slip direction of each slide was determined one kilometre grid in the survey area of 40 km x 20 km. The remarkable slip direction varies from Pliocene to Quaternary in the survey area. Parallel dike structure is also available for the distinguishment of the slump deposit and normal deposit on time slice images. By tracing outline of slump deposits at each depth, we identified general morphology of the overall slump deposits, and calculated the volume of the extracted slump deposits so as to estimate the scale of each event. We investigated temporal and spatial variation of depositional pattern of the slump deposits. Calculating the generation interval of the slumps, some periodicity is likely recognized, especially large slump do not occur in succession. Additionally, examining the relationship of the cumulative volume and the generation interval, certain correlation is observed in Pliocene and Quaternary. Key words: submarine landslides, 3D seismic data, Shimokita Peninsula
NASA Astrophysics Data System (ADS)
Sofia, G.; Tarolli, P.; Dalla Fontana, G.
2012-04-01
In floodplains, massive investments in land reclamation have always played an important role in the past for flood protection. In these contexts, human alteration is reflected by artificial features ('Anthropogenic features'), such as banks, levees or road scarps, that constantly increase and change, in response to the rapid growth of human populations. For these areas, various existing and emerging applications require up-to-date, accurate and sufficiently attributed digital data, but such information is usually lacking, especially when dealing with large-scale applications. More recently, National or Local Mapping Agencies, in Europe, are moving towards the generation of digital topographic information that conforms to reality and are highly reliable and up to date. LiDAR Digital Terrain Models (DTMs) covering large areas are readily available for public authorities, and there is a greater and more widespread interest in the application of such information by agencies responsible for land management for the development of automated methods aimed at solving geomorphological and hydrological problems. Automatic feature recognition based upon DTMs can offer, for large-scale applications, a quick and accurate method that can help in improving topographic databases, and that can overcome some of the problems associated with traditional, field-based, geomorphological mapping, such as restrictions on access, and constraints of time or costs. Although anthropogenic features as levees and road scarps are artificial structures that actually do not belong to what is usually defined as the bare ground surface, they are implicitly embedded in digital terrain models (DTMs). Automatic feature recognition based upon DTMs, therefore, can offer a quick and accurate method that does not require additional data, and that can help in improving flood defense asset information, flood modeling or other applications. In natural contexts, morphological indicators derived from high resolution topography have been proven to be reliable for feasible applications. The use of statistical operators as thresholds for these geomorphic parameters, furthermore, showed a high reliability for feature extraction in mountainous environments. The goal of this research is to test if these morphological indicators and objective thresholds can be feasible also in floodplains, where features assume different characteristics and other artificial disturbances might be present. In the work, three different geomorphic parameters are tested and applied at different scales on a LiDAR DTM of typical alluvial plain's area in the North East of Italy. The box-plot is applied to identify the threshold for feature extraction, and a filtering procedure is proposed, to improve the quality of the final results. The effectiveness of the different geomorphic parameters is analyzed, comparing automatically derived features with the surveyed ones. The results highlight the capability of high resolution topography, geomorphic indicators and statistical thresholds for anthropogenic features extraction and characterization in a floodplains context.
NASA Astrophysics Data System (ADS)
Hamada, Y.; O'Connor, B. L.
2012-12-01
Development in arid environments often results in the loss and degradation of the ephemeral streams that provide habitat and critical ecosystem functions such as water delivery, sediment transport, and groundwater recharge. Quantification of these ecosystem functions is challenging because of the episodic nature of runoff events in desert landscapes and the large spatial scale of watersheds that potentially can be impacted by large-scale development. Low-impact development guidelines and regulatory protection of ephemeral streams are often lacking due to the difficulty of accurately mapping and quantifying the critical functions of ephemeral streams at scales larger than individual reaches. Renewable energy development in arid regions has the potential to disturb ephemeral streams at the watershed scale, and it is necessary to develop environmental monitoring applications for ephemeral streams to help inform land management and regulatory actions aimed at protecting and mitigating for impacts related to large-scale land disturbances. This study focuses on developing remote sensing methodologies to identify and monitor impacts on ephemeral streams resulting from the land disturbance associated with utility-scale solar energy development in the desert southwest of the United States. Airborne very high resolution (VHR) multispectral imagery is used to produce stereoscopic, three-dimensional landscape models that can be used to (1) identify and map ephemeral stream channel networks, and (2) support analyses and models of hydrologic and sediment transport processes that pertain to the critical functionality of ephemeral streams. Spectral and statistical analyses are being developed to extract information about ephemeral channel location and extent, micro-topography, riparian vegetation, and soil moisture characteristics. This presentation will demonstrate initial results and provide a framework for future work associated with this project, for developing the necessary field measurements necessary to verify remote sensing landscape models, and for generating hydrologic models and analyses.
NASA Astrophysics Data System (ADS)
Huang, H.-P.; Wright, I. P.; Gilmour, I.; Pillinger, C. T.
1994-11-01
Silica aerogel represents an ideal material for use as a cosmic dust capture medium. Its low density enables impacting particles to decelerate and stop within a small quality of the material, but without any severe heating. Hence the particles, which remain unmelted, can subsequently be removed and studied. Since a large proportion of the prospective cosmic dust is likely to be enriched in elements such as carbon and hydrogen (typically 5 wt% C, 20 wt% H2O), it is imperative that the aerogel used in the capture cell contains minimal quantities of these elements. Unfortunately the lowest density aerogels contain carbon at levels of 5 wt%; water is present in even greater amounts. Thus, techniques need to be identified to remove these contaminants. Herein, an attempt is made to use supercritical fluid extraction to remove carbon (and water). The investigation was tried to identify the most suitable parameters (i.e. CO2 density, solvating power using single or multiple extractions, use of modifier, etc.) necessary for removal of contaminants. A set of conditions was derived which was able to remove 90% of carbon contaminants from an aerogel of 0.12 g/cu cm density. This involved the use of multiple extractions with gradient temperatures (i.e. variable CO2 density), but without the use of a methanol modifier. Unfortunately, the same technique was less efficacious at removing carbon from aerogels with densities less than 0.12 g/cu cm. At present the extraction procedure has only been tried on a laboratory scale, but clearly this could be scaled-up in the future.
Optimization of protein electroextraction from microalgae by a flow process.
Coustets, Mathilde; Joubert-Durigneux, Vanessa; Hérault, Josiane; Schoefs, Benoît; Blanckaert, Vincent; Garnier, Jean-Pierre; Teissié, Justin
2015-06-01
Classical methods, used for large scale treatments such as mechanical or chemical extractions, affect the integrity of extracted cytosolic protein by releasing proteases contained in vacuoles. Our previous experiments on flow processes electroextraction on yeasts proved that pulsed electric field technology allows preserving the integrity of released cytosolic proteins, by not affecting vacuole membranes. Furthermore, large cell culture volumes are easily treated by the flow technology. Based on this previous knowledge, we developed a new protocol in order to electro-extract total cytoplasmic proteins from microalgae (Nannochloropsis salina, Chlorella vulgaris and Haematococcus pluvialis). Given that induction of electropermeabilization is under the control of target cell size, as the mean diameter for N. salina is only 2.5 μm, we used repetitive 2 ms long pulses of alternating polarities with stronger field strengths than previously described for yeasts. The electric treatment was followed by a 24h incubation period in a salty buffer. The amount of total protein release was observed by a classical Bradford assay. A more accurate evaluation of protein release was obtained by SDS-PAGE. Similar results were obtained with C. vulgaris and H. pluvialis under milder electrical conditions as expected from their larger size. Copyright © 2014 Elsevier B.V. All rights reserved.
Methods for extracting climate indicator data from social media.
NASA Astrophysics Data System (ADS)
Fuka, M. Z.; Fuka, D. R.
2011-12-01
This paper shows how we've used the R software suite to extract climate indicator data from Twitter. In the course of this research we've collected extensive data sets of unsolicited observations ("tweets") for hundreds of climate-related phenological, biological, epidemiological and meteorological effects. R has proved itself in our work as a useful tool for manipulating those large data sets. Our experience from this effort has yielded a variety of insights on using R to extract geophysics-specific information from publicly accessible social media sources. We illustrate our methodology by mapping tweeted US armadillo sightings to explore the impact of climate variability on the extent of the animal's range. This example usefully demonstrates R's technical capabilities in collecting, time-stamping, geolocating, analyzing, visualizing and otherwise processing climate-related data derived from unsolicited social media postings. We also "mash-up" the data sets with those acquired by more traditional means, for example, temperature and precipitation data across the armadillo's US range. Our data-handling practice is extendable to social sharing services other than Twitter, providing the environmental modeling community an opportunity to access largely untapped resources of non-traditional climate indicator data to better understand the effects of climate change at local, regional and global scales.
Hwang, Wonjun; Wang, Haitao; Kim, Hyunwoo; Kee, Seok-Cheol; Kim, Junmo
2011-04-01
The authors present a robust face recognition system for large-scale data sets taken under uncontrolled illumination variations. The proposed face recognition system consists of a novel illumination-insensitive preprocessing method, a hybrid Fourier-based facial feature extraction, and a score fusion scheme. First, in the preprocessing stage, a face image is transformed into an illumination-insensitive image, called an "integral normalized gradient image," by normalizing and integrating the smoothed gradients of a facial image. Then, for feature extraction of complementary classifiers, multiple face models based upon hybrid Fourier features are applied. The hybrid Fourier features are extracted from different Fourier domains in different frequency bandwidths, and then each feature is individually classified by linear discriminant analysis. In addition, multiple face models are generated by plural normalized face images that have different eye distances. Finally, to combine scores from multiple complementary classifiers, a log likelihood ratio-based score fusion scheme is applied. The proposed system using the face recognition grand challenge (FRGC) experimental protocols is evaluated; FRGC is a large available data set. Experimental results on the FRGC version 2.0 data sets have shown that the proposed method shows an average of 81.49% verification rate on 2-D face images under various environmental variations such as illumination changes, expression changes, and time elapses.
Saliency image of feature building for image quality assessment
NASA Astrophysics Data System (ADS)
Ju, Xinuo; Sun, Jiyin; Wang, Peng
2011-11-01
The purpose and method of image quality assessment are quite different for automatic target recognition (ATR) and traditional application. Local invariant feature detectors, mainly including corner detectors, blob detectors and region detectors etc., are widely applied for ATR. A saliency model of feature was proposed to evaluate feasibility of ATR in this paper. The first step consisted of computing the first-order derivatives on horizontal orientation and vertical orientation, and computing DoG maps in different scales respectively. Next, saliency images of feature were built based auto-correlation matrix in different scale. Then, saliency images of feature of different scales amalgamated. Experiment were performed on a large test set, including infrared images and optical images, and the result showed that the salient regions computed by this model were consistent with real feature regions computed by mostly local invariant feature extraction algorithms.
Molecular Origins of Mesoscale Ordering in a Metalloamphiphile Phase
2015-01-01
Controlling the assembly of soft and deformable molecular aggregates into mesoscale structures is essential for understanding and developing a broad range of processes including rare earth extraction and cleaning of water, as well as for developing materials with unique properties. By combined synchrotron small- and wide-angle X-ray scattering with large-scale atomistic molecular dynamics simulations we analyze here a metalloamphiphile–oil solution that organizes on multiple length scales. The molecules associate into aggregates, and aggregates flocculate into meso-ordered phases. Our study demonstrates that dipolar interactions, centered on the amphiphile headgroup, bridge ionic aggregate cores and drive aggregate flocculation. By identifying specific intermolecular interactions that drive mesoscale ordering in solution, we bridge two different length scales that are classically addressed separately. Our results highlight the importance of individual intermolecular interactions in driving mesoscale ordering. PMID:27163014
Chen, Shasha; Zeng, Zhi; Hu, Na; Bai, Bo; Wang, Honglun; Suo, Yourui
2018-03-01
Lycium ruthenicum Murr. (LR) is a functional food that plays an important role in anti-oxidation due to its high level of phenolic compounds. This study aims to optimize ultrasound-assisted extraction (UAE) of phenolic compounds and antioxidant activities of obtained extracts from LR using response surface methodology (RSM). A four-factor-three-level Box-Behnken design (BBD) was employed to discuss the following extracting parameters: extraction time (X 1 ), ultrasonic power (X 2 ), solvent to sample ratio (X 3 ) and solvent concentration (X 4 ). The analysis of variance (ANOVA) results revealed that the solvent to sample ratio had a significant influence on all responses, while the extraction time had no statistically significant effect on phenolic compounds. The optimum values of the combination of phenolic compounds and antioxidant activities were obtained for X 1 =30min, X 2 =100W, X 3 =40mL/g, and X 4 =33% (v/v). Five phenolic acids, including chlorogenic acid, caffeic acid, syringic acid, p-coumaric acid and ferulic acid, were analyzed by HPLC. Our results indicated that optimization extraction is vital for the quantification of phenolic compounds and antioxidant activity in LR, which may be contributed to large-scale industrial applications and future pharmacological activities research. Copyright © 2017 Elsevier Ltd. All rights reserved.
Phillips, Carolyn L.; Peterka, Tom; Karpeyev, Dmitry; ...
2015-02-20
In type II superconductors, the dynamics of superconducting vortices determine their transport properties. In the Ginzburg-Landau theory, vortices correspond to topological defects in the complex order parameter. Extracting their precise positions and motion from discretized numerical simulation data is an important, but challenging, task. In the past, vortices have mostly been detected by analyzing the magnitude of the complex scalar field representing the order parameter and visualized by corresponding contour plots and isosurfaces. However, these methods, primarily used for small-scale simulations, blur the fine details of the vortices, scale poorly to large-scale simulations, and do not easily enable isolating andmore » tracking individual vortices. In this paper, we present a method for exactly finding the vortex core lines from a complex order parameter field. With this method, vortices can be easily described at a resolution even finer than the mesh itself. The precise determination of the vortex cores allows the interplay of the vortices inside a model superconductor to be visualized in higher resolution than has previously been possible. Finally, by representing the field as the set of vortices, this method also massively reduces the data footprint of the simulations and provides the data structures for further analysis and feature tracking.« less
Polymeric assembly of gluten proteins in an aqueous ethanol solvent.
Dahesh, Mohsen; Banc, Amélie; Duri, Agnès; Morel, Marie-Hélène; Ramos, Laurence
2014-09-25
The supramolecular organization of wheat gluten proteins is largely unknown due to the intrinsic complexity of this family of proteins and their insolubility in water. We fractionate gluten in a water/ethanol mixture (50/50 v/v) and obtain a protein extract which is depleted in gliadin, the monomeric part of wheat gluten proteins, and enriched in glutenin, the polymeric part of wheat gluten proteins. We investigate the structure of the proteins in the solvent used for extraction over a wide range of concentration, by combining X-ray scattering and multiangle static and dynamic light scattering. Our data show that, in the ethanol/water mixture, the proteins display features characteristic of flexible polymer chains in a good solvent. In the dilute regime, the proteins form very loose structures of characteristic size 150 nm, with an internal dynamics which is quantitatively similar to that of branched polymer coils. In more concentrated regimes, data highlight a hierarchical structure with one characteristic length scale of the order of a few nm, which displays the scaling with concentration expected for a semidilute polymer in good solvent, and a fractal arrangement at a much larger length scale. This structure is strikingly similar to that of polymeric gels, thus providing some factual knowledge to rationalize the viscoelastic properties of wheat gluten proteins and their assemblies.
Path Searching Based Crease Detection for Large Scale Scanned Document Images
NASA Astrophysics Data System (ADS)
Zhang, Jifu; Li, Yi; Li, Shutao; Sun, Bin; Sun, Jun
2017-12-01
Since the large size documents are usually folded for preservation, creases will occur in the scanned images. In this paper, a crease detection method is proposed to locate the crease pixels for further processing. According to the imaging process of contactless scanners, the shading on both sides of the crease usually varies a lot. Based on this observation, a convex hull based algorithm is adopted to extract the shading information of the scanned image. Then, the possible crease path can be achieved by applying the vertical filter and morphological operations on the shading image. Finally, the accurate crease is detected via Dijkstra path searching. Experimental results on the dataset of real scanned newspapers demonstrate that the proposed method can obtain accurate locations of the creases in the large size document images.
Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.
He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan
2009-07-01
Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Gram-scale purification of aconitine and identification of lappaconitine in Aconitum karacolicum.
Tarbe, M; de Pomyers, H; Mugnier, L; Bertin, D; Ibragimov, T; Gigmes, D; Mabrouk, K
2017-07-01
Aconitum karacolicum from northern Kyrgyzstan (Alatau area) contains about 0.8-1% aconitine as well as other aconite derivatives that have already been identified. In this paper, we compare several methods for the further purification of an Aconitum karacolicum extract initially containing 80% of aconitine. Reverse-phase flash chromatography, reverse-phase semi-preparative HPLC, centrifugal partition chromatography (CPC) and recrystallization techniques were evaluated regarding first their efficiency to get the highest purity of aconitine (over 96%) and secondly their applicability in a semi-industrial scale purification process (in our case, 150g of plant extract). Even if the CPC technique shows the highest purification yield (63%), the recrystallization remains the method of choice to purify a large amount of aconitine as i) it can be easily carried out in safe conditions; ii) an aprotic solvent is used, avoiding aconitine degradation. Moreover, this study led us to the identification of lappaconitine in Aconitum karacolicum, a well-known alkaloid never found in this Aconitum species. Copyright © 2017 Elsevier B.V. All rights reserved.
Sweeten, Sara E.; Ford, W. Mark
2015-01-01
Large-scale land uses such as residential wastewater discharge and coal mining practices, particularly surface coal extraction and associated valley fills, are of particular ecological concern in central Appalachia. Identification and quantification of both alterations across scales are a necessary first-step to mitigate negative consequences to biota. In central Appalachian headwater streams absent of fish, salamanders are the dominant, most abundant vertebrate predator providing a significant intermediate trophic role. Stream salamander species are considered to be sensitive to aquatic stressors and environmental alterations, and past research has shown linkages among microhabitat parameters, large-scale land use such as urbanization and logging with salamander abundances. However, little is known about these linkages in the coalfields of central Appalachia. In the summer of 2013, we visited 70 sites (sampled three times each) in the southwest Virginia coalfields to survey salamanders and quantify stream and riparian microhabitat parameters. Using an information-theoretic framework we compared the effects of microhabitat and large-scale land use on salamander abundances. Our findings indicate that dusky salamander (Desmognathus spp.) abundances are more correlated to microhabitat parameters such as canopy cover than to subwatershed land uses. Brook salamander (Eurycea spp.) abundances show strong negative associations to the suspended sediments and stream substrate embeddedness. Neither Desmognathus spp. nor Eurycea spp. abundances were influenced by water conductivity. These suggest protection or restoration of riparian habitats and erosion control is an important conservation component for maintaining stream salamanders in the mined landscapes of central Appalachia.
3D fully convolutional networks for subcortical segmentation in MRI: A large-scale study.
Dolz, Jose; Desrosiers, Christian; Ben Ayed, Ismail
2018-04-15
This study investigates a 3D and fully convolutional neural network (CNN) for subcortical brain structure segmentation in MRI. 3D CNN architectures have been generally avoided due to their computational and memory requirements during inference. We address the problem via small kernels, allowing deeper architectures. We further model both local and global context by embedding intermediate-layer outputs in the final prediction, which encourages consistency between features extracted at different scales and embeds fine-grained information directly in the segmentation process. Our model is efficiently trained end-to-end on a graphics processing unit (GPU), in a single stage, exploiting the dense inference capabilities of fully CNNs. We performed comprehensive experiments over two publicly available datasets. First, we demonstrate a state-of-the-art performance on the ISBR dataset. Then, we report a large-scale multi-site evaluation over 1112 unregistered subject datasets acquired from 17 different sites (ABIDE dataset), with ages ranging from 7 to 64 years, showing that our method is robust to various acquisition protocols, demographics and clinical factors. Our method yielded segmentations that are highly consistent with a standard atlas-based approach, while running in a fraction of the time needed by atlas-based methods and avoiding registration/normalization steps. This makes it convenient for massive multi-site neuroanatomical imaging studies. To the best of our knowledge, our work is the first to study subcortical structure segmentation on such large-scale and heterogeneous data. Copyright © 2017 Elsevier Inc. All rights reserved.
Sequential estimation and satellite data assimilation in meteorology and oceanography
NASA Technical Reports Server (NTRS)
Ghil, M.
1986-01-01
The role of dynamics in estimating the state of the atmosphere and ocean from incomplete and noisy data is discussed and the classical applications of four-dimensional data assimilation to large-scale atmospheric dynamics are presented. It is concluded that sequential updating of a forecast model with continuously incoming conventional and remote-sensing data is the most natural way of extracting the maximum amount of information from the imperfectly known dynamics, on the one hand, and the inaccurate and incomplete observations, on the other.
Calculating the Bending Modulus for Multicomponent Lipid Membranes in Different Thermodynamic Phases
2013-01-01
We establish a computational approach to extract the bending modulus, KC, for lipid membranes from relatively small-scale molecular simulations. Fluctuations in the splay of individual pairs of lipids faithfully inform on KC in multicomponent membranes over a large range of rigidities in different thermodynamic phases. Predictions are validated by experiments even where the standard spectral analysis-based methods fail. The local nature of this method potentially allows its extension to calculations of KC in protein-laden membranes. PMID:24039553
The yeast protein extract (RM8323) developed by National Institute of Standards and Technology (NIST) under the auspices of NCI's CPTC initiative is currently available to the public at https://www-s.nist.gov/srmors/view_detail.cfm?srm=8323. The yeast proteome offers researchers a unique biological reference material. RM8323 is the most extensively characterized complex biological proteome and the only one associated with several large-scale studies to estimate protein abundance across a wide concentration range.
Screening and productivity of penicillin antibiotic from Penicillium sp.
Sivakumari, V; Dhinakaran, J; Rajendran, A
2009-10-01
This paper highlights the antagonism effect of Penicillium isolates, which were screened against the test organisms such as Staphylococcus aureus, E. coli and Penicillium sp. Penicillium notatum and Penicillium chrysogenum isolates were used for penicillin biosynthesis. The antibacterial activities of fermented crude penicillin extract were assayed by disc diffusion method. Maximum antibacterial activity was observed in Gram positive organisms (Staphylococcus aureus) when compared with Gram negative organisms. The isolated Penicillium chrysogenum can be used for large-scale penicillin antibiotic production.
Complex Analysis of Combat in Afghanistan
2014-12-01
analysis we have β−ffE ~)( where β= 2H - 1 = 1 - γ, with H being the Hurst exponent , related to the correlation exponent γ. Usually, real-world data are...statistical nature. In every instance we found strong power law correlations in the data, and were able to extract accurate scaling exponents . On the... exponents , α. The case αɘ.5 corresponds to long-term anti-correlations, meaning that large values are most likely to be followed by small values and
Of Death Stars and Death Rays: A Glimpse At The Future of Space Warfare
2013-04-01
remains in step. The potential for long-term energy mining from the moon (discussed later in this paper) must also be a consideration as there will be a...spacecraft to the Itokawa asteroid , collected soil samples, and safely returned the mission to Earth. 37 In 2007, they demonstrated their mastery...helium-3 is dispersed across the lunar surface, large-scale mining operations and specialized equipment needed to extract the gas from lunar rocks will
Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng
2017-07-19
Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.
Large-Scale Image Analytics Using Deep Learning
NASA Astrophysics Data System (ADS)
Ganguly, S.; Nemani, R. R.; Basu, S.; Mukhopadhyay, S.; Michaelis, A.; Votava, P.
2014-12-01
High resolution land cover classification maps are needed to increase the accuracy of current Land ecosystem and climate model outputs. Limited studies are in place that demonstrates the state-of-the-art in deriving very high resolution (VHR) land cover products. In addition, most methods heavily rely on commercial softwares that are difficult to scale given the region of study (e.g. continents to globe). Complexities in present approaches relate to (a) scalability of the algorithm, (b) large image data processing (compute and memory intensive), (c) computational cost, (d) massively parallel architecture, and (e) machine learning automation. In addition, VHR satellite datasets are of the order of terabytes and features extracted from these datasets are of the order of petabytes. In our present study, we have acquired the National Agricultural Imaging Program (NAIP) dataset for the Continental United States at a spatial resolution of 1-m. This data comes as image tiles (a total of quarter million image scenes with ~60 million pixels) and has a total size of ~100 terabytes for a single acquisition. Features extracted from the entire dataset would amount to ~8-10 petabytes. In our proposed approach, we have implemented a novel semi-automated machine learning algorithm rooted on the principles of "deep learning" to delineate the percentage of tree cover. In order to perform image analytics in such a granular system, it is mandatory to devise an intelligent archiving and query system for image retrieval, file structuring, metadata processing and filtering of all available image scenes. Using the Open NASA Earth Exchange (NEX) initiative, which is a partnership with Amazon Web Services (AWS), we have developed an end-to-end architecture for designing the database and the deep belief network (following the distbelief computing model) to solve a grand challenge of scaling this process across quarter million NAIP tiles that cover the entire Continental United States. The AWS core components that we use to solve this problem are DynamoDB along with S3 for database query and storage, ElastiCache shared memory architecture for image segmentation, Elastic Map Reduce (EMR) for image feature extraction, and the memory optimized Elastic Cloud Compute (EC2) for the learning algorithm.
Edge-SIFT: discriminative binary descriptor for scalable partial-duplicate mobile search.
Zhang, Shiliang; Tian, Qi; Lu, Ke; Huang, Qingming; Gao, Wen
2013-07-01
As the basis of large-scale partial duplicate visual search on mobile devices, image local descriptor is expected to be discriminative, efficient, and compact. Our study shows that the popularly used histogram-based descriptors, such as scale invariant feature transform (SIFT) are not optimal for this task. This is mainly because histogram representation is relatively expensive to compute on mobile platforms and loses significant spatial clues, which are important for improving discriminative power and matching near-duplicate image patches. To address these issues, we propose to extract a novel binary local descriptor named Edge-SIFT from the binary edge maps of scale- and orientation-normalized image patches. By preserving both locations and orientations of edges and compressing the sparse binary edge maps with a boosting strategy, the final Edge-SIFT shows strong discriminative power with compact representation. Furthermore, we propose a fast similarity measurement and an indexing framework with flexible online verification. Hence, the Edge-SIFT allows an accurate and efficient image search and is ideal for computation sensitive scenarios such as a mobile image search. Experiments on a large-scale dataset manifest that the Edge-SIFT shows superior retrieval accuracy to Oriented BRIEF (ORB) and is superior to SIFT in the aspects of retrieval precision, efficiency, compactness, and transmission cost.
Warren, Jeffrey M; Hanson, Paul J; Iversen, Colleen M; Kumar, Jitendra; Walker, Anthony P; Wullschleger, Stan D
2015-01-01
There is wide breadth of root function within ecosystems that should be considered when modeling the terrestrial biosphere. Root structure and function are closely associated with control of plant water and nutrient uptake from the soil, plant carbon (C) assimilation, partitioning and release to the soils, and control of biogeochemical cycles through interactions within the rhizosphere. Root function is extremely dynamic and dependent on internal plant signals, root traits and morphology, and the physical, chemical and biotic soil environment. While plant roots have significant structural and functional plasticity to changing environmental conditions, their dynamics are noticeably absent from the land component of process-based Earth system models used to simulate global biogeochemical cycling. Their dynamic representation in large-scale models should improve model veracity. Here, we describe current root inclusion in models across scales, ranging from mechanistic processes of single roots to parameterized root processes operating at the landscape scale. With this foundation we discuss how existing and future root functional knowledge, new data compilation efforts, and novel modeling platforms can be leveraged to enhance root functionality in large-scale terrestrial biosphere models by improving parameterization within models, and introducing new components such as dynamic root distribution and root functional traits linked to resource extraction. No claim to original US Government works. New Phytologist © 2014 New Phytologist Trust.
Matsuura, Tomoaki; Tanimura, Naoki; Hosoda, Kazufumi; Yomo, Tetsuya; Shimizu, Yoshihiro
2017-01-01
To elucidate the dynamic features of a biologically relevant large-scale reaction network, we constructed a computational model of minimal protein synthesis consisting of 241 components and 968 reactions that synthesize the Met-Gly-Gly (MGG) peptide based on an Escherichia coli-based reconstituted in vitro protein synthesis system. We performed a simulation using parameters collected primarily from the literature and found that the rate of MGG peptide synthesis becomes nearly constant in minutes, thus achieving a steady state similar to experimental observations. In addition, concentration changes to 70% of the components, including intermediates, reached a plateau in a few minutes. However, the concentration change of each component exhibits several temporal plateaus, or a quasi-stationary state (QSS), before reaching the final plateau. To understand these complex dynamics, we focused on whether the components reached a QSS, mapped the arrangement of components in a QSS in the entire reaction network structure, and investigated time-dependent changes. We found that components in a QSS form clusters that grow over time but not in a linear fashion, and that this process involves the collapse and regrowth of clusters before the formation of a final large single cluster. These observations might commonly occur in other large-scale biological reaction networks. This developed analysis might be useful for understanding large-scale biological reactions by visualizing complex dynamics, thereby extracting the characteristics of the reaction network, including phase transitions. PMID:28167777
NASA Astrophysics Data System (ADS)
Chan, YinThai
2016-03-01
Colloidal semiconductor nanocrystals are ideal fluorophores for clinical diagnostics, therapeutics, and highly sensitive biochip applications due to their high photostability, size-tunable color of emission and flexible surface chemistry. The relatively recent development of core-seeded semiconductor nanorods showed that the presence of a rod-like shell can confer even more advantageous physicochemical properties than their spherical counterparts, such as large multi-photon absorption cross-sections and facet-specific chemistry that can be exploited to deposit secondary nanoparticles. It may be envisaged that these highly fluorescent nanorods can be integrated with large scale integrated (LSI) microfluidic systems that allow miniaturization and integration of multiple biochemical processes in a single device at the nanoliter scale, resulting in a highly sensitive and automated detection platform. In this talk, I will describe a LSI microfluidic device that integrates RNA extraction, reverse transcription to cDNA, amplification and target pull-down to detect histidine decarboxylase (HDC) gene directly from human white blood cells samples. When anisotropic colloidal semiconductor nanorods (NRs) were used as the fluorescent readout, the detection limit was found to be 0.4 ng of total RNA, which was much lower than that obtained using spherical quantum dots (QDs) or organic dyes. This was attributed to the large action cross-section of NRs and their high probability of target capture in a pull-down detection scheme. The combination of large scale integrated microfluidics with highly fluorescent semiconductor NRs may find widespread utility in point-of-care devices and multi-target diagnostics.
Heidari, Zahra; Roe, Daniel R; Galindo-Murillo, Rodrigo; Ghasemi, Jahan B; Cheatham, Thomas E
2016-07-25
Long time scale molecular dynamics (MD) simulations of biological systems are becoming increasingly commonplace due to the availability of both large-scale computational resources and significant advances in the underlying simulation methodologies. Therefore, it is useful to investigate and develop data mining and analysis techniques to quickly and efficiently extract the biologically relevant information from the incredible amount of generated data. Wavelet analysis (WA) is a technique that can quickly reveal significant motions during an MD simulation. Here, the application of WA on well-converged long time scale (tens of μs) simulations of a DNA helix is described. We show how WA combined with a simple clustering method can be used to identify both the physical and temporal locations of events with significant motion in MD trajectories. We also show that WA can not only distinguish and quantify the locations and time scales of significant motions, but by changing the maximum time scale of WA a more complete characterization of these motions can be obtained. This allows motions of different time scales to be identified or ignored as desired.
Methods for the Measurement of a Bacterial Enzyme Activity in Cell Lysates and Extracts
Mendz, George; Hazell, Stuart
1998-01-01
The kinetic characteristics and regulation of aspartate carbamoyltransferase activity were studied in lysates and cell extracts of Helicobacter pylori by three diffirent methods. Nuclear magnetic resonance spectroscopy, radioactive tracer analysis, and spectrophotometry were employed in conjunction to identify the properties of the enzyme activity and to validate the results obtained with each assay. NMR spectroscopy was the most direct method to provide proof of ACTase activity; radioactive tracer analysis was the most sensitive technique and a microtitre-based colorimetric assay was the most cost-and time-efficient for large scale analyses. Freeze-thawing was adopted as the preferred method for cell lysis in studying enzyme activity in situ. This study showed the benefits of employing several different complementary methods to investigate bacterial enzyme activity. PMID:12734591
Bladergroen, Marco R.; van der Burgt, Yuri E. M.
2015-01-01
For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071
Petigny, Loïc; Périno, Sandrine; Minuti, Matteo; Visinoni, Francesco; Wajsman, Joël; Chemat, Farid
2014-01-01
Microwave extraction and separation has been used to increase the concentration of the extract compared to the conventional method with the same solid/liquid ratio, reducing extraction time and separate at the same time Volatile Organic Compounds (VOC) from non-Volatile Organic Compounds (NVOC) of boldo leaves. As preliminary study, a response surface method has been used to optimize the extraction of soluble material and the separation of VOC from the plant in laboratory scale. The results from the statistical analysis revealed that the optimized conditions were: microwave power 200 W, extraction time 56 min and solid liquid ratio of 7.5% of plants in water. Lab scale optimized microwave method is compared to conventional distillation, and requires a power/mass ratio of 0.4 W/g of water engaged. This power/mass ratio is kept in order to upscale from lab to pilot plant. PMID:24776762
Sun, Peng-Cheng; Liu, Ying; Yi, Yue-Tao; Li, Hong-Juan; Fan, Ping; Xia, Chuan-Hai
2015-02-01
In the present study, a simple and efficient method for the preparative separation of 3-CQA from the extract of Helianthus tuberosus leaves with macroporous resins was studied. ADS-21 showed much higher adsorption capacity and better adsorption/desorption properties for 3-CQA among the tested resins. The adsorption of 3-CQA on ADS-21 resin at 25°C was fitted best to the Langmuir isotherm model and pseudo-second-order kinetic model. Dynamic adsorption/desorption experiments were carried out in a glass column packed with ADS-21 to optimise the separation process of 3-CQA from H. tuberosus leaves extract. After one treatment with ADS-21, the content of 3-CQA in the product was increased 5.42-fold, from 12.0% to 65.2%, with a recovery yield of 89.4%. The results demonstrated that the method was suitable for large-scale separation and manufacture of 3-CQA from H. tuberosus leaves. Copyright © 2014 Elsevier Ltd. All rights reserved.
Extraction and LOD control of colored interval volumes
NASA Astrophysics Data System (ADS)
Miyamura, Hiroko N.; Takeshima, Yuriko; Fujishiro, Issei; Saito, Takafumi
2005-03-01
Interval volume serves as a generalized isosurface and represents a three-dimensional subvolume for which the associated scalar filed values lie within a user-specified closed interval. In general, it is not an easy task for novices to specify the scalar field interval corresponding to their ROIs. In order to extract interval volumes from which desirable geometric features can be mined effectively, we propose a suggestive technique which extracts interval volumes automatically based on the global examination of the field contrast structure. Also proposed here is a simplification scheme for decimating resultant triangle patches to realize efficient transmission and rendition of large-scale interval volumes. Color distributions as well as geometric features are taken into account to select best edges to be collapsed. In addition, when a user wants to selectively display and analyze the original dataset, the simplified dataset is restructured to the original quality. Several simulated and acquired datasets are used to demonstrate the effectiveness of the present methods.
Arunachalam, Kantha D; Annamalai, Sathesh Kumar; Hari, Shanmugasundaram
2013-01-01
In this experiment, green-synthesized silver and gold nanoparticles were produced rapidly by treating silver and gold ions with an extract of Memecylon umbellatum leaf. The reaction process was simple and easy to handle, and was monitored using ultraviolet-visible spectroscopy. The effect of the phytochemicals present in M. umbellatum, including saponins, phenolic compounds, phytosterols, and quinones, on formation of stable silver and gold nanoparticles was investigated by Fourier-transform infrared spectroscopy. The morphology and crystalline phase of the nanoparticles were determined by transmission electron microscopy and energy-dispersive x-ray spectroscopy. The results indicate that the saponins, phytosterols, and phenolic compounds present in the plant extract play a major role in formation of silver and gold nanoparticles in their respective ions in solution. The characteristics of the nanoparticles formed suggest application of silver and gold nanoparticles as chemical sensors in the future. Given the simple and eco-friendly approach for synthesis, these nanoparticles could easily be commercialized for large-scale production.
Arunachalam, Kantha D; Annamalai, Sathesh Kumar; Hari, Shanmugasundaram
2013-01-01
In this experiment, green-synthesized silver and gold nanoparticles were produced rapidly by treating silver and gold ions with an extract of Memecylon umbellatum leaf. The reaction process was simple and easy to handle, and was monitored using ultraviolet-visible spectroscopy. The effect of the phytochemicals present in M. umbellatum, including saponins, phenolic compounds, phytosterols, and quinones, on formation of stable silver and gold nanoparticles was investigated by Fourier-transform infrared spectroscopy. The morphology and crystalline phase of the nanoparticles were determined by transmission electron microscopy and energy-dispersive x-ray spectroscopy. The results indicate that the saponins, phytosterols, and phenolic compounds present in the plant extract play a major role in formation of silver and gold nanoparticles in their respective ions in solution. The characteristics of the nanoparticles formed suggest application of silver and gold nanoparticles as chemical sensors in the future. Given the simple and eco-friendly approach for synthesis, these nanoparticles could easily be commercialized for large-scale production. PMID:23569372
Yousuf, Abu; Khan, Maksudur Rahman; Islam, M Amirul; Wahid, Zularisam Ab; Pirozzi, Domenico
2017-01-01
Microbial oils are considered as alternative to vegetable oils or animal fats as biodiesel feedstock. Microalgae and oleaginous yeast are the main candidates of microbial oil producers' community. However, biodiesel synthesis from these sources is associated with high cost and process complexity. The traditional transesterification method includes several steps such as biomass drying, cell disruption, oil extraction and solvent recovery. Therefore, direct transesterification or in situ transesterification, which combines all the steps in a single reactor, has been suggested to make the process cost effective. Nevertheless, the process is not applicable for large-scale biodiesel production having some difficulties such as high water content of biomass that makes the reaction rate slower and hurdles of cell disruption makes the efficiency of oil extraction lower. Additionally, it requires high heating energy in the solvent extraction and recovery stage. To resolve these difficulties, this review suggests the application of antimicrobial peptides and high electric fields to foster the microbial cell wall disruption.
A novel image retrieval algorithm based on PHOG and LSH
NASA Astrophysics Data System (ADS)
Wu, Hongliang; Wu, Weimin; Peng, Jiajin; Zhang, Junyuan
2017-08-01
PHOG can describe the local shape of the image and its relationship between the spaces. The using of PHOG algorithm to extract image features in image recognition and retrieval and other aspects have achieved good results. In recent years, locality sensitive hashing (LSH) algorithm has been superior to large-scale data in solving near-nearest neighbor problems compared with traditional algorithms. This paper presents a novel image retrieval algorithm based on PHOG and LSH. First, we use PHOG to extract the feature vector of the image, then use L different LSH hash table to reduce the dimension of PHOG texture to index values and map to different bucket, and finally extract the corresponding value of the image in the bucket for second image retrieval using Manhattan distance. This algorithm can adapt to the massive image retrieval, which ensures the high accuracy of the image retrieval and reduces the time complexity of the retrieval. This algorithm is of great significance.
VizieR Online Data Catalog: Horizon MareNostrum cosmological run (Gay+, 2010)
NASA Astrophysics Data System (ADS)
Gay, C.; Pichon, C.; Le Borgne, D.; Teyssier, R.; Sousbie, T.; Devriendt, J.
2010-11-01
The correlation between the large-scale distribution of galaxies and their spectroscopic properties at z=1.5 is investigated using the Horizon MareNostrum cosmological run. We have extracted a large sample of 105 galaxies from this large hydrodynamical simulation featuring standard galaxy formation physics. Spectral synthesis is applied to these single stellar populations to generate spectra and colours for all galaxies. We use the skeleton as a tracer of the cosmic web and study how our galaxy catalogue depends on the distance to the skeleton. We show that galaxies closer to the skeleton tend to be redder but that the effect is mostly due to the proximity of large haloes at the nodes of the skeleton, rather than the filaments themselves. The virtual catalogues (spectroscopical properties of the MareNostrum galaxies at various redshifts) are available online at http://www.iap.fr/users/pichon/MareNostrum/catalogues. (7 data files).
Obtaining lutein-rich extract from microalgal biomass at preparative scale.
Fernández-Sevilla, José M; Fernández, F Gabriel Acién; Grima, Emilio Molina
2012-01-01
Lutein extracts are in increasing demand due to their alleged role in the prevention of degenerative disorders such as age-related macular degeneration (AMD). Lutein extracts are currently obtained from plant sources, but microalgae have been demonstrated to be a competitive source likely to become an alternative. The extraction of lutein from microalgae posesses specific problems that arise from the different structure and composition of the source biomass. Here is presented a method for the recovery of lutein-rich carotenoid extracts from microalgal biomass in the kilogram scale.
Scale dependence of deuteron electrodisintegration
NASA Astrophysics Data System (ADS)
More, S. N.; Bogner, S. K.; Furnstahl, R. J.
2017-11-01
Background: Isolating nuclear structure properties from knock-out reactions in a process-independent manner requires a controlled factorization, which is always to some degree scale and scheme dependent. Understanding this dependence is important for robust extractions from experiment, to correctly use the structure information in other processes, and to understand the impact of approximations for both. Purpose: We seek insight into scale dependence by exploring a model calculation of deuteron electrodisintegration, which provides a simple and clean theoretical laboratory. Methods: By considering various kinematic regions of the longitudinal structure function, we can examine how the components—the initial deuteron wave function, the current operator, and the final-state interactions (FSIs)—combine at different scales. We use the similarity renormalization group to evolve each component. Results: When evolved to different resolutions, the ingredients are all modified, but how they combine depends strongly on the kinematic region. In some regions, for example, the FSIs are largely unaffected by evolution, while elsewhere FSIs are greatly reduced. For certain kinematics, the impulse approximation at a high renormalization group resolution gives an intuitive picture in terms of a one-body current breaking up a short-range correlated neutron-proton pair, although FSIs distort this simple picture. With evolution to low resolution, however, the cross section is unchanged but a very different and arguably simpler intuitive picture emerges, with the evolved current efficiently represented at low momentum through derivative expansions or low-rank singular value decompositions. Conclusions: The underlying physics of deuteron electrodisintegration is scale dependent and not just kinematics dependent. As a result, intuition about physics such as the role of short-range correlations or D -state mixing in particular kinematic regimes can be strongly scale dependent. Understanding this dependence is crucial in making use of extracted properties.
Nyakas, Adrien; Han, Jun; Peru, Kerry M; Headley, John V; Borchers, Christoph H
2013-05-07
Oil sands processed water (OSPW) is the main byproduct of the large-scale bitumen extraction activity in the Athabasca oil sands region (Alberta, Canada). We have investigated the acid-extractable fraction (AEF) of OSPW by extraction-only (EO) direct infusion (DI) negative-ion mode electrospray ionization (ESI) on a 12T-Fourier transform ion cyclotron resonance mass spectrometer (FTICR-MS), as well as by offline ultrahigh performance liquid chromatography (UHPLC) followed by DI-FTICR-MS. A preliminary offline UHPLC separation into 8 fractions using a reversed-phase C4 column led to approximately twice as many detected peaks and identified compounds (973 peaks versus 2231 peaks, of which 856 and 1734 peaks, respectively, could be assigned to chemical formulas based on accurate mass measurements). Conversion of these masses to the Kendrick mass scale allowed the straightforward recognition of homologues. Naphthenic (CnH2n+zO2) and oxy-naphthenic (CnH2n+zOx) acids represented the largest group of molecules with assigned formulas (64%), followed by sulfur-containing compounds (23%) and nitrogen-containing compounds (8%). Pooling of corresponding fractions from two consecutive offline UHPLC runs prior to MS analysis resulted in ~50% more assignments than a single injection, resulting in 3-fold increase of identifications compared to EO-DI-FTICR-MS using the same volume of starting material. Liquid-liquid extraction followed by offline UHPLC fractionation thus holds enormous potential for a more comprehensive profiling of OSPW, which may provide a deeper understanding of its chemical nature and environmental impact.
Arunachalam, Kantha D; Annamalai, Sathesh Kumar
2013-01-01
The exploitation of various plant materials for the biosynthesis of nanoparticles is considered a green technology as it does not involve any harmful chemicals. The aim of this study was to develop a simple biological method for the synthesis of silver and gold nanoparticles using Chrysopogon zizanioides. To exploit various plant materials for the biosynthesis of nanoparticles was considered a green technology. An aqueous leaf extract of C. zizanioides was used to synthesize silver and gold nanoparticles by the bioreduction of silver nitrate (AgNO3) and chloroauric acid (HAuCl4) respectively. Water-soluble organics present in the plant materials were mainly responsible for reducing silver or gold ions to nanosized Ag or Au particles. The synthesized silver and gold nanoparticles were characterized by ultraviolet (UV)-visible spectroscopy, scanning electron microscopy (SEM), energy dispersive X-ray analysis (EDAX), Fourier transform infrared spectroscopy (FTIR), and X-ray diffraction (XRD) analysis. The kinetics decline reactions of aqueous silver/gold ion with the C. zizanioides crude extract were determined by UV-visible spectroscopy. SEM analysis showed that aqueous gold ions, when exposed to the extract were reduced and resulted in the biosynthesis of gold nanoparticles in the size range 20–50 nm. This eco-friendly approach for the synthesis of nanoparticles is simple, can be scaled up for large-scale production with powerful bioactivity as demonstrated by the synthesized silver nanoparticles. The synthesized nanoparticles can have clinical use as antibacterial, antioxidant, as well as cytotoxic agents and can be used for biomedical applications. PMID:23861583
Catalina Eddy as revealed by the historical downscaling of reanalysis
NASA Astrophysics Data System (ADS)
Kanamitsu, Masao; Yulaeva, Elena; Li, Haiqin; Hong, Song-You
2013-08-01
Climatological properties, dynamical and thermodynamical characteristics of the Catalina Eddy are examined from the 61 years NCEP/NCAR Reanalysis downscaled to hourly 10 km resolution. The eddy is identified as a mesoscale cyclonic circulation confined to the Southern California Bight. Pattern correlation of wind direction against the canonical Catalina Eddy is used to extract cases from the downscaled analysis. Validation against published cases and various observations confirmed that the downscaled analysis accurately reproduces Catalina Eddy events. A composite analysis of the initiation phase of the eddy indicates that no apparent large-scale cyclonic/anti-cyclonic large-scale forcing is associated with the eddy formation or decay. The source of the vorticity is located at the coast of the Santa Barbara Channel. It is generated by the convergence of the wind system crossing over the San Rafael Mountains and the large-scale northwesterly flow associated with the subtropical high. This vorticity is advected towards the southeast by the northwesterly flow, which contributes to the formation of the streak of positive vorticity. At 6 hours prior to the mature stage, there is an explosive generation of positive vorticity along the coast, coincident with the phase change of the sea breeze circulation (wind turning from onshore to offshore), resulting in the convergence all along the California coast. The generation of vorticity due to convergence along the coast together with the advection of vorticity from the north resulted in the formation of southerly flow along the coast, forming the Catalina Eddy. The importance of diurnal variation and the lack of large-scale forcing are new findings, which are in sharp contrast to prior studies. These differences are due to the inclusion of many short-lived eddy events detected in our study which have not been included in other studies.
NASA Astrophysics Data System (ADS)
Bramhe, V. S.; Ghosh, S. K.; Garg, P. K.
2018-04-01
With rapid globalization, the extent of built-up areas is continuously increasing. Extraction of features for classifying built-up areas that are more robust and abstract is a leading research topic from past many years. Although, various studies have been carried out where spatial information along with spectral features has been utilized to enhance the accuracy of classification. Still, these feature extraction techniques require a large number of user-specific parameters and generally application specific. On the other hand, recently introduced Deep Learning (DL) techniques requires less number of parameters to represent more abstract aspects of the data without any manual effort. Since, it is difficult to acquire high-resolution datasets for applications that require large scale monitoring of areas. Therefore, in this study Sentinel-2 image has been used for built-up areas extraction. In this work, pre-trained Convolutional Neural Networks (ConvNets) i.e. Inception v3 and VGGNet are employed for transfer learning. Since these networks are trained on generic images of ImageNet dataset which are having very different characteristics from satellite images. Therefore, weights of networks are fine-tuned using data derived from Sentinel-2 images. To compare the accuracies with existing shallow networks, two state of art classifiers i.e. Gaussian Support Vector Machine (SVM) and Back-Propagation Neural Network (BP-NN) are also implemented. Both SVM and BP-NN gives 84.31 % and 82.86 % overall accuracies respectively. Inception-v3 and VGGNet gives 89.43 % of overall accuracy using fine-tuned VGGNet and 92.10 % when using Inception-v3. The results indicate high accuracy of proposed fine-tuned ConvNets on a 4-channel Sentinel-2 dataset for built-up area extraction.
Cosmic Microwave Background Anisotropy Measurement from Python V
NASA Astrophysics Data System (ADS)
Coble, K.; Dodelson, S.; Dragovan, M.; Ganga, K.; Knox, L.; Kovac, J.; Ratra, B.; Souradeep, T.
2003-02-01
We analyze observations of the microwave sky made with the Python experiment in its fifth year of operation at the Amundsen-Scott South Pole Station in Antarctica. After modeling the noise and constructing a map, we extract the cosmic signal from the data. We simultaneously estimate the angular power spectrum in eight bands ranging from large (l~40) to small (l~260) angular scales, with power detected in the first six bands. There is a significant rise in the power spectrum from large to smaller (l~200) scales, consistent with that expected from acoustic oscillations in the early universe. We compare this Python V map to a map made from data taken in the third year of Python. Python III observations were made at a frequency of 90 GHz and covered a subset of the region of the sky covered by Python V observations, which were made at 40 GHz. Good agreement is obtained both visually (with a filtered version of the map) and via a likelihood ratio test.
An integration of minimum local feature representation methods to recognize large variation of foods
NASA Astrophysics Data System (ADS)
Razali, Mohd Norhisham bin; Manshor, Noridayu; Halin, Alfian Abdul; Mustapha, Norwati; Yaakob, Razali
2017-10-01
Local invariant features have shown to be successful in describing object appearances for image classification tasks. Such features are robust towards occlusion and clutter and are also invariant against scale and orientation changes. This makes them suitable for classification tasks with little inter-class similarity and large intra-class difference. In this paper, we propose an integrated representation of the Speeded-Up Robust Feature (SURF) and Scale Invariant Feature Transform (SIFT) descriptors, using late fusion strategy. The proposed representation is used for food recognition from a dataset of food images with complex appearance variations. The Bag of Features (BOF) approach is employed to enhance the discriminative ability of the local features. Firstly, the individual local features are extracted to construct two kinds of visual vocabularies, representing SURF and SIFT. The visual vocabularies are then concatenated and fed into a Linear Support Vector Machine (SVM) to classify the respective food categories. Experimental results demonstrate impressive overall recognition at 82.38% classification accuracy based on the challenging UEC-Food100 dataset.
NASA Astrophysics Data System (ADS)
Zhou, Ying; Wang, Youhua; Liu, Runfeng; Xiao, Lin; Zhang, Qin; Huang, YongAn
2018-01-01
Epidermal electronics (e-skin) emerging in recent years offer the opportunity to noninvasively and wearably extract biosignals from human bodies. The conventional processes of e-skin based on standard microelectronic fabrication processes and a variety of transfer printing methods, nevertheless, unquestionably constrains the size of the devices, posing a serious challenge to collecting signals via skin, the largest organ in the human body. Herein we propose a multichannel noninvasive human-machine interface (HMI) using stretchable surface electromyography (sEMG) patches to realize a robot hand mimicking human gestures. Time-efficient processes are first developed to manufacture µm thick large-scale stretchable devices. With micron thickness, the stretchable µm thick sEMG patches show excellent conformability with human skin and consequently comparable electrical performance with conventional gel electrodes. Combined with the large-scale size, the multichannel noninvasive HMI via stretchable µm thick sEMG patches successfully manipulates the robot hand with eight different gestures, whose precision is as high as conventional gel electrodes array.
Helium ion microscopy of Lepidoptera scales.
Boden, Stuart A; Asadollahbaik, Asa; Rutt, Harvey N; Bagnall, Darren M
2012-01-01
In this report, helium ion microscopy (HIM) is used to study the micro and nanostructures responsible for structural color in the wings of two species of Lepidotera from the Papilionidae family: Papilio ulysses (Blue Mountain Butterfly) and Parides sesostris (Emerald-patched Cattleheart). Electronic charging of uncoated scales from the wings of these butterflies, due to the incident ion beam, is successfully neutralized, leading to images displaying a large depth-of-field and a high level of surface detail, which would normally be obscured by traditional coating methods used for scanning electron microscopy (SEM). The images are compared with those from variable pressure SEM, demonstrating the superiority of HIM at high magnifications. In addition, the large depth-of-field capabilities of HIM are exploited through the creation of stereo pairs that allows the exploration of the third dimension. Furthermore, the extraction of quantitative height information which matches well with cross-sectional transmission electron microscopy measurements from the literature is demonstrated. © Wiley Periodicals, Inc.
On the impact of approximate computation in an analog DeSTIN architecture.
Young, Steven; Lu, Junjie; Holleman, Jeremy; Arel, Itamar
2014-05-01
Deep machine learning (DML) holds the potential to revolutionize machine learning by automating rich feature extraction, which has become the primary bottleneck of human engineering in pattern recognition systems. However, the heavy computational burden renders DML systems implemented on conventional digital processors impractical for large-scale problems. The highly parallel computations required to implement large-scale deep learning systems are well suited to custom hardware. Analog computation has demonstrated power efficiency advantages of multiple orders of magnitude relative to digital systems while performing nonideal computations. In this paper, we investigate typical error sources introduced by analog computational elements and their impact on system-level performance in DeSTIN--a compositional deep learning architecture. These inaccuracies are evaluated on a pattern classification benchmark, clearly demonstrating the robustness of the underlying algorithm to the errors introduced by analog computational elements. A clear understanding of the impacts of nonideal computations is necessary to fully exploit the efficiency of analog circuits.
NASA Astrophysics Data System (ADS)
Piecuch, C. G.; Huybers, P. J.; Hay, C.; Mitrovica, J. X.; Little, C. M.; Ponte, R. M.; Tingley, M.
2017-12-01
Understanding observed spatial variations in centennial relative sea level trends on the United States east coast has important scientific and societal applications. Past studies based on models and proxies variously suggest roles for crustal displacement, ocean dynamics, and melting of the Greenland ice sheet. Here we perform joint Bayesian inference on regional relative sea level, vertical land motion, and absolute sea level fields based on tide gauge records and GPS data. Posterior solutions show that regional vertical land motion explains most (80% median estimate) of the spatial variance in the large-scale relative sea level trend field on the east coast over 1900-2016. The posterior estimate for coastal absolute sea level rise is remarkably spatially uniform compared to previous studies, with a spatial average of 1.4-2.3 mm/yr (95% credible interval). Results corroborate glacial isostatic adjustment models and reveal that meaningful long-period, large-scale vertical velocity signals can be extracted from short GPS records.
Zhang, Bo; Fu, Yingxue; Huang, Chao; Zheng, Chunli; Wu, Ziyin; Zhang, Wenjuan; Yang, Xiaoyan; Gong, Fukai; Li, Yuerong; Chen, Xiaoyu; Gao, Shuo; Chen, Xuetong; Li, Yan; Lu, Aiping; Wang, Yonghua
2016-02-25
The development of modern omics technology has not significantly improved the efficiency of drug development. Rather precise and targeted drug discovery remains unsolved. Here a large-scale cross-species molecular network association (CSMNA) approach for targeted drug screening from natural sources is presented. The algorithm integrates molecular network omics data from humans and 267 plants and microbes, establishing the biological relationships between them and extracting evolutionarily convergent chemicals. This technique allows the researcher to assess targeted drugs for specific human diseases based on specific plant or microbe pathways. In a perspective validation, connections between the plant Halliwell-Asada (HA) cycle and the human Nrf2-ARE pathway were verified and the manner by which the HA cycle molecules act on the human Nrf2-ARE pathway as antioxidants was determined. This shows the potential applicability of this approach in drug discovery. The current method integrates disparate evolutionary species into chemico-biologically coherent circuits, suggesting a new cross-species omics analysis strategy for rational drug development.
MAPPING GROWTH AND GRAVITY WITH ROBUST REDSHIFT SPACE DISTORTIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwan, Juliana; Lewis, Geraint F.; Linder, Eric V.
2012-04-01
Redshift space distortions (RSDs) caused by galaxy peculiar velocities provide a window onto the growth rate of large-scale structure and a method for testing general relativity. We investigate through a comparison of N-body simulations to various extensions of perturbation theory beyond the linear regime, the robustness of cosmological parameter extraction, including the gravitational growth index {gamma}. We find that the Kaiser formula and some perturbation theory approaches bias the growth rate by 1{sigma} or more relative to the fiducial at scales as large as k > 0.07 h Mpc{sup -1}. This bias propagates to estimates of the gravitational growth indexmore » as well as {Omega}{sub m} and the equation-of-state parameter and presents a significant challenge to modeling RSDs. We also determine an accurate fitting function for a combination of line-of-sight damping and higher order angular dependence that allows robust modeling of the redshift space power spectrum to substantially higher k.« less
NASA Astrophysics Data System (ADS)
Dai, LongGui; Yang, Fan; Yue, Gen; Jiang, Yang; Jia, Haiqiang; Wang, Wenxin; Chen, Hong
2014-11-01
Generally, nano-scale patterned sapphire substrate (NPSS) has better performance than micro-scale patterned sapphire substrate (MPSS) in improving the light extraction efficiency of LEDs. Laser interference lithography (LIL) is one of the powerful fabrication methods for periodic nanostructures without photo-masks for different designs. However, Lloyd's mirror LIL system has the disadvantage that fabricated patterns are inevitably distorted, especially for large-area twodimensional (2D) periodic nanostructures. Herein, we introduce two-beam LIL system to fabricate consistent large-area NPSS. Quantitative analysis and characterization indicate that the high uniformity of the photoresist arrays is achieved. Through the combination of dry etching and wet etching techniques, the well-defined NPSS with period of 460 nm were prepared on the whole sapphire substrate. The deviation is 4.34% for the bottom width of the triangle truncated pyramid arrays on the whole 2-inch sapphire substrate, which is suitable for the application in industrial production of NPSS.
Feature hashing for fast image retrieval
NASA Astrophysics Data System (ADS)
Yan, Lingyu; Fu, Jiarun; Zhang, Hongxin; Yuan, Lu; Xu, Hui
2018-03-01
Currently, researches on content based image retrieval mainly focus on robust feature extraction. However, due to the exponential growth of online images, it is necessary to consider searching among large scale images, which is very timeconsuming and unscalable. Hence, we need to pay much attention to the efficiency of image retrieval. In this paper, we propose a feature hashing method for image retrieval which not only generates compact fingerprint for image representation, but also prevents huge semantic loss during the process of hashing. To generate the fingerprint, an objective function of semantic loss is constructed and minimized, which combine the influence of both the neighborhood structure of feature data and mapping error. Since the machine learning based hashing effectively preserves neighborhood structure of data, it yields visual words with strong discriminability. Furthermore, the generated binary codes leads image representation building to be of low-complexity, making it efficient and scalable to large scale databases. Experimental results show good performance of our approach.
Nuclear modification factor in an anisotropic quark-gluon plasma
NASA Astrophysics Data System (ADS)
Mandal, Mahatsab; Bhattacharya, Lusaka; Roy, Pradip
2011-10-01
We calculate the nuclear modification factor (RAA) of light hadrons by taking into account the initial state momentum anisotropy of the quark-gluon plasma (QGP) expected to be formed in relativistic heavy ion collisions. Such an anisotropy can result from the initial rapid longitudinal expansion of the matter. A phenomenological model for the space-time evolution of the anisotropic QGP is used to obtain the time dependence of the anisotropy parameter ξ and the hard momentum scale, phard. The result is then compared with the PHENIX experimental data to constrain the isotropization time scale, τiso for fixed initial conditions (FIC). It is shown that the extracted value of τiso lies in the range 0.5⩽τiso⩽1.5. However, using a fixed final multiplicity (FFM) condition does not lead to any firm conclusion about the extraction of the isotropization time. The present calculation is also extended to contrast with the recent measurement of nuclear modification factor by the ALICE collaboration at s=2.76 TeV. It is argued that in the present approach, the extraction of τiso at this energy is uncertain and, therefore, refinement of the model is necessary. The sensitivity of the results on the initial conditions has been discussed. We also present the nuclear modification factor at Large Hadron Collider (LHC) energies with s=5.5 TeV.
Hiron, Matthew; Jonsell, Mats; Kubart, Ariana; Thor, Göran; Schroeder, Martin; Dahlberg, Anders; Johansson, Victor; Ranius, Thomas
2017-08-01
Stumps and slash resulting from forest clearcutting is used as a source of low-net-carbon energy, but there are concerns about the consequences of biofuel extraction on biodiversity. Logging residues constitute potentially important habitats, since a large part of forest biodiversity is dependent on dead wood. Here we used snapshot field data from a managed forest landscape (25 000 ha) to predict landscape scale population changes of dead wood dependent organisms after extraction of stumps and slash after clearcutting. We did this by estimating habitat availability for all observed dead wood-dependent beetles, macrofungi, and lichens (380 species) in the whole landscape. We found that 53% of species occurred in slash or stumps. For most species, population declines after moderate extraction (≤30%) were small (<10% decline) because they mainly occur on other dead wood types. However, some species were only recorded in slash and stumps. Red listed species were affected by slash and stump extraction (12 species), but less often than other species. Beetles and fungi were more affected by stump extraction, while lichens were more affected by slash extraction. For beetles and lichens, extraction of a combination of spruce, pine and birch resulted in larger negative effects than if only extracting spruce, while for fungi tree species had little effect. We conclude that extensive extraction decreases the amount of habitat to such extent that it may have negative consequences on species persistence at the landscape level. The negative consequences can be limited by extracting only slash, or only logging residues from spruce stands. Copyright © 2017 Elsevier Ltd. All rights reserved.
Extraction of Extended Small-Scale Objects in Digital Images
NASA Astrophysics Data System (ADS)
Volkov, V. Y.
2015-05-01
Detection and localization problem of extended small-scale objects with different shapes appears in radio observation systems which use SAR, infra-red, lidar and television camera. Intensive non-stationary background is the main difficulty for processing. Other challenge is low quality of images, blobs, blurred boundaries; in addition SAR images suffer from a serious intrinsic speckle noise. Statistics of background is not normal, it has evident skewness and heavy tails in probability density, so it is hard to identify it. The problem of extraction small-scale objects is solved here on the basis of directional filtering, adaptive thresholding and morthological analysis. New kind of masks is used which are open-ended at one side so it is possible to extract ends of line segments with unknown length. An advanced method of dynamical adaptive threshold setting is investigated which is based on isolated fragments extraction after thresholding. Hierarchy of isolated fragments on binary image is proposed for the analysis of segmentation results. It includes small-scale objects with different shape, size and orientation. The method uses extraction of isolated fragments in binary image and counting points in these fragments. Number of points in extracted fragments is normalized to the total number of points for given threshold and is used as effectiveness of extraction for these fragments. New method for adaptive threshold setting and control maximises effectiveness of extraction. It has optimality properties for objects extraction in normal noise field and shows effective results for real SAR images.
Superstatistical fluctuations in time series: Applications to share-price dynamics and turbulence
NASA Astrophysics Data System (ADS)
van der Straeten, Erik; Beck, Christian
2009-09-01
We report a general technique to study a given experimental time series with superstatistics. Crucial for the applicability of the superstatistics concept is the existence of a parameter β that fluctuates on a large time scale as compared to the other time scales of the complex system under consideration. The proposed method extracts the main superstatistical parameters out of a given data set and examines the validity of the superstatistical model assumptions. We test the method thoroughly with surrogate data sets. Then the applicability of the superstatistical approach is illustrated using real experimental data. We study two examples, velocity time series measured in turbulent Taylor-Couette flows and time series of log returns of the closing prices of some stock market indices.
Properties of gelatin film from horse mackerel (Trachurus japonicus) scale.
Le, Thuy; Maki, Hiroki; Takahashi, Kigen; Okazaki, Emiko; Osako, Kazufumi
2015-04-01
Optimal conditions for extracting gelatin and preparing gelatin film from horse mackerel scale, such as extraction temperature and time, as well as the protein concentration of film-forming solutions were investigated. Yields of extracted gelatin at 70 °C, 80 °C, and 90 °C for 15 min to 3 h were 1.08% to 3.45%, depending on the extraction conditions. Among the various extraction times and temperatures, the film from gelatin extracted at 70 °C for 1 h showed the highest tensile strength and elongation at break. Horse mackerel scale gelatin film showed the greatly low water vapor permeability (WVP) compared with mammalian or fish gelatin films, maybe due to its containing a slightly higher level of hydrophobic amino acids (total 653 residues per 1000 residues) than that of mammalian, cold-water fish and warm-water fish gelatins. Gelatin films from different preparation conditions showed excellent UV barrier properties at wavelength of 200 nm, although the films were transparent at visible wavelength. As a consequence, it can be suggested that gelatin film from horse mackerel scale extracted at 70 °C for 1 h can be applied to food packaging material due to its lowest WVP value and excellent UV barrier properties. © 2015 Institute of Food Technologists®
Development progresses of radio frequency ion source for neutral beam injector in fusion devices.
Chang, D H; Jeong, S H; Kim, T S; Park, M; Lee, K W; In, S R
2014-02-01
A large-area RF (radio frequency)-driven ion source is being developed in Germany for the heating and current drive of an ITER device. Negative hydrogen ion sources are the major components of neutral beam injection systems in future large-scale fusion experiments such as ITER and DEMO. RF ion sources for the production of positive hydrogen (deuterium) ions have been successfully developed for the neutral beam heating systems at IPP (Max-Planck-Institute for Plasma Physics) in Germany. The first long-pulse ion source has been developed successfully with a magnetic bucket plasma generator including a filament heating structure for the first NBI system of the KSTAR tokamak. There is a development plan for an RF ion source at KAERI to extract the positive ions, which can be applied for the KSTAR NBI system and to extract the negative ions for future fusion devices such as the Fusion Neutron Source and Korea-DEMO. The characteristics of RF-driven plasmas and the uniformity of the plasma parameters in the test-RF ion source were investigated initially using an electrostatic probe.
Fake currency detection using image processing
NASA Astrophysics Data System (ADS)
Agasti, Tushar; Burand, Gajanan; Wade, Pratik; Chitra, P.
2017-11-01
The advancement of color printing technology has increased the rate of fake currency note printing and duplicating the notes on a very large scale. Few years back, the printing could be done in a print house, but now anyone can print a currency note with maximum accuracy using a simple laser printer. As a result the issue of fake notes instead of the genuine ones has been increased very largely. India has been unfortunately cursed with the problems like corruption and black money. And counterfeit of currency notes is also a big problem to it. This leads to design of a system that detects the fake currency note in a less time and in a more efficient manner. The proposed system gives an approach to verify the Indian currency notes. Verification of currency note is done by the concepts of image processing. This article describes extraction of various features of Indian currency notes. MATLAB software is used to extract the features of the note. The proposed system has got advantages like simplicity and high performance speed. The result will predict whether the currency note is fake or not.
Kataoka, Toshiyuki; Hoshi, Keika; Ando, Tomohiro
2016-01-01
Objective Unexpected post-extraction bleeding is often experienced in clinical practice. Therefore, determining the risk of post-extraction bleeding in patients receiving anticoagulant therapy prior to surgery is beneficial. This study aimed to verify whether the HAS-BLED score was useful in predicting post-extraction bleeding in patients taking warfarin. Design Retrospective cohort study. Setting Department of Oral and Maxillofacial Surgery, Tokyo Women's Medical University. Participants Participants included 258 sequential cases (462 teeth) who had undergone tooth extraction between 1 January 2010 and 31 December 2012 while continuing warfarin therapy. Main outcome measure Post-extraction risk factors for bleeding. The following data were collected as the predicting variables for multivariate logistic analysis: the HAS-BLED score, extraction site, tooth type, stability of teeth, extraction procedure, prothrombin time-international normalised ratio value, platelet count and the use of concomitant antiplatelet agents. Results Post-extraction bleeding was noted in 21 (8.1%) of the 258 cases. Haemostasis was achieved with localised haemostatic procedures in all the cases of post-extraction bleeding. The HAS-BLED score was found to be insufficient in predicting post-extraction bleeding (area under the curve=0.548, p=0.867, multivariate analysis). The risk of post-extraction bleeding was approximately three times greater in patients taking concomitant oral antiplatelet agents (risk ratio=2.881, p=0.035, multivariate analysis). Conclusions The HAS-BLED score alone could not predict post-extraction bleeding. The concomitant use of oral antiplatelet agents was a risk factor for post-extraction bleeding. No episodes of post-extraction bleeding required more than local measures for haemostasis. However, because this was a retrospective study conducted at a single institution, large-scale prospective cohort studies, which include cases of outpatient tooth extraction, will be necessary in the future. PMID:26936909
Schott, Benjamin; Traub, Manuel; Schlagenhauf, Cornelia; Takamiya, Masanari; Antritter, Thomas; Bartschat, Andreas; Löffler, Katharina; Blessing, Denis; Otte, Jens C; Kobitski, Andrei Y; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf; Stegmaier, Johannes
2018-04-01
State-of-the-art light-sheet and confocal microscopes allow recording of entire embryos in 3D and over time (3D+t) for many hours. Fluorescently labeled structures can be segmented and tracked automatically in these terabyte-scale 3D+t images, resulting in thousands of cell migration trajectories that provide detailed insights to large-scale tissue reorganization at the cellular level. Here we present EmbryoMiner, a new interactive open-source framework suitable for in-depth analyses and comparisons of entire embryos, including an extensive set of trajectory features. Starting at the whole-embryo level, the framework can be used to iteratively focus on a region of interest within the embryo, to investigate and test specific trajectory-based hypotheses and to extract quantitative features from the isolated trajectories. Thus, the new framework provides a valuable new way to quantitatively compare corresponding anatomical regions in different embryos that were manually selected based on biological prior knowledge. As a proof of concept, we analyzed 3D+t light-sheet microscopy images of zebrafish embryos, showcasing potential user applications that can be performed using the new framework.
Urban area thermal monitoring: Liepaja case study using satellite and aerial thermal data
NASA Astrophysics Data System (ADS)
Gulbe, Linda; Caune, Vairis; Korats, Gundars
2017-12-01
The aim of this study is to explore large (60 m/pixel) and small scale (individual building level) temperature distribution patterns from thermal remote sensing data and to conclude what kind of information could be extracted from thermal remote sensing on regular basis. Landsat program provides frequent large scale thermal images useful for analysis of city temperature patterns. During the study correlation between temperature patterns and vegetation content based on NDVI and building coverage based on OpenStreetMap data was studied. Landsat based temperature patterns were independent from the season, negatively correlated with vegetation content and positively correlated with building coverage. Small scale analysis included spatial and raster descriptor analysis for polygons corresponding to roofs of individual buildings for evaluating insulation of roofs. Remote sensing and spatial descriptors are poorly related to heat consumption data, however, thermal aerial data median and entropy can help to identify poorly insulated roofs. Automated quantitative roof analysis has high potential for acquiring city wide information about roof insulation, but quality is limited by reference data quality and information on building types, and roof materials would be crucial for further studies.
A multi-scale network method for two-phase flow in porous media
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khayrat, Karim, E-mail: khayratk@ifd.mavt.ethz.ch; Jenny, Patrick
Pore-network models of porous media are useful in the study of pore-scale flow in porous media. In order to extract macroscopic properties from flow simulations in pore-networks, it is crucial the networks are large enough to be considered representative elementary volumes. However, existing two-phase network flow solvers are limited to relatively small domains. For this purpose, a multi-scale pore-network (MSPN) method, which takes into account flow-rate effects and can simulate larger domains compared to existing methods, was developed. In our solution algorithm, a large pore network is partitioned into several smaller sub-networks. The algorithm to advance the fluid interfaces withinmore » each subnetwork consists of three steps. First, a global pressure problem on the network is solved approximately using the multiscale finite volume (MSFV) method. Next, the fluxes across the subnetworks are computed. Lastly, using fluxes as boundary conditions, a dynamic two-phase flow solver is used to advance the solution in time. Simulation results of drainage scenarios at different capillary numbers and unfavourable viscosity ratios are presented and used to validate the MSPN method against solutions obtained by an existing dynamic network flow solver.« less
EvoluCode: Evolutionary Barcodes as a Unifying Framework for Multilevel Evolutionary Data.
Linard, Benjamin; Nguyen, Ngoc Hoan; Prosdocimi, Francisco; Poch, Olivier; Thompson, Julie D
2012-01-01
Evolutionary systems biology aims to uncover the general trends and principles governing the evolution of biological networks. An essential part of this process is the reconstruction and analysis of the evolutionary histories of these complex, dynamic networks. Unfortunately, the methodologies for representing and exploiting such complex evolutionary histories in large scale studies are currently limited. Here, we propose a new formalism, called EvoluCode (Evolutionary barCode), which allows the integration of different evolutionary parameters (eg, sequence conservation, orthology, synteny …) in a unifying format and facilitates the multilevel analysis and visualization of complex evolutionary histories at the genome scale. The advantages of the approach are demonstrated by constructing barcodes representing the evolution of the complete human proteome. Two large-scale studies are then described: (i) the mapping and visualization of the barcodes on the human chromosomes and (ii) automatic clustering of the barcodes to highlight protein subsets sharing similar evolutionary histories and their functional analysis. The methodologies developed here open the way to the efficient application of other data mining and knowledge extraction techniques in evolutionary systems biology studies. A database containing all EvoluCode data is available at: http://lbgi.igbmc.fr/barcodes.
NASA Astrophysics Data System (ADS)
Wei, Hongqiang; Zhou, Guiyun; Zhou, Junjie
2018-04-01
The classification of leaf and wood points is an essential preprocessing step for extracting inventory measurements and canopy characterization of trees from the terrestrial laser scanning (TLS) data. The geometry-based approach is one of the widely used classification method. In the geometry-based method, it is common practice to extract salient features at one single scale before the features are used for classification. It remains unclear how different scale(s) used affect the classification accuracy and efficiency. To assess the scale effect on the classification accuracy and efficiency, we extracted the single-scale and multi-scale salient features from the point clouds of two oak trees of different sizes and conducted the classification on leaf and wood. Our experimental results show that the balanced accuracy of the multi-scale method is higher than the average balanced accuracy of the single-scale method by about 10 % for both trees. The average speed-up ratio of single scale classifiers over multi-scale classifier for each tree is higher than 30.
Adaptive Texture Synthesis for Large Scale City Modeling
NASA Astrophysics Data System (ADS)
Despine, G.; Colleu, T.
2015-02-01
Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.
MAGNETOHYDRODYNAMIC SIMULATION-DRIVEN KINEMATIC MEAN FIELD MODEL OF THE SOLAR CYCLE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simard, Corinne; Charbonneau, Paul; Bouchat, Amelie, E-mail: corinne@astro.umontreal.ca, E-mail: paulchar@astro.umontreal.ca, E-mail: amelie.bouchat@mail.mcgill.ca
We construct a series of kinematic axisymmetric mean-field dynamo models operating in the {alpha}{Omega}, {alpha}{sup 2}{Omega} and {alpha}{sup 2} regimes, all using the full {alpha}-tensor extracted from a global magnetohydrodynamical simulation of solar convection producing large-scale magnetic fields undergoing solar-like cyclic polarity reversals. We also include an internal differential rotation profile produced in a purely hydrodynamical parent simulation of solar convection, and a simple meridional flow profile described by a single cell per meridional quadrant. An {alpha}{sup 2}{Omega} mean-field model, presumably closest to the mode of dynamo action characterizing the MHD simulation, produces a spatiotemporal evolution of magnetic fields thatmore » share some striking similarities with the zonally-averaged toroidal component extracted from the simulation. Comparison with {alpha}{sup 2} and {alpha}{Omega} mean-field models operating in the same parameter regimes indicates that much of the complexity observed in the spatiotemporal evolution of the large-scale magnetic field in the simulation can be traced to the turbulent electromotive force. Oscillating {alpha}{sup 2} solutions are readily produced, and show some similarities with the observed solar cycle, including a deep-seated toroidal component concentrated at low latitudes and migrating equatorward in the course of the solar cycle. Various numerical experiments performed using the mean-field models reveal that turbulent pumping plays an important role in setting the global characteristics of the magnetic cycles.« less
Event-driven processing for hardware-efficient neural spike sorting
NASA Astrophysics Data System (ADS)
Liu, Yan; Pereira, João L.; Constandinou, Timothy G.
2018-02-01
Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.
The lignol approach to biorefining of woody biomass to produce ethanol and chemicals.
Arato, Claudio; Pye, E Kendall; Gjennestad, Gordon
2005-01-01
Processes that produce only ethanol from lignocellulosics display poor economics. This is generally overcome by constructing large facilities having satisfactory economies of scale, thus making financing onerous and hindering the development of suitable technologies. Lignol Innovations has developed a biorefining technology that employs an ethanol-based organosolv step to separate lignin, hemicellulose components, and extractives from the cellulosic fraction of woody biomass. The resultant cellulosic fraction is highly susceptible to enzymatic hydrolysis, generating very high yields of glucose (>90% in 12-24 h) with typical enzyme loadings of 10-20 FPU (filter paper units)/g. This glucose is readily converted to ethanol, or possibly other sugar platform chemicals, either by sequential or simultaneous saccharification and fermentation. The liquor from the organosolv step is processed by well-established unit operations to recover lignin, furfural, xylose, acetic acid, and a lipophylic extractives fraction. The process ethanol is recovered and recycled back to the process. The resulting recycled process water is of a very high quality, low BOD5, and suitable for overall system process closure. Significant benefits can be attained in greenhouse gas (GHG) emission reductions, as per the Kyoto Protocol. Revenues from the multiple products, particularly the lignin, ethanol and xylose fractions, ensure excellent economics for the process even in plants as small as 100 mtpd (metric tonnes per day) dry woody biomass input a scale suitable for processing wood residues produced by a single large sawmill.
Shaw, Emily E; Schultz, Aaron P; Sperling, Reisa A; Hedden, Trey
2015-10-01
Intrinsic functional connectivity MRI has become a widely used tool for measuring integrity in large-scale cortical networks. This study examined multiple cortical networks using Template-Based Rotation (TBR), a method that applies a priori network and nuisance component templates defined from an independent dataset to test datasets of interest. A priori templates were applied to a test dataset of 276 older adults (ages 65-90) from the Harvard Aging Brain Study to examine the relationship between multiple large-scale cortical networks and cognition. Factor scores derived from neuropsychological tests represented processing speed, executive function, and episodic memory. Resting-state BOLD data were acquired in two 6-min acquisitions on a 3-Tesla scanner and processed with TBR to extract individual-level metrics of network connectivity in multiple cortical networks. All results controlled for data quality metrics, including motion. Connectivity in multiple large-scale cortical networks was positively related to all cognitive domains, with a composite measure of general connectivity positively associated with general cognitive performance. Controlling for the correlations between networks, the frontoparietal control network (FPCN) and executive function demonstrated the only significant association, suggesting specificity in this relationship. Further analyses found that the FPCN mediated the relationships of the other networks with cognition, suggesting that this network may play a central role in understanding individual variation in cognition during aging.
Asymptotic theory of time varying networks with burstiness and heterogeneous activation patterns
NASA Astrophysics Data System (ADS)
Burioni, Raffaella; Ubaldi, Enrico; Vezzani, Alessandro
2017-05-01
The recent availability of large-scale, time-resolved and high quality digital datasets has allowed for a deeper understanding of the structure and properties of many real-world networks. The empirical evidence of a temporal dimension prompted the switch of paradigm from a static representation of networks to a time varying one. In this work we briefly review the framework of time-varying-networks in real world social systems, especially focusing on the activity-driven paradigm. We develop a framework that allows for the encoding of three generative mechanisms that seem to play a central role in the social networks’ evolution: the individual’s propensity to engage in social interactions, its strategy in allocate these interactions among its alters and the burstiness of interactions amongst social actors. The functional forms and probability distributions encoding these mechanisms are typically data driven. A natural question arises if different classes of strategies and burstiness distributions, with different local scale behavior and analogous asymptotics can lead to the same long time and large scale structure of the evolving networks. We consider the problem in its full generality, by investigating and solving the system dynamics in the asymptotic limit, for general classes of ties allocation mechanisms and waiting time probability distributions. We show that the asymptotic network evolution is driven by a few characteristics of these functional forms, that can be extracted from direct measurements on large datasets.
Validation of Arabic and English versions of the ARSMA-II Acculturation Rating Scale.
Jadalla, Ahlam; Lee, Jerry
2015-02-01
To translate and adapt the Acculturation Rating Scale of Mexican-Americans II (ARSMA-II) for Arab Americans. A multistage translation process followed by a pilot and a large study. The translated and adapted versions, Acculturation Rating Scale for Arabic Americans-II Arabic and English (ARSAA-IIA, ARSAA-IIE), were validated in a sample of 297 Arab Americans. Factor analyses with principal axis factoring extractions and direct oblimin rotations were used to identify the underlying structure of ARSAA-II. Factor analysis confirmed the underlying structure of ARSAA-II and produced two interpretable factors labeled as 'Attraction to American Culture' (AAmC) and 'Attraction to Arabic Culture' (AArC). The Cronbach's alphas of AAmC and AArC were .89 and .85 respectively. Findings support ARSAA-II A & E to assess acculturation among Arab Americans. The emergent factors of ARSAA-II support the theoretical structure of the original ARSMA-II tool and show high internal consistency.
Deep learning with non-medical training used for chest pathology identification
NASA Astrophysics Data System (ADS)
Bar, Yaniv; Diamant, Idit; Wolf, Lior; Greenspan, Hayit
2015-03-01
In this work, we examine the strength of deep learning approaches for pathology detection in chest radiograph data. Convolutional neural networks (CNN) deep architecture classification approaches have gained popularity due to their ability to learn mid and high level image representations. We explore the ability of a CNN to identify different types of pathologies in chest x-ray images. Moreover, since very large training sets are generally not available in the medical domain, we explore the feasibility of using a deep learning approach based on non-medical learning. We tested our algorithm on a dataset of 93 images. We use a CNN that was trained with ImageNet, a well-known large scale nonmedical image database. The best performance was achieved using a combination of features extracted from the CNN and a set of low-level features. We obtained an area under curve (AUC) of 0.93 for Right Pleural Effusion detection, 0.89 for Enlarged heart detection and 0.79 for classification between healthy and abnormal chest x-ray, where all pathologies are combined into one large class. This is a first-of-its-kind experiment that shows that deep learning with large scale non-medical image databases may be sufficient for general medical image recognition tasks.
Ruiz-Montañez, G; Ragazzo-Sánchez, J A; Calderón-Santoyo, M; Velázquez-de la Cruz, G; de León, J A Ramírez; Navarro-Ocaña, A
2014-09-15
Bioactive compounds have become very important in the food and pharmaceutical markets leading research interests seeking efficient methods for extracting these bioactive substances. The objective of this research is to implement preparative scale obtention of mangiferin and lupeol from mango fruit (Mangifera indica L.) of autochthonous and Ataulfo varieties grown in Nayarit, using emerging extraction techniques. Five extraction techniques were evaluated: maceration, Soxhlet, sonication (UAE), microwave (MAE) and high hydrostatic pressures (HHP). Two maturity stages (physiological and consumption) as well as peel and fruit pulp were evaluated for preparative scale implementation. Peels from Ataulfo mango at consumption maturity stage can be considered as a source of mangiferin and lupeol using the UEA method as it improves extraction efficiency by increasing yield and shortening time. Copyright © 2014 Elsevier Ltd. All rights reserved.
Heath, Jason E; McKenna, Sean A; Dewers, Thomas A; Roach, Jesse D; Kobos, Peter H
2014-01-21
CO2 storage efficiency is a metric that expresses the portion of the pore space of a subsurface geologic formation that is available to store CO2. Estimates of storage efficiency for large-scale geologic CO2 storage depend on a variety of factors including geologic properties and operational design. These factors govern estimates on CO2 storage resources, the longevity of storage sites, and potential pressure buildup in storage reservoirs. This study employs numerical modeling to quantify CO2 injection well numbers, well spacing, and storage efficiency as a function of geologic formation properties, open-versus-closed boundary conditions, and injection with or without brine extraction. The set of modeling runs is important as it allows the comparison of controlling factors on CO2 storage efficiency. Brine extraction in closed domains can result in storage efficiencies that are similar to those of injection in open-boundary domains. Geomechanical constraints on downhole pressure at both injection and extraction wells lower CO2 storage efficiency as compared to the idealized scenario in which the same volumes of CO2 and brine are injected and extracted, respectively. Geomechanical constraints should be taken into account to avoid potential damage to the storage site.
Gelenberg, Alan J; Shelton, Richard C; Crits-Christoph, Paul; Keller, Martin B; Dunner, David L; Hirschfeld, Robert M A; Thase, Michael E; Russell, James M; Lydiard, R Bruce; Gallop, Robert J; Todd, Linda; Hellerstein, David J; Goodnick, Paul J; Keitner, Gabor I; Stahl, Stephen M; Halbreich, Uriel; Hopkins, Heather S
2004-08-01
A continuation study of an extract of St. John's wort (Hypericum perforatum) for depression was performed in follow-up to an acute study that found no significant difference between St. John's wort extract and placebo. Seventeen subjects with DSM-IV-defined major depressive disorder who responded to St. John's wort extract in the acute-phase study (phase 1) were continued on double-blind treatment with the same preparation for 24 weeks. Ninety-five subjects who did not respond to either St. John's wort or placebo were treated with an antidepressant for 24 weeks. During antidepressant treatment, mean scores on the Hamilton Rating Scale for Depression for phase 1 nonresponders decreased significantly (p <.0001), with no significant difference between St. John's wort nonresponders and placebo nonresponders. Of the 17 subjects continued on treatment with St. John's wort extract, 5 (29.4%) relapsed. The subjects who did not respond to St. John's wort extract or placebo in phase 1 were, by and large, not resistant to antidepressant treatment. This suggests that the lack of efficacy found by Shelton et al. in the acute-phase study was unlikely to be the result of a high proportion of treatment-resistant subjects.
Detecting the red tide based on remote sensing data in optically complex East China Sea
NASA Astrophysics Data System (ADS)
Xu, Xiaohui; Pan, Delu; Mao, Zhihua; Tao, Bangyi; Liu, Qiong
2012-09-01
Red tide not only destroys marine fishery production, deteriorates the marine environment, affects coastal tourist industry, but also causes human poison, even death by eating toxic seafood contaminated by red tide organisms. Remote sensing technology has the characteristics of large-scale, synchronized, rapid monitoring, so it is one of the most important and most effective means of red tide monitoring. This paper selects the high frequency red tides areas of the East China Sea as study area, MODIS/Aqua L2 data as the data source, analysis and compares the spectral differences in the red tide water bodies and non-red tide water bodies of many historical events. Based on the spectral differences, this paper develops the algorithm of Rrs555/Rrs488> 1.5 to extract the red tide information. Apply the algorithm on red tide event happened in the East China Sea on May 28, 2009 to extract the information of red tide, and found that the method can determine effectively the location of the occurrence of red tide; there is a good corresponding relationship between red tide extraction result and chlorophyll a concentration extracted by remote sensing, shows that these algorithm can determine effectively the location and extract the red tide information.
Liu, Zhen; Wang, Jieyin; Gao, Wenyuan; Man, Shuli; Wang, Ying; Liu, Changxiao
2013-07-01
Saponins are active compounds in natural products. Many researchers have tried to find the method for knowing their concentration in herbs. Some methods, such as solid-liquid extraction and solvent extraction, have been developed. However, the extraction methods of the steroidal saponins from Paris polyphylla Smith var. yunnanensis (Liliaceae) are not fully researched. To establish a simple extraction method for the separation of steroidal saponins from the rhizomes of P. polyphylla Smith var. yunnanensis. Macroporous adsorption resins were used for the separation of steroidal saponins. To select the most suitable resins, seven kinds of macroporous resins were selected in this study. The static adsorption and desorption tests on macroporous resins were determined. Also, we optimized the temperature and the ethanol concentration in the extraction method by the contents of five kinds of saponins. Then, we compared the extraction method with two other methods. D101 resin demonstrated the best adsorption and desorption properties for steroidal saponins. Its adsorption data fits best to the Freundlich adsorption model. The contents of steroidal saponins in the product were 4.83-fold increased with recovery yields of 85.47%. The process achieved simple and effective enrichment and separation for steroidal saponins. The method provides a scientific basis for large-scale preparation of steroidal saponins from the Rhizoma Paridis and other plants.
Colonna, William; Brehm-Stecher, Byron; Shetty, Kalidas; Pometto, Anthony
2017-12-01
This study focused on advancing a rapid turbidimetric bioassay to screen antimicrobials using specific cocktails of targeted foodborne bacterial pathogens. Specifically, to show the relevance of this rapid screening tool, the antimicrobial potential of generally recognized as safe calcium diacetate (DAX) and blends with cranberry (NC) and oregano (OX) natural extracts was evaluated. Furthermore, the same extracts were evaluated against beneficial lactic acid bacteria. The targeted foodborne pathogens evaluated were Escherichia coli O157:H7, Salmonella spp., Listeria monocytogenes, and Staphylococcus aureus using optimized initial cocktails (∼10 8 colony-forming unit/mL) containing strains isolated from human food outbreaks. Of all extracts evaluated, 0.51% (w/v) DAX in ethanol was the most effective against all four pathogens. However, DAX when reduced to 0.26% and with added blends from ethanol extractions consisting of DAX:OX (3:1), slightly outperformed or was equal to same levels of DAX alone. Subculture of wells in which no growth occurred after 1 week indicated that all water and ethanol extracts were bacteriostatic against the pathogens tested. All the targeted antimicrobials had no effect on the probiotic organism Lactobacillus plantarum. The use of such rapid screening methods combined with the use of multistrain cocktails of targeted foodborne pathogens from outbreaks will allow rapid large-scale screening of antimicrobials and enable further detailed studies in targeted model food systems.
NASA Astrophysics Data System (ADS)
Steiff, M.; Van Meter, K. J.; Basu, N. B.
2013-12-01
Lack of consistent water availability for irrigated agriculture is recognized as one of the primary constraints to meeting the UN Millennium Development Goals to alleviate hunger, and in semi-arid landscapes such as those of southern India, which are characterized by high intra-annual variability in rainfall, provision of capabilities for seasonal storage is recognized to be one of the key strategies towards alleviating water scarcity problems and ensuring food security. Although the issue of increased storage can be addressed by centralized infrastructure projects such as large-scale irrigation systems and dams, an alternative is the "soft path" approach, in which existing large-scale projects are complemented by small-scale, decentralized solutions. Such a decentralized approach has been utilized in southern India for thousands of years in the form of village rainwater harvesting tanks or ponds, providing a local and inherently sustainable approach to providing sufficient water for rice cultivation. Over the last century, however, large-scale canal projects and groundwater pumping have replaced rainwater harvesting as the primary source of irrigation water. But with groundwater withdrawals now exceeding recharge in many areas and water tables continuing to drop, many NGOs and government agencies are advocating for a revival of the older rainwater harvesting systems. Questions remain, however, regarding the limits to which rainwater harvesting can provide a solution to decades of water overexploitation. In the present work, we have utilized secondary data sources to analyze the linkages between the tank irrigation systems and the village communities that depend on them within the Gundar Basin of southern Tamil Nadu. Combining socioeconomic data with information regarding climate, land use, groundwater depletion, and tank density, we have developed indicators of sustainability for these systems. Using these indicators, we have attempted to unravel the close coupling that exists between tanks, the village communities, and the natural landscape within which they are embedded. Preliminary results suggest that groundwater over-extraction is in many cases negatively impacting the ability of the rainwater harvesting ponds to provide a reliable water supply. In addition, while the social and economic benefits provided by these ponds reduce community vulnerability to variations in the region's yearly monsoons, there can be negative environmental impacts. Large-scale rainwater harvesting, similar to groundwater extraction, can change the overall water balance of a watershed, leading to a tradeoff of water availability between socioeconomic and ecosystem demands. Although traditional rainwater harvesting practices may appear to be more sustainable than the current high levels of groundwater pumping, the two practices carried out in tandem can increase water consumption even further, pushing the system closer to a threshold beyond which a profound crisis may loom.
Modelling tidal current energy extraction in large area using a three-dimensional estuary model
NASA Astrophysics Data System (ADS)
Chen, Yaling; Lin, Binliang; Lin, Jie
2014-11-01
This paper presents a three-dimensional modelling study for simulating tidal current energy extraction in large areas, with a momentum sink term being added into the momentum equations. Due to the limits of computational capacity, the grid size of the numerical model is generally much larger than the turbine rotor diameter. Two models, i.e. a local grid refinement model and a coarse grid model, are employed and an idealized estuary is set up. The local grid refinement model is constructed to simulate the power generation of an isolated turbine and its impacts on hydrodynamics. The model is then used to determine the deployment of turbine farm and quantify a combined thrust coefficient for multiple turbines located in a grid element of coarse grid model. The model results indicate that the performance of power extraction is affected by array deployment, with more power generation from outer rows than inner rows due to velocity deficit influence of upstream turbines. Model results also demonstrate that the large-scale turbine farm has significant effects on the hydrodynamics. The tidal currents are attenuated within the turbine swept area, and both upstream and downstream of the array. While the currents are accelerated above and below turbines, which is contributed to speeding up the wake mixing process behind the arrays. The water levels are heightened in both low and high water levels as the turbine array spanning the full width of estuary. The magnitude of water level change is found to increase with the array expansion, especially at the low water level.
Data Exploration using Unsupervised Feature Extraction for Mixed Micro-Seismic Signals
NASA Astrophysics Data System (ADS)
Meyer, Matthias; Weber, Samuel; Beutel, Jan
2017-04-01
We present a system for the analysis of data originating in a multi-sensor and multi-year experiment focusing on slope stability and its underlying processes in fractured permafrost rock walls undertaken at 3500m a.s.l. on the Matterhorn Hörnligrat, (Zermatt, Switzerland). This system incorporates facilities for the transmission, management and storage of large-scales of data ( 7 GB/day), preprocessing and aggregation of multiple sensor types, machine-learning based automatic feature extraction for micro-seismic and acoustic emission data and interactive web-based visualization of the data. Specifically, a combination of three types of sensors are used to profile the frequency spectrum from 1 Hz to 80 kHz with the goal to identify the relevant destructive processes (e.g. micro-cracking and fracture propagation) leading to the eventual destabilization of large rock masses. The sensors installed for this profiling experiment (2 geophones, 1 accelerometers and 2 piezo-electric sensors for detecting acoustic emission), are further augmented with sensors originating from a previous activity focusing on long-term monitoring of temperature evolution and rock kinematics with the help of wireless sensor networks (crackmeters, cameras, weather station, rock temperature profiles, differential GPS) [Hasler2012]. In raw format, the data generated by the different types of sensors, specifically the micro-seismic and acoustic emission sensors, is strongly heterogeneous, in part unsynchronized and the storage and processing demand is large. Therefore, a purpose-built signal preprocessing and event-detection system is used. While the analysis of data from each individual sensor follows established methods, the application of all these sensor types in combination within a field experiment is unique. Furthermore, experience and methods from using such sensors in laboratory settings cannot be readily transferred to the mountain field site setting with its scale and full exposure to the natural environment. Consequently, many state-of-the-art algorithms for big data analysis and event classification requiring a ground truth dataset cannot be applied. The above mentioned challenges require a tool for data exploration. In the presented system, data exploration is supported by unsupervised feature learning based on convolutional neural networks, which is used to automatically extract common features for preliminary clustering and outlier detection. With this information, an interactive web-tool allows for a fast identification of interesting time segments on which segment-selective algorithms for visualization, feature extraction and statistics can be applied. The combination of manual labeling based and unsupervised feature extraction provides an event catalog for classification of different characteristic events related to internal progression of micro-crack in steep fractured bedrock permafrost. References Hasler, A., S. Gruber, and J. Beutel (2012), Kinematics of steep bedrock permafrost, J. Geophys. Res., 117, F01016, doi:10.1029/2011JF001981.
Analysis on Difference of Forest Phenology Extracted from EVI and LAI Based on PhenoCams
NASA Astrophysics Data System (ADS)
Wang, C.; Jing, L.; Qinhuo, L.
2017-12-01
Land surface phenology can make up for the deficiency of field observation with advantages of capturing the continuous expression of phenology on a large scale. However, there are some variability in phenological metrics derived from different satellite time-series data of vegetation parameters. This paper aims at assessing the difference of phenology information extracted from EVI and LAI time series. To achieve this, some web-camera sites were selected to analyze the characteristics between MODIS-EVI and MODIS-LAI time series from 2010 to 2014 for different forest types, including evergreen coniferous forest, evergreen broadleaf forest, deciduous coniferous forest and deciduous broadleaf forest. At the same time, satellite-based phenological metrics were extracted by the Logistics algorithm and compared with camera-based phenological metrics. Results show that the SOS and EOS that are extracted from LAI are close to bud burst and leaf defoliation respectively, while the SOS and EOS that are extracted from EVI is close to leaf unfolding and leaf coloring respectively. Thus the SOS that is extracted from LAI is earlier than that from EVI, while the EOS that is extracted from LAI is later than that from EVI at deciduous forest sites. Although the seasonal variation characteristics of evergreen forests are not apparent, significant discrepancies exist in LAI time series and EVI time series. In addition, Satellite- and camera-based phenological metrics agree well generally, but EVI has higher correlation with the camera-based canopy greenness (green chromatic coordinate, gcc) than LAI.
Scale-up and economic analysis of biodiesel production from municipal primary sewage sludge.
Olkiewicz, Magdalena; Torres, Carmen M; Jiménez, Laureano; Font, Josep; Bengoa, Christophe
2016-08-01
Municipal wastewater sludge is a promising lipid feedstock for biodiesel production, but the need to eliminate the high water content before lipid extraction is the main limitation for scaling up. This study evaluates the economic feasibility of biodiesel production directly from liquid primary sludge based on experimental data at laboratory scale. Computational tools were used for the modelling of the process scale-up and the different configurations of lipid extraction to optimise this step, as it is the most expensive. The operational variables with a major influence in the cost were the extraction time and the amount of solvent. The optimised extraction process had a break-even price of biodiesel of 1232 $/t, being economically competitive with the current cost of fossil diesel. The proposed biodiesel production process from waste sludge eliminates the expensive step of sludge drying, lowering the biodiesel price. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chan, A K; Singogo, E; Changamire, R; Ratsma, Y E C; Tassie, J-M; Harries, A D
2012-06-21
Rapid scale-up of antiretroviral therapy (ART) has challenged the health system in Malawi to monitor large numbers of patients effectively. To compare two methods of determining retention on treatment: quarterly ART clinic data aggregation vs. pharmacy stock cards. Between October 2010 and March 2011, data on ART outcomes were extracted from monitoring tools at five facilities. Pharmacy data on ART consumption were extracted. Workload for each method was observed and timed. We used intraclass correlation and Bland-Altman plots to compare the agreeability of both methods to determine treatment retention. There is wide variability between ART clinic cohort data and pharmacy data to determine treatment retention due to divergence in data at sites with large numbers of patients. However, there is a non-significant trend towards agreeability between the two methods (intraclass correlation coefficient > 0.9; P > 0.05). Pharmacy stock card monitoring is more time-efficient than quarterly ART data aggregation (81 min vs. 573 min). In low-resource settings, pharmacy records could be used to improve drug forecasting and estimate ART retention in a more time-efficient manner than quarterly data aggregation; however, a necessary precondition would be capacity building around pharmacy data management, particularly for large-sized cohorts.
The study of integration about measurable image and 4D production
NASA Astrophysics Data System (ADS)
Zhang, Chunsen; Hu, Pingbo; Niu, Weiyun
2008-12-01
In this paper, we create the geospatial data of three-dimensional (3D) modeling by the combination of digital photogrammetry and digital close-range photogrammetry. For large-scale geographical background, we make the establishment of DEM and DOM combination of three-dimensional landscape model based on the digital photogrammetry which uses aerial image data to make "4D" (DOM: Digital Orthophoto Map, DEM: Digital Elevation Model, DLG: Digital Line Graphic and DRG: Digital Raster Graphic) production. For the range of building and other artificial features which the users are interested in, we realize that the real features of the three-dimensional reconstruction adopting the method of the digital close-range photogrammetry can come true on the basis of following steps : non-metric cameras for data collection, the camera calibration, feature extraction, image matching, and other steps. At last, we combine three-dimensional background and local measurements real images of these large geographic data and realize the integration of measurable real image and the 4D production.The article discussed the way of the whole flow and technology, achieved the three-dimensional reconstruction and the integration of the large-scale threedimensional landscape and the metric building.
Techniques for extracting single-trial activity patterns from large-scale neural recordings
Churchland, Mark M; Yu, Byron M; Sahani, Maneesh; Shenoy, Krishna V
2008-01-01
Summary Large, chronically-implanted arrays of microelectrodes are an increasingly common tool for recording from primate cortex, and can provide extracellular recordings from many (order of 100) neurons. While the desire for cortically-based motor prostheses has helped drive their development, such arrays also offer great potential to advance basic neuroscience research. Here we discuss the utility of array recording for the study of neural dynamics. Neural activity often has dynamics beyond that driven directly by the stimulus. While governed by those dynamics, neural responses may nevertheless unfold differently for nominally identical trials, rendering many traditional analysis methods ineffective. We review recent studies – some employing simultaneous recording, some not – indicating that such variability is indeed present both during movement generation, and during the preceding premotor computations. In such cases, large-scale simultaneous recordings have the potential to provide an unprecedented view of neural dynamics at the level of single trials. However, this enterprise will depend not only on techniques for simultaneous recording, but also on the use and further development of analysis techniques that can appropriately reduce the dimensionality of the data, and allow visualization of single-trial neural behavior. PMID:18093826
Applicability of SCAR markers to food genomics: olive oil traceability.
Pafundo, Simona; Agrimonti, Caterina; Maestri, Elena; Marmiroli, Nelson
2007-07-25
DNA analysis with molecular markers has opened a shortcut toward a genomic comprehension of complex organisms. The availability of micro-DNA extraction methods, coupled with selective amplification of the smallest extracted fragments with molecular markers, could equally bring a breakthrough in food genomics: the identification of original components in food. Amplified fragment length polymorphisms (AFLPs) have been instrumental in plant genomics because they may allow rapid and reliable analysis of multiple and potentially polymorphic sites. Nevertheless, their direct application to the analysis of DNA extracted from food matrixes is complicated by the low quality of DNA extracted: its high degradation and the presence of inhibitors of enzymatic reactions. The conversion of an AFLP fragment to a robust and specific single-locus PCR-based marker, therefore, could extend the use of molecular markers to large-scale analysis of complex agro-food matrixes. In the present study is reported the development of sequence characterized amplified regions (SCARs) starting from AFLP profiles of monovarietal olive oils analyzed on agarose gel; one of these was used to identify differences among 56 olive cultivars. All the developed markers were purposefully amplified in olive oils to apply them to olive oil traceability.
Gao, Peng; Wang, Liping; Zhang, Yu-Yang; Huang, Yuan; Liao, Lei; Sutter, Peter; Liu, Kaihui; Yu, Dapeng; Wang, En-Ge
2016-09-14
In the rechargeable lithium ion batteries, the rate capability and energy efficiency are largely governed by the lithium ion transport dynamics and phase transition pathways in electrodes. Real-time and atomic-scale tracking of fully reversible lithium insertion and extraction processes in electrodes, which would ultimately lead to mechanistic understanding of how the electrodes function and why they fail, is highly desirable but very challenging. Here, we track lithium insertion and extraction in the van der Waals interactions dominated SnS2 by in situ high-resolution TEM method. We find that the lithium insertion occurs via a fast two-phase reaction to form expanded and defective LiSnS2, while the lithium extraction initially involves heterogeneous nucleation of intermediate superstructure Li0.5SnS2 domains with a 1-4 nm size. Density functional theory calculations indicate that the Li0.5SnS2 is kinetically favored and structurally stable. The asymmetric reaction pathways may supply enlightening insights into the mechanistic understanding of the underlying electrochemistry in the layered electrode materials and also suggest possible alternatives to the accepted explanation of the origins of voltage hysteresis in the intercalation electrode materials.
NASA Astrophysics Data System (ADS)
Dong, Di; Li, Ziwei; Liu, Zhaoqin; Yu, Yang
2014-03-01
This paper focuses on automated extraction and monitoring of coastlines by remote sensing techniques using multi-temporal Landsat imagery along Caofeidian, China. Caofeidian, as one of the active economic regions in China, has experienced dramatic change due to enhanced human activities, such as land reclamation. These processes have caused morphological changes of the Caofeidian shoreline. In this study, shoreline extraction and change analysis are researched. An algorithm based on image texture and mathematical morphology is proposed to automate coastline extraction. We tested this approach and found that it's capable of extracting coastlines from TM and ETM+ images with little human modifications. Then, the detected coastline vectors are imported into Arcgis software, and the Digital Shoreline Analysis System (DSAS) is used to calculate the change rate (the end point rate and linear regression rate). The results show that in some parts of the research area, remarkable coastline changes are observed, especially the accretion rate. The abnormal accretion is mostly attributed to the large-scale land reclamation during 2003 and 2004 in Caofeidian. So we can conclude that various construction projects, especially the land reclamation project, have made Caofeidian shorelines change greatly, far above the normal.
Sun, Guibo; Webster, Chris; Ni, Michael Y; Zhang, Xiaohu
2018-05-07
Uncertainty with respect to built environment (BE) data collection, measure conceptualization and spatial scales is evident in urban health research, but most findings are from relatively lowdensity contexts. We selected Hong Kong, an iconic high-density city, as the study area as limited research has been conducted on uncertainty in such areas. We used geocoded home addresses (n=5732) from a large population-based cohort in Hong Kong to extract BE measures for the participants' place of residence based on an internationally recognized BE framework. Variability of the measures was mapped and Spearman's rank correlation calculated to assess how well the relationships among indicators are preserved across variables and spatial scales. We found extreme variations and uncertainties for the 180 measures collected using comprehensive data and advanced geographic information systems modelling techniques. We highlight the implications of methodological selection and spatial scales of the measures. The results suggest that more robust information regarding urban health research in high-density city would emerge if greater consideration were given to BE data, design methods and spatial scales of the BE measures.
Synthesis of streamflow recession curves in dry environments
NASA Astrophysics Data System (ADS)
Arciniega, Saul; Breña-Naranjo, Agustín; Pedrozo-Acuña, Adrían
2015-04-01
The elucidation and predictability of hydrological systems can largely benefit by extracting observed patterns in processes, data and models. Such type of research framework in hydrology, also known as synthesis has gained significant attention over the last decade. For instance, hydrological synthesis implies that the identification of patterns in catchment behavior can enhance the extrapolation of hydrological signatures over large spatial and temporal scales. Hydrological signatures during dry periods such as streamflow recession curves (SRC) are of special interest in regions coping with water scarcity. Indeed, the study of SRCs from observed hydrographs allows to extract information about the storage-discharge relationship of a specific catchment and some of their groundwater hydraulic properties. This work aims at performing a synthesis work of SRCs in semi-arid & arid environments across Northern Mexico. Our dataset consisted in observed daily SRCs in 63 catchments with minima human interferences. Three streamflow recession extraction methods (Vogel, Brutsaert and Aksoy-Wittenberg) along with four recession models (Maillet, Boussinesq, Coutagne y Wittenberg) and three parameter estimation techniques (regressions, lower envelope y data binning) were used to determine the combination among different possible methods, processes and models that better describes SRCs in our study sites. Our results show that the extraction method proposed by Aksoy-Wittenberg along with Coutagne's nonlinear recession model provides a better approximation of SRCs across Northern Mexico, whereas regression was found to be the most adequate parameter estimation method. This study suggests that hydrological synthesis turned out to be an useful framework to identify similar patterns and model parameters during dry periods across Mexico's water-limited environments.
Nagarajan, Mahesh B; Coan, Paola; Huber, Markus B; Diemoz, Paul C; Wismüller, Axel
2015-01-01
Phase contrast X-ray computed tomography (PCI-CT) has been demonstrated as a novel imaging technique that can visualize human cartilage with high spatial resolution and soft tissue contrast. Different textural approaches have been previously investigated for characterizing chondrocyte organization on PCI-CT to enable classification of healthy and osteoarthritic cartilage. However, the large size of feature sets extracted in such studies motivates an investigation into algorithmic feature reduction for computing efficient feature representations without compromising their discriminatory power. For this purpose, geometrical feature sets derived from the scaling index method (SIM) were extracted from 1392 volumes of interest (VOI) annotated on PCI-CT images of ex vivo human patellar cartilage specimens. The extracted feature sets were subject to linear and non-linear dimension reduction techniques as well as feature selection based on evaluation of mutual information criteria. The reduced feature set was subsequently used in a machine learning task with support vector regression to classify VOIs as healthy or osteoarthritic; classification performance was evaluated using the area under the receiver-operating characteristic (ROC) curve (AUC). Our results show that the classification performance achieved by 9-D SIM-derived geometric feature sets (AUC: 0.96 ± 0.02) can be maintained with 2-D representations computed from both dimension reduction and feature selection (AUC values as high as 0.97 ± 0.02). Thus, such feature reduction techniques can offer a high degree of compaction to large feature sets extracted from PCI-CT images while maintaining their ability to characterize the underlying chondrocyte patterns.
Cloud-enabled large-scale land surface model simulations with the NASA Land Information System
NASA Astrophysics Data System (ADS)
Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.
2017-12-01
Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and describe the potential deployment of this information technology with other NASA applications.
"Non-cold" dark matter at small scales: a general approach
NASA Astrophysics Data System (ADS)
Murgia, R.; Merle, A.; Viel, M.; Totzauer, M.; Schneider, A.
2017-11-01
Structure formation at small cosmological scales provides an important frontier for dark matter (DM) research. Scenarios with small DM particle masses, large momenta or hidden interactions tend to suppress the gravitational clustering at small scales. The details of this suppression depend on the DM particle nature, allowing for a direct link between DM models and astrophysical observations. However, most of the astrophysical constraints obtained so far refer to a very specific shape of the power suppression, corresponding to thermal warm dark matter (WDM), i.e., candidates with a Fermi-Dirac or Bose-Einstein momentum distribution. In this work we introduce a new analytical fitting formula for the power spectrum, which is simple yet flexible enough to reproduce the clustering signal of large classes of non-thermal DM models, which are not at all adequately described by the oversimplified notion of WDM . We show that the formula is able to fully cover the parameter space of sterile neutrinos (whether resonantly produced or from particle decay), mixed cold and warm models, fuzzy dark matter, as well as other models suggested by effective theory of structure formation (ETHOS). Based on this fitting formula, we perform a large suite of N-body simulations and we extract important nonlinear statistics, such as the matter power spectrum and the halo mass function. Finally, we present first preliminary astrophysical constraints, based on linear theory, from both the number of Milky Way satellites and the Lyman-α forest. This paper is a first step towards a general and comprehensive modeling of small-scale departures from the standard cold DM model.
The impact of galaxy formation on satellite kinematics and redshift-space distortions
NASA Astrophysics Data System (ADS)
Orsi, Álvaro A.; Angulo, Raúl E.
2018-04-01
Galaxy surveys aim to map the large-scale structure of the Universe and use redshift-space distortions to constrain deviations from general relativity and probe the existence of massive neutrinos. However, the amount of information that can be extracted is limited by the accuracy of theoretical models used to analyse the data. Here, by using the L-Galaxies semi-analytical model run over the Millennium-XXL N-body simulation, we assess the impact of galaxy formation on satellite kinematics and the theoretical modelling of redshift-space distortions. We show that different galaxy selection criteria lead to noticeable differences in the radial distributions and velocity structure of satellite galaxies. Specifically, whereas samples of stellar mass selected galaxies feature satellites that roughly follow the dark matter, emission line satellite galaxies are located preferentially in the outskirts of haloes and display net infall velocities. We demonstrate that capturing these differences is crucial for modelling the multipoles of the correlation function in redshift space, even on large scales. In particular, we show how modelling small-scale velocities with a single Gaussian distribution leads to a poor description of the measured clustering. In contrast, we propose a parametrization that is flexible enough to model the satellite kinematics and that leads to an accurate description of the correlation function down to sub-Mpc scales. We anticipate that our model will be a necessary ingredient in improved theoretical descriptions of redshift-space distortions, which together could result in significantly tighter cosmological constraints and a more optimal exploitation of future large data sets.
NASA Technical Reports Server (NTRS)
Baurle, R. A.
2015-01-01
Steady-state and scale-resolving simulations have been performed for flow in and around a model scramjet combustor flameholder. The cases simulated corresponded to those used to examine this flowfield experimentally using particle image velocimetry. A variety of turbulence models were used for the steady-state Reynolds-averaged simulations which included both linear and non-linear eddy viscosity models. The scale-resolving simulations used a hybrid Reynolds-averaged / large eddy simulation strategy that is designed to be a large eddy simulation everywhere except in the inner portion (log layer and below) of the boundary layer. Hence, this formulation can be regarded as a wall-modeled large eddy simulation. This effort was undertaken to formally assess the performance of the hybrid Reynolds-averaged / large eddy simulation modeling approach in a flowfield of interest to the scramjet research community. The numerical errors were quantified for both the steady-state and scale-resolving simulations prior to making any claims of predictive accuracy relative to the measurements. The steady-state Reynolds-averaged results showed a high degree of variability when comparing the predictions obtained from each turbulence model, with the non-linear eddy viscosity model (an explicit algebraic stress model) providing the most accurate prediction of the measured values. The hybrid Reynolds-averaged/large eddy simulation results were carefully scrutinized to ensure that even the coarsest grid had an acceptable level of resolution for large eddy simulation, and that the time-averaged statistics were acceptably accurate. The autocorrelation and its Fourier transform were the primary tools used for this assessment. The statistics extracted from the hybrid simulation strategy proved to be more accurate than the Reynolds-averaged results obtained using the linear eddy viscosity models. However, there was no predictive improvement noted over the results obtained from the explicit Reynolds stress model. Fortunately, the numerical error assessment at most of the axial stations used to compare with measurements clearly indicated that the scale-resolving simulations were improving (i.e. approaching the measured values) as the grid was refined. Hence, unlike a Reynolds-averaged simulation, the hybrid approach provides a mechanism to the end-user for reducing model-form errors.
Uniform competency-based local feature extraction for remote sensing images
NASA Astrophysics Data System (ADS)
Sedaghat, Amin; Mohammadi, Nazila
2018-01-01
Local feature detectors are widely used in many photogrammetry and remote sensing applications. The quantity and distribution of the local features play a critical role in the quality of the image matching process, particularly for multi-sensor high resolution remote sensing image registration. However, conventional local feature detectors cannot extract desirable matched features either in terms of the number of correct matches or the spatial and scale distribution in multi-sensor remote sensing images. To address this problem, this paper proposes a novel method for uniform and robust local feature extraction for remote sensing images, which is based on a novel competency criterion and scale and location distribution constraints. The proposed method, called uniform competency (UC) local feature extraction, can be easily applied to any local feature detector for various kinds of applications. The proposed competency criterion is based on a weighted ranking process using three quality measures, including robustness, spatial saliency and scale parameters, which is performed in a multi-layer gridding schema. For evaluation, five state-of-the-art local feature detector approaches, namely, scale-invariant feature transform (SIFT), speeded up robust features (SURF), scale-invariant feature operator (SFOP), maximally stable extremal region (MSER) and hessian-affine, are used. The proposed UC-based feature extraction algorithms were successfully applied to match various synthetic and real satellite image pairs, and the results demonstrate its capability to increase matching performance and to improve the spatial distribution. The code to carry out the UC feature extraction is available from href="https://www.researchgate.net/publication/317956777_UC-Feature_Extraction.
NASA Astrophysics Data System (ADS)
Gholizadeh, Asa; Kopaekova, Veronika; Rogass, Christian; Mielke, Christian; Misurec, Jan
2016-08-01
Systematic quantification and monitoring of forest biophysical and biochemical variables is required to assess the response of ecosystems to climate change. Remote sensing has been introduced as a time and cost- efficient way to carry out large scale monitoring of vegetation parameters. Red-Edge Position (REP) is a hyperspectrally detectable parameter which is sensitive to vegetation Chl. In the current study, REP was modelled for the Norway spruce forest canopy resampled to HyMap and Sentinel-2 spectral resolution as well as calculated from the real HyMap and Sentinel-2 simulated data. Different REP extraction methods (4PLI, PF, LE, 4PLIH and 4PLIS) were assessed. The study showed the way for effective utilization of the forthcoming hyper and superspectral remote sensing sensors from orbit to monitor vegetation attributes.
NASA Astrophysics Data System (ADS)
Hazarika, S.; Mohanta, D.
2013-01-01
Naturally available green spinach, which is a rich source of potassium, was used as the key ingredient to extract mixed-phase ferroelectric crystals of nitrite and nitrate derivatives (KNO2 + KNO3). The KNO3 phase was found to be dominant for higher pH values, as revealed by the x-ray diffraction patterns. The characteristic optical absorption spectra exhibited intra-band π-π* electronic transitions, whereas Fourier transform infrared spectra exhibited characteristic N-O stretching vibrations. Differential scanning calorimetry revealed a broad endothermic peak at ˜121.8 °C, highlighting a transition from phase II to I via phase III of KNO3. Obtaining nanoscale ferroelectrics via the adoption of green synthesis is economically viable for large-scale production and possible application in ferroelectric elements/devices.
The Development of Duct for a Horizontal Axis Turbine Using CFD
NASA Astrophysics Data System (ADS)
Ghani, Mohamad Pauzi Abdul; Yaacob, Omar; Aziz, Azliza Abdul
2010-06-01
Malaysia is heavily dependent on the fossil fuels to satisfy its energy demand. Nowadays, renewable energy which has attracted great interest is marine current energy, which extracted by a device called a device called marine current turbine. This energy resource has agreat potential to be exploited on a large scale because of its predictability and intensity. This paper will focus on developing a Horizontal Axis Marine Current Turbine (HAMCT) rotor to extract marine current energy suitable for Malaysian sea conditions. This work incorporates the characteristic of Malaysia's ocean of shallow water and low speed current in developing the turbines. The HAMCT rotor will be developed and simulated using CAD and CFD software for various combination of inlet and oulet duct design. The computer simulation results of the HAMCT being developed will be presented.
A scalable method for O-antigen purification applied to various Salmonella serovars
Micoli, F.; Rondini, S.; Gavini, M.; Pisoni, I.; Lanzilao, L.; Colucci, A.M.; Giannelli, C.; Pippi, F.; Sollai, L.; Pinto, V.; Berti, F.; MacLennan, C.A.; Martin, L.B.; Saul, A.
2014-01-01
The surface lipopolysaccharide of gram-negative bacteria is both a virulence factor and a B cell antigen. Antibodies against O-antigen of lipopolysaccharide may confer protection against infection, and O-antigen conjugates have been designed against multiple pathogens. Here, we describe a simplified methodology for extraction and purification of the O-antigen core portion of Salmonella lipopolysaccharide, suitable for large-scale production. Lipopolysaccharide extraction and delipidation are performed by acetic acid hydrolysis of whole bacterial culture and can take place directly in a bioreactor, without previous isolation and inactivation of bacteria. Further O-antigen core purification consists of rapid filtration and precipitation steps, without using enzymes or hazardous chemicals. The process was successfully applied to various Salmonella enterica serovars (Paratyphi A, Typhimurium, and Enteritidis), obtaining good yields of high-quality material, suitable for conjugate vaccine preparations. PMID:23142430
Music information retrieval in compressed audio files: a survey
NASA Astrophysics Data System (ADS)
Zampoglou, Markos; Malamos, Athanasios G.
2014-07-01
In this paper, we present an organized survey of the existing literature on music information retrieval systems in which descriptor features are extracted directly from the compressed audio files, without prior decompression to pulse-code modulation format. Avoiding the decompression step and utilizing the readily available compressed-domain information can significantly lighten the computational cost of a music information retrieval system, allowing application to large-scale music databases. We identify a number of systems relying on compressed-domain information and form a systematic classification of the features they extract, the retrieval tasks they tackle and the degree in which they achieve an actual increase in the overall speed-as well as any resulting loss in accuracy. Finally, we discuss recent developments in the field, and the potential research directions they open toward ultra-fast, scalable systems.
Thiazoline peptides and a tris-phenethyl urea from Didemnum molle with anti-HIV activity.
Lu, Zhenyu; Harper, Mary Kay; Pond, Christopher D; Barrows, Louis R; Ireland, Chris M; Van Wagoner, Ryan M
2012-08-24
As part of our screening for anti-HIV agents from marine invertebrates, the MeOH extract of Didemnum molle was tested and showed moderate in vitro anti-HIV activity. Bioassay-guided fractionation of a large-scale extract allowed the identification of two new cyclopeptides, mollamides E and F (1 and 2), and one new tris-phenethyl urea, molleurea A (3). The absolute configurations were established using the advanced Marfey's method. The three compounds were evaluated for anti-HIV activity in both an HIV integrase inhibition assay and a cytoprotective cell-based assay. Compound 2 was active in both assays with IC(50) values of 39 and 78 μM, respectively. Compound 3 was active only in the cytoprotective cell-based assay, with an IC(50) value of 60 μM.
Using NLP to identify cancer cases in imaging reports drawn from radiology information systems.
Patrick, Jon; Asgari, Pooyan; Li, Min; Nguyen, Dung
2013-01-01
A Natural Language processing (NLP) classifier has been developed for the Victorian and NSW Cancer Registries with the purpose of automatically identifying cancer reports from imaging services, transmitting them to the Registries and then extracting pertinent cancer information. Large scale trials conducted on over 40,000 reports show the sensitivity for identifying reportable cancer reports is above 98% with a specificity above 96%. Detection of tumour stream, report purpose, and a variety of extracted content is generally above 90% specificity. The differences between report layout and authoring strategies across imaging services appear to require different classifiers to retain this high level of accuracy. Linkage of the imaging data with existing registry records (hospital and pathology reports) to derive stage and recurrence of cancer has commenced and shown very promising results.
Janssen, Renske H; Vincken, Jean-Paul; van den Broek, Lambertus A M; Fogliano, Vincenzo; Lakemond, Catriona M M
2017-03-22
Insects are considered a nutritionally valuable source of alternative proteins, and their efficient protein extraction is a prerequisite for large-scale use. The protein content is usually calculated from total nitrogen using the nitrogen-to-protein conversion factor (Kp) of 6.25. This factor overestimates the protein content, due to the presence of nonprotein nitrogen in insects. In this paper, a specific Kp of 4.76 ± 0.09 was calculated for larvae from Tenebrio molitor, Alphitobius diaperinus, and Hermetia illucens, using amino acid analysis. After protein extraction and purification, a Kp factor of 5.60 ± 0.39 was found for the larvae of three insect species studied. We propose to adopt these Kp values for determining protein content of insects to avoid overestimation of the protein content.
2017-01-01
Insects are considered a nutritionally valuable source of alternative proteins, and their efficient protein extraction is a prerequisite for large-scale use. The protein content is usually calculated from total nitrogen using the nitrogen-to-protein conversion factor (Kp) of 6.25. This factor overestimates the protein content, due to the presence of nonprotein nitrogen in insects. In this paper, a specific Kp of 4.76 ± 0.09 was calculated for larvae from Tenebrio molitor, Alphitobius diaperinus, and Hermetia illucens, using amino acid analysis. After protein extraction and purification, a Kp factor of 5.60 ± 0.39 was found for the larvae of three insect species studied. We propose to adopt these Kp values for determining protein content of insects to avoid overestimation of the protein content. PMID:28252948