Sample records for provide large amounts

  1. Medical Malpractice Damage Caps and Provider Reimbursement.

    PubMed

    Friedson, Andrew I

    2017-01-01

    A common state legislative maneuver to combat rising healthcare costs is to reform the tort system by implementing caps on noneconomic damages awardable in medical malpractice cases. Using the implementation of caps in several states and large database of private insurance claims, I estimate the effect of damage caps on the amount providers charge to insurance companies as well as the amount that insurance companies reimburse providers for medical services. The amount providers charge insurers is unresponsive to tort reform, but the amount that insurers reimburse providers decreases for some procedures. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  3. Gigwa-Genotype investigator for genome-wide analyses.

    PubMed

    Sempéré, Guilhem; Philippe, Florian; Dereeper, Alexis; Ruiz, Manuel; Sarah, Gautier; Larmande, Pierre

    2016-06-06

    Exploring the structure of genomes and analyzing their evolution is essential to understanding the ecological adaptation of organisms. However, with the large amounts of data being produced by next-generation sequencing, computational challenges arise in terms of storage, search, sharing, analysis and visualization. This is particularly true with regards to studies of genomic variation, which are currently lacking scalable and user-friendly data exploration solutions. Here we present Gigwa, a web-based tool that provides an easy and intuitive way to explore large amounts of genotyping data by filtering it not only on the basis of variant features, including functional annotations, but also on genotype patterns. The data storage relies on MongoDB, which offers good scalability properties. Gigwa can handle multiple databases and may be deployed in either single- or multi-user mode. In addition, it provides a wide range of popular export formats. The Gigwa application is suitable for managing large amounts of genomic variation data. Its user-friendly web interface makes such processing widely accessible. It can either be simply deployed on a workstation or be used to provide a shared data portal for a given community of researchers.

  4. Distributed Processing of Projections of Large Datasets: A Preliminary Study

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    Modern information needs have resulted in very large amounts of data being used in geographic information systems. Problems arise when trying to project these data in a reasonable amount of time and accuracy, however. Current single-threaded methods can suffer from two problems: fast projection with poor accuracy, or accurate projection with long processing time. A possible solution may be to combine accurate interpolation methods and distributed processing algorithms to quickly and accurately convert digital geospatial data between coordinate systems. Modern technology has made it possible to construct systems, such as Beowulf clusters, for a low cost and provide access to supercomputer-class technology. Combining these techniques may result in the ability to use large amounts of geographic data in time-critical situations.

  5. User Oriented Techniques to Support Interaction and Decision Making with Large Educational Databases

    ERIC Educational Resources Information Center

    Hartley, Roger; Almuhaidib, Saud M. Y.

    2007-01-01

    Information Technology is developing rapidly and providing policy/decision makers with large amounts of information that require processing and analysis. Decision support systems (DSS) aim to provide tools that not only help such analyses, but enable the decision maker to experiment and simulate the effects of different policies and selection…

  6. A MBD-seq protocol for large-scale methylome-wide studies with (very) low amounts of DNA.

    PubMed

    Aberg, Karolina A; Chan, Robin F; Shabalin, Andrey A; Zhao, Min; Turecki, Gustavo; Staunstrup, Nicklas Heine; Starnawska, Anna; Mors, Ole; Xie, Lin Y; van den Oord, Edwin Jcg

    2017-09-01

    We recently showed that, after optimization, our methyl-CpG binding domain sequencing (MBD-seq) application approximates the methylome-wide coverage obtained with whole-genome bisulfite sequencing (WGB-seq), but at a cost that enables adequately powered large-scale association studies. A prior drawback of MBD-seq is the relatively large amount of genomic DNA (ideally >1 µg) required to obtain high-quality data. Biomaterials are typically expensive to collect, provide a finite amount of DNA, and may simply not yield sufficient starting material. The ability to use low amounts of DNA will increase the breadth and number of studies that can be conducted. Therefore, we further optimized the enrichment step. With this low starting material protocol, MBD-seq performed equally well, or better, than the protocol requiring ample starting material (>1 µg). Using only 15 ng of DNA as input, there is minimal loss in data quality, achieving 93% of the coverage of WGB-seq (with standard amounts of input DNA) at similar false/positive rates. Furthermore, across a large number of genomic features, the MBD-seq methylation profiles closely tracked those observed for WGB-seq with even slightly larger effect sizes. This suggests that MBD-seq provides similar information about the methylome and classifies methylation status somewhat more accurately. Performance decreases with <15 ng DNA as starting material but, even with as little as 5 ng, MBD-seq still achieves 90% of the coverage of WGB-seq with comparable genome-wide methylation profiles. Thus, the proposed protocol is an attractive option for adequately powered and cost-effective methylome-wide investigations using (very) low amounts of DNA.

  7. Managing Materials and Wastes for Homeland Security Incidents

    EPA Pesticide Factsheets

    To provide information on waste management planning and preparedness before a homeland security incident, including preparing for the large amounts of waste that would need to be managed when an incident occurs, such as a large-scale natural disaster.

  8. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  9. Signal and image processing algorithm performance in a virtual and elastic computing environment

    NASA Astrophysics Data System (ADS)

    Bennett, Kelly W.; Robertson, James

    2013-05-01

    The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.

  10. Sidewall-box airlift pump provides large flows for aeration, CO2 stripping, and water rotation in large dual-drain circular tanks

    USDA-ARS?s Scientific Manuscript database

    Conventional gas transfer technologies for aquaculture systems occupy a large amount of space, require a considerable capital investment, and can contribute to high electricity demand. In addition, diffused aeration in a circular culture tank can interfere with the hydrodynamics of water rotation a...

  11. Autonomous Object Characterization with Large Datasets

    DTIC Science & Technology

    2015-10-18

    desk, where a substantial amount of effort is required to transform raw photometry into a data product, minimizing the amount of time the analyst has...were used to explore concepts in satellite characterization and satellite state change. The first algorithm provides real- time stability estimation... Timely and effective space object (SO) characterization is a challenge, and requires advanced data processing techniques. Detection and identification

  12. Image acquisition in the Pi-of-the-Sky project

    NASA Astrophysics Data System (ADS)

    Jegier, M.; Nawrocki, K.; Poźniak, K.; Sokołowski, M.

    2006-10-01

    Modern astronomical image acquisition systems dedicated for sky surveys provide large amount of data in a single measurement session. During one session that lasts a few hours it is possible to get as much as 100 GB of data. This large amount of data needs to be transferred from camera and processed. This paper presents some aspects of image acquisition in a sky survey image acquisition system. It describes a dedicated USB linux driver for the first version of the "Pi of The Sky" CCD camera (later versions have also Ethernet interface) and the test program for the camera together with a driver-wrapper providing core device functionality. Finally, the paper contains description of an algorithm for matching several images based on image features, i.e. star positions and their brightness.

  13. Models of resource planning during formation of calendar construction plans for erection of high-rise buildings

    NASA Astrophysics Data System (ADS)

    Pocebneva, Irina; Belousov, Vadim; Fateeva, Irina

    2018-03-01

    This article provides a methodical description of resource-time analysis for a wide range of requirements imposed for resource consumption processes in scheduling tasks during the construction of high-rise buildings and facilities. The core of the proposed approach and is the resource models being determined. The generalized network models are the elements of those models, the amount of which can be too large to carry out the analysis of each element. Therefore, the problem is to approximate the original resource model by simpler time models, when their amount is not very large.

  14. Treating mature stands for wildlife

    Treesearch

    William H. Healy; Gary F. Houf

    1989-01-01

    Stands older than 60 years or that are medium to large sawtimber size generally provide good wildlife habitat. Mature trees usually produce abundant mast and provide den sites (see fig. 1 in Note 9.04 Treating Immature Stands). The undergrowth in these stands produces moderate amounts of browse and herbage. Mature stands also provide opportunities for management...

  15. Geophysical data base

    NASA Technical Reports Server (NTRS)

    Williamson, M. R.; Kirschner, L. R.

    1975-01-01

    A general data-management system that provides a random-access capability for large amounts of data is described. The system operates on a CDC 6400 computer using a combination of magnetic tape and disk storage. A FORTRAN subroutine package is provided to simplify the maintenance and use of the data.

  16. 32 CFR 220.8 - Reasonable charges.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... standardized amounts, one for large urban, other urban/rural, and overseas area, utilizing the same... certain medications when these services are provided in a separate immunizations or shot clinic, are based...

  17. 32 CFR 220.8 - Reasonable charges.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... standardized amounts, one for large urban, other urban/rural, and overseas area, utilizing the same... certain medications when these services are provided in a separate immunizations or shot clinic, are based...

  18. 32 CFR 220.8 - Reasonable charges.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standardized amounts, one for large urban, other urban/rural, and overseas area, utilizing the same... certain medications when these services are provided in a separate immunizations or shot clinic, are based...

  19. 32 CFR 220.8 - Reasonable charges.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standardized amounts, one for large urban, other urban/rural, and overseas area, utilizing the same... certain medications when these services are provided in a separate immunizations or shot clinic, are based...

  20. Rose Hip

    MedlinePlus

    ... with your health provider.AspirinThe body breaks down aspirin to get rid of it. Rose hip contains ... of vitamin C might decrease the breakdown of aspirin. Taking large amount of rose hip along with ...

  1. Effects of Large Impacts on Mars: Implications for River Formation

    NASA Technical Reports Server (NTRS)

    Segura, T. L.; Toon, O. B.; Colaprete, A.; Zahnle, K.

    2002-01-01

    The Martian crater record provides ample evidence of the impacts of large (> 100 km) objects. These objects create hot global debris layers meters or more in depth, cause long term warming, and are capable of melting and precipitating a significant amount of water globally. Additional information is contained in the original extended abstract.

  2. Providing Author-Defined State Data Storage to Learning Objects

    ERIC Educational Resources Information Center

    Kassahun, Ayalew; Beulens, Adrie; Hartog, Rob

    2006-01-01

    Two major trends in eLearning are the shift from presentational towards activating learning objects and the shift from proprietary towards SCORM conformant delivery systems. In a large program on the design, development and use of digital learning material for food and biotechnology in higher education, a large amount of experience has been gained…

  3. A Semantic Graph Query Language

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, I L

    2006-10-16

    Semantic graphs can be used to organize large amounts of information from a number of sources into one unified structure. A semantic query language provides a foundation for extracting information from the semantic graph. The graph query language described here provides a simple, powerful method for querying semantic graphs.

  4. Planetary Surface Visualization and Analytics

    NASA Astrophysics Data System (ADS)

    Law, E. S.; Solar System Treks Team

    2018-04-01

    An introduction and update of the Solar System Treks Project which provides a suite of interactive visualization and analysis tools to enable users (engineers, scientists, public) to access large amounts of mapped planetary data products.

  5. Chinese-American Parents' Perspectives about Using the Internet to Access Information for Children with Special Needs

    ERIC Educational Resources Information Center

    Zeng, Songtian; Cheatham, Gregory A.

    2017-01-01

    As the Internet contains large amounts of health- and education-related information, it provides a potentially efficient and affordable format for directly reaching a large number of families with evidence-based health- and education-related information for their children with disabilities. Little is known, however, about Internet…

  6. Optimal conditions for elution of hepatitis B antigen after absorption onto colloidal silica.

    PubMed Central

    Pillot, J; Goueffon, S; Keros, R G

    1976-01-01

    Hepatitis B surface antigen (HBSAg) adsorbed from sera onto colloidal silica could be completely eluted through the use of 0.25% sodium deoxycholate in 0.01 M borax, pH 9.3, at 56 degrees C. The HBSAg recovered in the eluate represented 100% of that present in the original serum, and it was contaminated by only trace amounts of serum proteins (in decreasing amounts: beta-lipoprotein, immunoglobulin G, albumin). This preliminary step greatly facilitates purification of large amounts of HBSAg and provides small volumes of highly concentrated material for subsequent purification by density gradient centrifugation. PMID:9423

  7. A Case Study of the Uses Supported by Higher Education Computer Networks and an Analysis of Application Traffic

    ERIC Educational Resources Information Center

    Pisano, Mark

    2017-01-01

    Universities and Higher Education Institutions spend large sums of money to maintain and build network infrastructures. Current research and discussions in this area revolve around providing large amounts of bandwidth to students who live in a residence hall. However, there is a lack of information showing what is being used to support research…

  8. Bluetooth-based travel time/speed measuring systems development.

    DOT National Transportation Integrated Search

    2010-06-01

    Agencies in the Houston region have traditionally used toll tag readers to provide travel times on : freeways and High Occupancy Vehicle (HOV) lanes, but these systems require large amounts of costly and : physically invasive infrastructure. Bluetoot...

  9. The effect of spin in swing bowling in cricket: model trajectories for spin alone

    NASA Astrophysics Data System (ADS)

    Robinson, Garry; Robinson, Ian

    2015-02-01

    In ‘swing’ bowling, as employed by fast and fast-medium bowlers in cricket, back-spin along the line of the seam is normally applied in order to keep the seam vertical and to provide stability against ‘wobble’ of the seam. Whilst spin is normally thought of as primarily being the slow bowler's domain, the spin applied by the swing bowler has the side-effect of generating a lift or Magnus force. This force, depending on the orientation of the seam and hence that of the back-spin, can have a side-ways component as well as the expected vertical ‘lift’ component. The effect of the spin itself, in influencing the trajectory of the fast bowler's delivery, is normally not considered, presumably being thought of as negligible. The purpose of this paper is to investigate, using calculated model trajectories, the amount of side-ways movement due to the spin and to see how this predicted movement compares with the total observed side-ways movement. The size of the vertical lift component is also estimated. It is found that, although the spin is an essential part of the successful swing bowler's delivery, the amount of side-ways movement due to the spin itself amounts to a few centimetres or so, and is therefore small, but perhaps not negligible, compared to the total amount of side-ways movement observed. The spin does, however, provide a considerable amount of lift compared to the equivalent delivery bowled without spin, altering the point of pitching by up to 3 m, a very large amount indeed. Thus, for example, bowling a ball with the seam pointing directly down the pitch and not designed to swing side-ways at all, but with the amount of back-spin varied, could provide a very powerful additional weapon in the fast bowler's arsenal. So-called ‘sling bowlers’, who use a very low arm action, can take advantage of spin since effectively they can apply side-spin to the ball, giving rise to a large side-ways movement, ˜ 20{}^\\circ cm or more, which certainly is significant. For a given amount of spin the amount of side-ways movement increases as the bowler's delivery arm becomes more horizontal. This technique could also be exploited by normal spin bowlers as well as swing bowlers.

  10. Large-scale retrieval for medical image analytics: A comprehensive review.

    PubMed

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena

    PubMed Central

    Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi

    2016-01-01

    Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095

  12. SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena.

    PubMed

    Tohsato, Yukako; Ho, Kenneth H L; Kyoda, Koji; Onami, Shuichi

    2016-11-15

    Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. SSBD is accessible at http://ssbd.qbic.riken.jp CONTACT: sonami@riken.jp. © The Author 2016. Published by Oxford University Press.

  13. The Role of Popular Girls in Bullying and Intimidating Boys and Other Popular Girls in Secondary School

    ERIC Educational Resources Information Center

    Dytham, Siobhan

    2018-01-01

    Despite a large amount of research focusing on bullying and exclusion in secondary schools, there is far less research focusing on cross-gender bullying and 'popular' students who experience bullying. This research provides an analysis of interactions between male and female students (aged 13-14) in a school in England. The data provides multiple…

  14. The Impact of Pre-Kindergarten Programs on Student Achievement in Mississippi Elementary Schools

    ERIC Educational Resources Information Center

    Harges, Fletcher B.

    2017-01-01

    Each of the states bordering Mississippi invests large amounts of money in providing children with state-funded pre-k programs in their public schools. However, Mississippi falls behind these states and does not similarly invest in this effort to provide many of its children with the opportunity to attend state-funded pre-k programs. Because…

  15. Influence of the clearance on in-vitro tribology of large diameter metal-on-metal articulations pertaining to resurfacing hip implants.

    PubMed

    Rieker, Claude B; Schön, Rolf; Konrad, Reto; Liebentritt, Gernot; Gnepf, Patric; Shen, Ming; Roberts, Paul; Grigoris, Peter

    2005-04-01

    Large-diameter metal-on-metal articulations may provide an opportunity for wear reduction in total hip implants because earlier studies have shown that the formation of a fluid film that completely separates the bearing surfaces is theoretically possible. In such a lubrication mode and under ideal conditions, there is theoretically no amount of wear. Studies have suggested that the two primary parameters controlling the lubrication mode are the diameter and the clearance of the articulation. The goal of the present study was to experimentally investigate the influence of these two parameters on the wear behavior of large-diameter metal-on-metal articulations pertaining to resurfacing hip implants. The results of this in vitro investigation showed that longer running-in periods and higher amounts of running-in wear were associated with larger clearances.

  16. How to leverage a bad inventory situation.

    PubMed

    Horsfall, G A

    1998-11-01

    Small manufacturing companies have a hard time taking advantage of the price breaks that result from large purchase orders. Besides the greater amount of money involved, purchasing large quantities of items demands additional space for storing the items. This article describes a company that created separate inventory management and finance company to provide inventory management services to itself and to market these services to other small companies in its area.

  17. Lightweight Integrated Solar Array (LISA): Providing Higher Power to Small Spacecraft

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Carr, John; Fabisinski, Leo; Lockett, Tiffany Russell

    2015-01-01

    Affordable and convenient access to electrical power is essential for all spacecraft and is a critical design driver for the next generation of smallsats, including CubeSats, which are currently extremely power limited. The Lightweight Integrated Solar Array (LISA), a concept designed, prototyped, and tested at the NASA Marshall Space Flight Center (MSFC) in Huntsville, Alabama provides an affordable, lightweight, scalable, and easily manufactured approach for power generation in space. This flexible technology has many wide-ranging applications from serving small satellites to providing abundant power to large spacecraft in GEO and beyond. By using very thin, ultraflexible solar arrays adhered to an inflatable or deployable structure, a large area (and thus large amount of power) can be folded and packaged into a relatively small volume.

  18. Design of a ``Digital Atlas Vme Electronics'' (DAVE) module

    NASA Astrophysics Data System (ADS)

    Goodrick, M.; Robinson, D.; Shaw, R.; Postranecky, M.; Warren, M.

    2012-01-01

    ATLAS-SCT has developed a new ATLAS trigger card, 'Digital Atlas Vme Electronics' (``DAVE''). The unit is designed to provide a versatile array of interface and logic resources, including a large FPGA. It interfaces to both VME bus and USB hosts. DAVE aims to provide exact ATLAS CTP (ATLAS Central Trigger Processor) functionality, with random trigger, simple and complex deadtime, ECR (Event Counter Reset), BCR (Bunch Counter Reset) etc. being generated to give exactly the same conditions in standalone running as experienced in combined runs. DAVE provides additional hardware and a large amount of free firmware resource to allow users to add or change functionality. The combination of the large number of individually programmable inputs and outputs in various formats, with very large external RAM and other components all connected to the FPGA, also makes DAVE a powerful and versatile FPGA utility card.

  19. Integrated sequencing of exome and mRNA of large-sized single cells.

    PubMed

    Wang, Lily Yan; Guo, Jiajie; Cao, Wei; Zhang, Meng; He, Jiankui; Li, Zhoufang

    2018-01-10

    Current approaches of single cell DNA-RNA integrated sequencing are difficult to call SNPs, because a large amount of DNA and RNA is lost during DNA-RNA separation. Here, we performed simultaneous single-cell exome and transcriptome sequencing on individual mouse oocytes. Using microinjection, we kept the nuclei intact to avoid DNA loss, while retaining the cytoplasm inside the cell membrane, to maximize the amount of DNA and RNA captured from the single cell. We then conducted exome-sequencing on the isolated nuclei and mRNA-sequencing on the enucleated cytoplasm. For single oocytes, exome-seq can cover up to 92% of exome region with an average sequencing depth of 10+, while mRNA-sequencing reveals more than 10,000 expressed genes in enucleated cytoplasm, with similar performance for intact oocytes. This approach provides unprecedented opportunities to study DNA-RNA regulation, such as RNA editing at single nucleotide level in oocytes. In future, this method can also be applied to other large cells, including neurons, large dendritic cells and large tumour cells for integrated exome and transcriptome sequencing.

  20. A Discretization Algorithm for Meteorological Data and its Parallelization Based on Hadoop

    NASA Astrophysics Data System (ADS)

    Liu, Chao; Jin, Wen; Yu, Yuting; Qiu, Taorong; Bai, Xiaoming; Zou, Shuilong

    2017-10-01

    In view of the large amount of meteorological observation data, the property is more and the attribute values are continuous values, the correlation between the elements is the need for the application of meteorological data, this paper is devoted to solving the problem of how to better discretize large meteorological data to more effectively dig out the hidden knowledge in meteorological data and research on the improvement of discretization algorithm for large scale data, in order to achieve data in the large meteorological data discretization for the follow-up to better provide knowledge to provide protection, a discretization algorithm based on information entropy and inconsistency of meteorological attributes is proposed and the algorithm is parallelized under Hadoop platform. Finally, the comparison test validates the effectiveness of the proposed algorithm for discretization in the area of meteorological large data.

  1. Ion Heating During Local Helicity Injection Plasma Startup in the Pegasus ST

    NASA Astrophysics Data System (ADS)

    Burke, M. G.; Barr, J. L.; Bongard, M. W.; Fonck, R. J.; Hinson, E. T.; Perry, J. M.; Reusch, J. A.

    2015-11-01

    Plasmas in the Pegasus ST are initiated either through standard, MHD stable, inductive current drive or non-solenoidal local helicity injection (LHI) current drive with strong reconnection activity, providing a rich environment to study ion dynamics. During LHI discharges, a large amount of impurity ion heating has been observed, with the passively measured impurity Ti as high as 800 eV compared to Ti ~ 60 eV and Te ~ 175 eV during standard inductive current drive discharges. In addition, non-thermal ion velocity distributions are observed and appear to be strongest near the helicity injectors. The ion heating is hypothesized to be a result of large-scale magnetic reconnection activity, as the amount of heating scales with increasing fluctuation amplitude of the dominant, edge localized, n =1 MHD mode. An approximate temporal scaling of the heating with the amplitude of higher frequency magnetic fluctuations has also been observed, with large amounts of power spectral density present at several impurity ion cyclotron frequencies. Recent experiments have focused on investigating the impurity ion heating scaling with the ion charge to mass ratio as well as the reconnecting field strength. The ion charge to mass ratio was modified by observing different impurity charge states in similar LHI plasmas while the reconnecting field strength was modified by changing the amount of injected edge current. Work supported by US DOE grant DE-FG02-96ER54375.

  2. Evaluation of a Viscosity-Molecular Weight Relationship.

    ERIC Educational Resources Information Center

    Mathias, Lon J.

    1983-01-01

    Background information, procedures, and results are provided for a series of graduate/undergraduate polymer experiments. These include synthesis of poly(methylmethacrylate), viscosity experiment (indicating large effect even small amounts of a polymer may have on solution properties), and measurement of weight-average molecular weight by light…

  3. More Bucks for Your Bang: Selective Fund Raising Pays Off.

    ERIC Educational Resources Information Center

    Buchanan, J. Scott

    1981-01-01

    Austin College's approach to raising large amounts of capital funds with severe limits on number of staff, size of budget, and available time is presented. Some suggestions on locating prospects, cultivating donors, involving the president and trustees and thanking donors are provided. (MLW)

  4. Indigenous Digital Collections

    ERIC Educational Resources Information Center

    Nakata, N. M.

    2007-01-01

    The intersection of public institutions managing large amounts of information and knowledge and new information and communication technologies has brought forward exciting and innovative changes to the ways information and knowledge have been traditionally managed. This paper provides a brief snapshot of some of the key issues facing the library…

  5. Nitrogen in agricultural systems: Implications for conservation policy

    USDA-ARS?s Scientific Manuscript database

    Nitrogen is an important agricultural input that is critical for providing food to feed a growing world population. However, the introduction of large amount of reactive nitrogen into the environment has a number of undesirable impacts on water, terrestrial, and atmospheric resources. Careful manage...

  6. Analyzing large-scale spiking neural data with HRLAnalysis™

    PubMed Central

    Thibeault, Corey M.; O'Brien, Michael J.; Srinivasa, Narayan

    2014-01-01

    The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time, but also provide a user friendly Python interface. We describe the design considerations, implementation and features of the HRLAnalysis™ suite. In addition, performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible, but also straightforward to interface with existing Python modules. PMID:24634655

  7. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  8. Social modeling effects on young women's breakfast intake.

    PubMed

    Hermans, Roel C J; Herman, C Peter; Larsen, Junilla K; Engels, Rutger C M E

    2010-12-01

    Numerous studies have shown that the presence of others influences young women's food intake. They eat more when the other eats more, and eat less when the other eats less. However, most of these studies have focused on snack situations. The present study assesses the degree to which young women model the breakfast intake of a same-sex peer in a semi-naturalistic setting. The study took place in a laboratory setting at the Radboud University Nijmegen, the Netherlands, during the period January to April 2009. After completing three cover tasks, normal-weight participants (n=57) spent a 20-minute break with a peer who ate a large amount or a small amount of breakfast or no breakfast at all. The participants' total amount of energy consumed (in kilocalories) during the break was measured. An analysis of variance was used to examine whether young women modeled the breakfast intake of same-sex peers. Results indicate a main effect of breakfast condition, F(2,54)=8.44; P<0.01. Participants exposed to a peer eating nothing ate less than did participants exposed to a peer eating a small amount (d=0.85) or large amount of breakfast (d=1.23). Intake in the Small-Breakfast condition did not differ substantially from intake in the Large-Breakfast condition. The findings from the present study provide evidence that modeling effects of food intake are weaker in eating contexts in which scripts or routines guide an individual's eating behavior. Copyright © 2010 American Dietetic Association. Published by Elsevier Inc. All rights reserved.

  9. Geo-reCAPTCHA: Crowdsourcing large amounts of geographic information from earth observation data

    NASA Astrophysics Data System (ADS)

    Hillen, Florian; Höfle, Bernhard

    2015-08-01

    The reCAPTCHA concept provides a large amount of valuable information for various applications. First, it provides security, e.g., for a form on a website, by means of a test that only a human could solve. Second, the effort of the user for this test is used to generate additional information, e.g., digitization of books or identification of house numbers. In this work, we present a concept for adapting the reCAPTCHA idea to create user-generated geographic information from earth observation data, and the requirements during the conception and implementation are depicted in detail. Furthermore, the essential parts of a Geo-reCAPTCHA system are described, and afterwards transferred, to a prototype implementation. An empirical user study is conducted to investigate the Geo-reCAPTCHA approach, assessing time and quality of the resulting geographic information. Our results show that a Geo-reCAPTCHA can be solved by the users of our study on building digitization in a short amount of time (19.2 s on average) with an overall average accuracy of the digitizations of 82.2%. In conclusion, Geo-reCAPTCHA has the potential to be a reasonable alternative to the typical reCAPTCHA, and to become a new data-rich channel of crowdsourced geographic information.

  10. Exploring the Amount and Type of Writing Instruction during Language Arts Instruction in Kindergarten Classrooms

    PubMed Central

    Puranik, Cynthia S.; Al Otaiba, Stephanie; Sidler, Jessica Folsom; Greulich, Luana

    2014-01-01

    The objective of this exploratory investigation was to examine the nature of writing instruction in kindergarten classrooms and to describe student writing outcomes at the end of the school year. Participants for this study included 21 teachers and 238 kindergarten children from nine schools. Classroom teachers were videotaped once each in the fall and winter during the 90 minute instructional block for reading and language arts to examine time allocation and the types of writing instructional practices taking place in the kindergarten classrooms. Classroom observation of writing was divided into student-practice variables (activities in which students were observed practicing writing or writing independently) and teacher-instruction variables (activities in which the teacher was observed providing direct writing instruction). In addition, participants completed handwriting fluency, spelling, and writing tasks. Large variability was observed in the amount of writing instruction occurring in the classroom, the amount of time kindergarten teachers spent on writing and in the amount of time students spent writing. Marked variability was also observed in classroom practices both within and across schools and this fact was reflected in the large variability noted in kindergartners’ writing performance. PMID:24578591

  11. Exploring the Amount and Type of Writing Instruction during Language Arts Instruction in Kindergarten Classrooms.

    PubMed

    Puranik, Cynthia S; Al Otaiba, Stephanie; Sidler, Jessica Folsom; Greulich, Luana

    2014-02-01

    The objective of this exploratory investigation was to examine the nature of writing instruction in kindergarten classrooms and to describe student writing outcomes at the end of the school year. Participants for this study included 21 teachers and 238 kindergarten children from nine schools. Classroom teachers were videotaped once each in the fall and winter during the 90 minute instructional block for reading and language arts to examine time allocation and the types of writing instructional practices taking place in the kindergarten classrooms. Classroom observation of writing was divided into student-practice variables (activities in which students were observed practicing writing or writing independently) and teacher-instruction variables (activities in which the teacher was observed providing direct writing instruction). In addition, participants completed handwriting fluency, spelling, and writing tasks. Large variability was observed in the amount of writing instruction occurring in the classroom, the amount of time kindergarten teachers spent on writing and in the amount of time students spent writing. Marked variability was also observed in classroom practices both within and across schools and this fact was reflected in the large variability noted in kindergartners' writing performance.

  12. Statistical Challenges in "Big Data" Human Neuroimaging.

    PubMed

    Smith, Stephen M; Nichols, Thomas E

    2018-01-17

    Smith and Nichols discuss "big data" human neuroimaging studies, with very large subject numbers and amounts of data. These studies provide great opportunities for making new discoveries about the brain but raise many new analytical challenges and interpretational risks. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. A mass storage system for supercomputers based on Unix

    NASA Technical Reports Server (NTRS)

    Richards, J.; Kummell, T.; Zarlengo, D. G.

    1988-01-01

    The authors present the design, implementation, and utilization of a large mass storage subsystem (MSS) for the numerical aerodynamics simulation. The MSS supports a large networked, multivendor Unix-based supercomputing facility. The MSS at Ames Research Center provides all processors on the numerical aerodynamics system processing network, from workstations to supercomputers, the ability to store large amounts of data in a highly accessible, long-term repository. The MSS uses Unix System V and is capable of storing hundreds of thousands of files ranging from a few bytes to 2 Gb in size.

  14. Efficient multifeature index structures for music data retrieval

    NASA Astrophysics Data System (ADS)

    Lee, Wegin; Chen, Arbee L. P.

    1999-12-01

    In this paper, we propose four index structures for music data retrieval. Based on suffix trees, we develop two index structures called combined suffix tree and independent suffix trees. These methods still show shortcomings for some search functions. Hence we develop another index, called Twin Suffix Trees, to overcome these problems. However, the Twin Suffix Trees lack of scalability when the amount of music data becomes large. Therefore we propose the fourth index, called Grid-Twin Suffix Trees, to provide scalability and flexibility for a large amount of music data. For each index, we can use different search functions, like exact search and approximate search, on different music features, like melody, rhythm or both. We compare the performance of the different search functions applied on each index structure by a series of experiments.

  15. Characterization and visualization of RNA secondary structure Boltzmann ensemble via information theory.

    PubMed

    Lin, Luan; McKerrow, Wilson H; Richards, Bryce; Phonsom, Chukiat; Lawrence, Charles E

    2018-03-05

    The nearest neighbor model and associated dynamic programming algorithms allow for the efficient estimation of the RNA secondary structure Boltzmann ensemble. However because a given RNA secondary structure only contains a fraction of the possible helices that could form from a given sequence, the Boltzmann ensemble is multimodal. Several methods exist for clustering structures and finding those modes. However less focus is given to exploring the underlying reasons for this multimodality: the presence of conflicting basepairs. Information theory, or more specifically mutual information, provides a method to identify those basepairs that are key to the secondary structure. To this end we find most informative basepairs and visualize the effect of these basepairs on the secondary structure. Knowing whether a most informative basepair is present tells us not only the status of the particular pair but also provides a large amount of information about which other pairs are present or not present. We find that a few basepairs account for a large amount of the structural uncertainty. The identification of these pairs indicates small changes to sequence or stability that will have a large effect on structure. We provide a novel algorithm that uses mutual information to identify the key basepairs that lead to a multimodal Boltzmann distribution. We then visualize the effect of these pairs on the overall Boltzmann ensemble.

  16. Mobile English Vocabulary Learning Based on Concept-Mapping Strategy

    ERIC Educational Resources Information Center

    Liu, Pei-Lin

    2016-01-01

    Numerous researchers in education recognize that vocabulary is essential in foreign language learning. However, students often encounter vocabulary that is difficult to remember. Providing effective vocabulary learning strategies is therefore more valuable than teaching students a large amount of vocabulary. The purpose of this study was to…

  17. Multi-Media in USAF Pilot Training.

    ERIC Educational Resources Information Center

    Wood, Milton E.

    The flight-line portion of flying training has traditionally required large amounts of airborne practice under an apprenticeship form of instruction. New developments in educational technology, from both a philosophical and device point of view, provide new opportunities to train airborne skills in a ground environment. Through the use of…

  18. Helping Students Interpret Large-Scale Data Tables

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2016-01-01

    New technologies have completely altered the ways that citizens can access data. Indeed, emerging online data sources give citizens access to an enormous amount of numerical information that provides new sorts of evidence used to influence public opinion. In this new environment, two trends have had a significant impact on our increasingly…

  19. Supporting Distance Learners: Making Practice More Effective

    ERIC Educational Resources Information Center

    Pratt, Keryn

    2015-01-01

    This paper reports on a qualitative evaluation of the postgraduate courses offered by distance in one university department. The types and amount of support provided to students was evaluated and compared with Simpson's (2008a) Proactive Motivational Support model (PaMS). While students were largely satisfied with the support they received during…

  20. Effect of weight-mile tax on road damage in Oregon

    DOT National Transportation Integrated Search

    1999-09-01

    Oregon's weight-mile tax was amended in 1990 to provide for a lower tax rate for trucks weighing more than 80,000 pounds if they added axles. The additional axles within a weight class reduce the amount of road damage. The tax break was largely based...

  1. The Role of Nuclear Power in Reducing Greenhouse Gas Emissions

    EPA Science Inventory

    For Frank Princiotta’s book, Global Climate Change—The Technology Challenge As this chapter will point out, nuclear energy is a low greenhouse gas emitter and is capable of providing large amounts of power using proven technology. In the immediate future, it can contribute to gr...

  2. The Application Law of Large Numbers That Predicts The Amount of Actual Loss in Insurance of Life

    NASA Astrophysics Data System (ADS)

    Tinungki, Georgina Maria

    2018-03-01

    The law of large numbers is a statistical concept that calculates the average number of events or risks in a sample or population to predict something. The larger the population is calculated, the more accurate predictions. In the field of insurance, the Law of Large Numbers is used to predict the risk of loss or claims of some participants so that the premium can be calculated appropriately. For example there is an average that of every 100 insurance participants, there is one participant who filed an accident claim, then the premium of 100 participants should be able to provide Sum Assured to at least 1 accident claim. The larger the insurance participant is calculated, the more precise the prediction of the calendar and the calculation of the premium. Life insurance, as a tool for risk spread, can only work if a life insurance company is able to bear the same risk in large numbers. Here apply what is called the law of large number. The law of large numbers states that if the amount of exposure to losses increases, then the predicted loss will be closer to the actual loss. The use of the law of large numbers allows the number of losses to be predicted better.

  3. CD-ROM technology at the EROS data center

    USGS Publications Warehouse

    Madigan, Michael E.; Weinheimer, Mary C.

    1993-01-01

    The vast amount of digital spatial data often required by a single user has created a demand for media alternatives to 1/2" magnetic tape. One such medium that has been recently adopted at the U.S. Geological Survey's EROS Data Center is the compact disc (CD). CD's are a versatile, dynamic, and low-cost method for providing a variety of data on a single media device and are compatible with various computer platforms. CD drives are available for personal computers, UNIX workstations, and mainframe systems, either directly connected, or through a network. This medium furnishes a quick method of reproducing and distributing large amounts of data on a single CD. Several data sets are already available on CD's, including collections of historical Landsat multispectral scanner data and biweekly composites of Advanced Very High Resolution Radiometer data for the conterminous United States. The EROS Data Center intends to provide even more data sets on CD's. Plans include specific data sets on a customized disc to fulfill individual requests, and mass production of unique data sets for large-scale distribution. Requests for a single compact disc-read only memory (CD-ROM) containing a large volume of data either for archiving or for one-time distribution can be addressed with a CD-write once (CD-WO) unit. Mass production and large-scale distribution will require CD-ROM replication and mastering.

  4. The Importance of Supporting Inferences with Evidence: Learning Lessons from Huffman (2014) in the Hope of Providing Stronger Evidence for Extensive Reading

    ERIC Educational Resources Information Center

    McLean, Stuart

    2016-01-01

    Stuart McLean refers in this commentary to Jeffrey Huffman's article "Reading Rate Gains during a One-Semester Extensive Reading Course" (v26 n2 p17-33 Oct 2014) [See: EJ1044344], in which Huffman reports that extensive reading (ER) was an effective way to provide large amounts of comprehensible input to foreign language learners, but…

  5. Energy efficient laboratory fume hood

    DOEpatents

    Feustel, Helmut E.

    2000-01-01

    The present invention provides a low energy consumption fume hood that provides an adequate level of safety while reducing the amount of air exhausted from the hood. A low-flow fume hood in accordance with the present invention works on the principal of providing an air supply, preferably with low turbulence intensity, in the face of the hood. The air flow supplied displaces the volume currently present in the hood's face without significant mixing between the two volumes and with minimum injection of air from either side of the flow. This air flow provides a protective layer of clean air between the contaminated low-flow fume hood work chamber and the laboratory room. Because this protective layer of air will be free of contaminants, even temporary mixing between the air in the face of the fume hood and room air, which may result from short term pressure fluctuations or turbulence in the laboratory, will keep contaminants contained within the hood. Protection of the face of the hood by an air flow with low turbulence intensity in accordance with a preferred embodiment of the present invention largely reduces the need to exhaust large amounts of air from the hood. It has been shown that exhaust air flow reductions of up to 75% are possible without a decrease in the hood's containment performance.

  6. Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor

    DOEpatents

    Britten, Jerald A.

    1997-01-01

    A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for 1) cleaning, developing or etching, 2) rinsing, and 3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material.

  7. Moving zone Marangoni drying of wet objects using naturally evaporated solvent vapor

    DOEpatents

    Britten, J.A.

    1997-08-26

    A surface tension gradient driven flow (a Marangoni flow) is used to remove the thin film of water remaining on the surface of an object following rinsing. The process passively introduces by natural evaporation and diffusion of minute amounts of alcohol (or other suitable material) vapor in the immediate vicinity of a continuously refreshed meniscus of deionized water or another aqueous-based, nonsurfactant rinsing agent. Used in conjunction with cleaning, developing or wet etching application, rinsing coupled with Marangoni drying provides a single-step process for (1) cleaning, developing or etching, (2) rinsing, and (3) drying objects such as flat substrates or coatings on flat substrates without necessarily using heat, forced air flow, contact wiping, centrifugation or large amounts of flammable solvents. This process is useful in one-step cleaning and drying of large flat optical substrates, one-step developing/rinsing and drying or etching/rinsing/drying of large flat patterned substrates and flat panel displays during lithographic processing, and room-temperature rinsing/drying of other large parts, sheets or continuous rolls of material. 5 figs.

  8. Viscoelastic love-type surface waves

    USGS Publications Warehouse

    Borcherdt, Roger D.

    2008-01-01

    The general theoretical solution for Love-Type surface waves in viscoelastic media provides theoreticalexpressions for the physical characteristics of the waves in elastic as well as anelastic media with arbitraryamounts of intrinsic damping. The general solution yields dispersion and absorption-coefficient curves for the waves as a function of frequency and theamount of intrinsic damping for any chosen viscoelastic model.Numerical results valid for a variety of viscoelastic models provide quantitative estimates of the physicalcharacteristics of the waves pertinent to models of Earth materials ranging from small amounts of damping in the Earth’s crust to moderate and large amounts of damping in soft soils and water-saturated sediments. Numerical results, presented herein, are valid for a wide range of solids and applications.

  9. A model for prioritizing landfills for remediation and closure: A case study in Serbia.

    PubMed

    Ubavin, Dejan; Agarski, Boris; Maodus, Nikola; Stanisavljevic, Nemanja; Budak, Igor

    2018-01-01

    The existence of large numbers of landfills that do not fulfill sanitary prerequisites presents a serious hazard for the environment in lower income countries. One of the main hazards is landfill leachate that contains various pollutants and presents a threat to groundwater. Groundwater pollution from landfills depends on various mutually interconnected factors such as the waste type and amount, the amount of precipitation, the landfill location characteristics, and operational measures, among others. Considering these factors, lower income countries face a selection problem where landfills urgently requiring remediation and closure must be identified from among a large number of sites. The present paper proposes a model for prioritizing landfills for closure and remediation based on multicriteria decision making, in which the hazards of landfill groundwater pollution are evaluated. The parameters for the prioritization of landfills are the amount of waste disposed, the amount of precipitation, the vulnerability index, and the rate of increase of the amount of waste in the landfill. Verification was performed using a case study in Serbia where all municipal landfills were included and 128 landfills were selected for prioritization. The results of the evaluation of Serbian landfills, prioritizing sites for closure and remediation, are presented for the first time. Critical landfills are identified, and prioritization ranks for the selected landfills are provided. Integr Environ Assess Manag 2018;14:105-119. © 2017 SETAC. © 2017 SETAC.

  10. Water demand-supply analysis in a large spatial area based on the processes of evapotranspiration and runoff

    PubMed Central

    Maruyama, Toshisuke

    2007-01-01

    To estimate the amount of evapotranspiration in a river basin, the “short period water balance method” was formulated. Then, by introducing the “complementary relationship method,” the amount of evapotranspiration was estimated seasonally, and with reasonable accuracy, for both small and large areas. Moreover, to accurately estimate river discharge in the low water season, the “weighted statistical unit hydrograph method” was proposed and a procedure for the calculation of the unit hydrograph was developed. Also, a new model, based on the “equivalent roughness method,” was successfully developed for the estimation of flood runoff from newly reclaimed farmlands. Based on the results of this research, a “composite reservoir model” was formulated to analyze the repeated use of irrigation water in large spatial areas. The application of this model to a number of watershed areas provided useful information with regard to the realities of water demand-supply systems in watersheds predominately dedicated to paddy fields, in Japan. PMID:24367144

  11. The Digital Slide Archive: A Software Platform for Management, Integration, and Analysis of Histology for Cancer Research.

    PubMed

    Gutman, David A; Khalilia, Mohammed; Lee, Sanghoon; Nalisnik, Michael; Mullen, Zach; Beezley, Jonathan; Chittajallu, Deepak R; Manthey, David; Cooper, Lee A D

    2017-11-01

    Tissue-based cancer studies can generate large amounts of histology data in the form of glass slides. These slides contain important diagnostic, prognostic, and biological information and can be digitized into expansive and high-resolution whole-slide images using slide-scanning devices. Effectively utilizing digital pathology data in cancer research requires the ability to manage, visualize, share, and perform quantitative analysis on these large amounts of image data, tasks that are often complex and difficult for investigators with the current state of commercial digital pathology software. In this article, we describe the Digital Slide Archive (DSA), an open-source web-based platform for digital pathology. DSA allows investigators to manage large collections of histologic images and integrate them with clinical and genomic metadata. The open-source model enables DSA to be extended to provide additional capabilities. Cancer Res; 77(21); e75-78. ©2017 AACR . ©2017 American Association for Cancer Research.

  12. Flexible services for the support of research.

    PubMed

    Turilli, Matteo; Wallom, David; Williams, Chris; Gough, Steve; Curran, Neal; Tarrant, Richard; Bretherton, Dan; Powell, Andy; Johnson, Matt; Harmer, Terry; Wright, Peter; Gordon, John

    2013-01-28

    Cloud computing has been increasingly adopted by users and providers to promote a flexible, scalable and tailored access to computing resources. Nonetheless, the consolidation of this paradigm has uncovered some of its limitations. Initially devised by corporations with direct control over large amounts of computational resources, cloud computing is now being endorsed by organizations with limited resources or with a more articulated, less direct control over these resources. The challenge for these organizations is to leverage the benefits of cloud computing while dealing with limited and often widely distributed computing resources. This study focuses on the adoption of cloud computing by higher education institutions and addresses two main issues: flexible and on-demand access to a large amount of storage resources, and scalability across a heterogeneous set of cloud infrastructures. The proposed solutions leverage a federated approach to cloud resources in which users access multiple and largely independent cloud infrastructures through a highly customizable broker layer. This approach allows for a uniform authentication and authorization infrastructure, a fine-grained policy specification and the aggregation of accounting and monitoring. Within a loosely coupled federation of cloud infrastructures, users can access vast amount of data without copying them across cloud infrastructures and can scale their resource provisions when the local cloud resources become insufficient.

  13. Influence of social norms and palatability on amount consumed and food choice.

    PubMed

    Pliner, Patricia; Mann, Nikki

    2004-04-01

    In two parallel studies, we examined the effect of social influence and palatability on amount consumed and on food choice. In Experiment 1, which looked at amount consumed, participants were provided with either palatable or unpalatable food; they were also given information about how much previous participants had eaten (large or small amounts) or were given no information. In the case of palatable food, participants ate more when led to believe that prior participants had eaten a great deal than when led to believe that prior participants had eaten small amounts or when provided with no information. This social-influence effect was not present when participants received unpalatable food. In Experiment 2, which looked at food choice, some participants learned that prior participants had chosen the palatable food, others learned that prior participants had chosen the unpalatable food, while still others received no information about prior participants' choices. The social-influence manipulation had no effect on participants' food choices; nearly all of them chose the palatable food. The results were discussed in the context of Churchfield's (1995) distinction between judgments about matters of fact and judgments about preferences. The results were also used to illustrate the importance of palatability as a determinant of eating behavior.

  14. Estimation and change tendency of rape straw resource in Leshan

    NASA Astrophysics Data System (ADS)

    Guan, Qinlan; Gong, Mingfu

    2018-04-01

    Rape straw in Leshan area are rape stalks, including stems, leaves and pods after removing rapeseed. Leshan area is one of the main rape planting areas in Sichuan Province and rape planting area is large. Each year will produce a lot of rape straw. Based on the analysis of the trend of rapeseed planting area and rapeseed yield from 2008 to 2014, the change trend of rape straw resources in Leshan from 2008 to 2014 was analyzed and the decision-making reference was provided for resource utilization of rape straw. The results showed that the amount of rape straw resources in Leshan was very large, which was more than 100,000 tons per year, which was increasing year by year. By 2014, the amount of rape straw resources in Leshan was close to 200,000 tons.

  15. Applications of Data Mining Methods in the Integrative Medical Studies of Coronary Heart Disease: Progress and Prospect

    PubMed Central

    Wang, Yixin; Guo, Fang

    2014-01-01

    A large amount of studies show that real-world study has strong external validity than the traditional randomized controlled trials and can evaluate the effect of interventions in a real clinical setting, which open up a new path for researches of integrative medicine in coronary heart disease. However, clinical data of integrative medicine in coronary heart disease are large in amount and complex in data types, making exploring the appropriate methodology a hot topic. Data mining techniques are to analyze and dig out useful information and knowledge from the mass data to guide people's practices. The present review provides insights for the main features of data mining and their applications of integrative medical studies in coronary heart disease, aiming to analyze the progress and prospect in this field. PMID:25544853

  16. Discovering Related Clinical Concepts Using Large Amounts of Clinical Notes

    PubMed Central

    Ganesan, Kavita; Lloyd, Shane; Sarkar, Vikren

    2016-01-01

    The ability to find highly related clinical concepts is essential for many applications such as for hypothesis generation, query expansion for medical literature search, search results filtering, ICD-10 code filtering and many other applications. While manually constructed medical terminologies such as SNOMED CT can surface certain related concepts, these terminologies are inadequate as they depend on expertise of several subject matter experts making the terminology curation process open to geographic and language bias. In addition, these terminologies also provide no quantifiable evidence on how related the concepts are. In this work, we explore an unsupervised graphical approach to mine related concepts by leveraging the volume within large amounts of clinical notes. Our evaluation shows that we are able to use a data driven approach to discovering highly related concepts for various search terms including medications, symptoms and diseases. PMID:27656096

  17. Destruction of Navy Hazardous Wastes by Supercritical Water Oxidation

    DTIC Science & Technology

    1994-08-01

    cleaning and derusting (nitrite and citric acid solutions), electroplating ( acids and metal bearing solutions), electronics and refrigeration... acid forming chemical species or that contain a large amount of dissolved solids present a challenge to current SCWO •-chnology. Approved for public...Waste streams that contain a large amount of mineral- acid forming chemical species or that contain a large amount of dissolved solids present a challenge

  18. Empirical relationships between tree fall and landscape-level amounts of logging and fire

    PubMed Central

    Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C.

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape. PMID:29474487

  19. Empirical relationships between tree fall and landscape-level amounts of logging and fire.

    PubMed

    Lindenmayer, David B; Blanchard, Wade; Blair, David; McBurney, Lachlan; Stein, John; Banks, Sam C

    2018-01-01

    Large old trees are critically important keystone structures in forest ecosystems globally. Populations of these trees are also in rapid decline in many forest ecosystems, making it important to quantify the factors that influence their dynamics at different spatial scales. Large old trees often occur in forest landscapes also subject to fire and logging. However, the effects on the risk of collapse of large old trees of the amount of logging and fire in the surrounding landscape are not well understood. Using an 18-year study in the Mountain Ash (Eucalyptus regnans) forests of the Central Highlands of Victoria, we quantify relationships between the probability of collapse of large old hollow-bearing trees at a site and the amount of logging and the amount of fire in the surrounding landscape. We found the probability of collapse increased with an increasing amount of logged forest in the surrounding landscape. It also increased with a greater amount of burned area in the surrounding landscape, particularly for trees in highly advanced stages of decay. The most likely explanation for elevated tree fall with an increasing amount of logged or burned areas in the surrounding landscape is change in wind movement patterns associated with cutblocks or burned areas. Previous studies show that large old hollow-bearing trees are already at high risk of collapse in our study area. New analyses presented here indicate that additional logging operations in the surrounding landscape will further elevate that risk. Current logging prescriptions require the protection of large old hollow-bearing trees on cutblocks. We suggest that efforts to reduce the probability of collapse of large old hollow-bearing trees on unlogged sites will demand careful landscape planning to limit the amount of timber harvesting in the surrounding landscape.

  20. Contrails of Small and Very Large Optical Depth

    NASA Technical Reports Server (NTRS)

    Atlas, David; Wang, Zhien

    2010-01-01

    This work deals with two kinds of contrails. The first comprises a large number of optically thin contrails near the tropopause. They are mapped geographically using a lidar to obtain their height and a camera to obtain azimuth and elevation. These high-resolution maps provide the local contrail geometry and the amount of optically clear atmosphere. The second kind is a single trail of unprecedentedly large optical thickness that occurs at a lower height. The latter was observed fortuitously when an aircraft moving along the wind direction passed over the lidar, thus providing measurements for more than 3 h and an equivalent distance of 620 km. It was also observed by Geostationary Operational Environmental Satellite (GOES) sensors. The lidar measured an optical depth of 2.3. The corresponding extinction coefficient of 0.023 per kilometer and ice water content of 0.063 grams per cubic meter are close to the maximum values found for midlatitude cirrus. The associated large radar reflectivity compares to that measured by ultrasensitive radar, thus providing support for the reality of the large optical depth.

  1. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    NASA Astrophysics Data System (ADS)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  2. Influences of large-scale convection and moisture source on monthly precipitation isotope ratios observed in Thailand, Southeast Asia

    NASA Astrophysics Data System (ADS)

    Wei, Zhongwang; Lee, Xuhui; Liu, Zhongfang; Seeboonruang, Uma; Koike, Masahiro; Yoshimura, Kei

    2018-04-01

    Many paleoclimatic records in Southeast Asia rely on rainfall isotope ratios as proxies for past hydroclimatic variability. However, the physical processes controlling modern rainfall isotopic behaviors in the region is poorly constrained. Here, we combined isotopic measurements at six sites across Thailand with an isotope-incorporated atmospheric circulation model (IsoGSM) and the Hybrid Single-Particle Lagrangian Integrated Trajectory (HYSPLIT) model to investigate the factors that govern the variability of precipitation isotope ratios in this region. Results show that rainfall isotope ratios are both correlated with local rainfall amount and regional outgoing longwave radiation, suggesting that rainfall isotope ratios in this region are controlled not only by local rain amount (amount effect) but also by large-scale convection. As a transition zone between the Indian monsoon and the western North Pacific monsoon, the spatial difference of observed precipitation isotope among different sites are associated with moisture source. These results highlight the importance of regional processes in determining rainfall isotope ratios in the tropics and provide constraints on the interpretation of paleo-precipitation isotope records in the context of regional climate dynamics.

  3. [Advances in the research of application of artificial intelligence in burn field].

    PubMed

    Li, H H; Bao, Z X; Liu, X B; Zhu, S H

    2018-04-20

    Artificial intelligence has been able to automatically learn and judge large-scale data to some extent. Based on database of a large amount of burn data and in-depth learning, artificial intelligence can assist burn surgeons to evaluate burn surface, diagnose burn depth, guide fluid supply during shock stage, and predict prognosis, with high accuracy. With the development of technology, artificial intelligence can provide more accurate information for burn surgeons to make clinical diagnosis and treatment strategies.

  4. HyperCard K-12: Classroom Computer Learning Special Supplement Sponsored by Apple Computer.

    ERIC Educational Resources Information Center

    Classroom Computer Learning, 1989

    1989-01-01

    Follows the development of hypertext which is the electronic movement of large amounts of text. Probes the use of the Macintosh HyperCard and its applications in education. Notes programs are stackable in the computer. Provides tool, resource, and stack directory along with tips for using HyperCard. (MVL)

  5. Reflection: A Key Component to Thinking Critically

    ERIC Educational Resources Information Center

    Colley, Binta M.; Bilics, Andrea R.; Lerch, Carol M.

    2012-01-01

    The ability to think critically is an important trait of all members of society. With today's multinational, multicultural, complex issues, citizens must be able to sift through large amounts of various data to make intelligent decisions. Thinking critically must be a focus of higher education in order to provide the intellectual training for its…

  6. PASSIVE SAMPLING OF GROUND WATER MONITORING WELLS WITHOUT PURGING MULTILEVEL WELL CHEMISTRY AND TRACER DISAPPEARANCE

    EPA Science Inventory

    It is essential that the sampling techniques utilized in groundwater monitoring provide data that accurately depicts the water quality of the sampled aquifer in the vicinity of the well. Due to the large amount of monitoring activity currently underway in the U.S.A. it is also im...

  7. Low flow fume hood

    DOEpatents

    Bell, Geoffrey C.; Feustel, Helmut E.; Dickerhoff, Darryl J.

    2002-01-01

    A fume hood is provided having an adequate level of safety while reducing the amount of air exhausted from the hood. A displacement flow fume hood works on the principal of a displacement flow which displaces the volume currently present in the hood using a push-pull system. The displacement flow includes a plurality of air supplies which provide fresh air, preferably having laminar flow, to the fume hood. The displacement flow fume hood also includes an air exhaust which pulls air from the work chamber in a minimally turbulent manner. As the displacement flow produces a substantially consistent and minimally turbulent flow in the hood, inconsistent flow patterns associated with contaminant escape from the hood are minimized. The displacement flow fume hood largely reduces the need to exhaust large amounts of air from the hood. It has been shown that exhaust air flow reductions of up to 70% are possible without a decrease in the hood's containment performance. The fume hood also includes a number of structural adaptations which facilitate consistent and minimally turbulent flow within a fume hood.

  8. Survival of Glycolaldehyde and Production of Sugar Compounds via Comet Impact Delivery

    NASA Astrophysics Data System (ADS)

    Zellner, N.; McCaffrey, V.; Crake, C.; Butler, J.; Robbins, J.; Fodor, A.

    2017-12-01

    Impact experiments using glycolaldehyde (GLA), a two-carbon sugar precursor that has been detected in regions of the interstellar medium and on comets, have been conducted at the Experimental Impact Laboratory at NASA's Johnson Space Center. Samples of GLA and GLA mixed with montmorillonite clays were subjected to the pressure conditions that are found during impact delivery of biomolecules by comets, asteroids, or meteors; pressures ranged from 4.5 GPa to 25 GPa. Results show that large amounts of GLA survived the impacts and moderate amounts of threose, erythrose, and glycolic acid were produced in these impacts. Total amounts are dependent on impact pressure. Ethylene glycol, a reduced variant of GLA that has also been detected in the interstellar medium and on comets, was also produced. The results of these experimental impacts provide evidence that large amounts of GLA, EG, and other biomolecules were available on habitable moons or planets, especially during the era of late heavy bombardment ( 4.2 to 3.7 billion years ago) when life may have been developing on Earth. The presence and availability of these biomolecules, under appropriate conditions, may be important for understanding the origin of life as we know it. Glycolaldehyde in particular, may be an important molecule in the production of ribose, the five-carbon sugar in RNA.

  9. Roles and applications of biomedical ontologies in experimental animal science.

    PubMed

    Masuya, Hiroshi

    2012-01-01

    A huge amount of experimental data from past studies has played a vital role in the development of new knowledge and technologies in biomedical science. The importance of computational technologies for the reuse of data, data integration, and knowledge discoveries has also increased, providing means of processing large amounts of data. In recent years, information technologies related to "ontologies" have played more significant roles in the standardization, integration, and knowledge representation of biomedical information. This review paper outlines the history of data integration in biomedical science and its recent trends in relation to the field of experimental animal science.

  10. A simple biosynthetic pathway for large product generation from small substrate amounts

    NASA Astrophysics Data System (ADS)

    Djordjevic, Marko; Djordjevic, Magdalena

    2012-10-01

    A recently emerging discipline of synthetic biology has the aim of constructing new biosynthetic pathways with useful biological functions. A major application of these pathways is generating a large amount of the desired product. However, toxicity due to the possible presence of toxic precursors is one of the main problems for such production. We consider here the problem of generating a large amount of product from a potentially toxic substrate. To address this, we propose a simple biosynthetic pathway, which can be induced in order to produce a large number of the product molecules, by keeping the substrate amount at low levels. Surprisingly, we show that the large product generation crucially depends on fast non-specific degradation of the substrate molecules. We derive an optimal induction strategy, which allows as much as three orders of magnitude increase in the product amount through biologically realistic parameter values. We point to a recently discovered bacterial immune system (CRISPR/Cas in E. coli) as a putative example of the pathway analysed here. We also argue that the scheme proposed here can be used not only as a stand-alone pathway, but also as a strategy to produce a large amount of the desired molecules with small perturbations of endogenous biosynthetic pathways.

  11. Preparation of PEMFC Electrodes from Milligram-Amounts of Catalyst Powder

    DOE PAGES

    Yarlagadda, Venkata; McKinney, Samuel E.; Keary, Cristin L.; ...

    2017-06-03

    Development of electrocatalysts with higher activity and stability is one of the highest priorities in enabling cost-competitive hydrogen-air fuel cells. Although the rotating disk electrode (RDE) technique is widely used to study new catalyst materials, it has been often shown to be an unreliable predictor of catalyst performance in actual fuel cell operation. Fabrication of membrane electrode assemblies (MEA) for evaluation which are more representative of actual fuel cells generally requires relatively large amounts (>1 g) of catalyst material which are often not readily available in early stages of development. In this study, we present two MEA preparation techniques usingmore » as little as 30 mg of catalyst material, providing methods to conduct more meaningful MEA-based tests using research-level catalysts amounts.« less

  12. Natural gas hydrate occurrence and issues

    USGS Publications Warehouse

    Kvenvolden, K.A.

    1994-01-01

    Naturally occurring gas hydrate is found in sediment of two regions: (1) continental, including continental shelves, at high latitudes where surface temperatures are very cold, and (2) submarine outer continental margins where pressures are very high and bottom-water temperatures are near 0??C. Continental gas hydrate is found in association with onshore and offshore permafrost. Submarine gas hydrate is found in sediment of continental slopes and rises. The amount of methane present in gas hydrate is thought to be very large, but the estimates that have been made are more speculative than real. Nevertheless, at the present time there has been a convergence of ideas regarding the amount of methane in gas hydrate deposits worldwide at about 2 x 1016 m3 or 7 x 1017 ft3 = 7 x 105 Tcf [Tcf = trillion (1012) ft3]. The potentially large amount of methane in gas hydrate and the shallow depth of gas hydrate deposits are two of the principal factors driving research concerning this substance. Such a large amount of methane, if it could be commercially produced, provides a potential energy resource for the future. Because gas hydrate is metastable, changes of surface pressure and temperature affect its stability. Destabilized gas hydrate beneath the sea floor leads to geologic hazards such as submarine mass movements. Examples of submarine slope failures attributed to gas hydrate are found worldwide. The metastability of gas hydrate may also have an effect on climate. The release of methane, a 'greenhouse' gas, from destabilized gas hydrate may contribute to global warming and be a factor in global climate change.

  13. Microphysical, Macrophysical and Radiative Signatures of Volcanic Aerosols in Trade Wind Cumulus Observed by the A-Train

    NASA Technical Reports Server (NTRS)

    Yuan, T.; Remer, L. A.; Yu, H.

    2011-01-01

    Increased aerosol concentrations can raise planetary albedo not only by reflecting sunlight and increasing cloud albedo, but also by changing cloud amount. However, detecting aerosol effect on cloud amount has been elusive to both observations and modeling due to potential buffering mechanisms and convolution of meteorology. Here through a natural experiment provided by long-tem1 degassing of a low-lying volcano and use of A-Train satellite observations, we show modifications of trade cumulus cloud fields including decreased droplet size, decreased precipitation efficiency and increased cloud amount are associated with volcanic aerosols. In addition we find significantly higher cloud tops for polluted clouds. We demonstrate that the observed microphysical and macrophysical changes cannot be explained by synoptic meteorology or the orographic effect of the Hawaiian Islands. The "total shortwave aerosol forcin", resulting from direct and indirect forcings including both cloud albedo and cloud amount. is almost an order of magnitude higher than aerosol direct forcing alone. Furthermore, the precipitation reduction associated with enhanced aerosol leads to large changes in the energetics of air-sea exchange and trade wind boundary layer. Our results represent the first observational evidence of large-scale increase of cloud amount due to aerosols in a trade cumulus regime, which can be used to constrain the representation of aerosol-cloud interactions in climate models. The findings also have implications for volcano-climate interactions and climate mitigation research.

  14. Strategies for responding to RAC requests electronically.

    PubMed

    Schramm, Michael

    2012-04-01

    Providers that would like to respond to complex RAC reviews electronically should consider three strategies: Invest in an EHR software package or a high-powered scanner that can quickly scan large amounts of paper. Implement an audit software platform that will allow providers to manage the entire audit process in one place. Use a CONNECT-compatible gateway capable of accessing the Nationwide Health Information Network (the network on which the electronic submission of medical documentation program runs).

  15. Planetary Analogs in Antarctica: Icy Satellites

    NASA Technical Reports Server (NTRS)

    Malin, M. C.

    1985-01-01

    As part of a study to provide semi-quantitative techniques to date past Antarctic glaciations, sponsored by the Antarctic Research Program, field observations pertinent to other planets were also acquired. The extremely diverse surface conditions, marked by extreme cold and large amounts of ice, provide potential terrain and process analogs to the icy satellites of Jupiter and Saturn. Thin ice tectonic features and explosion craters (on sea ice) and deformation features on thicker ice (glaciers) are specifically addressed.

  16. The Physician Payments Sunshine Act: Data Evaluation Regarding Payments to Ophthalmologists

    PubMed Central

    Chang, Jonathan S.

    2014-01-01

    Objective/Purpose To review data for ophthalmologists published online from the Physician Payments Sunshine Act. Design Retrospective data review using a publicly available electronic database Methods: Main Outcome Measures A database was downloaded from the Centers for Medicare and Medicaid Services (CMS) Website under Identified General Payments to Physicians and a primary specialty of ophthalmology. Basic statistical analysis was performed including mean, median and range of payments for both single payments and per provider. Data were also summarized by category of payment, geographic region and compared with other surgical subspecialties. Results From August 1, 2013 to December 31, 2013, a total of 55,996 individual payments were reported to 9,855 ophthalmologists for a total of $10,926,447. The mean amount received in a single payment was $195.13 (range $0.04–$193,073). The mean amount received per physician ID was $1,108 (range $1–$397,849) and median amount $112.01. Consulting fees made up the largest percentage of fees. There was not a large difference in payments received by region. The mean payments for the subspecialties of dermatology, neurosurgery, orthopedic surgery and urology ranged from $954–$6,980, and median payments in each field by provider identifier ranged from $88–$173. Conclusions A large amount of data was released by CMS for the Physician Payment Sunshine Act. In ophthalmology, mean and median payments per physician did not vary greatly from other surgical subspecialties. Most single payments were under $100, and most physicians received less than $500 in total payments. Payments for consulting made up the largest category of spending. How this affects patient perception, patient care and medical costs warrants further study. PMID:25578254

  17. Chemical weathering on the North Island of New Zealand: CO2 consumption and fluxes of Sr and Os

    NASA Astrophysics Data System (ADS)

    Blazina, Tim; Sharma, Mukul

    2013-09-01

    We present Os and Sr isotope ratios and Os, Sr and major/trace element concentrations for river waters, spring waters and rains on the North Island of New Zealand. The Os and Sr data are used to examine whether the NINZ is a significant contributor of unradiogenic Os and Sr to the oceans. Major element chemistry is used to quantify weathering and CO2 consumption rates on the island to investigate relationships between these processes and Os and Sr behavior. Chemical erosion rates and CO2 consumption rates across the island range from 44 to 555 km-2 yr-1 and 95 to 1900 × 103 mol CO2 km-2 yr-1, respectively. Strontium flux for the island range from 177 to 16,100 mol km-2 yr-1 and the rivers have an average flux normalized 87Sr/86Sr ratio of 0.7075. In agreement with the previous studies these findings provide further evidence that weathering of arc terrains contributes a disproportionally large amount of Sr to the oceans and consumes very large amounts of CO2 annually compared to their areal extent. However, the 87Sr/86Sr from the NINZ is not particularly unradiogenic and it is likely not contributing significant amounts of unradiogenic Sr to the oceans. Repeated Os analyses and bottle leaching experiments revealed extensive and variable sample contamination by Os leaching from rigorously precleaned LDPE bottles. An upper bound on the flux of Os from NINZ can nevertheless be assessed and indicates that island arcs cannot provide significant amounts of unradiogenic Os to the oceans.

  18. Method and apparatus for controlling pitch and flap angles of a wind turbine

    DOEpatents

    Deering, Kenneth J [Seattle, WA; Wohlwend, Keith P [Issaquah, WA

    2009-05-12

    A wind turbine with improved response to wind conditions is provided. Blade flap angle motion is accompanied by a change in pitch angle by an amount defining a pitch/flap coupling ratio. The coupling ratio is non-constant as a function of a flap angle and is preferably a substantially continuous, non-linear function of flap angle. The non-constant coupling ratio can be provided by mechanical systems such as a series of linkages or by configuring electronic or other control systems and/or angle sensors. A link with a movable proximal end advantageously is part of the mechanical system. The system can provide relatively large coupling ratios and relatively large rates of coupling ratio changes especially for near-feather pitches and low flap angles.

  19. Use of cloud computing in biomedicine.

    PubMed

    Sobeslav, Vladimir; Maresova, Petra; Krejcar, Ondrej; Franca, Tanos C C; Kuca, Kamil

    2016-12-01

    Nowadays, biomedicine is characterised by a growing need for processing of large amounts of data in real time. This leads to new requirements for information and communication technologies (ICT). Cloud computing offers a solution to these requirements and provides many advantages, such as cost savings, elasticity and scalability of using ICT. The aim of this paper is to explore the concept of cloud computing and the related use of this concept in the area of biomedicine. Authors offer a comprehensive analysis of the implementation of the cloud computing approach in biomedical research, decomposed into infrastructure, platform and service layer, and a recommendation for processing large amounts of data in biomedicine. Firstly, the paper describes the appropriate forms and technological solutions of cloud computing. Secondly, the high-end computing paradigm of cloud computing aspects is analysed. Finally, the potential and current use of applications in scientific research of this technology in biomedicine is discussed.

  20. Fuzzy Document Clustering Approach using WordNet Lexical Categories

    NASA Astrophysics Data System (ADS)

    Gharib, Tarek F.; Fouad, Mohammed M.; Aref, Mostafa M.

    Text mining refers generally to the process of extracting interesting information and knowledge from unstructured text. This area is growing rapidly mainly because of the strong need for analysing the huge and large amount of textual data that reside on internal file systems and the Web. Text document clustering provides an effective navigation mechanism to organize this large amount of data by grouping their documents into a small number of meaningful classes. In this paper we proposed a fuzzy text document clustering approach using WordNet lexical categories and Fuzzy c-Means algorithm. Some experiments are performed to compare efficiency of the proposed approach with the recently reported approaches. Experimental results show that Fuzzy clustering leads to great performance results. Fuzzy c-means algorithm overcomes other classical clustering algorithms like k-means and bisecting k-means in both clustering quality and running time efficiency.

  1. A Model for Education: Energy-Water Consumption Decision Making.

    ERIC Educational Resources Information Center

    Bontrager, Ralph L.; Hubbard, Charles W.

    Public schools are in a position to convince society-at-large of the national energy problem. There is a direct relationship between energy costs to the schools and the type of educational programs they can provide. While waiting for a national energy policy with a section devoted to schools, districts can calculate the amount and cost of energy…

  2. Scaffolding Word Solving While Reading: New Research Insights

    ERIC Educational Resources Information Center

    Rodgers, Emily

    2017-01-01

    For many teachers, the term "scaffolding" has come to mean providing just the right amount of help when a student encounters difficulty. However, there is another facet of scaffolding that has been largely ignored, and that is making decisions about what to focus on to help the student. In this article, new research findings are shared…

  3. Landsat continuity: issues and opportunities for land cover monitoring

    Treesearch

    Michael A. Wulder; Joanne C. White; Samuel N. Goward; Jeffrey G. Masek; James R. Irons; Martin Herold; Warren B. Cohen; Thomas R. Loveland; Curtis E. Woodcock

    2008-01-01

    Initiated in 1972, the Landsat program has provided a continuous record of Earth observation for 35 years. The assemblage of Landsat spatial, spectral, and temporal resolutions, over a reasonably sized image extent, results in imagery that can be processed to represent land cover over large areas with an amount of spatial detail that is absolutely unique and...

  4. Reading Rate Gains during a One-Semester Extensive Reading Course

    ERIC Educational Resources Information Center

    Huffman, Jeffrey

    2014-01-01

    Extensive reading (ER) is an effective way to provide large amounts of comprehensible input to foreign language learners, but many teachers and administrators remain unconvinced, and it has been argued that there is still insufficient evidence to support the claims that have been made regarding its benefits. Few studies have looked at ER's effect…

  5. Doctoral and Postdoctoral Education in Science and Engineering: Europe in the International Competition

    ERIC Educational Resources Information Center

    MOGUEROU, PHILIPPE

    2005-01-01

    In this article, we discuss the recent evolutions of science and engineering doctoral and postdoctoral education in Europe. Indeed, Ph.Ds are crucial to the conduct of research and innovation in the national innovation systems, as they provide a large amount of input into creating the competitive advantage, notably through basic research. First,…

  6. Integrated process development-a robust, rapid method for inclusion body harvesting and processing at the microscale level.

    PubMed

    Walther, Cornelia; Kellner, Martin; Berkemeyer, Matthias; Brocard, Cécile; Dürauer, Astrid

    2017-10-21

    Escherichia coli stores large amounts of highly pure product within inclusion bodies (IBs). To take advantage of this beneficial feature, after cell disintegration, the first step to optimal product recovery is efficient IB preparation. This step is also important in evaluating upstream optimization and process development, due to the potential impact of bioprocessing conditions on product quality and on the nanoscale properties of IBs. Proper IB preparation is often neglected, due to laboratory-scale methods requiring large amounts of materials and labor. Miniaturization and parallelization can accelerate analyses of individual processing steps and provide a deeper understanding of up- and downstream processing interdependencies. Consequently, reproducible, predictive microscale methods are in demand. In the present study, we complemented a recently established high-throughput cell disruption method with a microscale method for preparing purified IBs. This preparation provided results comparable to laboratory-scale IB processing, regarding impurity depletion, and product loss. Furthermore, with this method, we performed a "design of experiments" study to demonstrate the influence of fermentation conditions on the performance of subsequent downstream steps and product quality. We showed that this approach provided a 300-fold reduction in material consumption for each fermentation condition and a 24-fold reduction in processing time for 24 samples.

  7. Regenerative adsorbent heat pump

    NASA Technical Reports Server (NTRS)

    Jones, Jack A. (Inventor)

    1991-01-01

    A regenerative adsorbent heat pump process and system is provided which can regenerate a high percentage of the sensible heat of the system and at least a portion of the heat of adsorption. A series of at least four compressors containing an adsorbent is provided. A large amount of heat is transferred from compressor to compressor so that heat is regenerated. The process and system are useful for air conditioning rooms, providing room heat in the winter or for hot water heating throughout the year, and, in general, for pumping heat from a lower temperature to a higher temperature.

  8. Realizing the financial benefits of capitation arbitrage.

    PubMed

    Sussman, A J; Fairchild, D G; Colling, M C; Brennan, T A

    1999-11-01

    By anticipating the arbitrage potential of cash flow under budgeted capitation, healthcare organizations can make the best use of cash flow as a revenue-generating resource. Factors that determine the magnitude of the benefits for providers and insurers include settlement interval, withhold amount, which party controls the withhold, and incurred-but-not-reported expenses. In choosing how to structure these factors in their contract negotiations, providers and insurers should carefully assess whether capitation surpluses or deficits can be expected from the provider. In both instances, the recipient and magnitude of capitation arbitrage benefits are dictated largely by the performance of the provider.

  9. Influence of dietary fiber on luminal environment and morphology in the small and large intestine of sows.

    PubMed

    Serena, A; Hedemann, M S; Bach Knudsen, K E

    2008-09-01

    In this study, the effect of feeding different types and amounts of dietary fiber (DF) on luminal environment and morphology in the small and large intestine of sows was studied. Three diets, a low-fiber diet (LF) and 2 high-fiber diets (high fiber 1, HF1, and high fiber 2, HF2) were used. Diet LF (DF, 17%; soluble DF 4.6%) was based on wheat and barley, whereas the 2 high-fiber diets (HF1: DF, 43%; soluble DF, 11.0%; and HF2: DF, 45%; soluble DF, 7.6%) were based on wheat and barley supplemented with different coproducts from the vegetable food and agroindustry (HF1 and HF2: sugar beet pulp, potato pulp, and pectin residue; HF2: brewers spent grain, seed residue, and pea hull). The diets were fed for a 4-wk period to 12 sows (4 receiving each diet). Thereafter, the sows were killed 4 h postfeeding, and digesta and tissue samples were collected from various parts of the small and large intestine. The carbohydrates in the LF diet were well digested in the small intestine, resulting in less digesta in all segments of the intestinal tract. The fermentation of nonstarch polysaccharides in the large intestine was affected by the chemical composition and physicochemical properties. The digesta from pigs fed the LF diet provided low levels of fermentable carbohydrates that were depleted in proximal colon, whereas for pigs fed the 2 high-DF diets, the digesta was depleted of fermentable carbohydrates at more distal locations of the colon. The consequence was an increased retention time, greater DM percentage, decreased amount of material, and a decreased tissue weight after feeding the LF diet compared with the HF diets. The concentration of short-chain fatty acids was consistent with the fermentability of carbohydrates in the large intestine, but there was no effect of the dietary composition on the molar short-chain fatty acid proportions. It was further shown that feeding the diet providing the greatest amount of fermentable carbohydrates (diet HF1, which was high in soluble DF) resulted in significant morphological changes in the colon compared with the LF diet.

  10. Data Prospecting Framework - a new approach to explore "big data" in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Rushing, J.; Lin, A.; Kuo, K.

    2012-12-01

    Due to advances in sensors, computation and storage, cost and effort required to produce large datasets have been significantly reduced. As a result, we are seeing a proliferation of large-scale data sets being assembled in almost every science field, especially in geosciences. Opportunities to exploit the "big data" are enormous as new hypotheses can be generated by combining and analyzing large amounts of data. However, such a data-driven approach to science discovery assumes that scientists can find and isolate relevant subsets from vast amounts of available data. Current Earth Science data systems only provide data discovery through simple metadata and keyword-based searches and are not designed to support data exploration capabilities based on the actual content. Consequently, scientists often find themselves downloading large volumes of data, struggling with large amounts of storage and learning new analysis technologies that will help them separate the wheat from the chaff. New mechanisms of data exploration are needed to help scientists discover the relevant subsets We present data prospecting, a new content-based data analysis paradigm to support data-intensive science. Data prospecting allows the researchers to explore big data in determining and isolating data subsets for further analysis. This is akin to geo-prospecting in which mineral sites of interest are determined over the landscape through screening methods. The resulting "data prospects" only provide an interaction with and feel for the data through first-look analytics; the researchers would still have to download the relevant datasets and analyze them deeply using their favorite analytical tools to determine if the datasets will yield new hypotheses. Data prospecting combines two traditional categories of data analysis, data exploration and data mining within the discovery step. Data exploration utilizes manual/interactive methods for data analysis such as standard statistical analysis and visualization, usually on small datasets. On the other hand, data mining utilizes automated algorithms to extract useful information. Humans guide these automated algorithms and specify algorithm parameters (training samples, clustering size, etc.). Data Prospecting combines these two approaches using high performance computing and the new techniques for efficient distributed file access.

  11. Scaling up to address data science challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, Joanne R.

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  12. Scaling up to address data science challenges

    DOE PAGES

    Wendelberger, Joanne R.

    2017-04-27

    Statistics and Data Science provide a variety of perspectives and technical approaches for exploring and understanding Big Data. Partnerships between scientists from different fields such as statistics, machine learning, computer science, and applied mathematics can lead to innovative approaches for addressing problems involving increasingly large amounts of data in a rigorous and effective manner that takes advantage of advances in computing. Here, this article will explore various challenges in Data Science and will highlight statistical approaches that can facilitate analysis of large-scale data including sampling and data reduction methods, techniques for effective analysis and visualization of large-scale simulations, and algorithmsmore » and procedures for efficient processing.« less

  13. Querying Large Biological Network Datasets

    ERIC Educational Resources Information Center

    Gulsoy, Gunhan

    2013-01-01

    New experimental methods has resulted in increasing amount of genetic interaction data to be generated every day. Biological networks are used to store genetic interaction data gathered. Increasing amount of data available requires fast large scale analysis methods. Therefore, we address the problem of querying large biological network datasets.…

  14. Identifying and quantifying urban recharge: a review

    NASA Astrophysics Data System (ADS)

    Lerner, David N.

    2002-02-01

    The sources of and pathways for groundwater recharge in urban areas are more numerous and complex than in rural environments. Buildings, roads, and other surface infrastructure combine with man-made drainage networks to change the pathways for precipitation. Some direct recharge is lost, but additional recharge can occur from storm drainage systems. Large amounts of water are imported into most cities for supply, distributed through underground pipes, and collected again in sewers or septic tanks. The leaks from these pipe networks often provide substantial recharge. Sources of recharge in urban areas are identified through piezometry, chemical signatures, and water balances. All three approaches have problems. Recharge is quantified either by individual components (direct recharge, water-mains leakage, septic tanks, etc.) or holistically. Working with individual components requires large amounts of data, much of which is uncertain and is likely to lead to large uncertainties in the final result. Recommended holistic approaches include the use of groundwater modelling and solute balances, where various types of data are integrated. Urban recharge remains an under-researched topic, with few high-quality case studies reported in the literature.

  15. Risk assessment for ecotoxicity of pharmaceuticals--an emerging issue.

    PubMed

    Kar, Supratik; Roy, Kunal

    2012-03-01

    Existence of a large amount of pharmaceuticals and their active metabolites in the environment has recently been considered as one of the most serious concerns in environmental sciences. Large diversity of pharmaceuticals has been found in the environmental domain in considerable amounts that are not only destructive to environment but also fatal for human and animal fraternity. There is a considerable lack of knowledge about the environmental fate and quantification of a large number of pharmaceuticals. This communication aims to review the literature information regarding occurrence of pharmaceuticals and their metabolites in the environment, their persistence, environmental fate and toxicity as well as application of theoretical, non-experimental, non-animal, alternative and, in particular, in silico methods to provide information about the basic physicochemical and fate properties of pharmaceuticals to the environment. The reader will gain an overview of risk assessment strategies for ecotoxicity of pharmaceuticals and advances in application of quantitative structure-toxicity relationship (QSTR) in this field. This review justifies the need to develop more QSTR models for prediction of ecotoxicity of pharmaceuticals in order to reduce time and cost involvement in such exercise.

  16. Highly crystallized nanometer-sized zeolite a with large Cs adsorption capability for the decontamination of water.

    PubMed

    Torad, Nagy L; Naito, Masanobu; Tatami, Junichi; Endo, Akira; Leo, Sin-Yen; Ishihara, Shinsuke; Wu, Kevin C-W; Wakihara, Toru; Yamauchi, Yusuke

    2014-03-01

    Nanometer-sized zeolite A with a large cesium (Cs) uptake capability is prepared through a simple post-milling recrystallization method. This method is suitable for producing nanometer-sized zeolite in large scale, as additional organic compounds are not needed to control zeolite nucleation and crystal growth. Herein, we perform a quartz crystal microbalance (QCM) study to evaluate the uptake ability of Cs ions by zeolite, to the best of our knowledge, for the first time. In comparison to micrometer-sized zeolite A, nanometer-sized zeolite A can rapidly accommodate a larger amount of Cs ions into the zeolite crystal structure, owing to its high external surface area. Nanometer-sized zeolite is a promising candidate for the removal of radioactive Cs ions from polluted water. Our QCM study on Cs adsorption uptake behavior provides the information of adsorption kinetics (e.g., adsorption amounts and rates). This technique is applicable to other zeolites, which will be highly valuable for further consideration of radioactive Cs removal in the future. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Integrated Data Capturing Requirements for 3d Semantic Modelling of Cultural Heritage: the Inception Protocol

    NASA Astrophysics Data System (ADS)

    Di Giulio, R.; Maietti, F.; Piaia, E.; Medici, M.; Ferrari, F.; Turillazzi, B.

    2017-02-01

    The generation of high quality 3D models can be still very time-consuming and expensive, and the outcome of digital reconstructions is frequently provided in formats that are not interoperable, and therefore cannot be easily accessed. This challenge is even more crucial for complex architectures and large heritage sites, which involve a large amount of data to be acquired, managed and enriched by metadata. In this framework, the ongoing EU funded project INCEPTION - Inclusive Cultural Heritage in Europe through 3D semantic modelling proposes a workflow aimed at the achievements of efficient 3D digitization methods, post-processing tools for an enriched semantic modelling, web-based solutions and applications to ensure a wide access to experts and non-experts. In order to face these challenges and to start solving the issue of the large amount of captured data and time-consuming processes in the production of 3D digital models, an Optimized Data Acquisition Protocol (DAP) has been set up. The purpose is to guide the processes of digitization of cultural heritage, respecting needs, requirements and specificities of cultural assets.

  18. Evaluation of Flush-Mounted, S-Duct Inlets with Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Morehouse, Melissa B.

    2003-01-01

    A new high Reynolds number test capability for boundary layer ingesting inlets has been developed for the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. Using this new capability, an experimental investigation of four S-duct inlet configurations with large amounts of boundary layer ingestion (nominal boundary layer thickness of about 40% of inlet height) was conducted at realistic operating conditions (high subsonic Mach numbers and full-scale Reynolds numbers). The objectives of this investigation were to 1) provide a database for CFD tool validation on boundary layer ingesting inlets operating at realistic conditions and 2) provide a baseline inlet for future inlet flow-control studies. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on duct exit diameter) from 5.1 million to a full-scale value of 13.9 million, and inlet mass-flow ratios from 0.39 to 1.58 depending on Mach number. Results of this investigation indicate that inlet pressure recovery generally decreased and inlet distortion generally increased with increasing Mach number. Except at low Mach numbers, increasing inlet mass-flow increased pressure recovery and increased distortion. Increasing the amount of boundary layer ingestion (by decreasing inlet throat height) or ingesting a boundary layer with a distorted (adverse) profile decreased pressure recovery and increased distortion. Finally, increasing Reynolds number had almost no effect on inlet distortion but increased inlet recovery by about one-half percent at a Mach number near cruise.

  19. Analysis of volatile organic compounds. [trace amounts of organic volatiles in gas samples

    NASA Technical Reports Server (NTRS)

    Zlatkis, A. (Inventor)

    1977-01-01

    An apparatus and method are described for reproducibly analyzing trace amounts of a large number of organic volatiles existing in a gas sample. Direct injection of the trapped volatiles into a cryogenic percolum provides a sharply defined plug. Applications of the method include: (1) analyzing the headspace gas of body fluids and comparing a profile of the organic volatiles with standard profiles for the detection and monitoring of disease; (2) analyzing the headspace gas of foods and beverages and comparing the profile with standard profiles to monitor and control flavor and aroma; and (3) analyses for determining the organic pollutants in air or water samples.

  20. Serum Amylase in Bulimia Nervosa and Purging Disorder: Differentiating the Association with Binge Eating versus Purging Behavior

    PubMed Central

    Wolfe, Barbara E.; Jimerson, David C.; Smith, Adrian; Keel, Pamela K.

    2011-01-01

    Objective Elevated serum amylase levels in bulimia nervosa (BN), associated with increased salivary gland size and self-induced vomiting in some patients, provide a possible marker of symptom severity. The goal of this study was to assess whether serum hyperamylasemia in BN is more closely associated with binge eating episodes involving consumption of large amounts of food or with purging behavior. Method Participants included women with BN (n=26); women with “purging disorder” (PD), a subtype of EDNOS characterized by recurrent purging in the absence of objectively large binge eating episodes (n=14); and healthy non-eating disorder female controls (n=32). There were no significant differences in age or body mass index (BMI) across groups. The clinical groups reported similar frequency of self-induced vomiting behavior and were free of psychotropic medications. Serum samples were obtained after overnight fast and were assayed for alpha-amylase by enzymatic method. Results Serum amylase levels were significantly elevated in BN (60.7 ± 25.4 international units [IU]/liter, mean ± sd) in comparison to PD (44.7 ± 17.1 IU/L, p < 02) and to Controls (49.3 ± 15.8, p < .05). Conclusion These findings provide evidence to suggest that it is recurrent binge eating involving large amounts of food, rather than self-induced vomiting, which contributes to elevated serum amylase values in BN. PMID:21781981

  1. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  2. [Performance enhancement by carbohydrate intake during sport: effects of carbohydrates during and after high-intensity exercise].

    PubMed

    Beelen, Milou; Cermak, Naomi M; van Loon, Luc J C

    2015-01-01

    Endogenous carbohydrate availability does not provide sufficient energy for prolonged moderate to high-intensity exercise. Carbohydrate ingestion during high-intensity exercise can therefore enhance performance.- For exercise lasting 1 to 2.5 hours, athletes are advised to ingest 30-60 g of carbohydrates per hour.- Well-trained endurance athletes competing for longer than 2.5 hours at high intensity can metabolise up to 90 g of carbohydrates per hour, provided that a mixture of glucose and fructose is ingested.- Athletes participating in intermittent or team sports are advised to follow the same strategies but the timing of carbohydrate intake depends on the type of sport.- If top performance is required again within 24 hours after strenuous exercise, the advice is to supplement endogenous carbohydrate supplies quickly within the first few hours post-exercise by ingesting large amounts of carbohydrate (1.2 g/kg/h) or a lower amount of carbohydrate (0.8 g/kg/h) with a small amount of protein (0.2-0.4 g/kg/h).

  3. Evaluation of methods for delineating riparian zones in a semi-arid montane watershed

    Treesearch

    Jessica A. Salo; David M. Theobald; Thomas C. Brown

    2016-01-01

    Riparian zones in semi-arid, mountainous regions provide a disproportionate amount of the available wildlife habitat and ecosystem services. Despite their importance, there is little guidance on the best way to map riparian zones for broad spatial extents (e.g., large watersheds) when detailed maps from field data or high-resolution imagery and terrain data...

  4. Radio Emission from Binary Stars

    NASA Astrophysics Data System (ADS)

    Hjellming, R.; Murdin, P.

    2000-11-01

    Stellar radio emission is most common in double star systems where each star provides something essential in producing the large amounts of radio radiation needed for it to be detectable by RADIO TELESCOPES. They transfer mass, supply energy or, when one of the stars is a NEUTRON STAR or BLACK HOLE, have the strong gravitational fields needed for the energetic particles and magnetic fields needed...

  5. Making It Count: A Guide to High-Impact Education Philanthropy.

    ERIC Educational Resources Information Center

    Finn, Chester E., Jr.; Amis, Kelly

    The level of philanthropic giving to K-12 education in the U.S. is at an all-time high, yet there are instances in which large amounts of money bestowed seem to have vanished into the system without leaving lasting footprints or a meaningful transformation. The aim of this guide is to provide practical advice for philanthropists who are tired of…

  6. Grand Tour outer planet missions definition phase. Part 1: Quantitative imaging of the outer planets and their satellites

    NASA Technical Reports Server (NTRS)

    Belton, M. J. S.; Aksnes, K.; Davies, M. E.; Hartmann, W. K.; Millis, R. L.; Owen, T. C.; Reilly, T. H.; Sagan, C.; Suomi, V. E.; Collins, S. A., Jr.

    1972-01-01

    A recommended imaging system is outlined for use aboard the Outer Planet Grand Tour Explorer. The system features the high angular resolution capacity necessary to accommodate large encounter distances, and to satisfy the demand for a reasonable amount of time coverage. Specifications for all components within the system are provided in detail.

  7. Ecological foundations for fire management in North American forest and shrubland ecosystems

    Treesearch

    J.E. Keeley; G.H. Aplet; N.L. Christensen; S.G. Conard; E.A. Johnson; P.N. Omi; D.L. Peterson; T.W. Swetnam

    2009-01-01

    This synthesis provides an ecological foundation for management of the diverse ecosystems and fire regimes of North America based on scientific principles of fire interactions with vegetation, fuels, and biophysical processes. Although a large amount of scientific data on fire exists, most of those data have been collected at small spatial and temporal scales. Thus, it...

  8. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  9. 1:1 Laptop Implications and District Policy Considerations

    ERIC Educational Resources Information Center

    Sauers, Nicholas J.

    2012-01-01

    Background. The state of Iowa has seen a drastic increase in the number of schools that provide one laptop for each student. These 1:1 schools have invested large amounts of time and money into becoming a 1:1 school. The current research on 1:1 schools is sparse, and policy makers are actively trying to evaluate those programs. Purpose. To…

  10. Methodological Challenges in the Analysis of MOOC Data for Exploring the Relationship between Discussion Forum Views and Learning Outcomes

    ERIC Educational Resources Information Center

    Bergner, Yoav; Kerr, Deirdre; Pritchard, David E.

    2015-01-01

    Determining how learners use MOOCs effectively is critical to providing feedback to instructors, schools, and policy-makers on this highly scalable technology. However, drawing inferences about student learning outcomes in MOOCs has proven to be quite difficult due to large amounts of missing data (of various kinds) and to the diverse population…

  11. Economic analysis of municipal wastewater utilization for thermoelectric power production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Safari, I.; Walker, M.; Abbasian, J.

    2011-01-01

    The thermoelectric power industry in the U.S. uses a large amount of freshwater. The large water demand is increasingly a problem, especially for new power plant development, as availability of freshwater for new uses diminishes in the United States. Reusing non-traditional water sources, such as treated municipal wastewater, provides one option to mitigate freshwater usage in the thermoelectric power industry. The amount of freshwater withdrawal that can be displaced with non-traditional water sources at a particular location requires evaluation of the water management and treatment requirements, considering the quality and abundance of the non-traditional water sources. This paper presents themore » development of an integrated costing model to assess the impact of degraded water treatment, as well as the implications of increased tube scaling in the main condenser. The model developed herein is used to perform case studies of various treatment, condenser cleaning and condenser configurations to provide insight into the ramifications of degraded water use in the cooling loops of thermoelectric power plants. Further, this paper lays the groundwork for the integration of relationships between degraded water quality, scaling characteristics and volatile emission within a recirculating cooling loop model.« less

  12. A novel methodology to estimate the evolution of construction waste in construction sites.

    PubMed

    Katz, Amnon; Baum, Hadassa

    2011-02-01

    This paper focuses on the accumulation of construction waste generated throughout the erection of new residential buildings. A special methodology was developed in order to provide a model that will predict the flow of construction waste. The amount of waste and its constituents, produced on 10 relatively large construction sites (7000-32,000 m(2) of built area) was monitored periodically for a limited time. A model that predicts the accumulation of construction waste was developed based on these field observations. According to the model, waste accumulates in an exponential manner, i.e. smaller amounts are generated during the early stages of construction and increasing amounts are generated towards the end of the project. The total amount of waste from these sites was estimated at 0.2m(3) per 1m(2) floor area. A good correlation was found between the model predictions and actual data from the field survey. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Requirement analysis to promote small-sized E-waste collection from consumers.

    PubMed

    Mishima, Kuniko; Nishimura, Hidekazu

    2016-02-01

    The collection and recycling of small-sized waste electrical and electronic equipment is an emerging problem, since these products contain certain amounts of critical metals and rare earths. Even if the amount is not large, having a few supply routes for such recycled resources could be a good strategy to be competitive in a world of finite resources. The small-sized e-waste sometimes contains personal information, therefore, consumers are often reluctant to put them into recycling bins. In order to promote the recycling of E-waste, collection of used products from the consumer becomes important. Effective methods involving incentives for consumers might be necessary. Without such methods, it will be difficult to achieve the critical amounts necessary for an efficient recycling system. This article focused on used mobile phones among information appliances as the first case study, since it contains relatively large amounts of valuable metals compared with other small-sized waste electrical and electronic equipment and there are a large number of products existing in the market. The article carried out surveys to determine what kind of recycled material collection services are preferred by consumers. The results clarify that incentive or reward money alone is not a driving force for recycling behaviour. The article discusses the types of effective services required to promote recycling behaviour. The article concludes that securing information, transferring data and providing proper information about resources and environment can be an effective tool to encourage a recycling behaviour strategy to promote recycling, plus the potential discount service on purchasing new products associated with the return of recycled mobile phones. © The Author(s) 2015.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burnett, R.A.

    A major goal of the Analysis of Large Data Sets (ALDS) research project at Pacific Northwest Laboratory (PNL) is to provide efficient data organization, storage, and access capabilities for statistical applications involving large amounts of data. As part of the effort to achieve this goal, a self-describing binary (SDB) data file structure has been designed and implemented together with a set of basic data manipulation functions and supporting SDB data access routines. Logical and physical data descriptors are stored in SDB files preceding the data values. SDB files thus provide a common data representation for interfacing diverse software components. Thismore » paper describes the various types of data descriptors and data structures permitted by the file design. Data buffering, file segmentation, and a segment overflow handler are also discussed.« less

  15. A sequence of physical processes quantified in LAOS by continuous local measures

    NASA Astrophysics Data System (ADS)

    Lee, Ching-Wei; Rogers, Simon A.

    2017-11-01

    The response to large amplitude oscillatory shear of a soft colloidal glass formed by a suspension of multiarm star polymers is investigated by means of well-defined continuous local measures. The local measures provide information regarding the transient elastic and viscous response of the material, as well as elastic extension via a shifting equilibrium position. It is shown that even when the amplitude of the strain is very large, cages reform and break twice per period and exhibit maximum elasticity around the point of zero stress. It is also shown that around the point of zero stress, the cages are extended by a nearly constant amount of approximately 5% at 1 rad/s and 7% at 10 rad/s, even when the total strain is as large as 420%. The results of this study provide a blueprint for a generic approach to elucidating the complex dynamics exhibited by soft materials under flow.

  16. The composition of the primitive atmosphere and the synthesis of organic compounds on the early Earth

    NASA Technical Reports Server (NTRS)

    Bada, J. L.; Miller, S. L.

    1985-01-01

    The generally accepted theory for the origin of life on the Earth requires that a large variety of organic compounds be present to form the first living organisms and to provide the energy sources for primitive life either directly or through various fermentation reactions. This can provide a strong constraint on discussions of the formation of the Earth and on the composition of the primitive atmosphere. In order for substantial amounts of organic compounds to have been present on the prebiological Earth, certain conditions must have existed. There is a large body of literature on the prebiotic synthesis of organic compounds in various postulated atmospheres. In this mixture of abiotically synthesized organic compounds, the amino acids are of special interest since they are utilized by modern organisms to synthesize structural materials and a large array of catalytic peptides.

  17. Sensor Alerting Capability

    NASA Astrophysics Data System (ADS)

    Henriksson, Jakob; Bermudez, Luis; Satapathy, Goutam

    2013-04-01

    There is a large amount of sensor data generated today by various sensors, from in-situ buoys to mobile underwater gliders. Providing sensor data to the users through standardized services, language and data model is the promise of OGC's Sensor Web Enablement (SWE) initiative. As the amount of data grows it is becoming difficult for data providers, planners and managers to ensure reliability of data and services and to monitor critical data changes. Intelligent Automation Inc. (IAI) is developing a net-centric alerting capability to address these issues. The capability is built on Sensor Observation Services (SOSs), which is used to collect and monitor sensor data. The alerts can be configured at the service level and at the sensor data level. For example it can alert for irregular data delivery events or a geo-temporal statistic of sensor data crossing a preset threshold. The capability provides multiple delivery mechanisms and protocols, including traditional techniques such as email and RSS. With this capability decision makers can monitor their assets and data streams, correct failures or be alerted about a coming phenomena.

  18. Pseudoelastic intramedullary nailing for tibio-talo-calcaneal arthrodesis.

    PubMed

    Yakacki, Christopher M; Gall, Ken; Dirschl, Douglas R; Pacaccio, Douglas J

    2011-03-01

    Tibio-talo-calcaneal (TTC) arthrodesis is a procedure to treat severe ankle arthropathy by providing a pain-free and stable fusion. Intramedullary (IM) nails offer a method of internal fixation for TTC arthrodesis by providing compressive stability, as well as shear and torsional rigidity. IM nails have been developed to apply compression to the TTC complex during installation; however, current designs are highly susceptible to a loss of compression when exposed to small amounts of bone resorption and cyclic loading. Nickel titanium (NiTi) is a shape-memory alloy capable of recovering large amounts of deformation via shape-memory or pseudoelasticity. Currently, the next generation of IM nails is being developed to utilize the adaptive, pseudoelastic properties of NiTi and provide a fusion nail that is resistant to loss of compression or loosening. Specifically, the pseudoelastic IM nail contains an internal NiTi compression element that applies sustained compression during the course of fusion, analogous to external fixators. © 2011 Expert Reviews Ltd

  19. A Framework for Real-Time Collection, Analysis, and Classification of Ubiquitous Infrasound Data

    NASA Astrophysics Data System (ADS)

    Christe, A.; Garces, M. A.; Magana-Zook, S. A.; Schnurr, J. M.

    2015-12-01

    Traditional infrasound arrays are generally expensive to install and maintain. There are ~10^3 infrasound channels on Earth today. The amount of data currently provided by legacy architectures can be processed on a modest server. However, the growing availability of low-cost, ubiquitous, and dense infrasonic sensor networks presents a substantial increase in the volume, velocity, and variety of data flow. Initial data from a prototype ubiquitous global infrasound network is already pushing the boundaries of traditional research server and communication systems, in particular when serving data products over heterogeneous, international network topologies. We present a scalable, cloud-based approach for capturing and analyzing large amounts of dense infrasonic data (>10^6 channels). We utilize Akka actors with WebSockets to maintain data connections with infrasound sensors. Apache Spark provides streaming, batch, machine learning, and graph processing libraries which will permit signature classification, cross-correlation, and other analytics in near real time. This new framework and approach provide significant advantages in scalability and cost.

  20. Accurate Inventories Of Irrigated Land

    NASA Technical Reports Server (NTRS)

    Wall, S.; Thomas, R.; Brown, C.

    1992-01-01

    System for taking land-use inventories overcomes two problems in estimating extent of irrigated land: only small portion of large state surveyed in given year, and aerial photographs made on 1 day out of year do not provide adequate picture of areas growing more than one crop per year. Developed for state of California as guide to controlling, protecting, conserving, and distributing water within state. Adapted to any large area in which large amounts of irrigation water needed for agriculture. Combination of satellite images, aerial photography, and ground surveys yields data for computer analysis. Analyst also consults agricultural statistics, current farm reports, weather reports, and maps. These information sources aid in interpreting patterns, colors, textures, and shapes on Landsat-images.

  1. Prediction of large negative shaded-side spacecraft potentials

    NASA Technical Reports Server (NTRS)

    Prokopenko, S. M. L.; Laframboise, J. G.

    1977-01-01

    A calculation by Knott, for the floating potential of a spherically symmetric synchronous-altitude satellite in eclipse, was adapted to provide simple calculations of upper bounds on negative potentials which may be achieved by electrically isolated shaded surfaces on spacecraft in sunlight. Large (approximately 60 percent) increases in predicted negative shaded-side potentials are obtained. To investigate effective potential barrier or angular momentum selection effects due to the presence of less negative sunlit-side or adjacent surface potentials, these expressions were replaced by the ion random current, which is a lower bound for convex surfaces when such effects become very severe. Further large increases in predicted negative potentials were obtained, amounting to a doubling in some cases.

  2. A fast boosting-based screening method for large-scale association study in complex traits with genetic heterogeneity.

    PubMed

    Wang, Lu-Yong; Fasulo, D

    2006-01-01

    Genome-wide association study for complex diseases will generate massive amount of single nucleotide polymorphisms (SNPs) data. Univariate statistical test (i.e. Fisher exact test) was used to single out non-associated SNPs. However, the disease-susceptible SNPs may have little marginal effects in population and are unlikely to retain after the univariate tests. Also, model-based methods are impractical for large-scale dataset. Moreover, genetic heterogeneity makes the traditional methods harder to identify the genetic causes of diseases. A more recent random forest method provides a more robust method for screening the SNPs in thousands scale. However, for more large-scale data, i.e., Affymetrix Human Mapping 100K GeneChip data, a faster screening method is required to screening SNPs in whole-genome large scale association analysis with genetic heterogeneity. We propose a boosting-based method for rapid screening in large-scale analysis of complex traits in the presence of genetic heterogeneity. It provides a relatively fast and fairly good tool for screening and limiting the candidate SNPs for further more complex computational modeling task.

  3. Assistance received by employed caregivers and their care recipients: who helps care recipients when caregivers work full time?

    PubMed

    Scharlach, Andrew E; Gustavson, Kristen; Dal Santo, Teresa S

    2007-12-01

    This study examined the association among caregiver labor force participation, employees' caregiving activities, and the amount and quality of care received by care recipients. Telephone interviews were conducted with 478 adults who were employed full time and 705 nonemployed adults who provided care to a family member or friend aged 50 or older, identified through random sampling of California households. We assessed care recipient impairment and service problems; the amounts and types of assistance received from caregivers, family and friends, and paid providers; and caregiver utilization of support services. Care recipients of caregivers employed full time were less likely to receive large amounts of care from their caregivers, more likely to receive personal care from paid care providers, more likely to use community services, and more likely to experience service problems than were care recipients of nonemployed caregivers. Employed caregivers were more likely to use caregiver support services than were nonemployed caregivers. Accommodation to caregiver full-time employment involves selective supplementation by caregivers and their care recipients, reflecting increased reliance on formal support services as well as increased vulnerability to service problems and unmet care recipient needs. These findings suggest the need for greater attention to the well-being of disabled elders whose caregivers are employed full time.

  4. An expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1992-01-01

    A new rule-based, machine independent analytical tool was designed for diagnosing spacecraft anomalies using an expert system. Expert systems provide an effective method for saving knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms, which allow approximate reasoning and inference and the ability to attack problems not rigidly defined. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The modularity of the expert system allows for easy updates and modifications. It not only provides scientists with needed risk analysis and confidence not found in algorithmic programs, but is also an effective learning tool, and the window implementation makes it very easy to use. The system currently runs on a Micro VAX II at Goddard Space Flight Center (GSFC). The inference engine used is NASA's C Language Integrated Production System (CLIPS).

  5. Older adults' memory for medical information, effect of number and mode of presentation: An experimental study.

    PubMed

    Latorre-Postigo, José Miguel; Ros-Segura, Laura; Navarro-Bravo, Beatriz; Ricarte-Trives, Jorge Javier; Serrano-Selva, Juan Pedro; López-Torres-Hidalgo, Jesús

    2017-01-01

    To analyze different ways of presenting medical information to older adults, tailoring the information and its presentation to the characteristics of memory function in old age. Experimental study. We took into account the following variables: amount of information, type of information and mode of presentation, and time delay. The greater the number of recommendations, the lower the recall; visual presentation does not enhance verbal presentation; lifestyle information is recalled better than medication information; after ten minutes the percentage of memory decreases significantly; the first and last recommendations are better remembered. As a whole, these findings show that older adults remember more medical information when very few recommendations are provided in each session. It is inadvisable to overload older adults with a large amount of information: It is better to program more consultations and provide less information. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Method for quantitative determination and separation of trace amounts of chemical elements in the presence of large quantities of other elements having the same atomic mass

    DOEpatents

    Miller, C.M.; Nogar, N.S.

    1982-09-02

    Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.

  7. Composite Electrolytes for Lithium Batteries: Ionic Liquids in APTES Crosslinked Polymers

    NASA Technical Reports Server (NTRS)

    Tigelaar, Dean M.; Meador, Mary Ann B.; Bennett, William R.

    2007-01-01

    Solvent free polymer electrolytes were made consisting of Li(+) and pyrrolidinium salts of trifluoromethanesulfonimide added to a series of hyperbranched poly(ethylene oxide)s (PEO). The polymers were connected by triazine linkages and crosslinked by a sol-gel process to provide mechanical strength. The connecting PEO groups were varied to help understand the effects of polymer structure on electrolyte conductivity in the presence of ionic liquids. Polymers were also made that contain poly(dimethylsiloxane) groups, which provide increased flexibility without interacting with lithium ions. When large amounts of ionic liquid are added, there is little dependence of conductivity on the polymer structure. However, when smaller amounts of ionic liquid are added, the inherent conductivity of the polymer becomes a factor. These electrolytes are more conductive than those made with high molecular weight PEO imbibed with ionic liquids at ambient temperatures, due to the amorphous nature of the polymer.

  8. Highly entangled states with almost no secrecy.

    PubMed

    Christandl, Matthias; Schuch, Norbert; Winter, Andreas

    2010-06-18

    In this Letter we illuminate the relation between entanglement and secrecy by providing the first example of a quantum state that is highly entangled, but from which, nevertheless, almost no secrecy can be extracted. More precisely, we provide two bounds on the bipartite entanglement of the totally antisymmetric state in dimension d×d. First, we show that the amount of secrecy that can be extracted from the state is low; to be precise it is bounded by O(1/d). Second, we show that the state is highly entangled in the sense that we need a large amount of singlets to create the state: entanglement cost is larger than a constant, independent of d. In order to obtain our results we use representation theory, linear programming, and the entanglement measure known as squashed entanglement. Our findings also clarify the relation between the squashed entanglement and the relative entropy of entanglement.

  9. Students Preserve an Emancipation Site with Archaeological Technology

    ERIC Educational Resources Information Center

    LaRue, Paul

    2010-01-01

    Samuel Gist was a wealthy British merchant who, toward the end of his life, lived in England, but owned a considerable amount of land with a large number of slaves in America. Upon his death in 1815, his will specified that within one year his slaves should be emancipated, and his estate was to provide them with a new beginning in the form of…

  10. Customization of Discriminant Function Analysis for Prediction of Solar Flares

    DTIC Science & Technology

    2005-03-01

    lives such as telecommunication, commercial airlines, electrical power , wireless services, and terrestrial weather tracking and forecasting...the 1800’s can wreak havoc on today’s power , fuel, and telecommunication lines and finds its origin in solar activity. Enormous amounts of solar...inducing potential differences across large areas of the surface. Earth-bound power , fuel, and telecommunication lines grounded to the Earth provide an

  11. Episodic melting and magmatic recycling along 50 Ma in the Variscan belt linked to the orogenic evolution in NW Iberia

    NASA Astrophysics Data System (ADS)

    Gutiérrez-Alonso, G.; López-Carmona, A.; García Acera, G.; Martín Garro, J.; Fernández-Suárez, J.; Gärtner, A.; Hofmann, M.

    2017-12-01

    The advent of a large amount of more precise U-Pb age data on Variscan granitoids from NW Iberia in recent years has provided a more focused picture of the magmatic history of the Western European Variscan belt (WEVB). Based on these data, three main pulses of magmatic activity seem to be well established.

  12. [Corruption and health care system].

    PubMed

    Marasović Šušnjara, Ivana

    2014-06-01

    Corruption is a global problem that takes special place in health care system. A large number of participants in the health care system and numerous interactions among them provide an opportunity for various forms of corruption, be it bribery, theft, bureaucratic corruption or incorrect information. Even though it is difficult to measure the amount of corruption in medicine, there are tools that allow forming of the frames for possible interventions.

  13. A smart multisensor approach to assist blind people in specific urban navigation tasks.

    PubMed

    Ando, B

    2008-12-01

    Visually impaired people are often discouraged in using electronic aids due to complexity of operation, large amount of training, nonoptimized degree of information provided to the user, and high cost. In this paper, a new multisensor architecture is discussed, which would help blind people to perform urban mobility tasks. The device is based on a multisensor strategy and adopts smart signal processing.

  14. Accumulation of Reserve Carbohydrate by Rumen Protozoa and Bacteria in Competition for Glucose

    PubMed Central

    Denton, Bethany L.; Diese, Leanne E.; Firkins, Jeffrey L.

    2014-01-01

    The aim of this study was to determine if rumen protozoa could form large amounts of reserve carbohydrate compared to the amounts formed by bacteria when competing for glucose in batch cultures. We separated large protozoa and small bacteria from rumen fluid by filtration and centrifugation, recombined equal protein masses of each group into one mixture, and subsequently harvested (reseparated) these groups at intervals after glucose dosing. This method allowed us to monitor reserve carbohydrate accumulation of protozoa and bacteria individually. When mixtures were dosed with a moderate concentration of glucose (4.62 or 5 mM) (n = 2 each), protozoa accumulated large amounts of reserve carbohydrate; 58.7% (standard error of the mean [SEM], 2.2%) glucose carbon was recovered from protozoal reserve carbohydrate at time of peak reserve carbohydrate concentrations. Only 1.7% (SEM, 2.2%) was recovered in bacterial reserve carbohydrate, which was less than that for protozoa (P < 0.001). When provided a high concentration of glucose (20 mM) (n = 4 each), 24.1% (SEM, 2.2%) of glucose carbon was recovered from protozoal reserve carbohydrate, which was still higher (P = 0.001) than the 5.0% (SEM, 2.2%) glucose carbon recovered from bacterial reserve carbohydrate. Our novel competition experiments directly demonstrate that mixed protozoa can sequester sugar away from bacteria by accumulating reserve carbohydrate, giving protozoa a competitive advantage and stabilizing fermentation in the rumen. Similar experiments could be used to investigate the importance of starch sequestration. PMID:25548053

  15. Effects of entanglement in an ideal optical amplifier

    NASA Astrophysics Data System (ADS)

    Franson, J. D.; Brewster, R. A.

    2018-04-01

    In an ideal linear amplifier, the output signal is linearly related to the input signal with an additive noise that is independent of the input. The decoherence of a quantum-mechanical state as a result of optical amplification is usually assumed to be due to the addition of quantum noise. Here we show that entanglement between the input signal and the amplifying medium can produce an exponentially-large amount of decoherence in an ideal optical amplifier even when the gain is arbitrarily close to unity and the added noise is negligible. These effects occur for macroscopic superposition states, where even a small amount of gain can leave a significant amount of which-path information in the environment. Our results show that the usual input/output relation of a linear amplifier does not provide a complete description of the output state when post-selection is used.

  16. System simulation application for determining the size of daily raw material purchases at PT XY

    NASA Astrophysics Data System (ADS)

    Napitupulu, H. L.

    2018-02-01

    Every manufacturing company needs to implement green production, including PT XY as a marine catchment processing industry in Sumatera Utara Province. The company is engaged in the processing of squid for export purposes. The company’s problem relates to the absence of a decision on the daily purchase amount of the squid. The purchase of daily raw materials in varying quantities has caused companies to face the problem of excess raw materials or otherwise the lack of raw materials. The low purchase of raw materials will result in reduced productivity, while large purchases will lead to increased cooling costs for storage of excess raw materials, as well as possible loss of damage raw material. Therefore it is necessary to determine the optimal amount of raw material purchases every day. This can be determined by applying simulation. Application of system simulations can provide the expected optimal amount of raw material purchases.

  17. Automatic generation of reports at the TELECOM SCC

    NASA Astrophysics Data System (ADS)

    Beltan, Thierry; Jalbaud, Myriam; Fronton, Jean Francois

    In-orbit satellite follow-up produces a certain amount of reports on a regular basis (daily, weekly, quarterly, annually). Most of these documents use the information of former issues with the increments of the last period of time. They are made up of text, tables, graphs or pictures. The system presented here is the SGMT (Systeme de Gestion de la Memoire Technique), which means Technical Memory Mangement System. It provides the system operators with tools to generate the greatest part of these reports, as automatically as possible. It gives an easy access to the reports and the large amount of available memory enables the user to consult data on the complete lifetime of a satellite family.

  18. Using Visualization in Cockpit Decision Support Systems

    NASA Technical Reports Server (NTRS)

    Aragon, Cecilia R.

    2005-01-01

    In order to safely operate their aircraft, pilots must make rapid decisions based on integrating and processing large amounts of heterogeneous information. Visual displays are often the most efficient method of presenting safety-critical data to pilots in real time. However, care must be taken to ensure the pilot is provided with the appropriate amount of information to make effective decisions and not become cognitively overloaded. The results of two usability studies of a prototype airflow hazard visualization cockpit decision support system are summarized. The studies demonstrate that such a system significantly improves the performance of helicopter pilots landing under turbulent conditions. Based on these results, design principles and implications for cockpit decision support systems using visualization are presented.

  19. Transfer of interferon alfa into human breast milk.

    PubMed

    Kumar, A R; Hale, T W; Mock, R E

    2000-08-01

    Originally assumed to be antiviral substances, the efficacy of interferons in a number of pathologies, including malignancies, multiple sclerosis, and other immune syndromes, is increasingly recognized. This study provides data on the transfer of interferon alfa (2B) into human milk of a patient receiving massive intravenous doses for the treatment of malignant melanoma. Following an intravenous dose of 30 million IU, the amount of interferon transferred into human milk was only slightly elevated (1551 IU/mL) when compared to control milk (1249 IU/mL). These data suggest that even following enormous doses, interferon is probably too large in molecular weight to transfer into human milk in clinically relevant amounts.

  20. Totally Connected Healthcare with TV White Spaces.

    PubMed

    Katzis, Konstantinos; Jones, Richard W; Despotou, Georgios

    2017-01-01

    Recent technological advances in electronics, wireless communications and low cost medical sensors generated a plethora of Wearable Medical Devices (WMDs), which are capable of generating considerably large amounts of new, unstructured real-time data. This contribution outlines how this data can be propagated to a healthcare system through the internet, using long distance Radio Access Networks (RANs) and proposes a novel communication system architecture employing White Space Devices (WSD) to provide seamless connectivity to its users. Initial findings indicate that the proposed communication system can facilitate broadband services over a large geographical area taking advantage of the freely available TV White Spaces (TVWS).

  1. Finite Element Analysis and Optimization of Flexure Bearing for Linear Motor Compressor

    NASA Astrophysics Data System (ADS)

    Khot, Maruti; Gawali, Bajirao

    Nowadays linear motor compressors are commonly used in miniature cryocoolers instead of rotary compressors because rotary compressors apply large radial forces to the piston, which provide no useful work, cause large amount of wear and usually require lubrication. Recent trends favour flexure supported configurations for long life. The present work aims at designing and geometrical optimization of flexure bearings using finite element analysis and the development of design charts for selection purposes. The work also covers the manufacturing of flexures using different materials and the validation of the experimental finite element analysis results.

  2. Large area single-mode parity-time-symmetric laser amplifiers.

    PubMed

    Miri, Mohammad-Ali; LiKamWa, Patrik; Christodoulides, Demetrios N

    2012-03-01

    By exploiting recent developments associated with parity-time (PT) symmetry in optics, we here propose a new avenue in realizing single-mode large area laser amplifiers. This can be accomplished by utilizing the abrupt symmetry breaking transition that allows the fundamental mode to experience gain while keeping all the higher order modes neutral. Such PT-symmetric structures can be realized by judiciously coupling two multimode waveguides, one exhibiting gain while the other exhibits an equal amount of loss. Pertinent examples are provided for both semiconductor and fiber laser amplifiers. © 2012 Optical Society of America

  3. Making large amounts of meteorological plots easily accessible to users

    NASA Astrophysics Data System (ADS)

    Lamy-Thepaut, Sylvie; Siemen, Stephan; Sahin, Cihan; Raoult, Baudouin

    2015-04-01

    The European Centre for Medium-Range Weather Forecasts (ECMWF) is an international organisation providing its member organisations with forecasts in the medium time range of 3 to 15 days, and some longer-range forecasts for up to a year ahead, with varying degrees of detail. As part of its mission, ECMWF generates an increasing number of forecast data products for its users. To support the work of forecasters and researchers and to let them make best use of ECMWF forecasts, the Centre also provides tools and interfaces to visualise their products. This allows users to make use of and explore forecasts without having to transfer large amounts of raw data. This is especially true for products based on ECMWF's 50 member ensemble forecast, where some specific processing and visualisation are applied to extract information. Every day, thousands of raw data are being pushed to the ECMWF's interactive web charts application called ecCharts, and thousands of products are processed and pushed to ECMWF's institutional web site ecCharts provides a highly interactive application to display and manipulate recent numerical forecasts to forecasters in national weather services and ECMWF's commercial customers. With ecCharts forecasters are able to explore ECMWF's medium-range forecasts in far greater detail than has previously been possible on the web, and this as soon as the forecast becomes available. All ecCharts's products are also available through a machine-to-machine web map service based on the OGC Web Map Service (WMS) standard. ECMWF institutional web site provides access to a large number of graphical products. It was entirely redesigned last year. It now shares the same infrastructure as ECMWF's ecCharts, and can benefit of some ecCharts functionalities, for example the dashboard. The dashboard initially developed for ecCharts allows users to organise their own collection of products depending on their work flow, and is being further developed. In its first implementation, It presents the user's products in a single interface with fast access to the original product, and possibilities of synchronous animations between them. But its functionalities are being extended to give users the freedom to collect not only ecCharts's 2D maps and graphs, but also other ECMWF Web products such as monthly and seasonal products, scores, and observation monitoring. The dashboard will play a key role to help the user to interpret the large amount of information that ECMWF is providing. This talk will present examples of how the new user interface can organise complex meteorological maps and graphs and show the new possibilities users have gained by using the web as a medium.

  4. A large-scale solar dynamics observatory image dataset for computer vision applications.

    PubMed

    Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A

    2017-01-01

    The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.

  5. A biomedical information system for retrieval and manipulation of NHANES data.

    PubMed

    Mukherjee, Sukrit; Martins, David; Norris, Keith C; Jenders, Robert A

    2013-01-01

    The retrieval and manipulation of data from large public databases like the U.S. National Health and Nutrition Examination Survey (NHANES) may require sophisticated statistical software and significant expertise that may be unavailable in the university setting. In response, we have developed the Data Retrieval And Manipulation System (DReAMS), an automated information system to handle all processes of data extraction and cleaning and then joining different subsets to produce analysis-ready output. The system is a browser-based data warehouse application in which the input data from flat files or operational systems are aggregated in a structured way so that the desired data can be read, recoded, queried and extracted efficiently. The current pilot implementation of the system provides access to a limited amount of NHANES database. We plan to increase the amount of data available through the system in the near future and to extend the techniques to other large databases from CDU archive with a current holding of about 53 databases.

  6. Principles of gene microarray data analysis.

    PubMed

    Mocellin, Simone; Rossi, Carlo Riccardo

    2007-01-01

    The development of several gene expression profiling methods, such as comparative genomic hybridization (CGH), differential display, serial analysis of gene expression (SAGE), and gene microarray, together with the sequencing of the human genome, has provided an opportunity to monitor and investigate the complex cascade of molecular events leading to tumor development and progression. The availability of such large amounts of information has shifted the attention of scientists towards a nonreductionist approach to biological phenomena. High throughput technologies can be used to follow changing patterns of gene expression over time. Among them, gene microarray has become prominent because it is easier to use, does not require large-scale DNA sequencing, and allows for the parallel quantification of thousands of genes from multiple samples. Gene microarray technology is rapidly spreading worldwide and has the potential to drastically change the therapeutic approach to patients affected with tumor. Therefore, it is of paramount importance for both researchers and clinicians to know the principles underlying the analysis of the huge amount of data generated with microarray technology.

  7. Variable Stars in the Field of V729 Aql

    NASA Astrophysics Data System (ADS)

    Cagaš, P.

    2017-04-01

    Wide field instruments can be used to acquire light curves of tens or even hundreds of variable stars per night, which increases the probability of new discoveries of interesting variable stars and generally increases the efficiency of observations. At the same time, wide field instruments produce a large amount of data, which must be processed using advanced software. The traditional approach, typically used by amateur astronomers, requires an unacceptable amount of time needed to process each data set. New functionality, built into SIPS software package, can shorten the time needed to obtain light curves by several orders of magnitude. Also, newly introduced SILICUPS software is intended for post-processing of stored light curves. It can be used to visualize observations from many nights, to find variable star periods, evaluate types of variability, etc. This work provides an overview of tools used to process data from the large field of view around the variable star V729 Aql. and demonstrates the results.

  8. Evaluation of Flush-Mounted, S-Duct Inlets With Large Amounts of Boundary Layer Ingestion

    NASA Technical Reports Server (NTRS)

    Berrier, Bobby L.; Morehouse, Melissa B.

    2003-01-01

    A new high Reynolds number test capability for boundary layer ingesting inlets has been developed for the NASA Langley Research Center 0.3-Meter Transonic Cryogenic Tunnel. Using this new capability, an experimental investigation of four S-duct inlet configurations with large amounts of boundary layer ingestion (nominal boundary layer thickness of about 40% of inlet height) was conducted at realistic operating conditions (high subsonic Mach numbers and full-scale Reynolds numbers). The objectives of this investigation were to 1) develop a new high Reynolds number, boundary-layer ingesting inlet test capability, 2) evaluate the performance of several boundary layer ingesting S-duct inlets, 3) provide a database for CFD tool validation, and 4) provide a baseline inlet for future inlet flow-control studies. Tests were conducted at Mach numbers from 0.25 to 0.83, Reynolds numbers (based on duct exit diameter) from 5.1 million to a fullscale value of 13.9 million, and inlet mass-flow ratios from 0.39 to 1.58 depending on Mach number. Results of this investigation indicate that inlet pressure recovery generally decreased and inlet distortion generally increased with increasing Mach number. Except at low Mach numbers, increasing inlet mass-flow increased pressure recovery and increased distortion. Increasing the amount of boundary layer ingestion (by decreasing inlet throat height and increasing inlet throat width) or ingesting a boundary layer with a distorted profile decreased pressure recovery and increased distortion. Finally, increasing Reynolds number had almost no effect on inlet distortion but increased inlet recovery by about one-half percent at a Mach number near cruise.

  9. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and US Department of Agriculture Database Information: A Multisite Randomized Study

    PubMed Central

    Urban, Lorien E.; Weber, Judith L.; Heyman, Melvin B.; Schichtl, Rachel L.; Verstraete, Sofia; Lowery, Nina S.; Das, Sai Krupa; Schleicher, Molly M.; Rogers, Gail; Economos, Christina; Masters, William A.; Roberts, Susan B.

    2017-01-01

    Background Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ~50% of US restaurants are individual or small-chain (non–chain) establishments that do not provide nutrition information. Objective To measure the energy content of frequently ordered meals in non–chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. Design A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non–chain restaurants, together with equivalent meals from large-chain restaurants. Setting Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Main outcome measures Meal energy content determined by bomb calorimetry. Statistical analysis performed Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non–chain and chain meals, human energy requirements, and food database values. Results Meals from non–chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non–chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Conclusions Non–chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to excessive portions that induce overeating through established biological mechanisms. PMID:26803805

  10. Energy Contents of Frequently Ordered Restaurant Meals and Comparison with Human Energy Requirements and U.S. Department of Agriculture Database Information: A Multisite Randomized Study.

    PubMed

    Urban, Lorien E; Weber, Judith L; Heyman, Melvin B; Schichtl, Rachel L; Verstraete, Sofia; Lowery, Nina S; Das, Sai Krupa; Schleicher, Molly M; Rogers, Gail; Economos, Christina; Masters, William A; Roberts, Susan B

    2016-04-01

    Excess energy intake from meals consumed away from home is implicated as a major contributor to obesity, and ∼50% of US restaurants are individual or small-chain (non-chain) establishments that do not provide nutrition information. To measure the energy content of frequently ordered meals in non-chain restaurants in three US locations, and compare with the energy content of meals from large-chain restaurants, energy requirements, and food database information. A multisite random-sampling protocol was used to measure the energy contents of the most frequently ordered meals from the most popular cuisines in non-chain restaurants, together with equivalent meals from large-chain restaurants. Meals were obtained from restaurants in San Francisco, CA; Boston, MA; and Little Rock, AR, between 2011 and 2014. Meal energy content determined by bomb calorimetry. Regional and cuisine differences were assessed using a mixed model with restaurant nested within region×cuisine as the random factor. Paired t tests were used to evaluate differences between non-chain and chain meals, human energy requirements, and food database values. Meals from non-chain restaurants contained 1,205±465 kcal/meal, amounts that were not significantly different from equivalent meals from large-chain restaurants (+5.1%; P=0.41). There was a significant effect of cuisine on non-chain meal energy, and three of the four most popular cuisines (American, Italian, and Chinese) had the highest mean energy (1,495 kcal/meal). Ninety-two percent of meals exceeded typical energy requirements for a single eating occasion. Non-chain restaurants lacking nutrition information serve amounts of energy that are typically far in excess of human energy requirements for single eating occasions, and are equivalent to amounts served by the large-chain restaurants that have previously been criticized for providing excess energy. Restaurants in general, rather than specific categories of restaurant, expose patrons to excessive portions that induce overeating through established biological mechanisms. Copyright © 2016 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  11. Quantum stochastic thermodynamic on harmonic networks

    DOE PAGES

    Deffner, Sebastian

    2016-01-04

    Fluctuation theorems are symmetry relations for the probability to observe an amount of entropy production in a finite-time process. In a recent paper Pigeon et al (2016 New. J. Phys. 18 013009) derived fluctuation theorems for harmonic networks by means of the large deviation theory. Furthermore, their novel approach is illustrated with various examples of experimentally relevant systems. As a main result, however, Pigeon et al provide new insight how to consistently formulate quantum stochastic thermodynamics, and provide new and robust tools for the study of the thermodynamics of quantum harmonic networks.

  12. Extending Beowulf Clusters

    USGS Publications Warehouse

    Steinwand, Daniel R.; Maddox, Brian; Beckmann, Tim; Hamer, George

    2003-01-01

    Beowulf clusters can provide a cost-effective way to compute numerical models and process large amounts of remote sensing image data. Usually a Beowulf cluster is designed to accomplish a specific set of processing goals, and processing is very efficient when the problem remains inside the constraints of the original design. There are cases, however, when one might wish to compute a problem that is beyond the capacity of the local Beowulf system. In these cases, spreading the problem to multiple clusters or to other machines on the network may provide a cost-effective solution.

  13. Outlier Detection for Sensor Systems (ODSS): A MATLAB Macro for Evaluating Microphone Sensor Data Quality.

    PubMed

    Vasta, Robert; Crandell, Ian; Millican, Anthony; House, Leanna; Smith, Eric

    2017-10-13

    Microphone sensor systems provide information that may be used for a variety of applications. Such systems generate large amounts of data. One concern is with microphone failure and unusual values that may be generated as part of the information collection process. This paper describes methods and a MATLAB graphical interface that provides rapid evaluation of microphone performance and identifies irregularities. The approach and interface are described. An application to a microphone array used in a wind tunnel is used to illustrate the methodology.

  14. Family, society, economy and fertility in Bangladesh.

    PubMed

    Mannan, M A

    1989-09-01

    "This paper examines the socio-economic and cultural conditions under which the large family represents a rational economic goal for parents [in Bangladesh]." The author notes that rural children provide valuable labor services to parents during childhood, grown sons continue to support their parents financially and in other ways, and sons are the most reliable source of security in old age. Daughters, however, remain at home and cost a significant amount for dowries at marriage. It is concluded that prevailing socioeconomic conditions in Bangladesh still provide substantial support for high fertility and son preference. excerpt

  15. Quantum stochastic thermodynamic on harmonic networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deffner, Sebastian

    Fluctuation theorems are symmetry relations for the probability to observe an amount of entropy production in a finite-time process. In a recent paper Pigeon et al (2016 New. J. Phys. 18 013009) derived fluctuation theorems for harmonic networks by means of the large deviation theory. Furthermore, their novel approach is illustrated with various examples of experimentally relevant systems. As a main result, however, Pigeon et al provide new insight how to consistently formulate quantum stochastic thermodynamics, and provide new and robust tools for the study of the thermodynamics of quantum harmonic networks.

  16. Variation of consumer contact with household products: a preliminary investigation.

    PubMed

    Weegels, M E; van Veen, M P

    2001-06-01

    Little information is available on product use by consumers, which severely hampers exposure estimation for consumer products. This article describes actual contact with several consumer products, specifically dishwashing detergents, cleaning products, and hair styling products. How and where products are handled, as well as the duration, frequency, and amount of use were studied by means of diaries, in-home observations, and measurements. This study addressed the question, "To what extent are frequency, duration, and amount of use associated?" Findings showed that there was a large intra- as well as interindividual variation in frequency, duration, and amount of use, with the interindividual variation being considerably larger. At the same time, results showed that, for a given activity, users tended to follow their own routine. Few relations were found among frequency, duration, and amount of use. It was concluded that among persons, frequency, duration, and amount of product act in practice as independent parameters. Diaries appear to be quite suitable for gaining insight into frequently used products. Observations of usage, recorded on video, were indispensable for obtaining particular information on product use. In addition, home visits enabled the collection of specific measurements. Although diaries and home visits are time-consuming, the combination provided insight into variation as well as relations among frequency, duration, and amount of use.

  17. Exoplanet phase curves at large phase angles. Diagnostics for extended hazy atmospheres

    NASA Astrophysics Data System (ADS)

    García Muñoz, A.; Cabrera, J.

    2018-01-01

    At optical wavelengths, Titan's brightness for large Sun-Titan-observer phase angles significantly exceeds its dayside brightness. The brightening that occurs near back-illumination is due to moderately large haze particles in the moon's extended atmosphere that forward scatters the incident sunlight. Motivated by this phenomenon, here we investigate the forward scattering from currently known exoplanets, its diagnostics possibilities, the observational requirements to resolve it and potential implications. An analytical expression is derived for the amount of starlight forward scattered by an exponential atmosphere that takes into account the finite angular size of the star. We use this expression to tentatively estimate how prevalent this phenomenon may be. Based on numerical calculations that consider exoplanet visibility, we identify numerous planets with predicted out-of-transit forward-scattering signals of up to tens of parts per million provided that aerosols of ≳1 μm size form over an extended vertical region near the optical radius level. We propose that the interpretation of available optical phase curves should be revised to constrain the strength of this phenomenon that might provide insight into aerosol scale heights and particle sizes. For the relatively general atmospheres considered here, forward scattering reduces the transmission-only transit depth by typically less than the equivalent to a scale height. For short-period exoplanets, the finite angular size of the star severely affects the amount of radiation scattered towards the observer at mid-transit.

  18. An SQL query generator for CLIPS

    NASA Technical Reports Server (NTRS)

    Snyder, James; Chirica, Laurian

    1990-01-01

    As expert systems become more widely used, their access to large amounts of external information becomes increasingly important. This information exists in several forms such as statistical, tabular data, knowledge gained by experts and large databases of information maintained by companies. Because many expert systems, including CLIPS, do not provide access to this external information, much of the usefulness of expert systems is left untapped. The scope of this paper is to describe a database extension for the CLIPS expert system shell. The current industry standard database language is SQL. Due to SQL standardization, large amounts of information stored on various computers, potentially at different locations, will be more easily accessible. Expert systems should be able to directly access these existing databases rather than requiring information to be re-entered into the expert system environment. The ORACLE relational database management system (RDBMS) was used to provide a database connection within the CLIPS environment. To facilitate relational database access a query generation system was developed as a CLIPS user function. The queries are entered in a CLlPS-like syntax and are passed to the query generator, which constructs and submits for execution, an SQL query to the ORACLE RDBMS. The query results are asserted as CLIPS facts. The query generator was developed primarily for use within the ICADS project (Intelligent Computer Aided Design System) currently being developed by the CAD Research Unit in the California Polytechnic State University (Cal Poly). In ICADS, there are several parallel or distributed expert systems accessing a common knowledge base of facts. Expert system has a narrow domain of interest and therefore needs only certain portions of the information. The query generator provides a common method of accessing this information and allows the expert system to specify what data is needed without specifying how to retrieve it.

  19. Experimental study of a valveless pulse detonation rocket engine using nontoxic hypergolic propellants

    NASA Astrophysics Data System (ADS)

    Kan, Brandon K.

    A pulsed detonation rocket engine concept was explored through the use of hypergolic propellants in a fuel-centered pintle injector combustor. The combustor design yielded a simple open ended chamber with a pintle type injection element and pressure instrumentation. High-frequency pressure measurements from the first test series showed the presence of large pressure oscillations in excess of 2000 psia at frequencies between 400-600 hz during operation. High-speed video confirmed the high-frequency pulsed behavior and large amounts of after burning. Damaged hardware and instrumentation failure limited the amount of data gathered in the first test series, but the experiments met original test objectives of producing large over-pressures in an open chamber. A second test series proceeded by replacing hardware and instrumentation, and new data showed that pulsed events produced under expanded exhaust prior to pulsing, peak pressures around 8000 psi, and operating frequencies between 400-800 hz. Later hot-fires produced no pulsed behavior despite undamaged hardware. The research succeeded in producing pulsed combustion behavior using hypergolic fuels in a pintle injector setup and provided insights into design concepts that would assist future injector designs and experimental test setups.

  20. Application research on big data in energy conservation and emission reduction of transportation industry

    NASA Astrophysics Data System (ADS)

    Bai, Bingdong; Chen, Jing; Wang, Mei; Yao, Jingjing

    2017-06-01

    In the context of big data age, the energy conservation and emission reduction of transportation is a natural big data industry. The planning, management, decision-making of energy conservation and emission reduction of transportation and other aspects should be supported by the analysis and forecasting of large amounts of data. Now, with the development of information technology, such as intelligent city, sensor road and so on, information collection technology in the direction of the Internet of things gradually become popular. The 3G/4G network transmission technology develop rapidly, and a large number of energy conservation and emission reduction of transportation data is growing into a series with different ways. The government not only should be able to make good use of big data to solve the problem of energy conservation and emission reduction of transportation, but also to explore and use a large amount of data behind the hidden value. Based on the analysis of the basic characteristics and application technology of energy conservation and emission reduction of transportation data, this paper carries out its application research in energy conservation and emission reduction of transportation industry, so as to provide theoretical basis and reference value for low carbon management.

  1. Using Mosix for Wide-Area Compuational Resources

    USGS Publications Warehouse

    Maddox, Brian G.

    2004-01-01

    One of the problems with using traditional Beowulf-type distributed processing clusters is that they require an investment in dedicated computer resources. These resources are usually needed in addition to pre-existing ones such as desktop computers and file servers. Mosix is a series of modifications to the Linux kernel that creates a virtual computer, featuring automatic load balancing by migrating processes from heavily loaded nodes to less used ones. An extension of the Beowulf concept is to run a Mosixenabled Linux kernel on a large number of computer resources in an organization. This configuration would provide a very large amount of computational resources based on pre-existing equipment. The advantage of this method is that it provides much more processing power than a traditional Beowulf cluster without the added costs of dedicating resources.

  2. The social computing room: a multi-purpose collaborative visualization environment

    NASA Astrophysics Data System (ADS)

    Borland, David; Conway, Michael; Coposky, Jason; Ginn, Warren; Idaszak, Ray

    2010-01-01

    The Social Computing Room (SCR) is a novel collaborative visualization environment for viewing and interacting with large amounts of visual data. The SCR consists of a square room with 12 projectors (3 per wall) used to display a single 360-degree desktop environment that provides a large physical real estate for arranging visual information. The SCR was designed to be cost-effective, collaborative, configurable, widely applicable, and approachable for naive users. Because the SCR displays a single desktop, a wide range of applications is easily supported, making it possible for a variety of disciplines to take advantage of the room. We provide a technical overview of the room and highlight its application to scientific visualization, arts and humanities projects, research group meetings, and virtual worlds, among other uses.

  3. Precision medicine for psychopharmacology: a general introduction.

    PubMed

    Shin, Cheolmin; Han, Changsu; Pae, Chi-Un; Patkar, Ashwin A

    2016-07-01

    Precision medicine is an emerging medical model that can provide accurate diagnoses and tailored therapeutic strategies for patients based on data pertaining to genes, microbiomes, environment, family history and lifestyle. Here, we provide basic information about precision medicine and newly introduced concepts, such as the precision medicine ecosystem and big data processing, and omics technologies including pharmacogenomics, pharamacometabolomics, pharmacoproteomics, pharmacoepigenomics, connectomics and exposomics. The authors review the current state of omics in psychiatry and the future direction of psychopharmacology as it moves towards precision medicine. Expert commentary: Advances in precision medicine have been facilitated by achievements in multiple fields, including large-scale biological databases, powerful methods for characterizing patients (such as genomics, proteomics, metabolomics, diverse cellular assays, and even social networks and mobile health technologies), and computer-based tools for analyzing large amounts of data.

  4. The use of satellite data to determine the distribution of ozone in the troposphere

    NASA Technical Reports Server (NTRS)

    Fishman, Jack; Watson, Catherine E.; Brackett, Vincent G.; Fakhruzzaman, Khan; Veiga, Robert E.

    1991-01-01

    Measurements from two independent satellite data sets have been used to derive the climatology of the integrated amount of ozone in the troposphere. These data have led to the finding that large amounts of ozone pollution are generated by anthropogenic activity originating from both the industrialized regions of the Northern Hemisphere and from the southern tropical regions of Africa. To verify the existence of this ozone anomaly at low latitudes, an ozonesonde capability has been established at Ascension Island (8 deg S, 15 deg W) since July 1990. According to the satellite analyses, Ascension Island is located downwind of the primary source region of this ozone pollution, which likely results from the photochemical oxidation of emissions emanating from the widespread burning of savannas and other biomass. These in situ measurements confirm the existence of large amounts of ozone in the lower atmosphere. A summary of these ozonesonde data to date will be presented. In addition, we will present some ozone profile measurements from SAGE II which can be used to provide upper tropospheric ozone measurements directly in the tropical troposphere. A preliminary comparison between the satellite observations and the ozonesonde profiles in the upper troposphere and lower stratosphere will also be presented.

  5. Smart nanogels at the air/water interface: structural studies by neutron reflectivity

    NASA Astrophysics Data System (ADS)

    Zielińska, Katarzyna; Sun, Huihui; Campbell, Richard A.; Zarbakhsh, Ali; Resmini, Marina

    2016-02-01

    The development of effective transdermal drug delivery systems based on nanosized polymers requires a better understanding of the behaviour of such nanomaterials at interfaces. N-Isopropylacrylamide-based nanogels synthesized with different percentages of N,N'-methylenebisacrylamide as cross-linker, ranging from 10 to 30%, were characterized at physiological temperature at the air/water interface, using neutron reflectivity (NR), with isotopic contrast variation, and surface tension measurements; this allowed us to resolve the adsorbed amount and the volume fraction of nanogels at the interface. A large conformational change for the nanogels results in strong deformations at the interface. As the percentage of cross-linker incorporated in the nanogels becomes higher, more rigid matrices are obtained, although less deformed, and the amount of adsorbed nanogels is increased. The data provide the first experimental evidence of structural changes of nanogels as a function of the degree of cross-linking at the air/water interface.The development of effective transdermal drug delivery systems based on nanosized polymers requires a better understanding of the behaviour of such nanomaterials at interfaces. N-Isopropylacrylamide-based nanogels synthesized with different percentages of N,N'-methylenebisacrylamide as cross-linker, ranging from 10 to 30%, were characterized at physiological temperature at the air/water interface, using neutron reflectivity (NR), with isotopic contrast variation, and surface tension measurements; this allowed us to resolve the adsorbed amount and the volume fraction of nanogels at the interface. A large conformational change for the nanogels results in strong deformations at the interface. As the percentage of cross-linker incorporated in the nanogels becomes higher, more rigid matrices are obtained, although less deformed, and the amount of adsorbed nanogels is increased. The data provide the first experimental evidence of structural changes of nanogels as a function of the degree of cross-linking at the air/water interface. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07538f

  6. Selected Papers on Low-Energy Antiprotons and Possible Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Noble, Robert

    1998-09-19

    The only realistic means by which to create a facility at Fermilab to produce large amounts of low energy antiprotons is to use resources which already exist. There is simply too little money and manpower at this point in time to generate new accelerators on a time scale before the turn of the century. Therefore, innovation is required to modify existing equipment to provide the services required by experimenters.

  7. Development and application of a soil organic matter-based soil quality index in mineralized terrane of the Western US

    Treesearch

    S. W. Blecker; L. L. Stillings; M. C. Amacher; J. A. Ippolito; N. M. DeCrappeo

    2012-01-01

    Soil quality indices provide a means of distilling large amounts of data into a single metric that evaluates the soil's ability to carry out key ecosystem functions. Primarily developed in agroecosytems, then forested ecosystems, an index using the relation between soil organic matter and other key soil properties in more semi-arid systems of the Western US...

  8. Artificial intelligence applications concepts for the remote sensing and earth science community

    NASA Technical Reports Server (NTRS)

    Campbell, W. J.; Roelofs, L. H.

    1984-01-01

    The following potential applications of AI to the study of earth science are described: (1) intelligent data management systems; (2) intelligent processing and understanding of spatial data; and (3) automated systems which perform tasks that currently require large amounts of time by scientists and engineers to complete. An example is provided of how an intelligent information system might operate to support an earth science project.

  9. JOVIAL/Ada Microprocessor Study.

    DTIC Science & Technology

    1982-04-01

    Study Final Technical Report interesting feature of the nodes is that they provide multiple virtual terminals, so it is possible to monitor several...Terminal Interface Tasking Except ion Handling A more elaborate system could allow such features as spooling, background jobs or multiple users. To a large...Another editor feature is the buffer. Buffers may hold small amounts of text or entire text objects. They allow multiple files to be edited simultaneously

  10. Lightweight Innovative Solar Array (LISA): Providing Higher Power to Small Spacecraft

    NASA Technical Reports Server (NTRS)

    Johnson, Les; Carr, John; Fabisinski, Leo; Russell,Tiffany; Smith, Leigh

    2015-01-01

    Affordable and convenient access to electrical power is essential for all spacecraft and is a critical design driver for the next generation of smallsats, including cubesats, which are currently extremely power limited. The Lightweight Innovative Solar Array (LISA), a concept designed, prototyped, and tested at the NASA Marshall Space Flight Center (MSFC) in Huntsville, Alabama provides an affordable, lightweight, scalable, and easily manufactured approach for power generation in space. This flexible technology has many wide-ranging applications from serving small satellites to providing abundant power to large spacecraft in GEO and beyond. By using very thin, ultra-flexible solar arrays adhered to an inflatable structure, a large area (and thus large amount of power) can be folded and packaged into a relatively small volume. The LISA array comprises a launch-stowed, orbit-deployed structure on which lightweight photovoltaic devices and, potentially, transceiver elements are embedded. The system will provide a 2.5 to 5 fold increase in specific power generation (Watts/kilogram) coupled with a >2x enhancement of stowed volume (Watts/cubic-meter) and a decrease in cost (dollars/Watt) when compared to state-of-the-art solar arrays.

  11. Large, horizontal-axis wind turbines

    NASA Technical Reports Server (NTRS)

    Linscott, B. S.; Perkins, P.; Dennett, J. T.

    1984-01-01

    Development of the technology for safe, reliable, environmentally acceptable large wind turbines that have the potential to generate a significant amount of electricity at costs competitive with conventional electric generating systems are presented. In addition, these large wind turbines must be fully compatible with electric utility operations and interface requirements. There are several ongoing large wind system development projects and applied research efforts directed toward meeting the technology requirements for utility applications. Detailed information on these projects is provided. The Mod-O research facility and current applied research effort in aerodynamics, structural dynamics and aeroelasticity, composite and hybrid composite materials, and multiple system interaction are described. A chronology of component research and technology development for large, horizontal axis wind turbines is presented. Wind characteristics, wind turbine economics, and the impact of wind turbines on the environment are reported. The need for continued wind turbine research and technology development is explored. Over 40 references are sited and a bibliography is included.

  12. Risks of Large Portfolios

    PubMed Central

    Fan, Jianqing; Liao, Yuan; Shi, Xiaofeng

    2014-01-01

    The risk of a large portfolio is often estimated by substituting a good estimator of the volatility matrix. However, the accuracy of such a risk estimator is largely unknown. We study factor-based risk estimators under a large amount of assets, and introduce a high-confidence level upper bound (H-CLUB) to assess the estimation. The H-CLUB is constructed using the confidence interval of risk estimators with either known or unknown factors. We derive the limiting distribution of the estimated risks in high dimensionality. We find that when the dimension is large, the factor-based risk estimators have the same asymptotic variance no matter whether the factors are known or not, which is slightly smaller than that of the sample covariance-based estimator. Numerically, H-CLUB outperforms the traditional crude bounds, and provides an insightful risk assessment. In addition, our simulated results quantify the relative error in the risk estimation, which is usually negligible using 3-month daily data. PMID:26195851

  13. Use of tropical maize for bioethanol production

    USDA-ARS?s Scientific Manuscript database

    Tropical maize is an alternative energy crop being considered as a feedstock for bioethanol production in the North Central and Midwest United States. Tropical maize is advantageous because it produces large amounts of soluble sugars in its stalks, creates a large amount of biomass, and requires lo...

  14. Large wood in the Snowy River estuary, Australia

    NASA Astrophysics Data System (ADS)

    Hinwood, Jon B.; McLean, Errol J.

    2017-02-01

    In this paper we report on 8 years of data collection and interpretation of large wood in the Snowy River estuary in southeastern Australia, providing quantitative data on the amount, sources, transport, decay, and geomorphic actions. No prior census data for an estuary is known to the authors despite their environmental and economic importance and the significant differences between a fluvial channel and an estuarine channel. Southeastern Australian estuaries contain a significant quantity of large wood that is derived from many sources, including river flood flows, local bank erosion, and anthropogenic sources. Wind and tide are shown to be as important as river flow in transporting and stranding large wood. Tidal action facilitates trapping of large wood on intertidal bars and shoals; but channels are wider and generally deeper, so log jams are less likely than in rivers. Estuarine large wood contributes to localised scour and accretion and hence to the modification of estuarine habitat, but in the study area it did not have large-scale impacts on the hydraulic gradients nor the geomorphology.

  15. Solutions for Mining Distributed Scientific Data

    NASA Astrophysics Data System (ADS)

    Lynnes, C.; Pham, L.; Graves, S.; Ramachandran, R.; Maskey, M.; Keiser, K.

    2007-12-01

    Researchers at the University of Alabama in Huntsville (UAH) and the Goddard Earth Sciences Data and Information Services Center (GES DISC) are working on approaches and methodologies facilitating the analysis of large amounts of distributed scientific data. Despite the existence of full-featured analysis tools, such as the Algorithm Development and Mining (ADaM) toolkit from UAH, and data repositories, such as the GES DISC, that provide online access to large amounts of data, there remain obstacles to getting the analysis tools and the data together in a workable environment. Does one bring the data to the tools or deploy the tools close to the data? The large size of many current Earth science datasets incurs significant overhead in network transfer for analysis workflows, even with the advanced networking capabilities that are available between many educational and government facilities. The UAH and GES DISC team are developing a capability to define analysis workflows using distributed services and online data resources. We are developing two solutions for this problem that address different analysis scenarios. The first is a Data Center Deployment of the analysis services for large data selections, orchestrated by a remotely defined analysis workflow. The second is a Data Mining Center approach of providing a cohesive analysis solution for smaller subsets of data. The two approaches can be complementary and thus provide flexibility for researchers to exploit the best solution for their data requirements. The Data Center Deployment of the analysis services has been implemented by deploying ADaM web services at the GES DISC so they can access the data directly, without the need of network transfers. Using the Mining Workflow Composer, a user can define an analysis workflow that is then submitted through a Web Services interface to the GES DISC for execution by a processing engine. The workflow definition is composed, maintained and executed at a distributed location, but most of the actual services comprising the workflow are available local to the GES DISC data repository. Additional refinements will ultimately provide a package that is easily implemented and configured at additional data centers for analysis of additional science data sets. Enhancements to the ADaM toolkit allow the staging of distributed data wherever the services are deployed, to support a Data Mining Center that can provide additional computational resources, large storage of output, easier addition and updates to available services, and access to data from multiple repositories. The Data Mining Center case provides researchers more flexibility to quickly try different workflow configurations and refine the process, using smaller amounts of data that may likely be transferred from distributed online repositories. This environment is sufficient for some analyses, but can also be used as an initial sandbox to test and refine a solution before staging the execution at a Data Center Deployment. Detection of airborne dust both over water and land in MODIS imagery using mining services for both solutions will be presented. The dust detection is just one possible example of the mining and analysis capabilities the proposed mining services solutions will provide to the science community. More information about the available services and the current status of this project is available at http://www.itsc.uah.edu/mws/

  16. An Experimental Study of Medical Error Explanations: Do Apology, Empathy, Corrective Action, and Compensation Alter Intentions and Attitudes?

    PubMed

    Nazione, Samantha; Pace, Kristin

    2015-01-01

    Medical malpractice lawsuits are a growing problem in the United States, and there is much controversy regarding how to best address this problem. The medical error disclosure framework suggests that apologizing, expressing empathy, engaging in corrective action, and offering compensation after a medical error may improve the provider-patient relationship and ultimately help reduce the number of medical malpractice lawsuits patients bring to medical providers. This study provides an experimental examination of the medical error disclosure framework and its effect on amount of money requested in a lawsuit, negative intentions, attitudes, and anger toward the provider after a medical error. Results suggest empathy may play a large role in providing positive outcomes after a medical error.

  17. PARENTS’ UNDERSTANDING OF INFORMATION REGARDING THEIR CHILD’S POSTOPERATIVE PAIN MANAGEMENT

    PubMed Central

    Tait, Alan R.; Voepel-Lewis, Terri; Snyder, Robin M.; Malviya, Shobha

    2009-01-01

    Objectives Unlike information provided for research, information disclosed to patients for treatment or procedures is largely unregulated and, as such, there is likely considerable variability in the type and amount of disclosure. This study was designed to examine the nature of information provided to parents regarding options for postoperative pain control and their understanding thereof. Methods 187 parents of children scheduled to undergo a surgical procedure requiring inpatient postoperative pain control completed questionnaires that elicited information regarding their perceptions and understanding of, and satisfaction with, information regarding postoperative pain management. Results Results showed that there was considerable variability in the content and amount of information provided to parents based on the method of postoperative pain control provided. Parents whose child received Patient Controlled Analgesia (PCA) were given significantly (P< 0.025) more information on the risks and benefits compared to those receiving Nurse Controlled or intravenous-prn (NCA or IV) analgesia. Approximately one third of parents had no understanding of the risks associated with postoperative pain management. Parents who received pain information preoperatively and who were given information regarding the risks and benefits had improved understanding compared to parents who received no or minimal information (P< 0.001). Furthermore, information that was deemed unclear or insufficient resulted in decreased parental understanding. Discussion These results demonstrate the variability in the type and amount of information provided to parents regarding their child’s postoperative pain control and reinforce the importance of clear and full disclosure of pain information, particularly with respect to the risks and benefits. PMID:18716495

  18. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    NASA Technical Reports Server (NTRS)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to advance science point of view: On the continuum of ever evolving data management systems, we need to understand and develop ways that allow for the variety of data relationships to be examined, and information to be manipulated, such that knowledge can be enhanced, to facilitate science. Recognizing the importance and potential impacts of the unlimited ways to co-analyze heterogeneous datasets, now and especially in the future, one of the objectives of the ESDA cluster is to facilitate the preparation of individuals to understand and apply needed skills to Earth science data analytics. Pinpointing and communicating the needed skills and expertise is new, and not easy. Information technology is just beginning to provide the tools for advancing the analysis of heterogeneous datasets in a big way, thus, providing opportunity to discover unobvious scientific relationships, previously invisible to the science eye. And it is not easy It takes individuals, or teams of individuals, with just the right combination of skills to understand the data and develop the methods to glean knowledge out of data and information. In addition, whereas definitions of data science and big data are (more or less) available (summarized in Reference 5), Earth science data analytics is virtually ignored in the literature, (barring a few excellent sources).

  19. Aztec cannibalism: an ecological necessity?

    PubMed

    de Montellano, B R

    1978-05-12

    It has been proposed that Aztec human sacrifice and cannibalism can best be explained as a response to population pressure and famine. The greatest amount of cannibalism, however, coincided with times of harvest, not with periods of scarcity, and is better explained as a thanksgiving. Tenochtitlan recevied large quantities of food tribute and engaged in intensive (chinampa) agriculture. These two sources alone would have provided enough to feed practically the entire population of the city. The Aztecs also consumed various animals and insects that were good protein sources. The amount of protein available from human sacrifice would not have made a significant contribution to the diet. Cannibalism was not motivated by starvation but by a belief that this was a way to commune with the gods.

  20. Going green with eco-friendly dentistry.

    PubMed

    Avinash, Bhagyalakshmi; Avinash, B S; Shivalinga, B M; Jyothikiran, S; Padmini, M N

    2013-07-01

    Eco-friendly dentistry is currently transforming the medical and dental field to decrease its affect on our natural environment and reduce the amount of waste being produced. Eco-friendly dentistry uses a sustainable approach to encourage dentists to implement new strategies to try and reduce the energy being consumed and the large amount of waste being produced by the industry. Many reasonable, practical and easy alternatives do exist which would reduce the environmental footprint of a dental office were it to follow the 'green' recommendations. Dentist should take a leading role in the society by implementing 'green' initiatives to lessen their impact on the environment. This article provides a series of 'green' recommendations that dentists around the world can implement to become a leading Stewards of the environment.

  1. An intelligent load shedding scheme using neural networks and neuro-fuzzy.

    PubMed

    Haidar, Ahmed M A; Mohamed, Azah; Al-Dabbagh, Majid; Hussain, Aini; Masoum, Mohammad

    2009-12-01

    Load shedding is some of the essential requirement for maintaining security of modern power systems, particularly in competitive energy markets. This paper proposes an intelligent scheme for fast and accurate load shedding using neural networks for predicting the possible loss of load at the early stage and neuro-fuzzy for determining the amount of load shed in order to avoid a cascading outage. A large scale electrical power system has been considered to validate the performance of the proposed technique in determining the amount of load shed. The proposed techniques can provide tools for improving the reliability and continuity of power supply. This was confirmed by the results obtained in this research of which sample results are given in this paper.

  2. Extemporaneous compounding of medicated ointments.

    PubMed

    Nagel, Karen; Ali, Fatima; Al-Khudari, Sarah; Khan, Ayesha; Patel, Khushbu; Patel, Nikunj; Desai, Archana

    2010-01-01

    Topical preparations represent a large percentage of compounded prescriptions, particularly in the area of dermatology. Properties of ointment bases vary greatly, and active ingredients are frequently added as aqueous or alcoholic solutions. Currently, there are no quantitative guidelines stating the various water and alcohol absorption capacity of different bases. A short experiment was designed to quantitate the amount of water or alcohol that could be absorbed by a series of ointment bases of varying types. Our findings may be used to assist compounding pharmacists in deciding what base is most suitable to use when considering the amount of water, alcohol, or any similar solvent needed to compound the preparation. A general overview of issues related to topical medication compounding is also provided in this article.

  3. Gene Expression Browser: Large-Scale and Cross-Experiment Microarray Data Management, Search & Visualization

    USDA-ARS?s Scientific Manuscript database

    The amount of microarray gene expression data in public repositories has been increasing exponentially for the last couple of decades. High-throughput microarray data integration and analysis has become a critical step in exploring the large amount of expression data for biological discovery. Howeve...

  4. TransAtlasDB: an integrated database connecting expression data, metadata and variants

    PubMed Central

    Adetunji, Modupeore O; Lamont, Susan J; Schmidt, Carl J

    2018-01-01

    Abstract High-throughput transcriptome sequencing (RNAseq) is the universally applied method for target-free transcript identification and gene expression quantification, generating huge amounts of data. The constraint of accessing such data and interpreting results can be a major impediment in postulating suitable hypothesis, thus an innovative storage solution that addresses these limitations, such as hard disk storage requirements, efficiency and reproducibility are paramount. By offering a uniform data storage and retrieval mechanism, various data can be compared and easily investigated. We present a sophisticated system, TransAtlasDB, which incorporates a hybrid architecture of both relational and NoSQL databases for fast and efficient data storage, processing and querying of large datasets from transcript expression analysis with corresponding metadata, as well as gene-associated variants (such as SNPs) and their predicted gene effects. TransAtlasDB provides the data model of accurate storage of the large amount of data derived from RNAseq analysis and also methods of interacting with the database, either via the command-line data management workflows, written in Perl, with useful functionalities that simplifies the complexity of data storage and possibly manipulation of the massive amounts of data generated from RNAseq analysis or through the web interface. The database application is currently modeled to handle analyses data from agricultural species, and will be expanded to include more species groups. Overall TransAtlasDB aims to serve as an accessible repository for the large complex results data files derived from RNAseq gene expression profiling and variant analysis. Database URL: https://modupeore.github.io/TransAtlasDB/ PMID:29688361

  5. The effects of snowpack grain size on satellite passive microwave observations from the Upper Colorado River Basin

    USGS Publications Warehouse

    Josberger, E.G.; Gloersen, P.; Chang, A.; Rango, A.

    1996-01-01

    Understanding the passive microwave emissions of a snowpack, as observed by satellite sensors, requires knowledge of the snowpack properties: water equivalent, grain size, density, and stratigraphy. For the snowpack in the Upper Colorado River Basin, measurements of snow depth and water equivalent are routinely available from the U.S. Department of Agriculture, but extremely limited information is available for the other properties. To provide this information, a field program from 1984 to 1995 obtained profiles of snowpack grain size, density, and temperature near the time of maximum snow accumulation, at sites distributed across the basin. A synoptic basin-wide sampling program in 1985 showed that the snowpack exhibits consistent properties across large regions. Typically, the snowpack in the Wyoming region contains large amounts of depth hoar, with grain sizes up to 5 mm, while the snowpack in Colorado and Utah is dominated by rounded snow grains less than 2 mm in diameter. In the Wyoming region, large depth hoar crystals in shallow snowpacks yield the lowest emissivities or coldest brightness temperatures observed across the entire basin. Yearly differences in the average grain sizes result primarily from variations in the relative amount of depth hoar within the snowpack. The average grain size for the Colorado and Utah regions shows much less variation than do the grain sizes from the Wyoming region. Furthermore, the greatest amounts of depth hoar occur in the Wyoming region during 1987 and 1992, years with strong El Nin??o Southern Oscillation, but the Colorado and Utah regions do not show this behavior.

  6. Accumulation of reserve carbohydrate by rumen protozoa and bacteria in competition for glucose.

    PubMed

    Denton, Bethany L; Diese, Leanne E; Firkins, Jeffrey L; Hackmann, Timothy J

    2015-03-01

    The aim of this study was to determine if rumen protozoa could form large amounts of reserve carbohydrate compared to the amounts formed by bacteria when competing for glucose in batch cultures. We separated large protozoa and small bacteria from rumen fluid by filtration and centrifugation, recombined equal protein masses of each group into one mixture, and subsequently harvested (reseparated) these groups at intervals after glucose dosing. This method allowed us to monitor reserve carbohydrate accumulation of protozoa and bacteria individually. When mixtures were dosed with a moderate concentration of glucose (4.62 or 5 mM) (n = 2 each), protozoa accumulated large amounts of reserve carbohydrate; 58.7% (standard error of the mean [SEM], 2.2%) glucose carbon was recovered from protozoal reserve carbohydrate at time of peak reserve carbohydrate concentrations. Only 1.7% (SEM, 2.2%) was recovered in bacterial reserve carbohydrate, which was less than that for protozoa (P < 0.001). When provided a high concentration of glucose (20 mM) (n = 4 each), 24.1% (SEM, 2.2%) of glucose carbon was recovered from protozoal reserve carbohydrate, which was still higher (P = 0.001) than the 5.0% (SEM, 2.2%) glucose carbon recovered from bacterial reserve carbohydrate. Our novel competition experiments directly demonstrate that mixed protozoa can sequester sugar away from bacteria by accumulating reserve carbohydrate, giving protozoa a competitive advantage and stabilizing fermentation in the rumen. Similar experiments could be used to investigate the importance of starch sequestration. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  7. Martian stepped-delta formation by rapid water release.

    PubMed

    Kraal, Erin R; van Dijk, Maurits; Postma, George; Kleinhans, Maarten G

    2008-02-21

    Deltas and alluvial fans preserved on the surface of Mars provide an important record of surface water flow. Understanding how surface water flow could have produced the observed morphology is fundamental to understanding the history of water on Mars. To date, morphological studies have provided only minimum time estimates for the longevity of martian hydrologic events, which range from decades to millions of years. Here we use sand flume studies to show that the distinct morphology of martian stepped (terraced) deltas could only have originated from a single basin-filling event on a timescale of tens of years. Stepped deltas therefore provide a minimum and maximum constraint on the duration and magnitude of some surface flows on Mars. We estimate that the amount of water required to fill the basin and deposit the delta is comparable to the amount of water discharged by large terrestrial rivers, such as the Mississippi. The massive discharge, short timescale, and the associated short canyon lengths favour the hypothesis that stepped fans are terraced delta deposits draped over an alluvial fan and formed by water released suddenly from subsurface storage.

  8. Landspotting: Social gaming to collect vast amounts of data for satellite validation

    NASA Astrophysics Data System (ADS)

    Fritz, S.; Purgathofer, P.; Kayali, F.; Fellner, M.; Wimmer, M.; Sturn, T.; Triebnig, G.; Krause, S.; Schindler, F.; Kollegger, M.; Perger, C.; Dürauer, M.; Haberl, W.; See, L.; McCallum, I.

    2012-04-01

    At present there is no single satellite-derived global land cover product that is accurate enough to provide reliable estimates of forest or cropland area to determine, e.g., how much additional land is available to grow biofuels or to tackle problems of food security. The Landspotting Project aims to improve the quality of this land cover information by vastly increasing the amount of in-situ validation data available for calibration and validation of satellite-derived land cover. The Geo-Wiki (Geo-Wiki.org) system currently allows users to compare three satellite derived land cover products and validate them using Google Earth. However, there is presently no incentive for anyone to provide this data so the amount of validation through Geo-Wiki has been limited. However, recent competitions have proven that incentive driven campaigns can rapidly create large amounts of input. The LandSpotting Project is taking a truly innovative approach through the development of the Landspotting game. The game engages users whilst simultaneously collecting a large amount of in-situ land cover information. The development of the game is informed by the current raft of successful social gaming that is available on the internet and as mobile applications, many of which are geo-spatial in nature. Games that are integrated within a social networking site such as Facebook illustrate the power to reach and continually engage a large number of individuals. The number of active Facebook users is estimated to be greater than 400 million, where 100 million are accessing Facebook from mobile devices. The Landspotting Game has similar game mechanics as the famous strategy game "Civilization" (i.e. build, harvest, research, war, diplomacy, etc.). When a player wishes to make a settlement, they must first classify the land cover over the area they wish to settle. As the game is played on the earth surface with Google Maps, we are able to record and store this land cover/land use classification geographically. Every player can play the game for free (i.e. a massive multiplayer online game). Furthermore, it is a social game on Facebook (e.g. invite friends, send friends messages, purchase gifts, help friends, post messages onto the wall, etc). The game is played in a web browser, therefore it runs everywhere (where Flash is supported) without requiring the user to install anything additional. At the same time, the Geo-Wiki system will be modified to use the acquired in-situ validation information to create new outputs: a hybrid land cover map, which takes the best information from each individual product to create a single integrated version; a database of validation points that will be freely available to the land cover user community; and a facility that allows users to create a specific targeted validation area, which will then be provided to the crowdsourcing community for validation. These outputs will turn Geo-Wiki into a valuable system for earth system scientists.

  9. Biomedical imaging and sensing using flatbed scanners.

    PubMed

    Göröcs, Zoltán; Ozcan, Aydogan

    2014-09-07

    In this Review, we provide an overview of flatbed scanner based biomedical imaging and sensing techniques. The extremely large imaging field-of-view (e.g., ~600-700 cm(2)) of these devices coupled with their cost-effectiveness provide unique opportunities for digital imaging of samples that are too large for regular optical microscopes, and for collection of large amounts of statistical data in various automated imaging or sensing tasks. Here we give a short introduction to the basic features of flatbed scanners also highlighting the key parameters for designing scientific experiments using these devices, followed by a discussion of some of the significant examples, where scanner-based systems were constructed to conduct various biomedical imaging and/or sensing experiments. Along with mobile phones and other emerging consumer electronics devices, flatbed scanners and their use in advanced imaging and sensing experiments might help us transform current practices of medicine, engineering and sciences through democratization of measurement science and empowerment of citizen scientists, science educators and researchers in resource limited settings.

  10. Biomedical Imaging and Sensing using Flatbed Scanners

    PubMed Central

    Göröcs, Zoltán; Ozcan, Aydogan

    2014-01-01

    In this Review, we provide an overview of flatbed scanner based biomedical imaging and sensing techniques. The extremely large imaging field-of-view (e.g., ~600–700 cm2) of these devices coupled with their cost-effectiveness provide unique opportunities for digital imaging of samples that are too large for regular optical microscopes, and for collection of large amounts of statistical data in various automated imaging or sensing tasks. Here we give a short introduction to the basic features of flatbed scanners also highlighting the key parameters for designing scientific experiments using these devices, followed by a discussion of some of the significant examples, where scanner-based systems were constructed to conduct various biomedical imaging and/or sensing experiments. Along with mobile phones and other emerging consumer electronics devices, flatbed scanners and their use in advanced imaging and sensing experiments might help us transform current practices of medicine, engineering and sciences through democratization of measurement science and empowerment of citizen scientists, science educators and researchers in resource limited settings. PMID:24965011

  11. Consumption with Large Sip Sizes Increases Food Intake and Leads to Underestimation of the Amount Consumed

    PubMed Central

    Bolhuis, Dieuwerke P.; Lakemond, Catriona M. M.; de Wijk, Rene A.; Luning, Pieternel A.; de Graaf, Cees

    2013-01-01

    Background A number of studies have shown that bite and sip sizes influence the amount of food intake. Consuming with small sips instead of large sips means relatively more sips for the same amount of food to be consumed; people may believe that intake is higher which leads to faster satiation. This effect may be disturbed when people are distracted. Objective The objective of the study is to assess the effects of sip size in a focused state and a distracted state on ad libitum intake and on the estimated amount consumed. Design In this 3×2 cross-over design, 53 healthy subjects consumed ad libitum soup with small sips (5 g, 60 g/min), large sips (15 g, 60 g/min), and free sips (where sip size was determined by subjects themselves), in both a distracted and focused state. Sips were administered via a pump. There were no visual cues toward consumption. Subjects then estimated how much they had consumed by filling soup in soup bowls. Results Intake in the small-sip condition was ∼30% lower than in both the large-sip and free-sip conditions (P<0.001). In addition, subjects underestimated how much they had consumed in the large-sip and free-sip conditions (P<0.03). Distraction led to a general increase in food intake (P = 0.003), independent of sip size. Distraction did not influence sip size or estimations. Conclusions Consumption with large sips led to higher food intake, as expected. Large sips, that were either fixed or chosen by subjects themselves led to underestimations of the amount consumed. This may be a risk factor for over-consumption. Reducing sip or bite sizes may successfully lower food intake, even in a distracted state. PMID:23372657

  12. A single active trehalose-6-P synthase (TPS) and a family of putative regulatory TPS-like proteins in Arabidopsis.

    PubMed

    Vandesteene, Lies; Ramon, Matthew; Le Roy, Katrien; Van Dijck, Patrick; Rolland, Filip

    2010-03-01

    Higher plants typically do not produce trehalose in large amounts, but their genome sequences reveal large families of putative trehalose metabolism enzymes. An important regulatory role in plant growth and development is also emerging for the metabolic intermediate trehalose-6-P (T6P). Here, we present an update on Arabidopsis trehalose metabolism and a resource for further detailed analyses. In addition, we provide evidence that Arabidopsis encodes a single trehalose-6-P synthase (TPS) next to a family of catalytically inactive TPS-like proteins that might fulfill specific regulatory functions in actively growing tissues.

  13. SPARK GAP SWITCH

    DOEpatents

    Neal, R.B.

    1957-12-17

    An improved triggered spark gap switch is described, capable of precisely controllable firing time while switching very large amounts of power. The invention in general comprises three electrodes adjustably spaced and adapted to have a large potential impressed between the outer electrodes. The central electrode includes two separate elements electrically connected togetaer and spaced apart to define a pair of spark gaps between the end electrodes. Means are provided to cause the gas flow in the switch to pass towards the central electrode, through a passage in each separate element, and out an exit disposed between the two separate central electrode elements in order to withdraw ions from the spark gap.

  14. LinkWinds: An Approach to Visual Data Analysis

    NASA Technical Reports Server (NTRS)

    Jacobson, Allan S.

    1992-01-01

    The Linked Windows Interactive Data System (LinkWinds) is a prototype visual data exploration and analysis system resulting from a NASA/JPL program of research into graphical methods for rapidly accessing, displaying and analyzing large multivariate multidisciplinary datasets. It is an integrated multi-application execution environment allowing the dynamic interconnection of multiple windows containing visual displays and/or controls through a data-linking paradigm. This paradigm, which results in a system much like a graphical spreadsheet, is not only a powerful method for organizing large amounts of data for analysis, but provides a highly intuitive, easy to learn user interface on top of the traditional graphical user interface.

  15. Model for fluorescence quenching in light harvesting complex II in different aggregation states.

    PubMed

    Andreeva, Atanaska; Abarova, Silvia; Stoitchkova, Katerina; Busheva, Mira

    2009-02-01

    Low-temperature (77 K) steady-state fluorescence emission spectroscopy and dynamic light scattering were applied to the main chlorophyll a/b protein light harvesting complex of photosystem II (LHC II) in different aggregation states to elucidate the mechanism of fluorescence quenching within LHC II oligomers. Evidences presented that LHC II oligomers are heterogeneous and consist of large and small particles with different fluorescence yield. At intermediate detergent concentrations the mean size of the small particles is similar to that of trimers, while the size of large particles is comparable to that of aggregated trimers without added detergent. It is suggested that in small particles and trimers the emitter is monomeric chlorophyll, whereas in large aggregates there is also another emitter, which is a poorly fluorescing chlorophyll associate. A model, describing populations of antenna chlorophyll molecules in small and large aggregates in their ground and first singlet excited states, is considered. The model enables us to obtain the ratio of the singlet excited-state lifetimes in small and large particles, the relative amount of chlorophyll molecules in large particles, and the amount of quenchers as a function of the degree of aggregation. These dependencies reveal that the quenching of the chl a fluorescence upon aggregation is due to the formation of large aggregates and the increasing of the amount of chlorophyll molecules forming these aggregates. As a consequence, the amount of quenchers, located in large aggregates, is increased, and their singlet excited-state lifetimes steeply decrease.

  16. Deep learning-based fine-grained car make/model classification for visual surveillance

    NASA Astrophysics Data System (ADS)

    Gundogdu, Erhan; Parıldı, Enes Sinan; Solmaz, Berkan; Yücesoy, Veysel; Koç, Aykut

    2017-10-01

    Fine-grained object recognition is a potential computer vision problem that has been recently addressed by utilizing deep Convolutional Neural Networks (CNNs). Nevertheless, the main disadvantage of classification methods relying on deep CNN models is the need for considerably large amount of data. In addition, there exists relatively less amount of annotated data for a real world application, such as the recognition of car models in a traffic surveillance system. To this end, we mainly concentrate on the classification of fine-grained car make and/or models for visual scenarios by the help of two different domains. First, a large-scale dataset including approximately 900K images is constructed from a website which includes fine-grained car models. According to their labels, a state-of-the-art CNN model is trained on the constructed dataset. The second domain that is dealt with is the set of images collected from a camera integrated to a traffic surveillance system. These images, which are over 260K, are gathered by a special license plate detection method on top of a motion detection algorithm. An appropriately selected size of the image is cropped from the region of interest provided by the detected license plate location. These sets of images and their provided labels for more than 30 classes are employed to fine-tune the CNN model which is already trained on the large scale dataset described above. To fine-tune the network, the last two fully-connected layers are randomly initialized and the remaining layers are fine-tuned in the second dataset. In this work, the transfer of a learned model on a large dataset to a smaller one has been successfully performed by utilizing both the limited annotated data of the traffic field and a large scale dataset with available annotations. Our experimental results both in the validation dataset and the real field show that the proposed methodology performs favorably against the training of the CNN model from scratch.

  17. Design Enhancements to Facilitate a Sustainable and Energy Efficient Dining Facility (DFAC) in a Contingency Environment

    DTIC Science & Technology

    2014-09-01

    resources, and generate large amounts of food and solid waste daily. Almost all Contingency Basecamp (CB) DFACs provide individual paper and plastic ware...which is costly in terms of purchase, transportation, and disposal. This work analyzed the effects of replacing paper and plastic ware with...reusable materials, and of adding industrial dishwashers to re- duce the logistical burden of using paper and plastic ware. Additional en- hancements

  18. Charting the landscape of supercritical string theory.

    PubMed

    Hellerman, Simeon; Swanson, Ian

    2007-10-26

    Special solutions of string theory in supercritical dimensions can interpolate in time between theories with different numbers of spacetime dimensions and different amounts of world sheet supersymmetry. These solutions connect supercritical string theories to the more familiar string duality web in ten dimensions and provide a precise link between supersymmetric and purely bosonic string theories. Dimension quenching and c duality appear to be natural concepts in string theory, giving rise to large networks of interconnected theories.

  19. Latest research progress on food waste management: a comprehensive review

    NASA Astrophysics Data System (ADS)

    Zhu, Shangzhen; Gao, Hetong; Duan, Lunbo

    2018-05-01

    Since a large amount of food supplying is provided as a basic line measuring increasing residents’ life standard, food waste has become progressively numeral considerable. Much attention has been drawn to this problem. This work gave an overview on latest researches about anaerobic digestion, composting, generalized management and other developments on management of food waste. Different technologies were introduced and evaluated. Further views on future research in such a field were proposed.

  20. Measurement of charged particle transverse momentum spectra in deep inelastic scattering

    NASA Astrophysics Data System (ADS)

    Adloff, C.; Aid, S.; Anderson, M.; Andreev, V.; Andrieu, B.; Babaev, A.; Bähr, J.; Bán, J.; Ban, Y.; Baranov, P.; Barrelet, E.; Barschke, R.; Bartel, W.; Barth, M.; Bassler, U.; Beck, H. P.; Beck, M.; Behrend, H.-J.; Belousov, A.; Berger, Ch.; Bernardi, G.; Bertrand-Coremans, G.; Besançon, M.; Beyer, R.; Biddulph, P.; Bispham, P.; Bizot, J. C.; Blobel, V.; Borras, K.; Botterweck, F.; Boudry, V.; Braemer, A.; Braunschweig, W.; Brisson, V.; Brückner, W.; Bruel, P.; Bruncko, D.; Brune, C.; Buchholz, R.; Büngener, L.; Bürger, J.; Büsser, F. W.; Buniatian, A.; Burke, S.; Burton, M. J.; Calvet, D.; Campbell, A. T.; Carli, T.; Charlet, M.; Clarke, D.; Clegg, A. B.; Clerbaux, B.; Cocks, S.; Contreras, J. G.; Cormack, C.; Coughlan, J. A.; Courau, A.; Cousinou, M.-C.; Cozzika, G.; Criegee, L.; Cussans, D. G.; Cvach, J.; Dagoret, S.; Dainton, J. B.; Dau, W. D.; Daum, K.; David, M.; Davis, C. L.; Delcourt, B.; De Roeck, A.; De Wolf, E. A.; Dirkmann, M.; Dixon, P.; Di Nezza, P.; Dlugosz, W.; Dollfus, C.; Donovan, K. T.; Dowell, J. D.; Dreis, H. B.; Droutskoi, A.; Dünger, O.; Duhm, H.; Ebert, J.; Ebert, T. R.; Eckerlin, G.; Efremenko, V.; Egli, S.; Eichler, R.; Eisele, F.; Eisenhandler, E.; Elsen, E.; Erdmann, M.; Erdmann, W.; Fahr, A. B.; Favart, L.; Fedotov, A.; Felst, R.; Feltesse, J.; Ferencei, J.; Ferrarotto, F.; Flamm, K.; Fleischer, M.; Flieser, M.; Flügge, G.; Fomenko, A.; Formánek, J.; Foster, J. M.; Franke, G.; Fretwurst, E.; Gabathuler, E.; Gabathuler, K.; Gaede, F.; Garvey, J.; Gayler, J.; Gebauer, M.; Genzel, H.; Gerhards, R.; Glazov, A.; Goerlich, L.; Gogitidze, N.; Goldberg, M.; Goldner, D.; Golec-Biernat, K.; Gonzalez-Pineiro, B.; Gorelov, I.; Grab, C.; Grässler, H.; Greenshaw, T.; Griffiths, R. K.; Grindhammer, G.; Gruber, A.; Gruber, C.; Hadig, T.; Haidt, D.; Hajduk, L.; Haller, T.; Hampel, M.; Haynes, W. J.; Heinemann, B.; Heinzelmann, G.; Henderson, R. C. W.; Henschel, H.; Herynek, I.; Hess, M. F.; Hewitt, K.; Hildesheim, W.; Hiller, K. H.; Hilton, C. D.; Hladký, J.; Höppner, M.; Hoffmann, D.; Holtom, T.; Horisberger, R.; Hudgson, V. L.; Hütte, M.; Ibbotson, M.; Itterbeck, H.; Jacholkowska, A.; Jacobsson, C.; Jaffre, M.; Janoth, J.; Jansen, D. M.; Jansen, T.; Jönson, L.; Johnson, D. P.; Jung, H.; Kalmus, P. I. P.; Kander, M.; Kant, D.; Kaschowitz, R.; Kathage, U.; Katzy, J.; Kaufmann, H. H.; Kaufmann, O.; Kausch, M.; Kazarian, S.; Kenyon, I. R.; Kermiche, S.; Keuker, C.; Kiesling, C.; Klein, M.; Kleinwort, C.; Knies, G.; Köhler, T.; Köhne, J. H.; Kolanoski, H.; Kolya, S. D.; Korbel, V.; Kostka, P.; Kotelnikov, S. K.; Krämerkämper, T.; Krasny, M. W.; Krehbiel, H.; Krücker, D.; Küster, H.; Kuhlen, M.; Kurča, T.; Kurzhöfer, J.; Lacour, D.; Laforge, B.; Landon, M. P. J.; Lange, W.; Langenegger, U.; Lebedev, A.; Lehner, F.; Levonian, S.; Lindström, G.; Lindstroem, M.; Linsel, F.; Lipinski, J.; List, B.; Lobo, G.; Loch, P.; Lomas, J. W.; Lopez, G. C.; Lubimov, V.; Liike, D.; Lytkin, L.; Magnussen, N.; Malinovski, E.; Maraček, R.; Marage, P.; Marks, J.; Marshall, R.; Martens, J.; Martin, G.; Martin, R.; Martyn, H.-U.; Martyniak, J.; Mavroidis, T.; Maxfield, S. J.; McMahon, S. J.; Mehta, A.; Meier, K.; Metlica, F.; Meyer, A.; Meyer, A.; Meyer, H.; Meyer, J.; Meyer, P.-O.; Migliori, A.; Mikocki, S.; Milstead, D.; Moeck, J.; Moreau, F.; Morris, J. V.; Mroczko, E.; Müller, D.; Müller, G.; Müller, K.; Murín, P.; Nagovizin, V.; Nahnhauer, R.; Naroska, B.; Naumann, Th.; Négri, I.; Newman, P. R.; Newton, D.; Nguyen, H. K.; Nicholls, T. C.; Niebergall, F.; Niebuhr, C.; Niedzballa, Ch.; Niggli, H.; Nowak, G.; Noyes, G. W.; Nunnemann, T.; Nyberg-Werther, M.; Oakden, M.; Oberlack, H.; Olsson, J. E.; Ozerov, D.; Palmen, P.; Panaro, E.; Panitch, A.; Pascaud, C.; Patel, G. D.; Pawletta, H.; Peppel, E.; Perez, E.; Phillips, J. P.; Pieuchot, A.; Pitzl, D.; Pope, G.; Povh, B.; Prell, S.; Rabbertz, K.; Rädel, G.; Reimer, P.; Reinshagen, S.; Rick, H.; Riepenhausen, F.; Riess, S.; Rizvi, E.; Robmann, P.; Roloff, P. H. E.; Roosen, R.; Rosenbauer, K.; Rostovtsev, A.; Rouse, F.; Royon, C.; Rüter, K.; Rusakov, S.; Rybicki, K.; Sankey, D. P. C.; Schacht, P.; Schiek, S.; Schleif, S.; Schleper, P.; von Schlippe, W.; Schmidt, D.; Schmidt, G.; Schoeffel, L.; Schöning, A.; Schröder, V.; Schuhmann, E.; Schwab, B.; Sefkow, F.; Sell, R.; Semenovy, A.; Shekelyan, V.; Sheviakov, I.; Shtarkov, L. N.; Siegmon, G.; Siewert, U.; Sirois, Y.; Skillicorni, I. O.; Smirnov, F.; Solochenko, V.; Soloviev, Y.; Specka, A.; Spiekermann, J.; Spielman, S.; Spitzer, H.; Squinabol, F.; Steffen, F.; Steinberg, F.; Steiner, H.; Steinhart, J.; Stella, B.; Stellbergr, A.; Stier, P. J.; Stiewe, J.; Stöβlein, U.; Stolze, K.; Straumann, U.; Struczinski, W.; Sutton, J. P.; Tapprogge, S.; Tagevˇský, M.; Tchernyshov, V.; Tchetchelnitski, S.; Theissen, J.; Thiebaux, C.; Thompson, G.; Tobien, N.; Todenhagen, R.; Truöl, P.; Tsipolitis, G.; Turnau, J.; Tutas, J.; Tzamariudaki, E.; Uelkes, P.; Usik, A.; Valkár, S.; Valkárová, A.; Vallée, C.; Vandenplas, D.; Van Esch, P.; Van Mechelen, P.; Vazdik, Y.; Verrecchia, P.; Villet, G.; Wacker, K.; Wagener, A.; Wagener, M.; Waugh, B.; Weber, G.; Weber, M.; Wegener, D.; Wenger, A.; Wengler, T.; Werner, M.; West, L. R.; Wilksen, T.; Willard, S.; Winde, M.; Winter, G.-G.; Wittek, C.; Wobisch, M.; Wünsch, E.; Žáček, J.; Zarbock, D.; Zhang, Z.; Zhokin, A.; Zini, P.; Zomer, F.; Zsembery, J.; Zuber, K.; zurNedden, M.; Hl Collaboration

    1997-02-01

    Transverse momentum spectra of charged particles produced in deep inelastic scattering are measured as a function of the kinematic variables x and Q using the H1 detector at the epcollider HERA. The data are compared to different parton emission models, either with or without ordering of the emissions in transverse momentum. The data provide evidence for a relatively large amount of parton radiation between the current and the remnant systems.

  1. The future of consumer cameras

    NASA Astrophysics Data System (ADS)

    Battiato, Sebastiano; Moltisanti, Marco

    2015-03-01

    In the last two decades multimedia, and in particular imaging devices (camcorders, tablets, mobile phones, etc.) have been dramatically diffused. Moreover the increasing of their computational performances, combined with an higher storage capability, allows them to process large amount of data. In this paper an overview of the current trends of consumer cameras market and technology will be given, providing also some details about the recent past (from Digital Still Camera up today) and forthcoming key issues.

  2. Solid wood timber products consumption in major end uses in the United States, 1950-2009 : a technical document supporting the Forest Service 2010 RPA assessment

    Treesearch

    David B. McKeever; James L. Howard

    2011-01-01

    Solid wood timber products provide important raw materials to the construction, manufacturing, and shipping sectors of the U.S. economy. Nearly all new single-family houses and low-rise multifamily residential structures are wood framed and sheathed. Large amounts of solid wood timber products are also used in the construction of new nonresidential buildings, and in...

  3. The Response of Nitrifying Bacteria to Treatments of N-Serve and Roundup in Continuous-Flow Soil Columns

    DTIC Science & Technology

    1988-07-15

    Science Society of America, Inc. Atlas , R. V. and R. Bartha . 1987. Microbial Ecology : Fundamentale and Applications, 2nd Edition. Benjamin/Cuynmings...Thompson and Troeh, 1978). However, many nutrient cycling pathways are mediated by only a few genera of bacteria ( Atlas and Bartha , 1987). So...mole of ammonium and nitrite oxidized, respectively ( Atlas and Bartha , 1987). Therefore, large amounts of substrate must be oxidized to provide

  4. Matrix Determination of Reflectance of Hidden Object via Indirect Photography

    DTIC Science & Technology

    2012-03-01

    the hidden object. This thesis provides an alternative method of processing the camera images by modeling the system as a set of transport and...Distribution Function ( BRDF ). Figure 1. Indirect photography with camera field of view dictated by point of illumination. 3 1.3 Research Focus In an...would need to be modeled using radiometric principles. A large amount of the improvement in this process was due to the use of a blind

  5. Balancing Information Analysis and Decision Value: A Model to Exploit the Decision Process

    DTIC Science & Technology

    2011-12-01

    technical intelli- gence e.g. signals and sensors (SIGINT and MASINT), imagery (!MINT), as well and human and open source intelligence (HUMINT and OSINT ...Clark 2006). The ability to capture large amounts of da- ta and the plenitude of modem intelligence information sources provides a rich cache of...many tech- niques for managing information collected and derived from these sources , the exploitation of intelligence assets for decision-making

  6. Machinery of protein folding and unfolding.

    PubMed

    Zhang, Xiaodong; Beuron, Fabienne; Freemont, Paul S

    2002-04-01

    During the past two years, a large amount of biochemical, biophysical and low- to high-resolution structural data have provided mechanistic insights into the machinery of protein folding and unfolding. It has emerged that dual functionality in terms of folding and unfolding might exist for some systems. The majority of folding/unfolding machines adopt oligomeric ring structures in a cooperative fashion and utilise the conformational changes induced by ATP binding/hydrolysis for their specific functions.

  7. Extending the data dictionary for data/knowledge management

    NASA Technical Reports Server (NTRS)

    Hydrick, Cecile L.; Graves, Sara J.

    1988-01-01

    Current relational database technology provides the means for efficiently storing and retrieving large amounts of data. By combining techniques learned from the field of artificial intelligence with this technology, it is possible to expand the capabilities of such systems. This paper suggests using the expanded domain concept, an object-oriented organization, and the storing of knowledge rules within the relational database as a solution to the unique problems associated with CAD/CAM and engineering data.

  8. Artificial maturation of an immature sulfur- and organic matter-rich limestone from the Ghareb Formation, Jordan

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; De Leeuw, J. W.; Lewan, M.D.; Damste, J.S.S.

    1998-01-01

    An immature (Ro=0.39%), S-rich (S(org)/C = 0.07), organic matter-rich (19.6 wt. % TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220 ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophenes and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.An immature (Ro = 0.39%), S-rich (Sorg/C = 0.07), organic matter-rich (19.6 wt.% TOC) limestone from the Ghareb Formation (Upper Cretaceous) in Jordan was artificially matured by hydrous pyrolysis (200, 220, ..., 300??C; 72 h) to study the effect of progressive diagenesis and early catagenesis on the amounts and distributions of hydrocarbons, organic sulfur compounds and S-rich geomacromolecules. The use of internal standards allowed the determination of absolute amounts. With increasing thermal maturation, large amounts of alkanes and alkylthiophenes with predominantly linear carbon skeletons are generated from the kerogen. The alkylthiophene isomer distributions do not change significantly with increasing thermal maturation, indicating the applicability of alkylthiophenes as biomarkers at relatively high levels of thermal maturity. For a given carbon skeleton, the saturated hydrocarbon, alkylthiophene and alkylbenzo[b]thiophenes are stable forms at relatively high temperatures, whereas the alkylsulfides are not stable. The large amount of alkylthiophenes produced relative to the alkanes may be explained by the large number of monosulfide links per carbon skeleton. These results are in good agreement with those obtained previously for an artificial maturation series of an immature S-rich sample from the Gessoso-solfifera Formation.

  9. Flexible horseshoe

    DOEpatents

    Ford, Donald F.

    1985-01-01

    A screw-on horseshoe formed from a plastic material is disclosed. A flex joint is provided that allows the horseshoe to expand and contract as pressure is applied to the horse's hoof, thereby reducing friction between the hoof and the shoe. The horseshoe also provides a lip portion projecting upwardly from a horseshoe base portion to protect the horse hoof wall from obstacles encountered during the movement of the horse. A novel screw having a double helix thread pattern including a high thread pattern and a low thread pattern is used to fasten the horseshoe to the horse's hoof without piercing the hoof wall. The screw includes a keyed recessed self-holding head that is complementary to, and therefore readily driven by, a power drill. A lightweight, yet wear-resistant, horseshoe that is readily attached to a horse's hoof with a minimum amount of labor and a minimum amount of damage to the hoof that can be constructed in many styles and sizes to match a large variety of horse uses is thus described.

  10. Mandatory Provider Review And Pain Clinic Laws Reduce The Amounts Of Opioids Prescribed And Overdose Death Rates.

    PubMed

    Dowell, Deborah; Zhang, Kun; Noonan, Rita K; Hockenberry, Jason M

    2016-10-01

    To address the opioid overdose epidemic in the United States, states have implemented policies to reduce inappropriate opioid prescribing. These policies could affect the coincident heroin overdose epidemic by either driving the substitution of heroin for opioids or reducing simultaneous use of both substances. We used IMS Health's National Prescription Audit and government mortality data to examine the effect of these policies on opioid prescribing and on prescription opioid and heroin overdose death rates in the United States during 2006-13. The analysis revealed that combined implementation of mandated provider review of state-run prescription drug monitoring program data and pain clinic laws reduced opioid amounts prescribed by 8 percent and prescription opioid overdose death rates by 12 percent. We also observed relatively large but statistically insignificant reductions in heroin overdose death rates after implementation of these policies. This combination of policies was effective, but broader approaches to address these coincident epidemics are needed. Project HOPE—The People-to-People Health Foundation, Inc.

  11. Lime kiln dust as a potential raw material in portland cement manufacturing

    USGS Publications Warehouse

    Miller, M. Michael; Callaghan, Robert M.

    2004-01-01

    In the United States, the manufacture of portland cement involves burning in a rotary kiln a finely ground proportional mix of raw materials. The raw material mix provides the required chemical combination of calcium, silicon, aluminum, iron, and small amounts of other ingredients. The majority of calcium is supplied in the form of calcium carbonate usually from limestone. Other sources including waste materials or byproducts from other industries can be used to supply calcium (or lime, CaO), provided they have sufficiently high CaO content, have low magnesia content (less than 5 percent), and are competitive with limestone in terms of cost and adequacy of supply. In the United States, the lime industry produces large amounts of lime kiln dust (LKD), which is collected by dust control systems. This LKD may be a supplemental source of calcium for cement plants, if the lime and cement plants are located near enough to each other to make the arrangement economical.

  12. Corner-cutting mining assembly

    DOEpatents

    Bradley, J.A.

    1981-07-01

    This invention resulted from a contract with the United States Department of Energy and relates to a mining tool. More particularly, the invention relates to an assembly capable of drilling a hole having a square cross-sectional shape with radiused corners. In mining operations in which conventional auger-type drills are used to form a series of parallel, cylindrical holes in a coal seam, a large amount of coal remains in place in the seam because the shape of the holes leaves thick webs between the holes. A higher percentage of coal can be mined from a seam by a means capable of drilling holes having a substantially square cross section. It is an object of this invention to provide an improved mining apparatus by means of which the amount of coal recovered from a seam deposit can be increased. Another object of the invention is to provide a drilling assembly which cuts corners in a hole having a circular cross section. These objects and other advantages are attained by a preferred embodiment of the invention.

  13. Big Data Challenges in Climate Science: Improving the Next-Generation Cyberinfrastructure

    NASA Technical Reports Server (NTRS)

    Schnase, John L.; Lee, Tsengdar J.; Mattmann, Chris A.; Lynnes, Christopher S.; Cinquini, Luca; Ramirez, Paul M.; Hart, Andre F.; Williams, Dean N.; Waliser, Duane; Rinsland, Pamela; hide

    2016-01-01

    The knowledge we gain from research in climate science depends on the generation, dissemination, and analysis of high-quality data. This work comprises technical practice as well as social practice, both of which are distinguished by their massive scale and global reach. As a result, the amount of data involved in climate research is growing at an unprecedented rate. Climate model intercomparison (CMIP) experiments, the integration of observational data and climate reanalysis data with climate model outputs, as seen in the Obs4MIPs, Ana4MIPs, and CREATE-IP activities, and the collaborative work of the Intergovernmental Panel on Climate Change (IPCC) provide examples of the types of activities that increasingly require an improved cyberinfrastructure for dealing with large amounts of critical scientific data. This paper provides an overview of some of climate science's big data problems and the technical solutions being developed to advance data publication, climate analytics as a service, and interoperability within the Earth System Grid Federation (ESGF), the primary cyberinfrastructure currently supporting global climate research activities.

  14. Radiographic contrast media conservation systems.

    PubMed

    1996-11-01

    During procedures (such as cardiac catheterizations) that use contrast media, or dyes, large amounts of unused dye can become contaminated with the patient's blood and must be thrown out. Radiographic contrast media conservation systems (RCMCSs) are administration sets designed to reduce the waste of these expensive dyes by (1) isolating the bags or bottles supplying the dyes from contamination, allowing the remaining contents to be used again on another patient, and (2) minimizing the amount of dye left in the administration set (and therefore discarded) at the end of the procedure. In this Evaluation, we examined five burette RCMCSs from three suppliers. We tested the systems for their ability to protect the source containers from contamination, for their performance and design features, and for their ease of use. We also provide information on transfer line assembly RCMCSs, which perform some of the same functions but without using a burette, in a supplementary article within this Evaluation. In addition, a brief Purchasing Guide provides recommendations on deciding whether to purchase RCMCSs and how to choose among systems.

  15. Genetic and phenological variation of tocochromanol (vitamin E) content in wild (Daucus carota L. var. carota) and domesticated carrot (D. carota L. var. sativa)

    PubMed Central

    Luby, Claire H; Maeda, Hiroshi A; Goldman, Irwin L

    2014-01-01

    Carrot roots (Daucus carota L. var. sativa) produce tocochromanol compounds, collectively known as vitamin E. However, little is known about their types and amounts. Here we determined the range and variation in types and amounts of tocochromanols in a variety of cultivated carrot accessions throughout carrot postharvest storage and reproductive stages and in wild-type roots (Daucus carota L. var. carota). Of eight possible tocochromanol compounds, we detected and quantified α-, and the combined peak for β- and γ- forms of tocopherols and tocotrienols. Significant variation in amounts of tocochromanol compounds was observed across accessions and over time. Large increases in α-tocopherol were noted during both reproductive growth and the postharvest stages. The variation of tocochromanols in carrot root tissue provides useful information for future research seeking to understand the role of these compounds in carrot root tissue or to breed varieties with increased levels of these compounds. PMID:26504534

  16. Effects of water soaking and/or sodium polystyrene sulfonate addition on potassium content of foods.

    PubMed

    Picq, Christian; Asplanato, Marion; Bernillon, Noémie; Fabre, Claudie; Roubeix, Mathilde; Ricort, Jean-Marc

    2014-09-01

    In this study, we determined, by atomic absorption spectrophotometry, the potassium amount leached by soaking or boiling foods identified by children suffering from chronic renal failure as "pleasure food" and that they cannot eat because of their low-potassium diet, and evaluated whether addition of sodium polystyrene sulfonate resin (i.e. Kayexalate®) during soaking or boiling modulated potassium loss. A significant amount of potassium content was removed by soaking (16% for chocolate and potato, 26% for apple, 37% for tomato and 41% for banana) or boiling in a large amount of water (73% for potato). Although Kayexalate® efficiently dose-dependently removed potassium from drinks (by 48% to 73%), resin addition during soaking or boiling did not eliminate more potassium from solid foods. Our results therefore provide useful information for dietitians who elaborate menus for people on potassium-restricted diets and would give an interesting alternative to the systematic elimination of all potassium-rich foods from their diet.

  17. Nonparametric Density Estimation Based on Self-Organizing Incremental Neural Network for Large Noisy Data.

    PubMed

    Nakamura, Yoshihiro; Hasegawa, Osamu

    2017-01-01

    With the ongoing development and expansion of communication networks and sensors, massive amounts of data are continuously generated in real time from real environments. Beforehand, prediction of a distribution underlying such data is difficult; furthermore, the data include substantial amounts of noise. These factors make it difficult to estimate probability densities. To handle these issues and massive amounts of data, we propose a nonparametric density estimator that rapidly learns data online and has high robustness. Our approach is an extension of both kernel density estimation (KDE) and a self-organizing incremental neural network (SOINN); therefore, we call our approach KDESOINN. An SOINN provides a clustering method that learns about the given data as networks of prototype of data; more specifically, an SOINN can learn the distribution underlying the given data. Using this information, KDESOINN estimates the probability density function. The results of our experiments show that KDESOINN outperforms or achieves performance comparable to the current state-of-the-art approaches in terms of robustness, learning time, and accuracy.

  18. Foam rheology at large deformation

    NASA Astrophysics Data System (ADS)

    Géminard, J.-C.; Pastenes, J. C.; Melo, F.

    2018-04-01

    Large deformations are prone to cause irreversible changes in materials structure, generally leading to either material hardening or softening. Aqueous foam is a metastable disordered structure of densely packed gas bubbles. We report on the mechanical response of a foam layer subjected to quasistatic periodic shear at large amplitude. We observe that, upon increasing shear, the shear stress follows a universal curve that is nearly exponential and tends to an asymptotic stress value interpreted as the critical yield stress at which the foam structure is completely remodeled. Relevant trends of the foam mechanical response to cycling are mathematically reproduced through a simple law accounting for the amount of plastic deformation upon increasing stress. This view provides a natural interpretation to stress hardening in foams, demonstrating that plastic effects are present in this material even for minute deformation.

  19. Visual attention mitigates information loss in small- and large-scale neural codes

    PubMed Central

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  20. Modification of the fault logic circuit of a high-energy linear accelerator to accommodate selectively coded, large-field wedges.

    PubMed

    Miller, R W; van de Geijn, J

    1987-01-01

    A modification to the fault logic circuit that controls the collimator (COLL) fault is described. This modification permits the use of large-field wedges by adding an additional input into the reference voltage that determines the fault condition. The resistor controlling the amount of additional voltage is carried on board each wedge, within the wedge plug. This allows each wedge to determine its own, individual field size limit. Additionally, if no coding resistor is provided, the factory-supplied reference voltage is used, which sets the maximum allowable field size to 15 cm. This permits the use of factory-supplied wedges in conjunction with selected, large-field wedges, allowing proper sensing of the field size maximum in all conditions.

  1. Errors in Measuring Water Potentials of Small Samples Resulting from Water Adsorption by Thermocouple Psychrometer Chambers 1

    PubMed Central

    Bennett, Jerry M.; Cortes, Peter M.

    1985-01-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios. PMID:16664367

  2. Errors in measuring water potentials of small samples resulting from water adsorption by thermocouple psychrometer chambers.

    PubMed

    Bennett, J M; Cortes, P M

    1985-09-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios.

  3. Application of human induced pluripotent stem cells to model fibrodysplasia ossificans progressiva.

    PubMed

    Barruet, Emilie; Hsiao, Edward C

    2018-04-01

    Fibrodysplasia ossificans progressiva (FOP) is a genetic condition characterized by massive heterotopic ossification. FOP patients have mutations in the Activin A type I receptor (ACVR1), a bone morphogenetic protein (BMP) receptor. FOP is a progressive and debilitating disease characterized by bone formation flares that often occur after trauma. Since it is often difficult or impossible to obtain large amounts of tissue from human donors due to the risks of inciting more heterotopic bone formation, human induced pluripotent stem cells (hiPSCs) provide an attractive source for establishing in vitro disease models and for applications in drug screening. hiPSCs have the ability to self-renew, allowing researchers to obtain large amounts of starting material. hiPSCs also have the potential to differentiate into any cell type in the body. In this review, we discuss how the application of hiPSC technology to studying FOP has changed our perspectives on FOP disease pathogenesis. We also consider ongoing challenges and emerging opportunities for the use of human iPSCs in drug discovery and regenerative medicine. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Program Analyzes Radar Altimeter Data

    NASA Technical Reports Server (NTRS)

    Vandemark, Doug; Hancock, David; Tran, Ngan

    2004-01-01

    A computer program has been written to perform several analyses of radar altimeter data. The program was designed to improve on previous methods of analysis of altimeter engineering data by (1) facilitating and accelerating the analysis of large amounts of data in a more direct manner and (2) improving the ability to estimate performance of radar-altimeter instrumentation and provide data corrections. The data in question are openly available to the international scientific community and can be downloaded from anonymous file-transfer- protocol (FTP) locations that are accessible via links from altimetry Web sites. The software estimates noise in range measurements, estimates corrections for electromagnetic bias, and performs statistical analyses on various parameters for comparison of different altimeters. Whereas prior techniques used to perform similar analyses of altimeter range noise require comparison of data from repetitions of satellite ground tracks, the present software uses a high-pass filtering technique to obtain similar results from single satellite passes. Elimination of the requirement for repeat-track analysis facilitates the analysis of large amounts of satellite data to assess subtle variations in range noise.

  5. STEPP--Search Tool for Exploration of Petri net Paths: a new tool for Petri net-based path analysis in biochemical networks.

    PubMed

    Koch, Ina; Schueler, Markus; Heiner, Monika

    2005-01-01

    To understand biochemical processes caused by, e. g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber.

  6. STEPP - Search Tool for Exploration of Petri net Paths: A New Tool for Petri Net-Based Path Analysis in Biochemical Networks.

    PubMed

    Koch, Ina; Schüler, Markus; Heiner, Monika

    2011-01-01

    To understand biochemical processes caused by, e.g., mutations or deletions in the genome, the knowledge of possible alternative paths between two arbitrary chemical compounds is of increasing interest for biotechnology, pharmacology, medicine, and drug design. With the steadily increasing amount of data from high-throughput experiments new biochemical networks can be constructed and existing ones can be extended, which results in many large metabolic, signal transduction, and gene regulatory networks. The search for alternative paths within these complex and large networks can provide a huge amount of solutions, which can not be handled manually. Moreover, not all of the alternative paths are generally of interest. Therefore, we have developed and implemented a method, which allows us to define constraints to reduce the set of all structurally possible paths to the truly interesting path set. The paper describes the search algorithm and the constraints definition language. We give examples for path searches using this dedicated special language for a Petri net model of the sucrose-to-starch breakdown in the potato tuber. http://sanaga.tfh-berlin.de/~stepp/

  7. Advantages of Parallel Processing and the Effects of Communications Time

    NASA Technical Reports Server (NTRS)

    Eddy, Wesley M.; Allman, Mark

    2000-01-01

    Many computing tasks involve heavy mathematical calculations, or analyzing large amounts of data. These operations can take a long time to complete using only one computer. Networks such as the Internet provide many computers with the ability to communicate with each other. Parallel or distributed computing takes advantage of these networked computers by arranging them to work together on a problem, thereby reducing the time needed to obtain the solution. The drawback to using a network of computers to solve a problem is the time wasted in communicating between the various hosts. The application of distributed computing techniques to a space environment or to use over a satellite network would therefore be limited by the amount of time needed to send data across the network, which would typically take much longer than on a terrestrial network. This experiment shows how much faster a large job can be performed by adding more computers to the task, what role communications time plays in the total execution time, and the impact a long-delay network has on a distributed computing system.

  8. The global technical potential of bio-energy in 2050 considering sustainability constraints

    PubMed Central

    Haberl, Helmut; Beringer, Tim; Bhattacharya, Sribas C; Erb, Karl-Heinz; Hoogwijk, Monique

    2010-01-01

    Bio-energy, that is, energy produced from organic non-fossil material of biological origin, is promoted as a substitute for non-renewable (e.g., fossil) energy to reduce greenhouse gas (GHG) emissions and dependency on energy imports. At present, global bio-energy use amounts to approximately 50 EJ/yr, about 10% of humanity's primary energy supply. We here review recent literature on the amount of bio-energy that could be supplied globally in 2050, given current expectations on technology, food demand and environmental targets (‘technical potential’). Recent studies span a large range of global bio-energy potentials from ≈30 to over 1000 EJ/yr. In our opinion, the high end of the range is implausible because of (1) overestimation of the area available for bio-energy crops due to insufficient consideration of constraints (e.g., area for food, feed or nature conservation) and (2) too high yield expectations resulting from extrapolation of plot-based studies to large, less productive areas. According to this review, the global technical primary bio-energy potential in 2050 is in the range of 160–270 EJ/yr if sustainability criteria are considered. The potential of bio-energy crops is at the lower end of previously published ranges, while residues from food production and forestry could provide significant amounts of energy based on an integrated optimization (‘cascade utilization’) of biomass flows. PMID:24069093

  9. Dynamics and manipulation of entanglement in coupled harmonic systems with many degrees of freedom

    NASA Astrophysics Data System (ADS)

    Plenio, M. B.; Hartley, J.; Eisert, J.

    2004-03-01

    We study the entanglement dynamics of a system consisting of a large number of coupled harmonic oscillators in various configurations and for different types of nearest-neighbour interactions. For a one-dimensional chain, we provide compact analytical solutions and approximations to the dynamical evolution of the entanglement between spatially separated oscillators. Key properties such as the speed of entanglement propagation, the maximum amount of transferred entanglement and the efficiency for the entanglement transfer are computed. For harmonic oscillators coupled by springs, corresponding to a phonon model, we observe a non-monotonic transfer efficiency in the initially prepared amount of entanglement, i.e. an intermediate amount of initial entanglement is transferred with the highest efficiency. In contrast, within the framework of the rotating-wave approximation (as appropriate, e.g. in quantum optical settings) one finds a monotonic behaviour. We also study geometrical configurations that are analogous to quantum optical devices (such as beamsplitters and interferometers) and observe characteristic differences when initially thermal or squeezed states are entering these devices. We show that these devices may be switched on and off by changing the properties of an individual oscillator. They may therefore be used as building blocks of large fixed and pre-fabricated but programmable structures in which quantum information is manipulated through propagation. We discuss briefly possible experimental realizations of systems of interacting harmonic oscillators in which these effects may be confirmed experimentally.

  10. Proxy system modeling of tree-ring isotope chronologies over the Common Era

    NASA Astrophysics Data System (ADS)

    Anchukaitis, K. J.; LeGrande, A. N.

    2017-12-01

    The Asian monsoon can be characterized in terms of both precipitation variability and atmospheric circulation across a range of spatial and temporal scales. While multicentury time series of tree-ring widths at hundreds of sites across Asia provide estimates of past rainfall, the oxygen isotope ratios of annual rings may reveal broader regional hydroclimate and atmosphere-ocean dynamics. Tree-ring oxygen isotope chronologies from Monsoon Asia have been interpreted to reflect a local 'amount effect', relative humidity, source water and seasonality, and winter snowfall. Here, we use an isotope-enabled general circulation model simulation from the NASA Goddard Institute for Space Science (GISS) Model E and a proxy system model of the oxygen isotope composition of tree-ring cellulose to interpret the large-scale and local climate controls on δ 18O chronologies. Broad-scale dominant signals are associated with a suite of covarying hydroclimate variables including growing season rainfall amounts, relative humidity, and vapor pressure deficit. Temperature and source water influences are region-dependent, as are the simulated tree-ring isotope signals associated with the El Nino Southern Oscillation (ENSO) and large-scale indices of the Asian monsoon circulation. At some locations, including southern coastal Viet Nam, local precipitation isotope ratios and the resulting simulated δ 18O tree-ring chronologies reflect upstream rainfall amounts and atmospheric circulation associated with monsoon strength and wind anomalies.

  11. Objectives and metrics for wildlife monitoring

    USGS Publications Warehouse

    Sauer, J.R.; Knutson, M.G.

    2008-01-01

    Monitoring surveys allow managers to document system status and provide the quantitative basis for management decision-making, and large amounts of effort and funding are devoted to monitoring. Still, monitoring surveys often fall short of providing required information; inadequacies exist in survey designs, analyses procedures, or in the ability to integrate the information into an appropriate evaluation of management actions. We describe current uses of monitoring data, provide our perspective on the value and limitations of current approaches to monitoring, and set the stage for 3 papers that discuss current goals and implementation of monitoring programs. These papers were derived from presentations at a symposium at The Wildlife Society's 13th Annual Conference in Anchorage, Alaska, USA. [2006

  12. SLIDE - a web-based tool for interactive visualization of large-scale -omics data.

    PubMed

    Ghosh, Soumita; Datta, Abhik; Tan, Kaisen; Choi, Hyungwon

    2018-06-28

    Data visualization is often regarded as a post hoc step for verifying statistically significant results in the analysis of high-throughput data sets. This common practice leaves a large amount of raw data behind, from which more information can be extracted. However, existing solutions do not provide capabilities to explore large-scale raw datasets using biologically sensible queries, nor do they allow user interaction based real-time customization of graphics. To address these drawbacks, we have designed an open-source, web-based tool called Systems-Level Interactive Data Exploration, or SLIDE to visualize large-scale -omics data interactively. SLIDE's interface makes it easier for scientists to explore quantitative expression data in multiple resolutions in a single screen. SLIDE is publicly available under BSD license both as an online version as well as a stand-alone version at https://github.com/soumitag/SLIDE. Supplementary Information are available at Bioinformatics online.

  13. A MODIFIED METHOD OF OBTAINING LARGE AMOUNTS OF RICKETTSIA PROWAZEKI BY ROENTGEN IRRADIATION OF RATS

    PubMed Central

    Macchiavello, Atilio; Dresser, Richard

    1935-01-01

    The radiation method described by Zinsser and Castaneda for obtaining large amounts of Rickettsia has been carried out successfully with an ordinary radiographic machine. This allows the extension of the method to those communities which do not possess a high voltage Roentgen therapy unit as originally employed. PMID:19870416

  14. Profiling of lipid and glycogen accumulations under different growth conditions in the sulfothermophilic red alga Galdieria sulphuraria.

    PubMed

    Sakurai, Toshihiro; Aoki, Motohide; Ju, Xiaohui; Ueda, Tatsuya; Nakamura, Yasunori; Fujiwara, Shoko; Umemura, Tomonari; Tsuzuki, Mikio; Minoda, Ayumi

    2016-01-01

    The unicellular red alga Galdieria sulphuraria grows efficiently and produces a large amount of biomass in acidic conditions at high temperatures. It has great potential to produce biofuels and other beneficial compounds without becoming contaminated with other organisms. In G. sulphuraria, biomass measurements and glycogen and lipid analyses demonstrated that the amounts and compositions of glycogen and lipids differed when cells were grown under autotrophic, mixotrophic, and heterotrophic conditions. Maximum biomass production was obtained in the mixotrophic culture. High amounts of glycogen were obtained in the mixotrophic cultures, while the amounts of neutral lipids were similar between mixotrophic and heterotrophic cultures. The amounts of neutral lipids were highest in red algae, including thermophiles. Glycogen structure and fatty acids compositions largely depended on the growth conditions. Copyright © 2015. Published by Elsevier Ltd.

  15. Economical ground data delivery

    NASA Technical Reports Server (NTRS)

    Markley, Richard W.; Byrne, Russell H.; Bromberg, Daniel E.

    1994-01-01

    Data delivery in the Deep Space Network (DSN) involves transmission of a small amount of constant, high-priority traffic and a large amount of bursty, low priority data. The bursty traffic may be initially buffered and then metered back slowly as bandwidth becomes available. Today both types of data are transmitted over dedicated leased circuits. The authors investigated the potential of saving money by designing a hybrid communications architecture that uses leased circuits for high-priority network communications and dial-up circuits for low-priority traffic. Such an architecture may significantly reduce costs and provide an emergency backup. The architecture presented here may also be applied to any ground station-to-customer network within the range of a common carrier. The authors compare estimated costs for various scenarios and suggest security safeguards that should be considered.

  16. A Study of Time Dependent Response of Ceramic Materials

    NASA Technical Reports Server (NTRS)

    Hemann, John

    1997-01-01

    The research accomplishments under this grant were very extensive in the areas of the development of computer software for the design of ceramic materials. Rather than try to summarize all this research I have enclosed research papers and reports which were completed with the funding provided by the grant. These papers and reports are listed below. Additionally a large amount of technology transfer occurred in this project and a significant number of national awards were received.

  17. Guinea: Background and Relations with the United States

    DTIC Science & Technology

    2010-07-19

    salary arrears of $1,100 to each soldier, sack the defense minister, and grant promotions to junior officers, ending the uprising.15 In mid-June 2008...Service 17 printing large amounts of new currency in 2009), and the freezing of some foreign aid.75 Guinea’s external debt burden—$3.1 billion in...period. IDA also provides grants to countries at risk of debt distress. 100 The HIPC Initiative is a comprehensive approach to debt reduction for

  18. Semantically Enhanced Recommender Systems

    NASA Astrophysics Data System (ADS)

    Ruiz-Montiel, Manuela; Aldana-Montes, José F.

    Recommender Systems have become a significant area in the context of web personalization, given the large amount of available data. Ontologies can be widely taken advantage of in recommender systems, since they provide a means of classifying and discovering of new information about the items to recommend, about user profiles and even about their context. We have developed a semantically enhanced recommender system based on this kind of ontologies. In this paper we present a description of the proposed system.

  19. Test Review: Woodcock, R. W., Schrank, F. A., Mather, N., & McGrew, K. S. 2007). "Woodcock-Johnson III Tests of Achievement, Form C/Brief Battery." Rolling Meadows, IL: Riverside

    ERIC Educational Resources Information Center

    Grenwelge, Cheryl H.

    2009-01-01

    The Woodcock Johnson III Brief Assessment is a "maximum performance test" (Reynolds, Livingston, Willson, 2006) that is designed to assess the upper levels of knowledge and skills of the test taker using both power and speed to obtain a large amount of information in a short period of time. The Brief Assessment also provides an adequate…

  20. Gaining Momentum: How Media Influences Public Opinion To Push Civil-Military Decision Makers Into Formulating Foreign Policy

    DTIC Science & Technology

    2016-02-09

    1 AIR WAR COLLEGE AIR UNIVERSITY GAINING MOMENTUM: HOW MEDIA INFLUENCES PUBLIC OPINION TO PUSH CIVIL-MILITARY DECISION MAKERS INTO...engagements from the past, evidence suggest the media or press does have an influence over public opinion, especially during times of war and humanitarian...changes and that leaders must take into consideration that public opinion and the media may provide a large amount of influence over how the nation

  1. Engineering Design Handbook. Development Guide for Reliability. Part Two. Design for Reliability

    DTIC Science & Technology

    1976-01-01

    Component failure rates, however, have been recorded by many sources as a function of use and environment. Some of these sources are listed in Refs. 13-17...other systems capable of creating an explosive reac- tion. The second category is fairly obvious and includes many variations on methods for providing...aboutthem. 4. Ability to detect signals ( including patterns) in high noise environments. 5. Ability to store large amounts of informa- tion for long

  2. A Large-scale Benchmark Dataset for Event Recognition in Surveillance Video

    DTIC Science & Technology

    2011-06-01

    orders of magnitude larger than existing datasets such CAVIAR [7]. TRECVID 2008 airport dataset [16] contains 100 hours of video, but, it provides only...entire human figure (e.g., above shoulder), amounting to 500% human to video 2Some statistics are approximate, obtained from the CAVIAR 1st scene and...and diversity in both col- lection sites and viewpoints. In comparison to surveillance datasets such as CAVIAR [7] and TRECVID [16] shown in Fig. 3

  3. Characterizing variable biogeochemical changes during the treatment of produced oilfield waste.

    PubMed

    Hildenbrand, Zacariah L; Santos, Inês C; Liden, Tiffany; Carlton, Doug D; Varona-Torres, Emmanuel; Martin, Misty S; Reyes, Michelle L; Mulla, Safwan R; Schug, Kevin A

    2018-09-01

    At the forefront of the discussions about climate change and energy independence has been the process of hydraulic fracturing, which utilizes large amounts of water, proppants, and chemical additives to stimulate sequestered hydrocarbons from impermeable subsurface strata. This process also produces large amounts of heterogeneous flowback and formation waters, the subsurface disposal of which has most recently been linked to the induction of anthropogenic earthquakes. As such, the management of these waste streams has provided a newfound impetus to explore recycling alternatives to reduce the reliance on subsurface disposal and fresh water resources. However, the biogeochemical characteristics of produced oilfield waste render its recycling and reutilization for production well stimulation a substantial challenge. Here we present a comprehensive analysis of produced waste from the Eagle Ford shale region before, during, and after treatment through adjustable separation, flocculation, and disinfection technologies. The collection of bulk measurements revealed significant reductions in suspended and dissolved constituents that could otherwise preclude untreated produced water from being utilized for production well stimulation. Additionally, a significant step-wise reduction in pertinent scaling and well-fouling elements was observed, in conjunction with notable fluctuations in the microbiomes of highly variable produced waters. Collectively, these data provide insight into the efficacies of available water treatment modalities within the shale energy sector, which is currently challenged with improving the environmental stewardship of produced water management. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. The battle for hearts and minds: who is communicating most effectively with the cosmetic marketplace?

    PubMed

    Camp, Matthew C; Wong, Wendy W; Mussman, Jason L; Gupta, Subhas C

    2010-01-01

    Cosmetic surgery, historically the purview of plastic surgeons, has in recent years seen an influx of practitioners from other fields of training. Many of these new providers are savvy in marketing and public relations and are beginning to control a surprisingly large amount of cosmetic patient care. The purpose of this study is to measure the amount of traffic being attracted to the Web sites of individual practitioners and organizations vying for cosmetic patients. This study investigates the trends of the past 12 months and identifies changes of special concern to plastic surgeons. The Web sites of 1307 cosmetic providers were monitored over a year's time. The Web activity of two million individuals whose computers were loaded with a self-reporting software package was recorded and analyzed. The Web sites were analyzed according to the specialty training of the site owner and total unique visits per month were tallied for the most prominent specialties. The dominant Web sites were closely scrutinized and the Web optimization strategies of each were also examined. There is a tremendous amount of Web activity surrounding cosmetic procedures and the amount of traffic on the most popular sites is continuing to grow. Also, a large sum of money is being expended to channel Web traffic, with sums in the thousands of dollars being spent daily by top Web sites. Overall in the past year, the private Web sites of plastic surgeons have increased their reach by 10%, growing from 200,000 to approximately 220,000 unique visitors monthly. Plastic surgery remains the specialty with the largest number of Web visitors per month. However, when combined, the private Web sites of all other providers of aesthetic services have significantly outpaced plastic surgery's growth. The traffic going to non-plastic surgeons has grown by 50% (200,000 visitors per month in September 2008 to 300,000 visitors monthly in September 2009). For providers of aesthetic services, communication with the public is of utmost importance. The Web has become the single most important information resource for consumers because of easy access. Plastic surgeons are facing significant competition for the attention of potential patients, with increasingly sophisticated Web sites and listing services being set up by independent parties. It is important for plastic surgeons to become familiar with the available Internet tools for communication with potential patients and to aggressively utilize these tools for effective practice building.

  5. GeoNotebook: Browser based Interactive analysis and visualization workflow for very large climate and geospatial datasets

    NASA Astrophysics Data System (ADS)

    Ozturk, D.; Chaudhary, A.; Votava, P.; Kotfila, C.

    2016-12-01

    Jointly developed by Kitware and NASA Ames, GeoNotebook is an open source tool designed to give the maximum amount of flexibility to analysts, while dramatically simplifying the process of exploring geospatially indexed datasets. Packages like Fiona (backed by GDAL), Shapely, Descartes, Geopandas, and PySAL provide a stack of technologies for reading, transforming, and analyzing geospatial data. Combined with the Jupyter notebook and libraries like matplotlib/Basemap it is possible to generate detailed geospatial visualizations. Unfortunately, visualizations generated is either static or does not perform well for very large datasets. Also, this setup requires a great deal of boilerplate code to create and maintain. Other extensions exist to remedy these problems, but they provide a separate map for each input cell and do not support map interactions that feed back into the python environment. To support interactive data exploration and visualization on large datasets we have developed an extension to the Jupyter notebook that provides a single dynamic map that can be managed from the Python environment, and that can communicate back with a server which can perform operations like data subsetting on a cloud-based cluster.

  6. Mouthwash overdose

    MedlinePlus

    ... are: Chlorhexidine gluconate Ethanol (ethyl alcohol) Hydrogen peroxide Methyl salicylate ... amounts of alcohol (drunkenness). Swallowing large amounts of methyl salicylate and hydrogen peroxide may also cause serious stomach ...

  7. Dawn of Advanced Molecular Medicine: Nanotechnological Advancements in Cancer Imaging and Therapy

    PubMed Central

    Kaittanis, Charalambos; Shaffer, Travis M.; Thorek, Daniel L. J.; Grimm, Jan

    2014-01-01

    Nanotechnology plays an increasingly important role not only in our everyday life (with all its benefits and dangers) but also in medicine. Nanoparticles are to date the most intriguing option to deliver high concentrations of agents specifically and directly to cancer cells; therefore, a wide variety of these nanomaterials has been developed and explored. These span the range from simple nanoagents to sophisticated smart devices for drug delivery or imaging. Nanomaterials usually provide a large surface area, allowing for decoration with a large amount of moieties on the surface for either additional functionalities or targeting. Besides using particles solely for imaging purposes, they can also carry as a payload a therapeutic agent. If both are combined within the same particle, a theranostic agent is created. The sophistication of highly developed nanotechnology targeting approaches provides a promising means for many clinical implementations and can provide improved applications for otherwise suboptimal formulations. In this review we will explore nanotechnology both for imaging and therapy to provide a general overview of the field and its impact on cancer imaging and therapy. PMID:25271430

  8. Lessons learnt on the analysis of large sequence data in animal genomics.

    PubMed

    Biscarini, F; Cozzi, P; Orozco-Ter Wengel, P

    2018-04-06

    The 'omics revolution has made a large amount of sequence data available to researchers and the industry. This has had a profound impact in the field of bioinformatics, stimulating unprecedented advancements in this discipline. Mostly, this is usually looked at from the perspective of human 'omics, in particular human genomics. Plant and animal genomics, however, have also been deeply influenced by next-generation sequencing technologies, with several genomics applications now popular among researchers and the breeding industry. Genomics tends to generate huge amounts of data, and genomic sequence data account for an increasing proportion of big data in biological sciences, due largely to decreasing sequencing and genotyping costs and to large-scale sequencing and resequencing projects. The analysis of big data poses a challenge to scientists, as data gathering currently takes place at a faster pace than does data processing and analysis, and the associated computational burden is increasingly taxing, making even simple manipulation, visualization and transferring of data a cumbersome operation. The time consumed by the processing and analysing of huge data sets may be at the expense of data quality assessment and critical interpretation. Additionally, when analysing lots of data, something is likely to go awry-the software may crash or stop-and it can be very frustrating to track the error. We herein review the most relevant issues related to tackling these challenges and problems, from the perspective of animal genomics, and provide researchers that lack extensive computing experience with guidelines that will help when processing large genomic data sets. © 2018 Stichting International Foundation for Animal Genetics.

  9. Advances in the Biology and Chemistry of Sialic Acids

    PubMed Central

    Chen, Xi; Varki, Ajit

    2010-01-01

    Sialic acids are a subset of nonulosonic acids, which are nine-carbon alpha-keto aldonic acids. Natural existing sialic acid-containing structures are presented in different sialic acid forms, various sialyl linkages, and on diverse underlying glycans. They play important roles in biological, pathological, and immunological processes. Sialobiology has been a challenging and yet attractive research area. Recent advances in chemical and chemoenzymatic synthesis as well as large-scale E. coli cell-based production have provided a large library of sialoside standards and derivatives in amounts sufficient for structure-activity relationship studies. Sialoglycan microarrays provide an efficient platform for quick identification of preferred ligands for sialic acid-binding proteins. Future research on sialic acid will continue to be at the interface of chemistry and biology. Research efforts will not only lead to a better understanding of the biological and pathological importance of sialic acids and their diversity, but could also lead to the development of therapeutics. PMID:20020717

  10. Potential release of fibers from burning carbon composites. [aircraft fires

    NASA Technical Reports Server (NTRS)

    Bell, V. L.

    1980-01-01

    A comprehensive experimental carbon fiber source program was conducted to determine the potential for the release of conductive carbon fibers from burning composites. Laboratory testing determined the relative importance of several parameters influencing the amounts of single fibers released, while large-scale aviation jet fuel pool fires provided realistic confirmation of the laboratory data. The dimensions and size distributions of fire-released carbon fibers were determined, not only for those of concern in an electrical sense, but also for those of potential interest from a health and environmental standpoint. Fire plume and chemistry studies were performed with large pool fires to provide an experimental input into an analytical modelling of simulated aircraft crash fires. A study of a high voltage spark system resulted in a promising device for the detection, counting, and sizing of electrically conductive fibers, for both active and passive modes of operation.

  11. Collapse of axion stars

    DOE PAGES

    Eby, Joshua; Leembruggen, Madelyn; Suranyi, Peter; ...

    2016-12-15

    Axion stars, gravitationally bound states of low-energy axion particles, have a maximum mass allowed by gravitational stability. Weakly bound states obtaining this maximum mass have sufficiently large radii such that they are dilute, and as a result, they are well described by a leading-order expansion of the axion potential. Here, heavier states are susceptible to gravitational collapse. Inclusion of higher-order interactions, present in the full potential, can give qualitatively different results in the analysis of collapsing heavy states, as compared to the leading-order expansion. In this work, we find that collapsing axion stars are stabilized by repulsive interactions present inmore » the full potential, providing evidence that such objects do not form black holes. In the last moments of collapse, the binding energy of the axion star grows rapidly, and we provide evidence that a large amount of its energy is lost through rapid emission of relativistic axions.« less

  12. Lithium wall conditioning by high frequency pellet injection in RFX-mod

    NASA Astrophysics Data System (ADS)

    Innocente, P.; Mansfield, D. K.; Roquemore, A. L.; Agostini, M.; Barison, S.; Canton, A.; Carraro, L.; Cavazzana, R.; De Masi, G.; Fassina, A.; Fiameni, S.; Grando, L.; Rais, B.; Rossetto, F.; Scarin, P.

    2015-08-01

    In the RFX-mod reversed field pinch experiment, lithium wall conditioning has been tested with multiple scopes: to improve density control, to reduce impurities and to increase energy and particle confinement time. Large single lithium pellet injection, lithium capillary-pore system and lithium evaporation has been used for lithiumization. The last two methods, which presently provide the best results in tokamak devices, have limited applicability in the RFX-mod device due to the magnetic field characteristics and geometrical constraints. On the other side, the first mentioned technique did not allow injecting large amount of lithium. To improve the deposition, recently in RFX-mod small lithium multi-pellets injection has been tested. In this paper we compare lithium multi-pellets injection to the other techniques. Multi-pellets gave more uniform Li deposition than evaporator, but provided similar effects on plasma parameters, showing that further optimizations are required.

  13. The use of electrochemistry for the synthesis of 17 alpha-hydroxyprogesterone by a fusion protein containing P450c17.

    PubMed

    Estabrook, R W; Shet, M S; Faulkner, K; Fisher, C W

    1996-11-01

    A method has been developed for the commercial application of the unique oxygen chemistry catalyzed by various cytochrome P450s. This is illustrated here for the synthesis of hydroxylated steroids. This method requires the preparation of large amounts of enzymatically functional P450 proteins that can serve as catalysts and a technique for providing electrons at an economically acceptable cost. To generate large amounts of enzymatically active recombinant P450s we have engineered the cDNAs for various P450s, including bovine adrenal P450c17, by linking them to a modified cDNA for rat NADPH-P450 reductase and placing them in the plasmid pCWori+. Transformation of E. coli results in the high level expression of an enzymatically active protein that can be easily purified by affinity chromatography. Incubation of the purified enzyme with steroid in a reaction vessel containing a platinum electrode and a Ag/AgCl electrode couple poised at -650 mV, together with the electromotively active redox mediator, cobalt sepulchrate, results in the 17 alpha-hydroxylation of progesterone at rates as high as 25 nmoles of progesterone hydroxylated/min/nmole of P450. Thus, high concentrations of hydroxylated steroids can be produced with incubation conditions of hours duration without the use of costly NADPH. Similar experiments have been carried out for the generation of the 6 beta-hydroxylation product of testosterone (using a fusion protein containing human P450 3A4). It is apparent that this method is applicable to many other P450 catalyzed reactions for the synthesis of large amounts of hydroxylated steroid metabolites. The electrochemical system is also applicable to drug discovery studies for the characterization of drug metabolites.

  14. Rapid Classification of Ordinary Chondrites Using Raman Spectroscopy

    NASA Technical Reports Server (NTRS)

    Fries, M.; Welzenbach, L.

    2014-01-01

    Classification of ordinary chondrites is typically done through measurements of the composition of olivine and pyroxenes. Historically, this measurement has usually been performed via electron microprobe, oil immersion or other methods which can be costly through lost sample material during thin section preparation. Raman microscopy can perform the same measurements but considerably faster and with much less sample preparation allowing for faster classification. Raman spectroscopy can facilitate more rapid classification of large amounts of chondrites such as those retrieved from North Africa and potentially Antarctica, are present in large collections, or are submitted to a curation facility by the public. With development, this approach may provide a completely automated classification method of all chondrite types.

  15. Axisymmetric force-free magnetosphere in the exterior of a neutron star - II. Maximum storage and open field energies

    NASA Astrophysics Data System (ADS)

    Kojima, Yasufumi; Okamoto, Satoki

    2018-04-01

    A magnetar's magnetosphere gradually evolves by the injection of energy and helicity from the interior. Axisymmetric static solutions for a relativistic force-free magnetosphere with a power-law current model are numerically obtained. They provide information about the configurations in which the stored energy is large. The energy along a sequence of equilibria increases and becomes sufficient to open the magnetic field. A magnetic flux rope, in which a large amount of toroidal field is confined, is formed in the vicinity of the star, for states exceeding the open field energy. These states are energetically metastable, and the excess energy may be ejected as a magnetar outburst.

  16. SUNY beamline facilities at the National Synchrotron Light Source (Final Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coppens, Philip

    2003-06-22

    The DOE sponsored SUNY synchrotron project has involved close cooperation among faculty at several SUNY campuses. A large number of students and postdoctoral associates have participated in its operation which was centered at the X3 beamline of the National Synchrotron Light Source at Brookhaven National Laboratory. Four stations with capabilities for Small Angle Scattering, Single Crystal and Powder and Surface diffraction and EXAFS were designed and operated with capability to perform experiments at very low as well as elevated temperatures and under high vacuum. A large amount of cutting-edge science was performed at the facility, which in addition provided excellentmore » training for students and postdoctoral scientists in the field.« less

  17. Gas Production Within Stromatolites Across the Archean: Evidence For Ancient Microbial Metabolisms

    NASA Astrophysics Data System (ADS)

    Wilmeth, D.; Corsetti, F. A.; Berelson, W.; Beukes, N. J.; Awramik, S. M.; Petryshyn, V. A.

    2017-12-01

    Identifying the presence of specific microbial metabolisms in the Archean is a fundamental goal of deep-time geobiology. Certain fenestral textures within Archean stromatolites provide evidence for the presence of gas, and therefore gas-releasing metabolisms, within ancient microbial mats. Paleoenvironmental analysis indicates many of the stromatolites formed in shallow, agitated aqueous environments, with relatively rapid gas production and lithification of fenestrae. Proposed gases include oxygen, carbon dioxide, methane, hydrogen sulfide, and various nitrogen species, produced by appropriate metabolisms. This study charts the presence of gas-related fenestrae in Archean stromatolites over time, and examines the potential for various metabolisms to produce fenestral textures. Fenestral textures are present in Archean stromatolites on at least four separate cratons from 3.5 to 2.5 Ga. Fenestrae are preserved in carbonate and chert microbialites of various morphologies, including laminar, domal, and conical forms. Extensive fenestral textures, with dozens of fenestrae along individual laminae, are especially prevalent in Neoarchean stromatolites (2.8 -2.5 Ga). The volume of gas within Archean microbial mats was estimated by measuring fenestrae in ancient stromatolites and bubbles within modern mats. The time needed for metabolisms to produce appropriate gas volumes was calculated using modern rates obtained from the literature. Given the paleoenvironmental conditions, the longer a metabolism takes to make large amounts of gas, the less likely large bubbles will remain long enough to become preserved. Additionally, limiting reactants were estimated for each metabolism using previous Archean geochemical models. Metabolisms with limited reactants are less likely to produce large amounts of gas. Oxygenic photosynthesis can produce large amounts of gas within minutes, and the necessary reactants (carbon dioxide and water) were readily available in Archean environments. In the absence of clear sedimentary or geochemical evidence for abundant hydrogen or oxidized sulfur and nitrogen species during stromatolite morphogenesis, oxygenic photosynthesis is the metabolism with the highest potential for producing fenestrae before the Great Oxidation Event.

  18. Local-search based prediction of medical image registration error

    NASA Astrophysics Data System (ADS)

    Saygili, Görkem

    2018-03-01

    Medical image registration is a crucial task in many different medical imaging applications. Hence, considerable amount of work has been published recently that aim to predict the error in a registration without any human effort. If provided, these error predictions can be used as a feedback to the registration algorithm to further improve its performance. Recent methods generally start with extracting image-based and deformation-based features, then apply feature pooling and finally train a Random Forest (RF) regressor to predict the real registration error. Image-based features can be calculated after applying a single registration but provide limited accuracy whereas deformation-based features such as variation of deformation vector field may require up to 20 registrations which is a considerably high time-consuming task. This paper proposes to use extracted features from a local search algorithm as image-based features to estimate the error of a registration. The proposed method comprises a local search algorithm to find corresponding voxels between registered image pairs and based on the amount of shifts and stereo confidence measures, it predicts the amount of registration error in millimetres densely using a RF regressor. Compared to other algorithms in the literature, the proposed algorithm does not require multiple registrations, can be efficiently implemented on a Graphical Processing Unit (GPU) and can still provide highly accurate error predictions in existence of large registration error. Experimental results with real registrations on a public dataset indicate a substantially high accuracy achieved by using features from the local search algorithm.

  19. Crosslinked elastic fibers are necessary for low energy loss in the ascending aorta.

    PubMed

    Kim, Jungsil; Staiculescu, Marius Catalin; Cocciolone, Austin J; Yanagisawa, Hiromi; Mecham, Robert P; Wagenseil, Jessica E

    2017-08-16

    In the large arteries, it is believed that elastin provides the resistance to stretch at low pressure, while collagen provides the resistance to stretch at high pressure. It is also thought that elastin is responsible for the low energy loss observed with cyclic loading. These tenets are supported through experiments that alter component amounts through protease digestion, vessel remodeling, normal growth, or in different artery types. Genetic engineering provides the opportunity to revisit these tenets through the loss of expression of specific wall components. We used newborn mice lacking elastin (Eln -/- ) or two key proteins (lysyl oxidase, Lox -/- , or fibulin-4, Fbln4 -/- ) that are necessary for the assembly of mechanically-functional elastic fibers to investigate the contributions of elastic fibers to large artery mechanics. We determined component content and organization and quantified the nonlinear and viscoelastic mechanical behavior of Eln -/- , Lox -/- , and Fbln4 -/- ascending aorta and their respective controls. We confirmed that the lack of elastin, fibulin-4, or lysyl oxidase leads to absent or highly fragmented elastic fibers in the aortic wall and a 56-97% decrease in crosslinked elastin amounts. We found that the resistance to stretch at low pressure is decreased only in Eln -/- aorta, confirming the role of elastin in the nonlinear mechanical behavior of the aortic wall. Dissipated energy with cyclic loading and unloading is increased 53-387% in Eln -/- , Lox -/- , and Fbln4 -/- aorta, indicating that not only elastin, but properly assembled and crosslinked elastic fibers, are necessary for low energy loss in the aorta. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Radioisotope fueled pulsed power generation system for propulsion and electrical power for deep space missions

    NASA Astrophysics Data System (ADS)

    Howe, Troy

    Space exploration missions to the moon, Mars, and other celestial bodies have allowed for great scientific leaps to enhance our knowledge of the universe; yet the astronomical cost of these missions limits their utility to only a few select agencies. Reducing the cost of exploratory space travel will give rise to a new era of exploration, where private investors, universities, and world governments can send satellites to far off planets and gather important data. By using radioisotope power sources and thermal storage devices, a duty cycle can be introduced to extract large amounts of energy in short amounts of time, allowing for efficient space travel. The same device can also provide electrical power for subsystems such as communications, drills, lasers, or other components that can provide valuable scientific information. This project examines the use of multiple radioisotope sources combined with a thermal capacitor using Phase Change Materials (PCMs) which can collect energy over a period of time. The result of this design culminates in a variety of possible spacecraft with their own varying costs, transit times, and objectives. Among the most promising are missions to Mars which cost less than 17M, missions that can provide power to satellite constellations for decades, or missions that can deliver large, Opportunity-sized (185kg) payloads to mars for less than 53M. All made available to a much wider range of customer with commercially available satellite launches from earth. The true cost of such progress though lies in the sometimes substantial increase in transit times for these missions.

  1. Cold Trap Dismantling and Sodium Removal at a Fast Breeder Reactor - 12327

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graf, A.; Petrick, H.; Stutz, U.

    2012-07-01

    The first German prototype Fast Breeder Nuclear Reactor (KNK) is currently being dismantled after being the only operating Fast Breeder-type reactor in Germany. As this reactor type used sodium as a coolant in its primary and secondary circuit, seven cold traps containing various amounts of partially activated sodium needed to be disposed of as part of the dismantling. The resulting combined difficulties of radioactive contamination and high chemical reactivity were handled by treating the cold traps differently depending on their size and the amount of sodium contained inside. Six small cold traps were processed onsite by cutting them up intomore » small parts using a band saw under a protective atmosphere. The sodium was then converted to sodium hydroxide by using water. The remaining large cold trap could not be handled in the same way due to its dimensions (2.9 m x 1.1 m) and the declared amount of sodium inside (1,700 kg). It was therefore manually dismantled inside a large box filled with a protective atmosphere, while the resulting pieces were packaged for later burning in a special facility. The experiences gained by KNK during this process may be advantageous for future dismantling projects in similar sodium-cooled reactors worldwide. The dismantling of a prototype fast breeder reactor provides the challenge not only to dismantle radioactive materials but also to handle sodium-contaminated or sodium-containing components. The treatment of sodium requires additional equipment and installations to ensure a safe handling. Since it is not permitted to bring sodium into a repository, all sodium has to be neutralized either through a controlled reaction with water or by incinerating. The resulting components can be disposed of as normal radioactive waste with no further conditions. The handling of sodium needs skilled and experienced workers to minimize the inherent risks. And the example of the disposal of the large KNK cold trap shows the interaction with others and also foreign decommissioning projects can provide solutions with were unknown before. (authors)« less

  2. The distribution of soil phosphorus for global biogeochemical modeling

    DOE PAGES

    Yang, Xiaojuan; Post, Wilfred M.; Thornton, Peter E.; ...

    2013-04-16

    We discuss that phosphorus (P) is a major element required for biological activity in terrestrial ecosystems. Although the total P content in most soils can be large, only a small fraction is available or in an organic form for biological utilization because it is bound either in incompletely weathered mineral particles, adsorbed on mineral surfaces, or, over the time of soil formation, made unavailable by secondary mineral formation (occluded). In order to adequately represent phosphorus availability in global biogeochemistry–climate models, a representation of the amount and form of P in soils globally is required. We develop an approach that buildsmore » on existing knowledge of soil P processes and databases of parent material and soil P measurements to provide spatially explicit estimates of different forms of naturally occurring soil P on the global scale. We assembled data on the various forms of phosphorus in soils globally, chronosequence information, and several global spatial databases to develop a map of total soil P and the distribution among mineral bound, labile, organic, occluded, and secondary P forms in soils globally. The amount of P, to 50cm soil depth, in soil labile, organic, occluded, and secondary pools is 3.6 ± 3, 8.6 ± 6, 12.2 ± 8, and 3.2 ± 2 Pg P (Petagrams of P, 1 Pg = 1 × 10 15g) respectively. The amount in soil mineral particles to the same depth is estimated at 13.0 ± 8 Pg P for a global soil total of 40.6 ± 18 Pg P. The large uncertainty in our estimates reflects our limited understanding of the processes controlling soil P transformations during pedogenesis and a deficiency in the number of soil P measurements. In spite of the large uncertainty, the estimated global spatial variation and distribution of different soil P forms presented in this study will be useful for global biogeochemistry models that include P as a limiting element in biological production by providing initial estimates of the available soil P for plant uptake and microbial utilization.« less

  3. Pilot-Induced Oscillation Prediction With Three Levels of Simulation Motion Displacement

    NASA Technical Reports Server (NTRS)

    Schroeder, Jeffery A.; Chung, William W. Y.; Tran, Duc T.; Laforce, Soren; Bengford, Norman J.

    2001-01-01

    Simulator motion platform characteristics were examined to determine if the amount of motion affects pilot-induced oscillation (PIO) prediction. Five test pilots evaluated how susceptible 18 different sets of pitch dynamics were to PIOs with three different levels of simulation motion platform displacement: large, small, and none. The pitch dynamics were those of a previous in-flight experiment, some of which elicited PIOs These in-flight results served as truth data for the simulation. As such, the in-flight experiment was replicated as much as possible. Objective and subjective data were collected and analyzed With large motion, PIO and handling qualities ratings matched the flight data more closely than did small motion or no motion. Also, regardless of the aircraft dynamics, large motion increased pilot confidence in assigning handling qualifies ratings, reduced safety pilot trips, and lowered touchdown velocities. While both large and small motion provided a pitch rate cue of high fidelity, only large motion presented the pilot with a high fidelity vertical acceleration cue.

  4. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  5. Unit-Dose Bags For Formulating Intravenous Solutions

    NASA Technical Reports Server (NTRS)

    Finley, Mike; Kipp, Jim; Scharf, Mike; Packard, Jeff; Owens, Jim

    1993-01-01

    Smaller unit-dose flowthrough bags devised for use with large-volume parenteral (LVP) bags in preparing sterile intravenous solutions. Premeasured amount of solute stored in such unit-dose bag flushed by predetermined amount of water into LVP bag. Relatively small number of LVP bags used in conjunction with smaller unit-dose bags to formulate large number of LVP intravenous solutions in nonsterile environment.

  6. Taking Energy to the Physics Classroom from the Large Hadron Collider at CERN

    ERIC Educational Resources Information Center

    Cid, Xabier; Cid, Ramon

    2009-01-01

    In 2008, the greatest experiment in history began. When in full operation, the Large Hadron Collider (LHC) at CERN will generate the greatest amount of information that has ever been produced in an experiment before. It will also reveal some of the most fundamental secrets of nature. Despite the enormous amount of information available on this…

  7. Patchy reaction-diffusion and population abundance: the relative importance of habitat amount and arrangement

    Treesearch

    Curtis H. Flather; Michael Bevers

    2002-01-01

    A discrete reaction-diffusion model was used to estimate long-term equilibrium populations of a hypothetical species inhabiting patchy landscapes to examine the relative importance of habitat amount and arrangement in explaining population size. When examined over a broad range of habitat amounts and arrangements, population size was largely determined by a pure amount...

  8. Resolving the tips of the tree of life: How much mitochondrialdata doe we need?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonett, Ronald M.; Macey, J. Robert; Boore, Jeffrey L.

    2005-04-29

    Mitochondrial (mt) DNA sequences are used extensively to reconstruct evolutionary relationships among recently diverged animals,and have constituted the most widely used markers for species- and generic-level relationships for the last decade or more. However, most studies to date have employed relatively small portions of the mt-genome. In contrast, complete mt-genomes primarily have been used to investigate deep divergences, including several studies of the amount of mt sequence necessary to recover ancient relationships. We sequenced and analyzed 24 complete mt-genomes from a group of salamander species exhibiting divergences typical of those in many species-level studies. We present the first comprehensive investigationmore » of the amount of mt sequence data necessary to consistently recover the mt-genome tree at this level, using parsimony and Bayesian methods. Both methods of phylogenetic analysis revealed extremely similar results. A surprising number of well supported, yet conflicting, relationships were found in trees based on fragments less than {approx}2000 nucleotides (nt), typical of the vast majority of the thousands of mt-based studies published to date. Large amounts of data (11,500+ nt) were necessary to consistently recover the whole mt-genome tree. Some relationships consistently were recovered with fragments of all sizes, but many nodes required the majority of the mt-genome to stabilize, particularly those associated with short internal branches. Although moderate amounts of data (2000-3000 nt) were adequate to recover mt-based relationships for which most nodes were congruent with the whole mt-genome tree, many thousands of nucleotides were necessary to resolve rapid bursts of evolution. Recent advances in genomics are making collection of large amounts of sequence data highly feasible, and our results provide the basis for comparative studies of other closely related groups to optimize mt sequence sampling and phylogenetic resolution at the ''tips'' of the Tree of Life.« less

  9. Effects of season and nitrogen supply on the partitioning of recently fixed carbon in understory vegetation using a 13CO2 pulse labeling technique

    NASA Astrophysics Data System (ADS)

    Hasselquist, Niles; Metcalfe, Daniel; Högberg, Peter

    2013-04-01

    Vegetation research in boreal forests has traditionally been focused on trees, with little attention given to understory vegetation. However, understory vegetation has been identified as a key driver for the functioning of boreal forests and may play an important role in the amount of carbon (C) that is entering and leaving these forested ecosystems. We conducted a large-scale 13C pulse labeling experiment to better understand how recently fixed C is allocated in the understory vegetation characteristic of boreal forests. We used transparent plastic chambers to pulse label the understory vegetation with enriched 13CO2 in the early (June) and late (August) growing seasons. This study was also replicated across a nitrogen (N) fertilization treatment to better understand the effects of N availability on C allocation patterns. We present data on the amount of 13C label found in different components of the understory vegetation (i.e. leaves, stems, lichens, mosses, rhizomes and fine roots) as well as CO2 efflux. Additionally, we provide estimates of C residence time (MRT) among the different components and examine how MRT of C is affected by seasonality and N availability. Seasonality had a large effect on how recently fixed C is allocated in understory vegetation, whereas N fertilization influenced the MRT of C in the different components of ericaceous vegetation. Moreover, there was a general trend that N additions increased the amount of 13C in CO2 efflux compared to the amount of 13C in biomass, suggesting that N fertilization may lead to an increase in the utilization of recently fixed C, whereas N-limitation promotes the storage of recently fixed C.

  10. The Influence of Cloud Field Uniformity on Observed Cloud Amount

    NASA Astrophysics Data System (ADS)

    Riley, E.; Kleiss, J.; Kassianov, E.; Long, C. N.; Riihimaki, L.; Berg, L. K.

    2017-12-01

    Two ground-based measurements of cloud amount include cloud fraction (CF) obtained from time series of zenith-pointing radar-lidar observations and fractional sky cover (FSC) acquired from a Total Sky Imager (TSI). In comparison with the radars and lidars, the TSI has a considerably larger field of view (FOV 100° vs. 0.2°) and therefore is expected to have a different sensitivity to inhomogeneity in a cloud field. Radiative transfer calculations based on cloud properties retrieved from narrow-FOV overhead cloud observations may differ from shortwave and longwave flux observations due to spatial variability in local cloud cover. This bias will impede radiative closure for sampling reasons rather than the accuracy of cloud microphysics retrievals or radiative transfer calculations. Furthermore, the comparison between observed and modeled cloud amount from large eddy simulations (LES) models may be affected by cloud field inhomogeneity. The main goal of our study is to estimate the anticipated impact of cloud field inhomogeneity on the level of agreement between CF and FSC. We focus on shallow cumulus clouds observed at the U.S. Department of Energy Atmospheric Radiation Measurement Facility's Southern Great Plains (SGP) site in Oklahoma, USA. Our analysis identifies cloud field inhomogeneity using a novel metric that quantifies the spatial and temporal uniformity of FSC over 100-degree FOV TSI images. We demonstrate that (1) large differences between CF and FSC are partly attributable to increases in inhomogeneity and (2) using the uniformity metric can provide a meaningful assessment of uncertainties in observed cloud amount to aide in comparing ground-based measurements to radiative transfer or LES model outputs at SGP.

  11. Actual consumption amount of personal care products reflecting Japanese cosmetic habits.

    PubMed

    Yamaguchi, Masahiko; Araki, Daisuke; Kanamori, Takeshi; Okiyama, Yasuko; Seto, Hirokazu; Uda, Masaki; Usami, Masahito; Yamamoto, Yutaka; Masunaga, Takuji; Sasa, Hitoshi

    2017-01-01

    Safety assessments of cosmetics are carried out by identifying possible harmful effects of substances in cosmetic products and assessing the exposure to products containing these substances. The present study provided data on the amounts of cosmetic products consumed in Japan to enhance and complement the existing data from Europe and the United States, i.e., the West. The outcomes of this study increase the accuracy of exposure assessments and enable more sophisticated risk assessment as a part of the safety assessment of cosmetic products. Actual amounts of products applied were calculated by determining the difference in the weight of products before and after use by approximately 300 subjects. The results of the study of skincare products revealed that in comparison with the West, large amounts of lotions and emulsions were applied, whereas lower amounts of cream and essence were applied in Japan. In the study of sunscreen products, actual measured values during outdoor leisure use were obtained, and these were lower than the values from the West. The study of the use of facial mask packs yielded data on typical Japanese sheet-type impregnated masks and revealed that high amounts were applied. Furthermore, data were obtained on cleansing foams, makeup removers and makeup products. The data from the present study enhance and complement existing information and will facilitate more sophisticated risk assessments. The present results should be extremely useful in safety assessments of newly developed cosmetic products and to regulatory authorities in Japan and around the world.

  12. Opportunistic citizen science data transform understanding of species distributions, phenology, and diversity gradients for global change research.

    PubMed

    Soroye, Peter; Ahmed, Najeeba; Kerr, Jeremy T

    2018-06-19

    Opportunistic citizen science (CS) programs allow volunteers to report species observations from anywhere, at any time, and can assemble large volumes of historic and current data at faster rates than more coordinated programs with standardized data collection. This can quickly provide large amounts of species distributional data, but whether this focus on participation comes at a cost in data quality is not clear. While automated and expert vetting can increase data reliability, there is no guarantee that opportunistic data will do anything more than confirm information from professional surveys. Here, we use eButterfly, an opportunistic CS program, and a comparable dataset of professionally collected observations, to measure the amount of new distributional species information that opportunistic CS generates. We also test how well opportunistic CS can estimate regional species richness for a large group of taxa (>300 butterfly species) across a broad area. We find that eButterfly contributes new distributional information for >80% of species, and that opportunistically submitting observations allowed volunteers to spot species ~35 days earlier than professionals. While eButterfly did a relatively poor job at predicting regional species richness by itself (detecting only about 35-57% of species per region), it significantly contributed to regional species richness when used with the professional dataset (adding ~3 species that had gone undetected in professional surveys per region). Overall, we find that the opportunistic CS model can provide substantial complementary species information when used alongside professional survey data. Our results suggest that data from opportunistic CS programs in conjunction with professional datasets can strongly increase the capacity of researchers to estimate species richness, and provide unique information on species distributions and phenologies that are relevant to the detection of the biological consequences of global change. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  13. Enhancement of in vitro Guayule propagation

    NASA Technical Reports Server (NTRS)

    Dastoor, M. N.; Schubert, W. W.; Petersen, G. R. (Inventor)

    1982-01-01

    A method for stimulating in vitro propagation of Guayule from a nutrient medium containing Guayule tissue by adding a substituted trialkyl amine bioinducing agent to the nutrient medium is described. Selective or differentiated propagation of shoots or callus is obtained by varying the amounts of substituted trialky amine present in the nutrient medium. The luxuriant growth provided may be processed for its poly isoprene content or may be transferred to a rooting medium for production of whole plants as identical clones of the original tissue. The method also provides for the production of large numbers of Guayule plants having identical desirable properties such as high polyisoprene levels.

  14. Diabetes Interactive Atlas

    PubMed Central

    Burrows, Nilka R.; Geiss, Linda S.

    2014-01-01

    The Diabetes Interactive Atlas is a recently released Web-based collection of maps that allows users to view geographic patterns and examine trends in diabetes and its risk factors over time across the United States and within states. The atlas provides maps, tables, graphs, and motion charts that depict national, state, and county data. Large amounts of data can be viewed in various ways simultaneously. In this article, we describe the design and technical issues for developing the atlas and provide an overview of the atlas’ maps and graphs. The Diabetes Interactive Atlas improves visualization of geographic patterns, highlights observation of trends, and demonstrates the concomitant geographic and temporal growth of diabetes and obesity. PMID:24503340

  15. Visual attention mitigates information loss in small- and large-scale neural codes.

    PubMed

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Handspinning Enabled Highly Concentrated Carbon Nanotubes with Controlled Orientation in Nanofibers

    PubMed Central

    Lee, Hoik; Watanabe, Kei; Kim, Myungwoong; Gopiraman, Mayakrishnan; Song, Kyung-Hun; Lee, Jung Soon; Kim, Ick Soo

    2016-01-01

    The novel method, handspinning (HS), was invented by mimicking commonly observed methods in our daily lives. The use of HS allows us to fabricate carbon nanotube-reinforced nanofibers (CNT-reinforced nanofibers) by addressing three significant challenges: (i) the difficulty of forming nanofibers at high concentrations of CNTs, (ii) aggregation of the CNTs, and (iii) control of the orientation of the CNTs. The handspun nanofibers showed better physical properties than fibers fabricated by conventional methods, such as electrospinning. Handspun nanofibers retain a larger amount of CNTs than electrospun nanofibers, and the CNTs are easily aligned uniaxially. We attributed these improvements provided by the HS process to simple mechanical stretching force, which allows for orienting the nanofillers along with the force direction without agglomeration, leading to increased contact area between the CNTs and the polymer matrix, thereby providing enhanced interactions. HS is a simple and straightforward method as it does not require an electric field, and, hence, any kinds of polymers and solvents can be applicable. Furthermore, it is feasible to retain a large amount of various nanofillers in the fibers to enhance their physical and chemical properties. Therefore, HS provides an effective pathway to create new types of reinforced nanofibers with outstanding properties. PMID:27876892

  17. Time and Space Efficient Algorithms for Two-Party Authenticated Data Structures

    NASA Astrophysics Data System (ADS)

    Papamanthou, Charalampos; Tamassia, Roberto

    Authentication is increasingly relevant to data management. Data is being outsourced to untrusted servers and clients want to securely update and query their data. For example, in database outsourcing, a client's database is stored and maintained by an untrusted server. Also, in simple storage systems, clients can store very large amounts of data but at the same time, they want to assure their integrity when they retrieve them. In this paper, we present a model and protocol for two-party authentication of data structures. Namely, a client outsources its data structure and verifies that the answers to the queries have not been tampered with. We provide efficient algorithms to securely outsource a skip list with logarithmic time overhead at the server and client and logarithmic communication cost, thus providing an efficient authentication primitive for outsourced data, both structured (e.g., relational databases) and semi-structured (e.g., XML documents). In our technique, the client stores only a constant amount of space, which is optimal. Our two-party authentication framework can be deployed on top of existing storage applications, thus providing an efficient authentication service. Finally, we present experimental results that demonstrate the practical efficiency and scalability of our scheme.

  18. Fatty acids in energy metabolism of the central nervous system.

    PubMed

    Panov, Alexander; Orynbayeva, Zulfiya; Vavilin, Valentin; Lyakhovich, Vyacheslav

    2014-01-01

    In this review, we analyze the current hypotheses regarding energy metabolism in the neurons and astroglia. Recently, it was shown that up to 20% of the total brain's energy is provided by mitochondrial oxidation of fatty acids. However, the existing hypotheses consider glucose, or its derivative lactate, as the only main energy substrate for the brain. Astroglia metabolically supports the neurons by providing lactate as a substrate for neuronal mitochondria. In addition, a significant amount of neuromediators, glutamate and GABA, is transported into neurons and also serves as substrates for mitochondria. Thus, neuronal mitochondria may simultaneously oxidize several substrates. Astrocytes have to replenish the pool of neuromediators by synthesis de novo, which requires large amounts of energy. In this review, we made an attempt to reconcile β-oxidation of fatty acids by astrocytic mitochondria with the existing hypothesis on regulation of aerobic glycolysis. We suggest that, under condition of neuronal excitation, both metabolic pathways may exist simultaneously. We provide experimental evidence that isolated neuronal mitochondria may oxidize palmitoyl carnitine in the presence of other mitochondrial substrates. We also suggest that variations in the brain mitochondrial metabolic phenotype may be associated with different mtDNA haplogroups.

  19. Fatty Acids in Energy Metabolism of the Central Nervous System

    PubMed Central

    Orynbayeva, Zulfiya; Vavilin, Valentin; Lyakhovich, Vyacheslav

    2014-01-01

    In this review, we analyze the current hypotheses regarding energy metabolism in the neurons and astroglia. Recently, it was shown that up to 20% of the total brain's energy is provided by mitochondrial oxidation of fatty acids. However, the existing hypotheses consider glucose, or its derivative lactate, as the only main energy substrate for the brain. Astroglia metabolically supports the neurons by providing lactate as a substrate for neuronal mitochondria. In addition, a significant amount of neuromediators, glutamate and GABA, is transported into neurons and also serves as substrates for mitochondria. Thus, neuronal mitochondria may simultaneously oxidize several substrates. Astrocytes have to replenish the pool of neuromediators by synthesis de novo, which requires large amounts of energy. In this review, we made an attempt to reconcile β-oxidation of fatty acids by astrocytic mitochondria with the existing hypothesis on regulation of aerobic glycolysis. We suggest that, under condition of neuronal excitation, both metabolic pathways may exist simultaneously. We provide experimental evidence that isolated neuronal mitochondria may oxidize palmitoyl carnitine in the presence of other mitochondrial substrates. We also suggest that variations in the brain mitochondrial metabolic phenotype may be associated with different mtDNA haplogroups. PMID:24883315

  20. An Extensible Processing Framework for Eddy-covariance Data

    NASA Astrophysics Data System (ADS)

    Durden, D.; Fox, A. M.; Metzger, S.; Sturtevant, C.; Durden, N. P.; Luo, H.

    2016-12-01

    The evolution of large data collecting networks has not only led to an increase of available information, but also in the complexity of analyzing the observations. Timely dissemination of readily usable data products necessitates a streaming processing framework that is both automatable and flexible. Tower networks, such as ICOS, Ameriflux, and NEON, exemplify this issue by requiring large amounts of data to be processed from dispersed measurement sites. Eddy-covariance data from across the NEON network are expected to amount to 100 Gigabytes per day. The complexity of the algorithmic processing necessary to produce high-quality data products together with the continued development of new analysis techniques led to the development of a modular R-package, eddy4R. This allows algorithms provided by NEON and the larger community to be deployed in streaming processing, and to be used by community members alike. In order to control the processing environment, provide a proficient parallel processing structure, and certify dependencies are available during processing, we chose Docker as our "Development and Operations" (DevOps) platform. The Docker framework allows our processing algorithms to be developed, maintained and deployed at scale. Additionally, the eddy4R-Docker framework fosters community use and extensibility via pre-built Docker images and the Github distributed version control system. The capability to process large data sets is reliant upon efficient input and output of data, data compressibility to reduce compute resource loads, and the ability to easily package metadata. The Hierarchical Data Format (HDF5) is a file format that can meet these needs. A NEON standard HDF5 file structure and metadata attributes allow users to explore larger data sets in an intuitive "directory-like" structure adopting the NEON data product naming conventions.

  1. 1H NMR quantitative determination of photosynthetic pigments from green beans (Phaseolus vulgaris L.).

    PubMed

    Valverde, Juan; This, Hervé

    2008-01-23

    Using 1H nuclear magnetic resonance spectroscopy (1D and 2D), the two types of photosynthetic pigments (chlorophylls, their derivatives, and carotenoids) of "green beans" (immature pods of Phaseolus vulgaris L.) were analyzed. Compared to other analytical methods (light spectroscopy or chromatography), 1H NMR spectroscopy is a fast analytical way that provides more information on chlorophyll derivatives (allomers and epimers) than ultraviolet-visible spectroscopy. Moreover, it gives a large amount of data without prior chromatographic separation.

  2. Techniques for increasing the efficiency of Earth gravity calculations for precision orbit determination

    NASA Technical Reports Server (NTRS)

    Smith, R. L.; Lyubomirsky, A. S.

    1981-01-01

    Two techniques were analyzed. The first is a representation using Chebyshev expansions in three-dimensional cells. The second technique employs a temporary file for storing the components of the nonspherical gravity force. Computer storage requirements and relative CPU time requirements are presented. The Chebyshev gravity representation can provide a significant reduction in CPU time in precision orbit calculations, but at the cost of a large amount of direct-access storage space, which is required for a global model.

  3. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  4. Technology Requirements for Information Management

    NASA Technical Reports Server (NTRS)

    Graves, Sara; Knoblock, Craig A.; Lannom, Larry

    2002-01-01

    This report provides the results of a panel study conducted into the technology requirements for information management in support of application domains of particular government interest, including digital libraries, mission operations, and scientific research. The panel concluded that it was desirable to have a coordinated program of R&D that pursues a science of information management focused on an environment typified by applications of government interest - highly distributed with very large amounts of data and a high degree of heterogeneity of sources, data, and users.

  5. Avian influenza virus and free-ranging wild birds

    USGS Publications Warehouse

    Dierauf, Leslie A.; Karesh, W.B.; Ip, Hon S.; Gilardi, K.V.; Fischer, John R.

    2006-01-01

    Recent media and news reports and other information implicate wild birds in the spread of highly pathogenic avian influenza in Asia and Eastern Europe. Although there is little information concerning highly pathogenic avian influenza viruses in wild birds, scientists have amassed a large amount of data on low-pathogenicity avian influenza viruses during decades of research with wild birds. This knowledge can provide sound guidance to veterinarians, public health professionals, the general public, government agencies, and other entities with concerns about avian influenza.

  6. Entropy of gaseous boron monobromide

    NASA Astrophysics Data System (ADS)

    Wang, Jian-Feng; Peng, Xiao-Long; Zhang, Lie-Hui; Wang, Chao-Wen; Jia, Chun-Sheng

    2017-10-01

    We present an explicit representation of molar entropy for gaseous boron monobromide in terms of experimental values of only three molecular constants. Fortunately, through comparison of theoretically calculated results and experimental data, we find that the molar entropy of gaseous boron monobromide can be well predicted by employing the improved Manning-Rosen oscillator to describe the internal vibration of boron monobromide molecule. The present approach provides also opportunities for theoretical predictions of molar entropy for other gases with no use of large amounts of experimental spectroscopy data.

  7. Privacy Challenges of Genomic Big Data.

    PubMed

    Shen, Hong; Ma, Jian

    2017-01-01

    With the rapid advancement of high-throughput DNA sequencing technologies, genomics has become a big data discipline where large-scale genetic information of human individuals can be obtained efficiently with low cost. However, such massive amount of personal genomic data creates tremendous challenge for privacy, especially given the emergence of direct-to-consumer (DTC) industry that provides genetic testing services. Here we review the recent development in genomic big data and its implications on privacy. We also discuss the current dilemmas and future challenges of genomic privacy.

  8. New Engineering Solutions in Creation of Mini-BOF for Metallic Waste Recycling

    NASA Astrophysics Data System (ADS)

    Eronko, S. P.; Gorbatyuk, S. M.; Oshovskaya, E. V.; Starodubtsev, B. I.

    2017-12-01

    New engineering solutions used in design of the mini melting unit capable of recycling industrial and domestic metallic waste with high content of harmful impurities are provided. High efficiency of the process technology implemented with its use is achieved due to the possibility of the heat and mass transfer intensification in the molten metal bath, controlled charge into it of large amounts of reagents in lumps and in fines, and cut-off of remaining process slag during metal tapping into the teeming ladle.

  9. Combines Attitude Control and Energy Storage for Small Satellites using Variable Speed Control Moment Gyroscopes

    DTIC Science & Technology

    2008-06-24

    CMG torque amplification property in which a small amount of CMG gimbal motor input torque results in a relatively large slewing torque gives it a... properties these actuators provide [74–77]. 2.1.3 Magnetic Levitation and Bearing Technology Magnetic bearings for flywheel rotor suspension has a rich...2.1: Example of Magnetic and Ball Bearing Properties [21] Bearing ID OD Height Radial Static Max Speed (mm) (mm) (mm) Capacity(N) (RPM) MB-R-25-205 25

  10. Quantitative mapping of rainfall rates over the oceans utilizing Nimbus-5 ESMR data

    NASA Technical Reports Server (NTRS)

    Rao, M. S. V.; Abbott, W. V.

    1976-01-01

    The electrically scanning microwave radiometer (ESMR) data from the Nimbus 5 satellite was used to deduce estimates of precipitation amount over the oceans. An atlas of the global oceanic rainfall was prepared and the global rainfall maps analyzed and related to available ground truth information as well as to large scale processes in the atmosphere. It was concluded that the ESMR system provides the most reliable and direct approach yet known for the estimation of rainfall over sparsely documented, wide oceanic regions.

  11. Universal SaaS platform of internet of things for real-time monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tongke; Wu, Gang

    2018-04-01

    Real-time monitoring service, as a member of the IoT (Internet of Things) service, has a wide range application scenario. To support rapid construction and deployment of applications and avoid repetitive development works in these processes, this paper designs and develops a universal SaaS platform of IoT for real-time monitoring. Evaluation shows that this platform can provide SaaS service to multiple tenants and achieve high real-time performance under the situation of large amount of device access.

  12. Experience in highly parallel processing using DAP

    NASA Technical Reports Server (NTRS)

    Parkinson, D.

    1987-01-01

    Distributed Array Processors (DAP) have been in day to day use for ten years and a large amount of user experience has been gained. The profile of user applications is similar to that of the Massively Parallel Processor (MPP) working group. Experience has shown that contrary to expectations, highly parallel systems provide excellent performance on so-called dirty problems such as the physics part of meteorological codes. The reasons for this observation are discussed. The arguments against replacing bit processors with floating point processors are also discussed.

  13. Human interface to large multimedia databases

    NASA Astrophysics Data System (ADS)

    Davis, Ben; Marks, Linn; Collins, Dave; Mack, Robert; Malkin, Peter; Nguyen, Tam

    1994-04-01

    The emergence of high-speed networking for multimedia will have the effect of turning the computer screen into a window on a very large information space. As this information space increases in size and complexity, providing users with easy and intuitive means of accessing information will become increasingly important. Providing access to large amounts of text has been the focus of work for hundreds of years and has resulted in the evolution of a set of standards, from the Dewey Decimal System for libraries to the recently proposed ANSI standards for representing information on-line: KIF, Knowledge Interchange Format, and CG's, Conceptual Graphs. Certain problems remain unsolved by these efforts, though: how to let users know the contents of the information space, so that they know whether or not they want to search it in the first place, how to facilitate browsing, and, more specifically, how to facilitate visual browsing. These issues are particularly important for users in educational contexts and have been the focus of much of our recent work. In this paper we discuss some of the solutions we have prototypes: specifically, visual means, visual browsers, and visual definitional sequences.

  14. How much land is needed for feral pig hunting in Hawai'i?

    USGS Publications Warehouse

    Hess, Steven C.; Jacobi, James D.

    2014-01-01

    Hunting is often considered to be incompatible with conservation of native biota and watershed functions in Hawai'i. Management actions for conservation generally exclude large non-native mammals from natural areas, thereby reducing the amount of land area available for hunting activities and the maintenance of sustainable game populations. An approach which may be useful in addressing the necessary minimum amount of land area allocated for hunting in Hawai'i is to determine the amount of land area necessary for sustaining populations of hunted animals to meet current levels harvested by the public. We ask: What is the total amount of land necessary to provide sustained-yield hunting of game meat for food at the current harvest level on Hawai'i Island if only feral pigs (Sus scrofa) were to be harvested? We used a simplistic analysis to estimate that 1 317.6 km2-1 651.4 km2 would be necessary to produce 187 333.6 kg of feral pig meat annually based on the range of dressed weight per whole pig, the proportion of a pig population that can be sustainably removed annually, and the density of pig populations in the wild. This amount of area comprises 12.6-15.8% of the total land area of Hawai'i Island, but more likely represents 27.6-43.5% of areas that may be compatible with sustained-yield hunting.

  15. Old age and underlying interstitial abnormalities are risk factors for development of ARDS after pleurodesis using limited amount of large particle size talc.

    PubMed

    Shinno, Yuki; Kage, Hidenori; Chino, Haruka; Inaba, Atsushi; Arakawa, Sayaka; Noguchi, Satoshi; Amano, Yosuke; Yamauchi, Yasuhiro; Tanaka, Goh; Nagase, Takahide

    2018-01-01

    Talc pleurodesis is commonly performed to manage refractory pleural effusion or pneumothorax. It is considered as a safe procedure as long as a limited amount of large particle size talc is used. However, acute respiratory distress syndrome (ARDS) is a rare but serious complication after talc pleurodesis. We sought to determine the risk factors for the development of ARDS after pleurodesis using a limited amount of large particle size talc. We retrospectively reviewed patients who underwent pleurodesis with talc or OK-432 at the University of Tokyo Hospital. Twenty-seven and 35 patients underwent chemical pleurodesis using large particle size talc (4 g or less) or OK-432, respectively. Four of 27 (15%) patients developed ARDS after talc pleurodesis. Patients who developed ARDS were significantly older than those who did not (median 80 vs 66 years, P = 0.02) and had a higher prevalence of underlying interstitial abnormalities on chest computed tomography (CT; 2/4 vs 1/23, P < 0.05). No patient developed ARDS after pleurodesis with OK-432. This is the first case series of ARDS after pleurodesis using a limited amount of large particle size talc. Older age and underlying interstitial abnormalities on chest CT seem to be risk factors for developing ARDS after talc pleurodesis. © 2017 Asian Pacific Society of Respirology.

  16. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  17. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  18. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  19. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  20. 9 CFR 381.412 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  1. 9 CFR 317.312 - Reference amounts customarily consumed per eating occasion.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... appropriate national food consumption surveys. (2) The Reference Amounts are calculated for an infant or child... are based on data set forth in appropriate national food consumption surveys. Such Reference Amounts... child under 4 years of age. (3) An appropriate national food consumption survey includes a large sample...

  2. Bag For Formulating And Dispersing Intravenous Solution

    NASA Technical Reports Server (NTRS)

    Kipp, Jim; Owens, Jim; Scharf, Mike; Finley, Mike; Dudar, Tom; Veillon, Joe; Ogle, Jim

    1993-01-01

    Large-volume parenteral (LVP) bag in which predetermined amount of sterile solution formulated by combining premeasured, prepackaged amount of sterile solute with predetermined amount of water. Bag designed to hold predetermined amount, typically 1 L, of sterile solution. Sterility of solution maintained during mixing by passing water into bag through sterilizing filter. System used in field or hospitals not having proper sterile facilities, and in field research.

  3. Low-authority control synthesis for large space structures

    NASA Technical Reports Server (NTRS)

    Aubrun, J. N.; Margulies, G.

    1982-01-01

    The control of vibrations of large space structures by distributed sensors and actuators is studied. A procedure is developed for calculating the feedback loop gains required to achieve specified amounts of damping. For moderate damping (Low Authority Control) the procedure is purely algebraic, but it can be applied iteratively when larger amounts of damping are required and is generalized for arbitrary time invariant systems.

  4. A divide-and-conquer algorithm for large-scale de novo transcriptome assembly through combining small assemblies from existing algorithms.

    PubMed

    Sze, Sing-Hoi; Parrott, Jonathan J; Tarone, Aaron M

    2017-12-06

    While the continued development of high-throughput sequencing has facilitated studies of entire transcriptomes in non-model organisms, the incorporation of an increasing amount of RNA-Seq libraries has made de novo transcriptome assembly difficult. Although algorithms that can assemble a large amount of RNA-Seq data are available, they are generally very memory-intensive and can only be used to construct small assemblies. We develop a divide-and-conquer strategy that allows these algorithms to be utilized, by subdividing a large RNA-Seq data set into small libraries. Each individual library is assembled independently by an existing algorithm, and a merging algorithm is developed to combine these assemblies by picking a subset of high quality transcripts to form a large transcriptome. When compared to existing algorithms that return a single assembly directly, this strategy achieves comparable or increased accuracy as memory-efficient algorithms that can be used to process a large amount of RNA-Seq data, and comparable or decreased accuracy as memory-intensive algorithms that can only be used to construct small assemblies. Our divide-and-conquer strategy allows memory-intensive de novo transcriptome assembly algorithms to be utilized to construct large assemblies.

  5. Towards Personalized Medicine Mediated by in Vitro Virus-Based Interactome Approaches

    PubMed Central

    Ohashi, Hiroyuki; Miyamoto-Sato, Etsuko

    2014-01-01

    We have developed a simple in vitro virus (IVV) selection system based on cell-free co-translation, using a highly stable and efficient mRNA display method. The IVV system is applicable to the high-throughput and comprehensive analysis of proteins and protein–ligand interactions. Huge amounts of genomic sequence data have been generated over the last decade. The accumulated genetic alterations and the interactome networks identified within cells represent a universal feature of a disease, and knowledge of these aspects can help to determine the optimal therapy for the disease. The concept of the “integrome” has been developed as a means of integrating large amounts of data. We have developed an interactome analysis method aimed at providing individually-targeted health care. We also consider future prospects for this system. PMID:24756093

  6. Analysis of inorganic and organic constituents of myrrh resin by GC-MS and ICP-MS: An emphasis on medicinal assets.

    PubMed

    Ahamad, Syed Rizwan; Al-Ghadeer, Abdul Rahman; Ali, Raisuddin; Qamar, Wajhul; Aljarboa, Suliman

    2017-07-01

    The aim of the present investigation was to explore the constituents of the Arabian myrrh resin obtained from Commiphora myrrha. The organic and inorganic composition of the myrrh gum resin has been investigated using gas chromatography-mass spectrometry (GC-MS) and inductively coupled plasma-mass spectrometry (ICP-MS). Analysis executed by ICP-MS reveals the presence of various inorganic elements in significant amount in the myrrh resin. The elements that were found to be present in large amounts include calcium, magnesium, aluminum, phosphorus, chlorine, chromium, bromine and scandium. The important organic constituents identified in the myrrh ethanolic extract include limonene, curzerene, germacrene B, isocericenine, myrcenol, beta selinene, and spathulenol,. The present work complements other myrrh associated investigations done in the past and provides additional data for the future researches.

  7. Circulating plant miRNAs can regulate human gene expression in vitro

    PubMed Central

    Pastrello, Chiara; Tsay, Mike; McQuaid, Rosanne; Abovsky, Mark; Pasini, Elisa; Shirdel, Elize; Angeli, Marc; Tokar, Tomas; Jamnik, Joseph; Kotlyar, Max; Jurisicova, Andrea; Kotsopoulos, Joanne; El-Sohemy, Ahmed; Jurisica, Igor

    2016-01-01

    While Brassica oleracea vegetables have been linked to cancer prevention, the exact mechanism remains unknown. Regulation of gene expression by cross-species microRNAs has been previously reported; however, its link to cancer suppression remains unexplored. In this study we address both issues. We confirm plant microRNAs in human blood in a large nutrigenomics study cohort and in a randomized dose-controlled trial, finding a significant positive correlation between the daily amount of broccoli consumed and the amount of microRNA in the blood. We also demonstrate that Brassica microRNAs regulate expression of human genes and proteins in vitro, and that microRNAs cooperate with other Brassica-specific compounds in a possible cancer-preventive mechanism. Combined, we provide strong evidence and a possible multimodal mechanism for broccoli in cancer prevention. PMID:27604570

  8. Assimilation of granite by basaltic magma at Burnt Lava flow, Medicine Lake volcano, northern California: Decoupling of heat and mass transfer

    USGS Publications Warehouse

    Grove, T.L.; Kinzler, R.J.; Baker, M.B.; Donnelly-Nolan, J. M.; Lesher, C.E.

    1988-01-01

    At Medicine Lake volcano, California, andesite of the Holocene Burnt Lava flow has been produced by fractional crystallization of parental high alumina basalt (HAB) accompanied by assimilation of granitic crustal material. Burnt Lava contains inclusions of quenched HAB liquid, a potential parent magma of the andesite, highly melted granitic crustal xenoliths, and xenocryst assemblages which provide a record of the fractional crystallization and crustal assimilation process. Samples of granitic crustal material occur as xenoliths in other Holocene and Pleistocene lavas, and these xenoliths are used to constrain geochemical models of the assimilation process. A large amount of assimilation accompanied fractional crystallization to produce the contaminated Burnt lava andesites. Models which assume that assimilation and fractionation occurred simultaneously estimate the ratio of assimilation to fractional crystallization (R) to be >1 and best fits to all geochemical data are at an R value of 1.35 at F=0.68. Petrologic evidence, however, indicates that the assimilation process did not involve continuous addition of granitic crust as fractionation occurred. Instead, heat and mass transfer were separated in space and time. During the assimilation process, HAB magma underwent large amounts of fractional crystallization which was not accompanied by significant amounts of assimilation. This fractionation process supplied heat to melt granitic crust. The models proposed to explain the contamination process involve fractionation, replenishment by parental HAB, and mixing of evolved and parental magmas with melted granitic crust. ?? 1988 Springer-Verlag.

  9. Supply of large woody debris in a stream channel

    USGS Publications Warehouse

    Diehl, Timothy H.; Bryan, Bradley A.

    1993-01-01

    The amount of large woody debris that potentially could be transported to bridge sites was assessed in the basin of the West Harpeth River in Tennessee in the fall of 1992. The assessment was based on inspections of study sites at 12 bridges and examination of channel reaches between bridges. It involved estimating the amount of woody material at least 1.5 meters long, stored in the channel, and not rooted in soil. Study of multiple sites allowed estimation of the amount, characteristics, and sources of debris stored in the channel, and identification of geomorphic features of the channel associated with debris production. Woody debris is plentiful in the channel network, and much of the debris could be transported by a large flood. Tree trunks with attached root masses are the dominant large debris type. Death of these trees is primarily the result of bank erosion. Bank instability seems to be the basin characteristic most useful in identifying basins with a high potential for abundant production of debris.

  10. The economics of clinical genetics services. III. Cognitive genetics services are not self-supporting.

    PubMed Central

    Bernhardt, B A; Pyeritz, R E

    1989-01-01

    We investigated the amount of time required to provide, and the charges and reimbursement for, cognitive genetics services in four clinical settings. In a prenatal diagnostic center, a mean of 3 h/couple was required to provide counseling and follow-up services with a mean charge of $30/h and collection of $27/h. Only 49% of personnel costs were covered by income from patient charges. In a genetics clinic in a private specialty hospital, 5.5 and 2.75 h were required to provide cognitive services to each new and follow-up family, respectively. The mean charge for each new family was $25/h and for follow-up families $13/h. The amount collected was less than 25% of that charged. In a pediatric genetics clinic in a large teaching hospital, new families required a mean of 4 h and were charged $28/h; follow-up families also required a mean of 4 h, and were charged $15/h. Only 55% of the amounts charged were collected. Income from patient charges covered only 69% of personnel costs. In a genetics outreach setting, 5 and 4.5 h were required to serve new and follow-up families, respectively. Charges were $25/h and $12/h, and no monies were collected. In all clinic settings, less than one-half of the total service time was that of a physician, and more than one-half of the service time occurred before and after the clinic visit. In no clinic setting were cognitive genetics services self-supporting. Means to improve the financial base of cognitive genetics services include improving collections, increasing charges, developing fee schedules, providing services more efficiently, and seeking state, federal, and foundation support for services. PMID:2912071

  11. Generation Expansion Planning With Large Amounts of Wind Power via Decision-Dependent Stochastic Programming

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhan, Yiduo; Zheng, Qipeng P.; Wang, Jianhui

    Power generation expansion planning needs to deal with future uncertainties carefully, given that the invested generation assets will be in operation for a long time. Many stochastic programming models have been proposed to tackle this challenge. However, most previous works assume predetermined future uncertainties (i.e., fixed random outcomes with given probabilities). In several recent studies of generation assets' planning (e.g., thermal versus renewable), new findings show that the investment decisions could affect the future uncertainties as well. To this end, this paper proposes a multistage decision-dependent stochastic optimization model for long-term large-scale generation expansion planning, where large amounts of windmore » power are involved. In the decision-dependent model, the future uncertainties are not only affecting but also affected by the current decisions. In particular, the probability distribution function is determined by not only input parameters but also decision variables. To deal with the nonlinear constraints in our model, a quasi-exact solution approach is then introduced to reformulate the multistage stochastic investment model to a mixed-integer linear programming model. The wind penetration, investment decisions, and the optimality of the decision-dependent model are evaluated in a series of multistage case studies. The results show that the proposed decision-dependent model provides effective optimization solutions for long-term generation expansion planning.« less

  12. Large-scale production and isolation of Candida biofilm extracellular matrix.

    PubMed

    Zarnowski, Robert; Sanchez, Hiram; Andes, David R

    2016-12-01

    The extracellular matrix of biofilm is unique to the biofilm lifestyle, and it has key roles in community survival. A complete understanding of the biochemical nature of the matrix is integral to the understanding of the roles of matrix components. This knowledge is a first step toward the development of novel therapeutics and diagnostics to address persistent biofilm infections. Many of the assay methods needed for refined matrix composition analysis require milligram amounts of material that is separated from the cellular components of these complex communities. The protocol described here explains the large-scale production and isolation of the Candida biofilm extracellular matrix. To our knowledge, the proposed procedure is the only currently available approach in the field that yields milligram amounts of biofilm matrix. This procedure first requires biofilms to be seeded in large-surface-area roller bottles, followed by cell adhesion and biofilm maturation during continuous movement of the medium across the surface of the rotating bottle. The formed matrix is then separated from the entire biomass using sonication, which efficiently removes the matrix without perturbing the fungal cell wall. Subsequent filtration, dialysis and lyophilization steps result in a purified matrix product sufficient for biochemical, structural and functional assays. The overall protocol takes ∼11 d to complete. This protocol has been used for Candida species, but, using the troubleshooting guide provided, it could be adapted for other fungi or bacteria.

  13. Genomics Portals: integrative web-platform for mining genomics data.

    PubMed

    Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario

    2010-01-13

    A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  14. Genomics Portals: integrative web-platform for mining genomics data

    PubMed Central

    2010-01-01

    Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org. PMID:20070909

  15. Diet - liver disease

    MedlinePlus

    ... of toxic waste products. Increasing your intake of carbohydrates to be in proportion with the amount of ... severe liver disease include: Eat large amounts of carbohydrate foods. Carbohydrates should be the major source of ...

  16. Finding Atmospheric Composition (AC) Metadata

    NASA Technical Reports Server (NTRS)

    Strub, Richard F..; Falke, Stefan; Fiakowski, Ed; Kempler, Steve; Lynnes, Chris; Goussev, Oleg

    2015-01-01

    The Atmospheric Composition Portal (ACP) is an aggregator and curator of information related to remotely sensed atmospheric composition data and analysis. It uses existing tools and technologies and, where needed, enhances those capabilities to provide interoperable access, tools, and contextual guidance for scientists and value-adding organizations using remotely sensed atmospheric composition data. The initial focus is on Essential Climate Variables identified by the Global Climate Observing System CH4, CO, CO2, NO2, O3, SO2 and aerosols. This poster addresses our efforts in building the ACP Data Table, an interface to help discover and understand remotely sensed data that are related to atmospheric composition science and applications. We harvested GCMD, CWIC, GEOSS metadata catalogs using machine to machine technologies - OpenSearch, Web Services. We also manually investigated the plethora of CEOS data providers portals and other catalogs where that data might be aggregated. This poster is our experience of the excellence, variety, and challenges we encountered.Conclusions:1.The significant benefits that the major catalogs provide are their machine to machine tools like OpenSearch and Web Services rather than any GUI usability improvements due to the large amount of data in their catalog.2.There is a trend at the large catalogs towards simulating small data provider portals through advanced services. 3.Populating metadata catalogs using ISO19115 is too complex for users to do in a consistent way, difficult to parse visually or with XML libraries, and too complex for Java XML binders like CASTOR.4.The ability to search for Ids first and then for data (GCMD and ECHO) is better for machine to machine operations rather than the timeouts experienced when returning the entire metadata entry at once. 5.Metadata harvest and export activities between the major catalogs has led to a significant amount of duplication. (This is currently being addressed) 6.Most (if not all) Earth science atmospheric composition data providers store a reference to their data at GCMD.

  17. Automated Atmospheric Composition Dataset Level Metadata Discovery. Difficulties and Surprises

    NASA Astrophysics Data System (ADS)

    Strub, R. F.; Falke, S. R.; Kempler, S.; Fialkowski, E.; Goussev, O.; Lynnes, C.

    2015-12-01

    The Atmospheric Composition Portal (ACP) is an aggregator and curator of information related to remotely sensed atmospheric composition data and analysis. It uses existing tools and technologies and, where needed, enhances those capabilities to provide interoperable access, tools, and contextual guidance for scientists and value-adding organizations using remotely sensed atmospheric composition data. The initial focus is on Essential Climate Variables identified by the Global Climate Observing System - CH4, CO, CO2, NO2, O3, SO2 and aerosols. This poster addresses our efforts in building the ACP Data Table, an interface to help discover and understand remotely sensed data that are related to atmospheric composition science and applications. We harvested GCMD, CWIC, GEOSS metadata catalogs using machine to machine technologies - OpenSearch, Web Services. We also manually investigated the plethora of CEOS data providers portals and other catalogs where that data might be aggregated. This poster is our experience of the excellence, variety, and challenges we encountered.Conclusions:1.The significant benefits that the major catalogs provide are their machine to machine tools like OpenSearch and Web Services rather than any GUI usability improvements due to the large amount of data in their catalog.2.There is a trend at the large catalogs towards simulating small data provider portals through advanced services. 3.Populating metadata catalogs using ISO19115 is too complex for users to do in a consistent way, difficult to parse visually or with XML libraries, and too complex for Java XML binders like CASTOR.4.The ability to search for Ids first and then for data (GCMD and ECHO) is better for machine to machine operations rather than the timeouts experienced when returning the entire metadata entry at once. 5.Metadata harvest and export activities between the major catalogs has led to a significant amount of duplication. (This is currently being addressed) 6.Most (if not all) Earth science atmospheric composition data providers store a reference to their data at GCMD.

  18. Storage and Retrieval of Large RDF Graph Using Hadoop and MapReduce

    NASA Astrophysics Data System (ADS)

    Farhan Husain, Mohammad; Doshi, Pankil; Khan, Latifur; Thuraisingham, Bhavani

    Handling huge amount of data scalably is a matter of concern for a long time. Same is true for semantic web data. Current semantic web frameworks lack this ability. In this paper, we describe a framework that we built using Hadoop to store and retrieve large number of RDF triples. We describe our schema to store RDF data in Hadoop Distribute File System. We also present our algorithms to answer a SPARQL query. We make use of Hadoop's MapReduce framework to actually answer the queries. Our results reveal that we can store huge amount of semantic web data in Hadoop clusters built mostly by cheap commodity class hardware and still can answer queries fast enough. We conclude that ours is a scalable framework, able to handle large amount of RDF data efficiently.

  19. Investigating uplift in the South-Western Barents Sea using sonic and density well log measurements

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Ellis, M.

    2014-12-01

    Sediments in the Barents Sea have undergone large amounts of uplift due to Plio-Pleistoncene deglaciation as well as Palaeocene-Eocene Atlantic rifting. Uplift affects the reservoir quality, seal capacity and fluid migration. Therefore, it is important to gain reliable uplift estimates in order to evaluate the petroleum prospectivity properly. To this end, a number of quantification methods have been proposed, such as Apatite Fission Track Analysis (AFTA), and integration of seismic surveys with well log data. AFTA usually provides accurate uplift estimates, but the data is limited due to its high cost. While the seismic survey can provide good uplift estimate when well data is available for calibration, the uncertainty can be large in areas where there is little to no well data. We estimated South-Western Barents Sea uplift based on well data from the Norwegian Petroleum Directorate. Primary assumptions include time-irreversible shale compaction trends and a universal normal compaction trend for a specified formation. Sonic and density logs from two Cenozoic shale formation intervals, Kolmule and Kolje, were used for the study. For each formation, we studied logs of all released wells, and established exponential normal compaction trends based on a single well. That well was then deemed the reference well, and relative uplift can be calculated at other well locations based on the offset from the normal compaction trend. We found that the amount of uplift increases along the SW to NE direction, with a maximum difference of 1,447 m from the Kolje FM estimate, and 699 m from the Kolmule FM estimate. The average standard deviation of the estimated uplift is 130 m for the Kolje FM, and 160 m for the Kolmule FM using the density log. While results from density logs and sonic logs have good agreement in general, the density log provides slightly better results in terms of higher consistency and lower standard deviation. Our results agree with published papers qualitatively with some differences in the actual amount of uplifts. The results are considered to be more accurate due to the higher resolution of the log scale data that was used.

  20. Ensemble Simulation of the Atmospheric Radionuclides Discharged by the Fukushima Nuclear Accident

    NASA Astrophysics Data System (ADS)

    Sekiyama, Thomas; Kajino, Mizuo; Kunii, Masaru

    2013-04-01

    Enormous amounts of radionuclides were discharged into the atmosphere by a nuclear accident at the Fukushima Daiichi nuclear power plant (FDNPP) after the earthquake and tsunami on 11 March 2011. The radionuclides were dispersed from the power plant and deposited mainly over eastern Japan and the North Pacific Ocean. A lot of numerical simulations of the radionuclide dispersion and deposition had been attempted repeatedly since the nuclear accident. However, none of them were able to perfectly simulate the distribution of dose rates observed after the accident over eastern Japan. This was partly due to the error of the wind vectors and precipitations used in the numerical simulations; unfortunately, their deterministic simulations could not deal with the probability distribution of the simulation results and errors. Therefore, an ensemble simulation of the atmospheric radionuclides was performed using the ensemble Kalman filter (EnKF) data assimilation system coupled with the Japan Meteorological Agency (JMA) non-hydrostatic mesoscale model (NHM); this mesoscale model has been used operationally for daily weather forecasts by JMA. Meteorological observations were provided to the EnKF data assimilation system from the JMA operational-weather-forecast dataset. Through this ensemble data assimilation, twenty members of the meteorological analysis over eastern Japan from 11 to 31 March 2011 were successfully obtained. Using these meteorological ensemble analysis members, the radionuclide behavior in the atmosphere such as advection, convection, diffusion, dry deposition, and wet deposition was simulated. This ensemble simulation provided the multiple results of the radionuclide dispersion and distribution. Because a large ensemble deviation indicates the low accuracy of the numerical simulation, the probabilistic information is obtainable from the ensemble simulation results. For example, the uncertainty of precipitation triggered the uncertainty of wet deposition; the uncertainty of wet deposition triggered the uncertainty of atmospheric radionuclide amounts. Then the remained radionuclides were transported downwind; consequently the uncertainty signal of the radionuclide amounts was propagated downwind. The signal propagation was seen in the ensemble simulation by the tracking of the large deviation areas of radionuclide concentration and deposition. These statistics are able to provide information useful for the probabilistic prediction of radionuclides.

  1. The Ethics of Paid Plasma Donation: A Plea for Patient Centeredness.

    PubMed

    Farrugia, Albert; Penrod, Joshua; Bult, Jan M

    2015-12-01

    Plasma protein therapies (PPTs) are a group of essential medicines extracted from human plasma through processes of industrial scale fractionation. They are used primarily to treat a number of rare, chronic disorders ensuing from inherited or acquired deficiencies of a number of physiologically essential proteins. These disorders include hemophilia A and B, different immunodeficiencies and alpha 1-antitrypsin deficiency. In addition, acute blood loss, burns and sepsis are treated by PPTs. Hence, a population of vulnerable and very sick individuals is dependent on these products. In addition, the continued well-being of large sections of the community, including pregnant women and their children, travelers and workers exposed to infectious risk is also subject to the availability of these therapies. Their manufacture to adequate amounts requires large volumes of human plasma as the starting material of a complex purification process. Mainstream blood transfusion services run primarily by the not-for-profit sector have attempted to provide this plasma through the separation of blood donations, but have failed to provide sufficient amounts to meet the clinical demand. The collection of plasma from donors willing to commit to the process of plasmapheresis, which is not only time consuming but requires a long term, continuing commitment, generates much higher amounts of plasma and has been an activity historically separate from the blood transfusion sector and run by commercial companies. These companies now supply two-thirds of the growing global need for these therapies, while the mainstream government-run blood sector continues to supply a shrinking proportion. The private sector plasmapheresis activity which provides the bulk of treatment products has been compensating the donors in order to recognize the time and effort required. Recent activities have reignited the debate regarding the ethical and medical aspects of such compensation. In this work, we review the landscape; assess the contributions made by the compensated and non-compensated sectors and synthesize the outcomes on the relevant patient communities of perturbing the current paradigm of compensated plasma donation. We conclude that the current era of "Patient Centeredness" in health care demands the continuation and extension of paid plasma donation.

  2. Stopover decision during migration: physiological conditions predict nocturnal restlessness in wild passerines.

    PubMed

    Fusani, Leonida; Cardinale, Massimiliano; Carere, Claudio; Goymann, Wolfgang

    2009-06-23

    During migration, a number of bird species rely on stopover sites for resting and feeding before and after crossing ecological barriers such as deserts or seas. The duration of a stopover depends on the combined effects of environmental factors, endogenous programmes and physiological conditions. Previous studies indicated that lean birds prolong their refuelling stopover compared with fat birds; however, the quantitative relationship between physiological conditions and stopover behaviour has not been studied yet. Here, we tested in a large sample of free-living birds of three European passerines (whinchats, Saxicola rubetra, garden warblers, Sylvia borin and whitethroats, Sylvia communis) whether the amount of migratory restlessness (Zugunruhe) shown at a stopover site depends on physiological conditions. An integrated measure of condition based on body mass, amount of subcutaneous fat and thickness of pectoral muscles strongly predicted the intensity of Zugunruhe shown in recording cages in the night following capture. These results provide novel and robust quantitative evidence in support of the hypothesis that the amount of energy reserves plays a major role in determining the stopover duration in migratory birds.

  3. Stopover decision during migration: physiological conditions predict nocturnal restlessness in wild passerines

    PubMed Central

    Fusani, Leonida; Cardinale, Massimiliano; Carere, Claudio; Goymann, Wolfgang

    2009-01-01

    During migration, a number of bird species rely on stopover sites for resting and feeding before and after crossing ecological barriers such as deserts or seas. The duration of a stopover depends on the combined effects of environmental factors, endogenous programmes and physiological conditions. Previous studies indicated that lean birds prolong their refuelling stopover compared with fat birds; however, the quantitative relationship between physiological conditions and stopover behaviour has not been studied yet. Here, we tested in a large sample of free-living birds of three European passerines (whinchats, Saxicola rubetra, garden warblers, Sylvia borin and whitethroats, Sylvia communis) whether the amount of migratory restlessness (Zugunruhe) shown at a stopover site depends on physiological conditions. An integrated measure of condition based on body mass, amount of subcutaneous fat and thickness of pectoral muscles strongly predicted the intensity of Zugunruhe shown in recording cages in the night following capture. These results provide novel and robust quantitative evidence in support of the hypothesis that the amount of energy reserves plays a major role in determining the stopover duration in migratory birds. PMID:19324648

  4. Artificial selection reveals the energetic expense of producing larger eggs.

    PubMed

    Pick, Joel L; Hutter, Pascale; Ebneter, Christina; Ziegler, Ann-Kathrin; Giordano, Marta; Tschirren, Barbara

    2016-01-01

    The amount of resources provided by the mother before birth has important and long-lasting effects on offspring fitness. Despite this, there is a large amount of variation in maternal investment seen in natural populations. Life-history theory predicts that this variation is maintained through a trade-off between the benefits of high maternal investment for the offspring and the costs of high investment for the mother. However, the proximate mechanisms underlying these costs of reproduction are not well understood. Here we used artificial selection for high and low maternal egg investment in a precocial bird, the Japanese quail (Coturnix japonica) to quantify costs of maternal reproductive investment. We show that females from the high maternal investment lines had significantly larger reproductive organs, which explained their overall larger body mass, and resulted in a higher resting metabolic rate (RMR). Contrary to our expectations, this increase in metabolic activity did not lead to a higher level of oxidative damage. This study is the first to provide experimental evidence for metabolic costs of increased per offspring investment.

  5. Robotic liquid handling and automation in epigenetics.

    PubMed

    Gaisford, Wendy

    2012-10-01

    Automated liquid-handling robots and high-throughput screening (HTS) are widely used in the pharmaceutical industry for the screening of large compound libraries, small molecules for activity against disease-relevant target pathways, or proteins. HTS robots capable of low-volume dispensing reduce assay setup times and provide highly accurate and reproducible dispensing, minimizing variation between sample replicates and eliminating the potential for manual error. Low-volume automated nanoliter dispensers ensure accuracy of pipetting within volume ranges that are difficult to achieve manually. In addition, they have the ability to potentially expand the range of screening conditions from often limited amounts of valuable sample, as well as reduce the usage of expensive reagents. The ability to accurately dispense lower volumes provides the potential to achieve a greater amount of information than could be otherwise achieved using manual dispensing technology. With the emergence of the field of epigenetics, an increasing number of drug discovery companies are beginning to screen compound libraries against a range of epigenetic targets. This review discusses the potential for the use of low-volume liquid handling robots, for molecular biological applications such as quantitative PCR and epigenetics.

  6. Effects of low-density feeding on elk–fetus contact rates on Wyoming feedgrounds

    USGS Publications Warehouse

    Creech, Tyler G.; Cross, Paul C.; Scurlock, Brandon M.; Maichak, Eric J.; Rogerson, Jared D.; Henningsen, John C.; Creel, Scott

    2012-01-01

    High seroprevalance for Brucella abortus among elk on Wyoming feedgrounds suggests that supplemental feeding may influence parasite transmission and disease dynamics by altering the rate at which elk contact infectious materials in their environment. We used proximity loggers and video cameras to estimate rates of elk-to-fetus contact (the primary source of brucellosis transmission) during winter supplemental feeding. We compared contact rates during high-density and low-density (LD) feeding treatments that provided the same total amount of food distributed over different areas. Low-density feeding led to >70% reductions in total number of contacts and number of individuals contacting a fetus. Proximity loggers and video cameras provided similar estimates of elk–fetus contact rates. Elk contacted fetuses and random control points equally, suggesting that elk were not attracted to fetuses but encountered them incidentally while feeding. The modeled relationship between contact rate and disease prevalence is nonlinear and LD feeding may result in large reductions in brucellosis prevalence, but this depends on the amount of transmission that occurs on and off feedgrounds.

  7. The use of dual mode thermionic reactors in supporting Earth orbital and space exploration missions

    NASA Astrophysics Data System (ADS)

    Zubrin, Robert M.; Sulmeisters, Tal K.

    1993-01-01

    Missions requiring large amounts of electric power to support their payload functions can be enabled through the employment of nuclear electric power reactors, which in some cases can also assist the mission by making possible the employment of high specific impulse electric propulsion. However it is found that the practicality and versality of using a power reactor to provide advanced propulsion is enormously enhanced if the reactor is configured in such a way to allow it to generate a certain amount of direct thrust as well. The use of such a system allows the creation of a common bus upper stage that can provide both high power and high impulse (with short orbit transfer times). It is shown that such a system, termed an Integral Power and Propulsion Stage (IPAPS), is optimal for supporting many Earth, Lunar, planetary and asteroidal observation, exploration, and communication support missions, and it is therefore recommended that the nuclear power reactor ultimately selected by the government for development and production be one that can be configured for such a function.

  8. Foundations for context-aware information retrieval for proactive decision support

    NASA Astrophysics Data System (ADS)

    Mittu, Ranjeev; Lin, Jessica; Li, Qingzhe; Gao, Yifeng; Rangwala, Huzefa; Shargo, Peter; Robinson, Joshua; Rose, Carolyn; Tunison, Paul; Turek, Matt; Thomas, Stephen; Hanselman, Phil

    2016-05-01

    Intelligence analysts and military decision makers are faced with an onslaught of information. From the now ubiquitous presence of intelligence, surveillance, and reconnaissance (ISR) platforms providing large volumes of sensor data, to vast amounts of open source data in the form of news reports, blog postings, or social media postings, the amount of information available to a modern decision maker is staggering. Whether tasked with leading a military campaign or providing support for a humanitarian mission, being able to make sense of all the information available is a challenge. Due to the volume and velocity of this data, automated tools are required to help support reasoned, human decisions. In this paper we describe several automated techniques that are targeted at supporting decision making. Our approaches include modeling the kinematics of moving targets as motifs; developing normalcy models and detecting anomalies in kinematic data; automatically classifying the roles of users in social media; and modeling geo-spatial regions based on the behavior that takes place in them. These techniques cover a wide-range of potential decision maker needs.

  9. Land cover change monitoring within the east central Louisiana study site: A case for large area surveys with LANDSAT multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Burns, G. S.

    1983-01-01

    Results established for four digital procedures developed for characterizing the radiometric changes between multidate LANDSAT spectral data sets into meaningful measures of land cover/use dynamics are documented. Each technique's performance was contrasted against digitized land use change maps, which were produced from contemporaneous, retrospective aerophoto coverage, in a cell by cell comparison over a one half by one degree area in east central Louisiana as a standard for comparison. The four techniques identify from 10.5 to 13.0% loss in area of forestland in a five year period; however, they differ more by how accurately this amount of change is distributed, the need for ancillary ground truth, and amount of usable information that is extractable. All require some method of digitally co-registering the two data sets. All are capable of providing tabular statistics as well as map products. Two are capable of detecting changes and identifying their locations. The other two, in addition to this, provide information to qualify land cover conditions at each end of the study interval.

  10. Raytheon RSP2 Cryocooler Low Temperature Testing and Design Enhancements

    NASA Astrophysics Data System (ADS)

    Hon, R. C.; Kirkconnell, C. S.; Shrago, J. A.

    2010-04-01

    The High Capacity Raytheon Stirling/Pulse Tube Hybrid 2-Stage cryocooler (HC-RSP2) was originally developed to provide simultaneous cooling at temperatures of 85 K and 35 K. During testing performed in 2008 it was demonstrated that this stock-configuration cryocooler is capable of providing significant amounts of heat lift at 2nd stage temperatures as low as 12 K, and modeling indicated that minor changes to the 2nd stage inertance tube/surge volume setup could yield improved performance. These changes were implemented and the cooler was successfully retested, producing >350 mW of heat lift at 12 K. A comprehensive redesign of the system has been performed, the result of which is a robust 2-stage cryocooler system that is intended to efficiently produce relatively large amounts of cooling at 2nd stage temperatures <12 K. This cryocooler, called the Low Temperature RSP2 (LT-RSP2) will be fabricated and tested over the next 12 months. This paper reports on the recently-completed test activities, as well as details relating to the system redesign. Expected performance, mass and packaging volume are addressed.

  11. A thermal and chemical degradation approach to decipher pristane and phytane precursors in sedimentary organic matter

    USGS Publications Warehouse

    Koopmans, M.P.; Rijpstra, W.I.C.; Klapwijk, M.M.; De Leeuw, J. W.; Lewan, M.D.; Sinninghe, Damste J.S.

    1999-01-01

    A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.A thermal and chemical degradation approach was followed to determine the precursors of pristane (Pr) and phytane (Ph) in samples from the Gessoso-solfifera, Ghareb and Green River Formations. Hydrous pyrolysis of these samples yields large amounts of Pr and Ph carbon skeletons, indicating that their precursors are predominantly sequestered in high-molecular-weight fractions. However, chemical degradation of the polar fraction and the kerogen of the unheated samples generally does not release large amounts of Pr and Ph. Additional information on the precursors of Pr and Ph is obtained from flash pyrolysis analyses of kerogens and residues after hydrous pyrolysis and after chemical degradation. Multiple precursors for Pr and Ph are recognised in these three samples. The main increase of the Pr/Ph ratio with increasing maturation temperature, which is associated with strongly increasing amounts of Pr and Ph, is probably due to the higher amount of precursors of Pr compared to Ph, and not to the different timing of generation of Pr and Ph.

  12. Spontaneous, generalized lipidosis in captive greater horseshoe bats (Rhinolophus ferrumequinum).

    PubMed

    Gozalo, Alfonso S; Schwiebert, Rebecca S; Metzner, Walter; Lawson, Gregory W

    2005-11-01

    During a routine 6-month quarantine period, 3 of 34 greater horseshoe bats (Rhinolophus ferrumequinum) captured in mainland China and transported to the United States for use in echolocation studies were found dead with no prior history of illness. All animals were in good body condition at the time of death. At necropsy, a large amount of white fat was found within the subcutis, especially in the sacrolumbar region. The liver, kidneys, and heart were diffusely tan in color. Microscopic examination revealed that hepatocytes throughout the liver were filled with lipid, and in some areas, lipid granulomas were present. renal lesions included moderate amounts of lipid in the cortical tubular epithelium and large amounts of protein and lipid within Bowman's capsules in the glomeruli. In addition, one bat had large lipid vacuoles diffusely distributed throughout the myocardium. The exact pathologic mechanism inducing the hepatic, renal, and cardiac lipidosis is unknown. The horseshoe bats were captured during hibernation and immediately transported to the United States. It is possible that the large amount of fat stored coupled with changes in photoperiod, lack of exercise, and/or the stress of captivity might have contributed to altering the normal metabolic processes, leading to anorexia and consequently lipidosis in these animals.

  13. Method for large-scale fabrication of atomic-scale structures on material surfaces using surface vacancies

    DOEpatents

    Lim, Chong Wee; Ohmori, Kenji; Petrov, Ivan Georgiev; Greene, Joseph E.

    2004-07-13

    A method for forming atomic-scale structures on a surface of a substrate on a large-scale includes creating a predetermined amount of surface vacancies on the surface of the substrate by removing an amount of atoms on the surface of the material corresponding to the predetermined amount of the surface vacancies. Once the surface vacancies have been created, atoms of a desired structure material are deposited on the surface of the substrate to enable the surface vacancies and the atoms of the structure material to interact. The interaction causes the atoms of the structure material to form the atomic-scale structures.

  14. Expert system shell to reason on large amounts of data

    NASA Technical Reports Server (NTRS)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  15. A microfluidic system with integrated molecular imprinting polymer films for surface plasmon resonance detection

    NASA Astrophysics Data System (ADS)

    Huang, Shih-Chiang; Lee, Gwo-Bin; Chien, Fan-Ching; Chen, Shean-Jen; Chen, Wen-Janq; Yang, Ming-Chang

    2006-07-01

    This paper presents a novel microfluidic system with integrated molecular imprinting polymer (MIP) films designed for surface plasmon resonance (SPR) biosensing of multiple nanoscale biomolecules. The innovative microfluidic chip uses pneumatic microvalves and micropumps to transport a precise amount of the biosample through multiple microchannels to sensing regions containing the locally spin-coated MIP films. The signals of SPR biosensing are basically proportional to the number of molecules adsorbed on the MIP films. Hence, a precise control of flow rates inside microchannels is important to determine the adsorption amount of the molecules in the SPR/MIP chips. The integration of micropumps and microvalves can automate the sample introduction process and precisely control the amount of the sample injection to the microfluidic system. The proposed biochip enables the label-free biosensing of biomolecules in an automatic format, and provides a highly sensitive, highly specific and high-throughput detection performance. Three samples, i.e. progesterone, cholesterol and testosterone, are successfully detected using the developed system. The experimental results show that the proposed SPR/MIP microfluidic chip provides a comparable sensitivity to that of large-scale SPR techniques, but with reduced sample consumption and an automatic format. As such, the developed biochip has significant potential for a wide variety of nanoscale biosensing applications. The preliminary results of the current paper were presented at Transducers 2005, Seoul, Korea, 5-9 June 2005.

  16. Solar array flight experiment

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Emerging satellite designs require increasing amounts of electrical power to operate spacecraft instruments and to provide environments suitable for human habitation. In the past, electrical power was generated by covering rigid honeycomb panels with solar cells. This technology results in unacceptable weight and volume penalties when large amounts of power are required. To fill the need for large-area, lightweight solar arrays, a fabrication technique in which solar cells are attached to a copper printed circuit laminated to a plastic sheet was developed. The result is a flexible solar array with one-tenth the stowed volume and one-third the weight of comparably sized rigid arrays. An automated welding process developed to attack the cells to the printed circuit guarantees repeatable welds that are more tolerant of severe environments than conventional soldered connections. To demonstrate the flight readiness of this technology, the Solar Array Flight Experiment (SAFE) was developed and flown on the space shuttle Discovery in September 1984. The tests showed the modes and frequencies of the array to be very close to preflight predictions. Structural damping, however, was higher than anticipated. Electrical performance of the active solar panel was also tested. The flight performance and postflight data evaluation are described.

  17. Parallel hyperspectral compressive sensing method on GPU

    NASA Astrophysics Data System (ADS)

    Bernabé, Sergio; Martín, Gabriel; Nascimento, José M. P.

    2015-10-01

    Remote hyperspectral sensors collect large amounts of data per flight usually with low spatial resolution. It is known that the bandwidth connection between the satellite/airborne platform and the ground station is reduced, thus a compression onboard method is desirable to reduce the amount of data to be transmitted. This paper presents a parallel implementation of an compressive sensing method, called parallel hyperspectral coded aperture (P-HYCA), for graphics processing units (GPU) using the compute unified device architecture (CUDA). This method takes into account two main properties of hyperspectral dataset, namely the high correlation existing among the spectral bands and the generally low number of endmembers needed to explain the data, which largely reduces the number of measurements necessary to correctly reconstruct the original data. Experimental results conducted using synthetic and real hyperspectral datasets on two different GPU architectures by NVIDIA: GeForce GTX 590 and GeForce GTX TITAN, reveal that the use of GPUs can provide real-time compressive sensing performance. The achieved speedup is up to 20 times when compared with the processing time of HYCA running on one core of the Intel i7-2600 CPU (3.4GHz), with 16 Gbyte memory.

  18. Distributed memory parallel Markov random fields using graph partitioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heinemann, C.; Perciano, T.; Ushizima, D.

    Markov random fields (MRF) based algorithms have attracted a large amount of interest in image analysis due to their ability to exploit contextual information about data. Image data generated by experimental facilities, though, continues to grow larger and more complex, making it more difficult to analyze in a reasonable amount of time. Applying image processing algorithms to large datasets requires alternative approaches to circumvent performance problems. Aiming to provide scientists with a new tool to recover valuable information from such datasets, we developed a general purpose distributed memory parallel MRF-based image analysis framework (MPI-PMRF). MPI-PMRF overcomes performance and memory limitationsmore » by distributing data and computations across processors. The proposed approach was successfully tested with synthetic and experimental datasets. Additionally, the performance of the MPI-PMRF framework is analyzed through a detailed scalability study. We show that a performance increase is obtained while maintaining an accuracy of the segmentation results higher than 98%. The contributions of this paper are: (a) development of a distributed memory MRF framework; (b) measurement of the performance increase of the proposed approach; (c) verification of segmentation accuracy in both synthetic and experimental, real-world datasets« less

  19. Multimodal Word Meaning Induction From Minimal Exposure to Natural Text.

    PubMed

    Lazaridou, Angeliki; Marelli, Marco; Baroni, Marco

    2017-04-01

    By the time they reach early adulthood, English speakers are familiar with the meaning of thousands of words. In the last decades, computational simulations known as distributional semantic models (DSMs) have demonstrated that it is possible to induce word meaning representations solely from word co-occurrence statistics extracted from a large amount of text. However, while these models learn in batch mode from large corpora, human word learning proceeds incrementally after minimal exposure to new words. In this study, we run a set of experiments investigating whether minimal distributional evidence from very short passages suffices to trigger successful word learning in subjects, testing their linguistic and visual intuitions about the concepts associated with new words. After confirming that subjects are indeed very efficient distributional learners even from small amounts of evidence, we test a DSM on the same multimodal task, finding that it behaves in a remarkable human-like way. We conclude that DSMs provide a convincing computational account of word learning even at the early stages in which a word is first encountered, and the way they build meaning representations can offer new insights into human language acquisition. Copyright © 2017 Cognitive Science Society, Inc.

  20. OpenElectrophy: An Electrophysiological Data- and Analysis-Sharing Framework

    PubMed Central

    Garcia, Samuel; Fourcaud-Trocmé, Nicolas

    2008-01-01

    Progress in experimental tools and design is allowing the acquisition of increasingly large datasets. Storage, manipulation and efficient analyses of such large amounts of data is now a primary issue. We present OpenElectrophy, an electrophysiological data- and analysis-sharing framework developed to fill this niche. It stores all experiment data and meta-data in a single central MySQL database, and provides a graphic user interface to visualize and explore the data, and a library of functions for user analysis scripting in Python. It implements multiple spike-sorting methods, and oscillation detection based on the ridge extraction methods due to Roux et al. (2007). OpenElectrophy is open source and is freely available for download at http://neuralensemble.org/trac/OpenElectrophy. PMID:19521545

  1. Are Social Networking Sites Making Health Behavior Change Interventions More Effective? A Meta-Analytic Review.

    PubMed

    Yang, Qinghua

    2017-03-01

    The increasing popularity of social networking sites (SNSs) has drawn scholarly attention in recent years, and a large amount of efforts have been made in applying SNSs to health behavior change interventions. However, these interventions showed mixed results, with a large variance of effect sizes in Cohen's d ranging from -1.17 to 1.28. To provide a better understanding of SNS-based interventions' effectiveness, a meta-analysis of 21 studies examining the effects of health interventions using SNS was conducted. Results indicated that health behavior change interventions using SNS are effective in general, but the effects were moderated by health topic, methodological features, and participant features. Theoretical and practical implications of findings are discussed.

  2. Effects of consumption of choline and lecithin on neurological and cardiovascular systems.

    PubMed

    Wood, J L; Allison, R G

    1982-12-01

    This report concerns possible adverse health effects and benefits that might result from consumption of large amounts of choline, lecithin, or phosphatidylcholine. Indications from preliminary investigations that administration of choline or lecithin might alleviate some neurological disturbances, prevent hypercholesteremia and atherosclerosis, and restore memory and cognition have resulted in much research and public interest. Symptoms of tardive dyskinesia and Alzheimer's disease have been ameliorated in some patients and varied responses have been observed in the treatment of Gilles de la Tourette's disease, Friedreich's ataxia, levodopa-induced dyskinesia, mania, Huntington's disease, and myasthenic syndrome. Further clinical trials, especially in conjunction with cholinergic drugs, are considered worthwhile but will require sufficient amounts of pure phosphatidylcholine. The public has access to large amounts of commercial lecithin. Because high intakes of lecithin or choline produce acute gastrointestinal distress, sweating, salivation, and anorexia, it is improbable that individuals will incur lasting health hazards from self-administration of either compound. Development of depression or supersensitivity of dopamine receptors and disturbance of the cholinergic-dopaminergic-serotinergic balance is a concern with prolonged, repeated intakes of large amounts of lecithin.

  3. Grandmothers' productivity and the HIV/AIDS pandemic in sub-Saharan Africa.

    PubMed

    Bock, John; Johnson, Sara E

    2008-06-01

    The human immunodeficiency virus (HIV)/acquired immune deficiency syndrome (AIDS) pandemic has left large numbers of orphans in sub-Saharan Africa. Botswana has an HIV prevalence rate of approximately 40% in adults. Morbidity and mortality are high, and in a population of a 1.3 million there are nearly 50,000 children who have lost one or both parents to HIV/AIDS. The extended family, particularly grandparents, absorbs much of the childrearing responsibilities. This creates large amounts of additional work for grandmothers especially. The embodied capital model and the grandmother hypothesis are both derived from life history theory within evolutionary ecology, and both predict that one important factor in the evolution of the human extended family structure is that postreproductive individuals such as grandmothers provide substantial support to their grandchildren's survival. Data collected in the pre-pandemic context in a traditional multi-ethnic community in the Okavango Delta of Botswana are analyzed to calculate the amount of work effort provided to a household by women of different ages. Results show that the contributions of older and younger women to the household in term of both productivity and childrearing are qualitatively and quantitatively different. These results indicate that it is unrealistic to expect older women to be able to compensate for the loss of younger women's contributions to the household, and that interventions be specifically designed to support older women based on the type of activities in which they engage that affect child survival, growth, and development.

  4. Uncertainty In Greenhouse Gas Emissions On Carbon Sequestration In Coastal and Freshwater Wetlands of the Mississippi River Delta: A Subsiding Coastline as a Proxy for Future Global Sea Level

    NASA Astrophysics Data System (ADS)

    White, J. R.; DeLaune, R. D.; Roy, E. D.; Corstanje, R.

    2014-12-01

    The highly visible phenomenon of wetland loss in coastal Louisiana (LA) is examined through the prism of carbon accumulation, wetland loss and greenhouse gas (GHG) emissions. The Mississippi River Deltaic region experiences higher relative sea level rise due to coupled subsidence and eustatic sea level rise allowing this region to serve as a proxy for future projected golbal sea level rise. Carbon storage or sequestration in rapidly subsiding LA coastal marsh soils is based on vertical marsh accretion and areal change data. While coastal marshes sequester significant amount of carbon through vertical accretion, large amounts of carbon, previously sequested in the soil profile is lost through annual deterioration of these coastal marshes as well as through GHG emissions. Efforts are underway in Louisiana to access the carbon credit market in order to provide significant funding for coastal restoration projects. However, there is very large uncertainty on GHG emission rates related to both marsh type and temporal (daily and seasonal) effects. Very little data currently exists which addresses this uncertainty which can significantly affect the carbon credit value of a particular wetland system. We provide an analysis of GHG emission rates for coastal freshwater, brackish and and salt marshes compared to the net soil carbon sequestration rate. Results demonstrate that there is very high uncertainty on GHG emissions which can substantially alter the carbon credit value of a particular wetland system.

  5. Unprecedented Arctic Ozone Loss in 2011

    NASA Image and Video Library

    2011-10-02

    In mid-March 2011, NASA Aura spacecraft observed ozone in Earth stratosphere -- low ozone amounts are shown in purple and grey colors, large amounts of chlorine monoxide are shown in dark blue colors.

  6. Oil recovery by alkaline waterflooding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooke, C.E. Jr.; Williams, R.E.; Kolodzie, P.A.

    1974-01-01

    Flooding of oil containing organic acids with alkaline water under favorable conditions can result in recovery of around 50% of the residual oil left in a watered-out model. A high recovery efficiency results from the formation of a bank of viscous water-in-oil emulsion as surface active agents (soaps) are created by reactions of base in the water with the organic acids in the oil. The type and amount of organic acids in the oil, the pH and salt content of the water, and the amount of fines in the porous medium are the primary factors which determine the amount ofmore » additional oil recovered by this method. Interaction of alkaline water with reservoir rock largely determines the amount of chemical needed to flood a reservoir. Laboratory investigations using synthetic oils and crude oils show the importance of oil-water and liquid-solid interfacial properties to the results of an alkaline waterflood. A small field test demonstrated that emulsion banks can be formed in the reservoir and that chemical costs can be reasonable in selected reservoirs. Although studies have provided many qualitative guide lines for evaluating the feasibility of alkaline waterflooding, the economic attractiveness of the process must be considered on an individual reservoir.« less

  7. Using neural networks to model the behavior and decisions of gamblers, in particular, cyber-gamblers.

    PubMed

    Chan, Victor K Y

    2010-03-01

    This article describes the use of neural networks (a type of artificial intelligence) and an empirical data sample of, inter alia, the amounts of bets laid and the winnings/losses made in successive games by a number of cyber-gamblers to longitudinally model gamblers' behavior and decisions as to such bet amounts and the temporal trajectory of winnings/losses. The data was collected by videoing Texas Holdem gamblers at a cyber-gambling website. Six "persistent" gamblers were identified, totaling 675 games. The neural networks on average were able to predict bet amounts and cumulative winnings/losses in successive games accurately to three decimal places of the dollar. A more important conclusion is that the influence of a gambler's skills, strategies, and personality on his/her successive bet amounts and cumulative winnings/losses is almost totally reflected by the pattern(s) of his/her winnings/losses in the few initial games and his/her gambling account balance. This partially invalidates gamblers' illusions and fallacies that they can outperform others or even bankers. For government policy-makers, gambling industry operators, economists, sociologists, psychiatrists, and psychologists, this article provides models for gamblers' behavior and decisions. It also explores and exemplifies the usefulness of neural networks and artificial intelligence at large in the research on gambling.

  8. Direct Retrieval of Sulfur Dioxide Amount and Altitude from Spaceborne Hyperspectral UV Measurements: Theory and Application

    NASA Technical Reports Server (NTRS)

    Yang, Kau; Liu, Xiong; Bhartia, Pawan K.; Krotkov, Nickolay A.; Carn, Simon A.; Hughes, Eric J.; Krueger, Arlin J.; Spurr, Robert D.; Trahan, Samuel G.

    2010-01-01

    We describe the physical processes by which a vertically localized absorber perturbs the top-of-atmosphere solar backscattered ultraviolet (UV) radiance. The distinct spectral responses to perturbations of an absorber in its column amount and layer altitude provide the basis for a practical satellite retrieval technique, the Extended Iterative Spectral Fitting (EISF) algorithm, for the simultaneous retrieval of these quantities of a SO2 plume. In addition, the EISF retrieval provides an improved UV aerosol index for quantifying the spectral contrast of apparent scene reflectance at the bottom of atmosphere bounded by the surface and/or cloud; hence it can be used for detection of the presence or absence of UV absorbing aerosols. We study the performance and characterize the uncertainties of the EISF algorithm using synthetic backscattered UV radiances, retrievals from which can be compared with those used in the simulation. Our findings indicate that the presence of aerosols (both absorbing and nonabsorbing) does not cause large errors in EISF retrievals under most observing conditions when they are located below the SO2 plume. The EISF retrievals assuming a homogeneous field of view can provide accurate column amounts for inhomogeneous scenes, but they always underestimate the plume altitudes. The EISF algorithm reduces systematic errors present in existing linear retrieval algorithms that use prescribed SO2 plume heights. Applying the EISF algorithm to Ozone Monitoring Instrument satellite observations of the recent Kasatochi volcanic eruption, we demonstrate the successful retrieval of effective plume altitude of volcanic SO2, and we also show the improvement in accuracy in the corresponding SO2 columns.

  9. Preliminary report on ground-water conditions in the Cloquet area, Carlton County, Minnesota

    USGS Publications Warehouse

    Akin, P.D.

    1951-01-01

    A study of the geology and ground-water conditions in the.area including Cloquet, Minn., was begun by the United States Geological Survey in 1948 in financial cooperation with the Minnesota State Department of Conservation, at the request of the city of Cloquet for assistance in locating large additional ground-water supplies for industrial and municipal use. The location of the area is show on figure 1. Although the present municipal wells provide a fairly adequate supply for current municipal needs, which averaged about three-quarters of a million gallons a day in 1946, there is great need for large supplies of good water, on the order of 10 million gallons a day, for use by the paper mills and other industries there. At present the industries are using water from the St. Louis River, but the water is unsatisfactory and expensive to use because it contains a large amount of objectionable organic material.

  10. Impact of Medicare on the Use of Medical Services by Disabled Beneficiaries, 1972-1974

    PubMed Central

    Deacon, Ronald W.

    1979-01-01

    The extension of Medicare coverage in 1973 to disabled persons receiving cash benefits under the Social Security Act provided an opportunity to examine the impact of health insurance coverage on utilization and expenses for Part B services. Data on medical services used both before and after coverage, collected through the Current Medicare Survey, were analyzed. Results indicate that access to care (as measured by the number of persons using services) increased slightly, while the rate of use did not. The large increase in the number of persons eligible for Medicare reflected the large increase in the number of cash beneficiaries. Significant increases also were found in the amount charged for medical services. The absence of large increases in access and service use may be attributed, in part, to the already existing source of third party payment available to disabled cash beneficiaries in 1972, before Medicare coverage. PMID:10316939

  11. Honeycomb: Visual Analysis of Large Scale Social Networks

    NASA Astrophysics Data System (ADS)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  12. Interferon γ limits the effectiveness of melanoma peptide vaccines.

    PubMed

    Cho, Hyun-Il; Lee, Young-Ran; Celis, Esteban

    2011-01-06

    The development of effective therapeutic vaccines to generate tumor-reactive cytotoxic T lymphocytes (CTLs) continues to be a top research priority. However, in spite of some promising results, there are no clear examples of vaccines that eradicate established tumors. Most vaccines are ineffective because they generate low numbers of CTLs and because numerous immunosuppressive factors abound in tumor-bearing hosts. We designed a peptide vaccine that produces large numbers of tumor-reactive CTLs in a mouse model of melanoma. Surprisingly, CTL tumor recognition and antitumor effects decreased in the presence of interferon γ (IFNγ), a cytokine that can provide therapeutic benefit. Tumors exposed to IFNγ evade CTLs by inducing large amounts of noncognate major histocompatibility complex class I molecules, which limit T-cell activation and effector function. Our results demonstrate that peptide vaccines can eradicate large, established tumors in circumstances under which the inhibitory activities of IFNγ are curtailed.

  13. SILVA tree viewer: interactive web browsing of the SILVA phylogenetic guide trees.

    PubMed

    Beccati, Alan; Gerken, Jan; Quast, Christian; Yilmaz, Pelin; Glöckner, Frank Oliver

    2017-09-30

    Phylogenetic trees are an important tool to study the evolutionary relationships among organisms. The huge amount of available taxa poses difficulties in their interactive visualization. This hampers the interaction with the users to provide feedback for the further improvement of the taxonomic framework. The SILVA Tree Viewer is a web application designed for visualizing large phylogenetic trees without requiring the download of any software tool or data files. The SILVA Tree Viewer is based on Web Geographic Information Systems (Web-GIS) technology with a PostgreSQL backend. It enables zoom and pan functionalities similar to Google Maps. The SILVA Tree Viewer enables access to two phylogenetic (guide) trees provided by the SILVA database: the SSU Ref NR99 inferred from high-quality, full-length small subunit sequences, clustered at 99% sequence identity and the LSU Ref inferred from high-quality, full-length large subunit sequences. The Tree Viewer provides tree navigation, search and browse tools as well as an interactive feedback system to collect any kinds of requests ranging from taxonomy to data curation and improving the tool itself.

  14. Global Static Indexing for Real-Time Exploration of Very Large Regular Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pascucci, V; Frank, R

    2001-07-23

    In this paper we introduce a new indexing scheme for progressive traversal and visualization of large regular grids. We demonstrate the potential of our approach by providing a tool that displays at interactive rates planar slices of scalar field data with very modest computing resources. We obtain unprecedented results both in terms of absolute performance and, more importantly, in terms of scalability. On a laptop computer we provide real time interaction with a 2048{sup 3} grid (8 Giga-nodes) using only 20MB of memory. On an SGI Onyx we slice interactively an 8192{sup 3} grid (1/2 tera-nodes) using only 60MB ofmore » memory. The scheme relies simply on the determination of an appropriate reordering of the rectilinear grid data and a progressive construction of the output slice. The reordering minimizes the amount of I/O performed during the out-of-core computation. The progressive and asynchronous computation of the output provides flexible quality/speed tradeoffs and a time-critical and interruptible user interface.« less

  15. Directions for lunar construction - A derivation of requirements from a construction scenario analysis

    NASA Technical Reports Server (NTRS)

    Dias, William S.; Matijevic, Jacob R.; Venkataraman, Subramani T.; Smith, Jeffrey H.; Lindemann, Randel A.; Levin, Richard R.

    1992-01-01

    This paper provides an initial trade-off study among several lunar construction options available to the Space Exploration Initiative. The relative time effectiveness of Extra-Vehicular Activity (EVA), Intra-Vehicular Activity (IVA), and Earth-based remote control assembly and construction methods are studied. Also considered is whether there is any construction time savings to building roads in advance, or surveying the construction sites with orbiters or rovers in advance. The study was conducted by adding detail to a potentially real scenario - a nuclear power plant - and applying time multipliers for the various control options and terrain alternatives, provided by roboticists among the authors. The authors conclude that IVA is a faster construction method than either EVA or construction conducted remotely from Earth. Surveying proposed sites in advance, with orbiters and rovers, provides a significant time savings through adding to certainty, and therefore may be cost effective. Developing a heavy-lift launch capability and minimizing assembly and construction processes by landing large payloads is probably worthwhile to the degree possible, as construction activities would use a large amount of surface operations time.

  16. Using geoneutrinos to constrain the radiogenic power in the Earth's mantle

    NASA Astrophysics Data System (ADS)

    Šrámek, Ondřej; Roskovec, Bedřich; Wipperfurth, Scott A.; Xi, Yufei; McDonough, William F.

    2017-04-01

    The Earth's engine is driven by unknown proportions of primordial energy and heat produced in radioactive decay. Unfortunately, competing models of Earth's composition reveal an order of magnitude uncertainty in the amount of radiogenic power driving mantle dynamics. Together with established geoscientific disciplines (seismology, geodynamics, petrology, mineral physics), experimental particle physics now brings additional constraints to our understanding of mantle energetics. Measurements of the Earth's flux of geoneutrinos, electron antineutrinos emitted in β- decays of naturally occurring radionuclides, reveal the amount of uranium and thorium in the Earth and set limits on the amount of radiogenic power in the planet. Comparison of the flux measured at large underground neutrino experiments with geologically informed predictions of geoneutrino emission from the crust provide the critical test needed to define the mantle's radiogenic power. Measuring geoneutrinos at oceanic locations, distant from nuclear reactors and continental crust, would best reveal the mantle flux and by performing a coarse scale geoneutrino tomography could even test the hypothesis of large heterogeneous structures in deep mantle enriched in heat-producing elements. The current geoneutrino detecting experiments, KamLAND in Japan and Borexino in Italy, will by year ˜ 2020 be supplemented with three more experiments: SNO+ in Canada, and JUNO and Jinping in China. We predict the geoneutrino flux at all experimental sites. Within ˜ 8 years from today, the combination of data from all experiments will exclude end-member compositional models of the silicate Earth at the 1σ level, reveal the radiogenic contribution to the global surface heat loss, and provide tight limits on radiogenic power in the Earth's mantle. Additionally, we discuss how the geoneutrino measurements at the three relatively near-lying (≤ 3000 km) detectors KamLAND, JUNO, and Jinping may be harnessed to improve the regional models of the lithosphere. Šrámek, O. et al. Revealing the Earth's mantle from the tallest mountains using the Jinping Neutrino Experiment. Sci. Rep. 6, 33034; doi:10.1038/srep33034 (2016).

  17. A comparison of solute-transport solution techniques based on inverse modelling results

    USGS Publications Warehouse

    Mehl, S.; Hill, M.C.

    2000-01-01

    Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results-simulated breakthrough curves, sensitivity analysis, and calibrated parameter values-change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.Five common numerical techniques (finite difference, predictor-corrector, total-variation-diminishing, method-of-characteristics, and modified-method-of-characteristics) were tested using simulations of a controlled conservative tracer-test experiment through a heterogeneous, two-dimensional sand tank. The experimental facility was constructed using randomly distributed homogeneous blocks of five sand types. This experimental model provides an outstanding opportunity to compare the solution techniques because of the heterogeneous hydraulic conductivity distribution of known structure, and the availability of detailed measurements with which to compare simulated concentrations. The present work uses this opportunity to investigate how three common types of results - simulated breakthrough curves, sensitivity analysis, and calibrated parameter values - change in this heterogeneous situation, given the different methods of simulating solute transport. The results show that simulated peak concentrations, even at very fine grid spacings, varied because of different amounts of numerical dispersion. Sensitivity analysis results were robust in that they were independent of the solution technique. They revealed extreme correlation between hydraulic conductivity and porosity, and that the breakthrough curve data did not provide enough information about the dispersivities to estimate individual values for the five sands. However, estimated hydraulic conductivity values are significantly influenced by both the large possible variations in model dispersion and the amount of numerical dispersion present in the solution technique.

  18. An intelligent tool for activity data collection.

    PubMed

    Sarkar, A M Jehad

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user's activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool's performance in producing reliable datasets.

  19. The research and development of water resources management information system based on ArcGIS

    NASA Astrophysics Data System (ADS)

    Cui, Weiqun; Gao, Xiaoli; Li, Yuzhi; Cui, Zhencai

    According to that there are large amount of data, complexity of data type and format in the water resources management, we built the water resources calculation model and established the water resources management information system based on the advanced ArcGIS and Visual Studio.NET development platform. The system can integrate the spatial data and attribute data organically, and manage them uniformly. It can analyze spatial data, inquire by map and data bidirectionally, provide various charts and report forms automatically, link multimedia information, manage database etc. . So it can provide spatial and static synthetical information services for study, management and decision of water resources, regional geology and eco-environment etc..

  20. Encryption and decryption algorithm using algebraic matrix approach

    NASA Astrophysics Data System (ADS)

    Thiagarajan, K.; Balasubramanian, P.; Nagaraj, J.; Padmashree, J.

    2018-04-01

    Cryptographic algorithms provide security of data against attacks during encryption and decryption. However, they are computationally intensive process which consume large amount of CPU time and space at time of encryption and decryption. The goal of this paper is to study the encryption and decryption algorithm and to find space complexity of the encrypted and decrypted data by using of algorithm. In this paper, we encrypt and decrypt the message using key with the help of cyclic square matrix provides the approach applicable for any number of words having more number of characters and longest word. Also we discussed about the time complexity of the algorithm. The proposed algorithm is simple but difficult to break the process.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carter, Faustin Wirkus; Khaire, Trupti S.; Novosad, Valentyn

    We present "scraps" (SuperConducting Analysis and Plotting Software), a Python package designed to aid in the analysis and visualization of large amounts of superconducting resonator data, specifically complex transmission as a function of frequency, acquired at many different temperatures and driving powers. The package includes a least-squares fitting engine as well as a Monte-Carlo Markov Chain sampler for sampling the posterior distribution given priors, marginalizing over nuisance parameters, and estimating covariances. A set of plotting tools for generating publication-quality figures is also provided in the package. Lastly, we discuss the functionality of the software and provide some examples of itsmore » utility on data collected from a niobium-nitride coplanar waveguide resonator fabricated at Argonne National Laboratory.« less

  2. Detection of tiny amounts of fissile materials in large-sized containers with radioactive waste

    NASA Astrophysics Data System (ADS)

    Batyaev, V. F.; Skliarov, S. V.

    2018-01-01

    The paper is devoted to non-destructive control of tiny amounts of fissile materials in large-sized containers filled with radioactive waste (RAW). The aim of this work is to model an active neutron interrogation facility for detection of fissile ma-terials inside NZK type containers with RAW and determine the minimal detectable mass of U-235 as a function of various param-eters: matrix type, nonuniformity of container filling, neutron gen-erator parameters (flux, pulse frequency, pulse duration), meas-urement time. As a result the dependence of minimal detectable mass on fissile materials location inside container is shown. Nonu-niformity of the thermal neutron flux inside a container is the main reason of the space-heterogeneity of minimal detectable mass in-side a large-sized container. Our experiments with tiny amounts of uranium-235 (<1 g) confirm the detection of fissile materials in NZK containers by using active neutron interrogation technique.

  3. Sound Levels in East Texas Schools.

    ERIC Educational Resources Information Center

    Turner, Aaron Lynn

    A survey of sound levels was taken in several Texas schools to determine the amount of noise and sound present by size of class, type of activity, location of building, and the presence of air conditioning and large amounts of glass. The data indicate that class size and relative amounts of glass have no significant bearing on the production of…

  4. What Determines the Amount Students Borrow? Revisiting the Crisis-Convenience Debate

    ERIC Educational Resources Information Center

    Hart, Natala K.; Mustafa, Shoumi

    2008-01-01

    Recent studies have questioned the wisdom in blaming college costs for the escalation of student loans. It would appear that less affluent students borrow large amounts because inexpensive subsidized loans are available. This study attempted to verify the claim, estimating a model of the amount of loan received by students as a function of net…

  5. Beverage consumption among European adolescents in the HELENA study.

    PubMed

    Duffey, K J; Huybrechts, I; Mouratidou, T; Libuda, L; Kersting, M; De Vriendt, T; Gottrand, F; Widhalm, K; Dallongeville, J; Hallström, L; González-Gross, M; De Henauw, S; Moreno, L A; Popkin, B M

    2012-02-01

    Our objective was to describe the fluid and energy consumption of beverages in a large sample of European adolescents. We used data from 2741 European adolescents residing in 8 countries participating in the Healthy Lifestyle in Europe by Nutrition in Adolescence Cross-Sectional Study (HELENA-CSS). We averaged two 24-h recalls, collected using the HELENA-dietary assessment tool. By gender and age subgroup (12.5-14.9 years and 15-17.5 years), we examined per capita and per consumer fluid (milliliters (ml)) and energy (kilojoules (kJ)) intake from beverages and percentage consuming 10 different beverage groups. Mean beverage consumption was 1611 ml/day in boys and 1316 ml/day in girls. Energy intake from beverages was about 1966 kJ/day and 1289 kJ/day in European boys and girls, respectively, with sugar-sweetened beverages (SSBs) (carbonated and non-carbonated beverages, including soft drinks, fruit drinks and powders/concentrates) contributing to daily energy intake more than other groups of beverages. Boys and older adolescents consumed the most amount of per capita total energy from beverages. Among all age and gender subgroups, SSBs, sweetened milk (including chocolate milk and flavored yogurt drinks all with added sugar), low-fat milk and fruit juice provided the highest amount of per capita energy. Water was consumed by the largest percentage of adolescents followed by SSBs, fruit juice and sweetened milk. Among consumers, water provided the greatest fluid intake and sweetened milk accounted for the largest amount of energy intake followed by SSBs. Patterns of energy intake from each beverage varied between countries. European adolescents consume an average of 1455 ml/day of beverages, with the largest proportion of consumers and the largest fluid amount coming from water. Beverages provide 1609 kJ/day, of which 30.4%, 20.7% and 18.1% comes from SSBs, sweetened milk and fruit juice, respectively.

  6. Beverage consumption among European adolescents in the HELENA Study

    PubMed Central

    Duffey, K.J.; Huybrechts, I.; Mouratidou, T.; Libuda, L.; Kersting, M.; DeVriendt, T.; Gottrand, F.; Widhalm, K.; Dallongeville, J.; Hallström, L.; González-Gross, M.; DeHenauw, S.; Moreno, L.A.; Popkin, B.M.

    2012-01-01

    Background and Objective Our objective was to describe the fluid and energy consumption of beverages in a large sample of European adolescents Methods We used data from 2,741 European adolescents residing in 8 countries participating in the Healthy Lifestyle in Europe by Nutrition in Adolescence Cross Sectional Study (HELENA-CSS). We averaged two 24-hour recalls, collected using the HELENA-dietary assessment tool. By gender and age subgroup (12.5–14.9 y and 15–17.5 y), we examined per capita and per consumer fluid (milliliters [mL]) and energy (kilojoules [kJ]) intake from beverages and percent consuming ten different beverage groups. Results Mean beverage consumption was 1611 ml/d in boys and 1316 ml/d in girls. Energy intake from beverages was about 1966 kJ/d and 1289 kJ/d in European boys and girls respectively, with sugar-sweetened beverages (carbonated and non-carbonated beverages, including soft drinks, fruit drinks and powders/concentrates) contributing to daily energy intake more than other groups of beverages. Boys and older adolescents consumed the most amount of per capita total energy from beverages. Among all age and gender subgroups sugar-sweetened beverages, sweetened milk (including chocolate milk and flavored yogurt drinks all with added sugar), low-fat milk, and fruit juice provided the highest amount of per capita energy. Water was consumed by the largest percent of adolescents followed by sugar-sweetened beverages, fruit juice, and sweetened milk. Among consumers, water provided the greatest fluid intake and sweetened milk accounted for the largest amount of energy intake followed by sugar-sweetened beverages. Patterns of energy intake from each beverage varied between countries. Conclusions European adolescents consume an average of 1455 ml/d of beverages, with the largest proportion of consumers and the largest fluid amount coming from water. Beverages provide 1609 kJ/d, of which 30.4%, 20.7%, and 18.1% comes from sugar-sweetened beverages, sweetened milk, and fruit juice respectively. PMID:21952695

  7. Quick Estimation Model for the Concentration of Indoor Airborne Culturable Bacteria: An Application of Machine Learning.

    PubMed

    Liu, Zhijian; Li, Hao; Cao, Guoqing

    2017-07-30

    Indoor airborne culturable bacteria are sometimes harmful to human health. Therefore, a quick estimation of their concentration is particularly necessary. However, measuring the indoor microorganism concentration (e.g., bacteria) usually requires a large amount of time, economic cost, and manpower. In this paper, we aim to provide a quick solution: using knowledge-based machine learning to provide quick estimation of the concentration of indoor airborne culturable bacteria only with the inputs of several measurable indoor environmental indicators, including: indoor particulate matter (PM 2.5 and PM 10 ), temperature, relative humidity, and CO₂ concentration. Our results show that a general regression neural network (GRNN) model can sufficiently provide a quick and decent estimation based on the model training and testing using an experimental database with 249 data groups.

  8. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system.

    PubMed

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  9. High-speed cell recognition algorithm for ultrafast flow cytometer imaging system

    NASA Astrophysics Data System (ADS)

    Zhao, Wanyue; Wang, Chao; Chen, Hongwei; Chen, Minghua; Yang, Sigang

    2018-04-01

    An optical time-stretch flow imaging system enables high-throughput examination of cells/particles with unprecedented high speed and resolution. A significant amount of raw image data is produced. A high-speed cell recognition algorithm is, therefore, highly demanded to analyze large amounts of data efficiently. A high-speed cell recognition algorithm consisting of two-stage cascaded detection and Gaussian mixture model (GMM) classification is proposed. The first stage of detection extracts cell regions. The second stage integrates distance transform and the watershed algorithm to separate clustered cells. Finally, the cells detected are classified by GMM. We compared the performance of our algorithm with support vector machine. Results show that our algorithm increases the running speed by over 150% without sacrificing the recognition accuracy. This algorithm provides a promising solution for high-throughput and automated cell imaging and classification in the ultrafast flow cytometer imaging platform.

  10. Imaging samples larger than the field of view: the SLS experience

    NASA Astrophysics Data System (ADS)

    Vogiatzis Oikonomidis, Ioannis; Lovric, Goran; Cremona, Tiziana P.; Arcadu, Filippo; Patera, Alessandra; Schittny, Johannes C.; Stampanoni, Marco

    2017-06-01

    Volumetric datasets with micrometer spatial and sub-second temporal resolutions are nowadays routinely acquired using synchrotron X-ray tomographic microscopy (SRXTM). Although SRXTM technology allows the examination of multiple samples with short scan times, many specimens are larger than the field-of-view (FOV) provided by the detector. The extension of the FOV in the direction perpendicular to the rotation axis remains non-trivial. We present a method that can efficiently increase the FOV merging volumetric datasets obtained by region-of-interest tomographies in different 3D positions of the sample with a minimal amount of artefacts and with the ability to handle large amounts of data. The method has been successfully applied for the three-dimensional imaging of a small number of mouse lung acini of intact animals, where pixel sizes down to the micrometer range and short exposure times are required.

  11. Visualizing Internet routing changes.

    PubMed

    Lad, Mohit; Massey, Dan; Zhang, Lixia

    2006-01-01

    Today's Internet provides a global data delivery service to millions of end users and routing protocols play a critical role in this service. It is important to be able to identify and diagnose any problems occurring in Internet routing. However, the Internet's sheer size makes this task difficult. One cannot easily extract out the most important or relevant routing information from the large amounts of data collected from multiple routers. To tackle this problem, we have developed Link-Rank, a tool to visualize Internet routing changes at the global scale. Link-Rank weighs links in a topological graph by the number of routes carried over each link and visually captures changes in link weights in the form of a topological graph with adjustable size. Using Link-Rank, network operators can easily observe important routing changes from massive amounts of routing data, discover otherwise unnoticed routing problems, understand the impact of topological events, and infer root causes of observed routing changes.

  12. Low Data Drug Discovery with One-Shot Learning.

    PubMed

    Altae-Tran, Han; Ramsundar, Bharath; Pappu, Aneesh S; Pande, Vijay

    2017-04-26

    Recent advances in machine learning have made significant contributions to drug discovery. Deep neural networks in particular have been demonstrated to provide significant boosts in predictive power when inferring the properties and activities of small-molecule compounds (Ma, J. et al. J. Chem. Inf. 2015, 55, 263-274). However, the applicability of these techniques has been limited by the requirement for large amounts of training data. In this work, we demonstrate how one-shot learning can be used to significantly lower the amounts of data required to make meaningful predictions in drug discovery applications. We introduce a new architecture, the iterative refinement long short-term memory, that, when combined with graph convolutional neural networks, significantly improves learning of meaningful distance metrics over small-molecules. We open source all models introduced in this work as part of DeepChem, an open-source framework for deep-learning in drug discovery (Ramsundar, B. deepchem.io. https://github.com/deepchem/deepchem, 2016).

  13. Quantifying functional mobility progress for chronic disease management.

    PubMed

    Boyle, Justin; Karunanithi, Mohan; Wark, Tim; Chan, Wilbur; Colavitti, Christine

    2006-01-01

    A method for quantifying improvements in functional mobility is presented based on patient-worn accelerometer devices. For patients with cardiovascular, respiratory, or other chronic disease, increasing the amount of functional mobility is a large component of rehabilitation programs. We have conducted an observational trial on the use of accelerometers for quantifying mobility improvements in a small group of chronic disease patients (n=15, 48 - 86 yrs). Cognitive impairments precluded complex instrumentation of patients, and movement data was obtained from a single 2-axis accelerometer device worn at the hip. In our trial, movement data collected from accelerometer devices was classified into Lying vs Sitting/Standing vs Walking/Activity movements. This classification enabled the amount of walking to be quantified and graphically presented to clinicians and carers for feedback on exercise efficacy. Presenting long term trends in this data to patients also provides valuable feedback for self managed care and assisting with compliance.

  14. ABS-FishCount: An Agent-Based Simulator of Underwater Sensors for Measuring the Amount of Fish.

    PubMed

    García-Magariño, Iván; Lacuesta, Raquel; Lloret, Jaime

    2017-11-13

    Underwater sensors provide one of the possibilities to explore oceans, seas, rivers, fish farms and dams, which all together cover most of our planet's area. Simulators can be helpful to test and discover some possible strategies before implementing these in real underwater sensors. This speeds up the development of research theories so that these can be implemented later. In this context, the current work presents an agent-based simulator for defining and testing strategies for measuring the amount of fish by means of underwater sensors. The current approach is illustrated with the definition and assessment of two strategies for measuring fish. One of these two corresponds to a simple control mechanism, while the other is an experimental strategy and includes an implicit coordination mechanism. The experimental strategy showed a statistically significant improvement over the control one in the reduction of errors with a large Cohen's d effect size of 2.55.

  15. The Genetic Program of Pancreatic β-Cell Replication In Vivo

    PubMed Central

    Klochendler, Agnes; Caspi, Inbal; Corem, Noa; Moran, Maya; Friedlich, Oriel; Elgavish, Sharona; Nevo, Yuval; Helman, Aharon; Glaser, Benjamin; Eden, Amir; Itzkovitz, Shalev

    2016-01-01

    The molecular program underlying infrequent replication of pancreatic β-cells remains largely inaccessible. Using transgenic mice expressing green fluorescent protein in cycling cells, we sorted live, replicating β-cells and determined their transcriptome. Replicating β-cells upregulate hundreds of proliferation-related genes, along with many novel putative cell cycle components. Strikingly, genes involved in β-cell functions, namely, glucose sensing and insulin secretion, were repressed. Further studies using single-molecule RNA in situ hybridization revealed that in fact, replicating β-cells double the amount of RNA for most genes, but this upregulation excludes genes involved in β-cell function. These data suggest that the quiescence-proliferation transition involves global amplification of gene expression, except for a subset of tissue-specific genes, which are “left behind” and whose relative mRNA amount decreases. Our work provides a unique resource for the study of replicating β-cells in vivo. PMID:26993067

  16. Characterizing Lunch Meals Served and Consumed by Preschool Children in Head Start

    PubMed Central

    Liu, Yan; Stuff, Janice E; Fisher, Jennifer O; Mendoza, Jason A; O’Neil, Carol E

    2014-01-01

    Objective To examine the variability of food portions served and consumed by preschoolers in African-Americans and Hispanic-Americans attending Head Start (HS). Design Cross-Sectional. Setting Food consumption by preschoolers (n=796) enrolled in 16 HS centers in Houston, Texas (51% boys, 42% African-American, mean age 4 years) were assessed during three days of lunch meals using digital photography. Descriptive statistics and multi-level regression models, adjusting for classroom and school clustering effects, were determined. Subjects HS preschoolers 3–5 years. Results Mean amount served was 2428 kilojoule (kJ) (580 kilocalories [kcal]), and 572 grams. Mean intake was 1421 kJ (339 kcal), and 331 grams: 20% protein, 46% carbohydrate, 34% fat. Plate waste was 43% (range: 38% [fruit] to 61% [vegetables]). Mean coefficient of variation (CV) of food served was 29%: 33% entrée, 44% vegetables, 60% fruit, and 76% starches. Mean CV of food consumed was 46%: 58% entrée, 86% fruit, 96% vegetables, and 111% starches. Total gram amount of food served was positively correlated with consumption (r = 0.43, p<0.001). Conclusion Plate waste and variation in amounts served and consumed was substantial; amounts served were associated with amounts consumed. Large portion sizes may contribute to pediatric obesity by promoting excessive intake at meals. Understanding factors influencing portion sizes provide insight about specific intervention strategies that can be used in obesity prevention programs. PMID:23701867

  17. Application of multivariate statistical techniques in microbial ecology

    PubMed Central

    Paliy, O.; Shankar, V.

    2016-01-01

    Recent advances in high-throughput methods of molecular analyses have led to an explosion of studies generating large scale ecological datasets. Especially noticeable effect has been attained in the field of microbial ecology, where new experimental approaches provided in-depth assessments of the composition, functions, and dynamic changes of complex microbial communities. Because even a single high-throughput experiment produces large amounts of data, powerful statistical techniques of multivariate analysis are well suited to analyze and interpret these datasets. Many different multivariate techniques are available, and often it is not clear which method should be applied to a particular dataset. In this review we describe and compare the most widely used multivariate statistical techniques including exploratory, interpretive, and discriminatory procedures. We consider several important limitations and assumptions of these methods, and we present examples of how these approaches have been utilized in recent studies to provide insight into the ecology of the microbial world. Finally, we offer suggestions for the selection of appropriate methods based on the research question and dataset structure. PMID:26786791

  18. Metadata and annotations for multi-scale electrophysiological data.

    PubMed

    Bower, Mark R; Stead, Matt; Brinkmann, Benjamin H; Dufendach, Kevin; Worrell, Gregory A

    2009-01-01

    The increasing use of high-frequency (kHz), long-duration (days) intracranial monitoring from multiple electrodes during pre-surgical evaluation for epilepsy produces large amounts of data that are challenging to store and maintain. Descriptive metadata and clinical annotations of these large data sets also pose challenges to simple, often manual, methods of data analysis. The problems of reliable communication of metadata and annotations between programs, the maintenance of the meanings within that information over long time periods, and the flexibility to re-sort data for analysis place differing demands on data structures and algorithms. Solutions to these individual problem domains (communication, storage and analysis) can be configured to provide easy translation and clarity across the domains. The Multi-scale Annotation Format (MAF) provides an integrated metadata and annotation environment that maximizes code reuse, minimizes error probability and encourages future changes by reducing the tendency to over-fit information technology solutions to current problems. An example of a graphical utility for generating and evaluating metadata and annotations for "big data" files is presented.

  19. Fast segmentation of stained nuclei in terabyte-scale, time resolved 3D microscopy image stacks.

    PubMed

    Stegmaier, Johannes; Otte, Jens C; Kobitski, Andrei; Bartschat, Andreas; Garcia, Ariel; Nienhaus, G Ulrich; Strähle, Uwe; Mikut, Ralf

    2014-01-01

    Automated analysis of multi-dimensional microscopy images has become an integral part of modern research in life science. Most available algorithms that provide sufficient segmentation quality, however, are infeasible for a large amount of data due to their high complexity. In this contribution we present a fast parallelized segmentation method that is especially suited for the extraction of stained nuclei from microscopy images, e.g., of developing zebrafish embryos. The idea is to transform the input image based on gradient and normal directions in the proximity of detected seed points such that it can be handled by straightforward global thresholding like Otsu's method. We evaluate the quality of the obtained segmentation results on a set of real and simulated benchmark images in 2D and 3D and show the algorithm's superior performance compared to other state-of-the-art algorithms. We achieve an up to ten-fold decrease in processing times, allowing us to process large data sets while still providing reasonable segmentation results.

  20. Research on Optimization of GLCM Parameter in Cell Classification

    NASA Astrophysics Data System (ADS)

    Zhang, Xi-Kun; Hou, Jie; Hu, Xin-Hua

    2016-05-01

    Real-time classification of biological cells according to their 3D morphology is highly desired in a flow cytometer setting. Gray level co-occurrence matrix (GLCM) algorithm has been developed to extract feature parameters from measured diffraction images ,which are too complicated to coordinate with the real-time system for a large amount of calculation. An optimization of GLCM algorithm is provided based on correlation analysis of GLCM parameters. The results of GLCM analysis and subsequent classification demonstrate optimized method can lower the time complexity significantly without loss of classification accuracy.

  1. Adolescent accessory navicular.

    PubMed

    Leonard, Zachary C; Fortin, Paul T

    2010-06-01

    Accessory tarsal navicular is a common anomaly in the human foot. It should be in the differential of medial foot pain. A proper history and physical, along with imaging modalities, can lead to the diagnosis. Often, classification of the ossicle and amount of morbidity guide treatment. Nonsurgical measures can provide relief. A variety of surgical procedures have been used with good results. Our preferred method is excision for small ossicles and segmental fusion after removal of the synchondrosis for large ossicles. In addition, pes planovalgus deformities need to be addressed concomitantly. Copyright 2010 Elsevier Inc. All rights reserved.

  2. Pharmacological management of binge eating disorder: current and emerging treatment options

    PubMed Central

    McElroy, Susan L; Guerdjikova, Anna I; Mori, Nicole; O’Melia, Anne M

    2012-01-01

    Growing evidence suggests that pharmacotherapy may be beneficial for some patients with binge eating disorder (BED), an eating disorder characterized by repetitive episodes of uncontrollable consumption of abnormally large amounts of food without inappropriate weight loss behaviors. In this paper, we provide a brief overview of BED and review the rationales and data supporting the effectiveness of specific medications or medication classes in treating patients with BED. We conclude by summarizing these data, discussing the role of pharmacotherapy in the BED treatment armamentarium, and suggesting future areas for research. PMID:22654518

  3. Recent Developments in Hyperspectral Imaging for Assessment of Food Quality and Safety

    PubMed Central

    Huang, Hui; Liu, Li; Ngadi, Michael O.

    2014-01-01

    Hyperspectral imaging which combines imaging and spectroscopic technology is rapidly gaining ground as a non-destructive, real-time detection tool for food quality and safety assessment. Hyperspectral imaging could be used to simultaneously obtain large amounts of spatial and spectral information on the objects being studied. This paper provides a comprehensive review on the recent development of hyperspectral imaging applications in food and food products. The potential and future work of hyperspectral imaging for food quality and safety control is also discussed. PMID:24759119

  4. The NIH Roadmap Epigenomics Program data resource

    PubMed Central

    Chadwick, Lisa Helbling

    2012-01-01

    The NIH Roadmap Reference Epigenome Mapping Consortium is developing a community resource of genome-wide epigenetic maps in a broad range of human primary cells and tissues. There are large amounts of data already available, and a number of different options for viewing and analyzing the data. This report will describe key features of the websites where users will find data, protocols and analysis tools developed by the consortium, and provide a perspective on how this unique resource will facilitate and inform human disease research, both immediately and in the future. PMID:22690667

  5. The NIH Roadmap Epigenomics Program data resource.

    PubMed

    Chadwick, Lisa Helbling

    2012-06-01

    The NIH Roadmap Reference Epigenome Mapping Consortium is developing a community resource of genome-wide epigenetic maps in a broad range of human primary cells and tissues. There are large amounts of data already available, and a number of different options for viewing and analyzing the data. This report will describe key features of the websites where users will find data, protocols and analysis tools developed by the consortium, and provide a perspective on how this unique resource will facilitate and inform human disease research, both immediately and in the future.

  6. A Versatile Rocket Engine Hot Gas Facility

    NASA Technical Reports Server (NTRS)

    Green, James M.

    1993-01-01

    The capabilities of a versatile rocket engine facility, located in the Rocket Laboratory at the NASA Lewis Research Center, are presented. The gaseous hydrogen/oxygen facility can be used for thermal shock and hot gas testing of materials and structures as well as rocket propulsion testing. Testing over a wide range of operating conditions in both fuel and oxygen rich regimes can be conducted, with cooled or uncooled test specimens. The size and location of the test cell provide the ability to conduct large amounts of testing in short time periods with rapid turnaround between programs.

  7. Python-based geometry preparation and simulation visualization toolkits for STEPS

    PubMed Central

    Chen, Weiliang; De Schutter, Erik

    2014-01-01

    STEPS is a stochastic reaction-diffusion simulation engine that implements a spatial extension of Gillespie's Stochastic Simulation Algorithm (SSA) in complex tetrahedral geometries. An extensive Python-based interface is provided to STEPS so that it can interact with the large number of scientific packages in Python. However, a gap existed between the interfaces of these packages and the STEPS user interface, where supporting toolkits could reduce the amount of scripting required for research projects. This paper introduces two new supporting toolkits that support geometry preparation and visualization for STEPS simulations. PMID:24782754

  8. Implementation of NFC technology for industrial applications: case flexible production

    NASA Astrophysics Data System (ADS)

    Sallinen, Mikko; Strömmer, Esko; Ylisaukko-oja, Arto

    2007-09-01

    Near Field communication (NFC) technology enables a flexible short range communication. It has large amount of envisaged applications in consumer, welfare and industrial sector. Compared with other short range communication technologies such as Bluetooth or Wibree it provides advantages that we will introduce in this paper. In this paper, we present an example of applying NFC technology to industrial application where simple tasks can be automatized and industrial assembly process can be improved radically by replacing manual paperwork and increasing trace of the products during the production.

  9. Method for producing thin graphite flakes with large aspect ratios

    DOEpatents

    Bunnell, L. Roy

    1993-01-01

    A method for making graphite flakes of high aspect ratio by the steps of providing a strong concentrated acid and heating the graphite in the presence of the acid for a time and at a temperature effective to intercalate the acid in the graphite; heating the intercalated graphite at a rate and to a temperature effective to exfoliate the graphite in discrete layers; subjecting the graphite layers to ultrasonic energy, mechanical shear forces, or freezing in an amount effective to separate the layes into discrete flakes.

  10. Data::Downloader

    NASA Technical Reports Server (NTRS)

    Duggan, Brian

    2012-01-01

    Downloading and organizing large amounts of files is challenging, and often done using ad hoc methods. This software is capable of downloading and organizing files as an OpenSearch client. It can subscribe to RSS (Really Simple Syndication) feeds and Atom feeds containing arbitrary metadata, and maintains a local content addressable data store. It uses existing standards for obtaining the files, and uses efficient techniques for storing the files. Novel features include symbolic links to maintain a sane directory structure, checksums for validating file integrity during transfer and storage, and flexible use of server-provided metadata.

  11. Multiple resonant railgun power supply

    DOEpatents

    Honig, E.M.; Nunnally, W.C.

    1985-06-19

    A multiple repetitive resonant railgun power supply provides energy for repetitively propelling projectiles from a pair of parallel rails. A plurality of serially connected paired parallel rails are powered by similar power supplies. Each supply comprises an energy storage capacitor, a storage inductor to form a resonant circuit with the energy storage capacitor and a magnetic switch to transfer energy between the resonant circuit and the pair of parallel rails for the propelling of projectiles. The multiple serial operation permits relatively small energy components to deliver overall relatively large amounts of energy to the projectiles being propelled.

  12. The Design and Development of a Management Information System for the Monterey Navy Flying Club.

    DTIC Science & Technology

    1986-03-27

    Management Information System for the Monterey Navy Flying Club. It supplies the tools necessary to enable the club manager to maintain all club records and generate required administrative and financial reports. The Monterey Navy Flying Club has one of the largest memberships of the Navy sponsored flying clubs. As a result of this large membership and the amount of manual paperwork required to properly maintain club records, the Manager’s ability to provide necessary services and reports in severely hampered. The implementation of an efficient

  13. An analytical benchmark and a Mathematica program for MD codes: Testing LAMMPS on the 2nd generation Brenner potential

    NASA Astrophysics Data System (ADS)

    Favata, Antonino; Micheletti, Andrea; Ryu, Seunghwa; Pugno, Nicola M.

    2016-10-01

    An analytical benchmark and a simple consistent Mathematica program are proposed for graphene and carbon nanotubes, that may serve to test any molecular dynamics code implemented with REBO potentials. By exploiting the benchmark, we checked results produced by LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator) when adopting the second generation Brenner potential, we made evident that this code in its current implementation produces results which are offset from those of the benchmark by a significant amount, and provide evidence of the reason.

  14. The development of non-coding RNA ontology.

    PubMed

    Huang, Jingshan; Eilbeck, Karen; Smith, Barry; Blake, Judith A; Dou, Dejing; Huang, Weili; Natale, Darren A; Ruttenberg, Alan; Huan, Jun; Zimmermann, Michael T; Jiang, Guoqian; Lin, Yu; Wu, Bin; Strachan, Harrison J; de Silva, Nisansa; Kasukurthi, Mohan Vamsi; Jha, Vikash Kumar; He, Yongqun; Zhang, Shaojie; Wang, Xiaowei; Liu, Zixing; Borchert, Glen M; Tan, Ming

    2016-01-01

    Identification of non-coding RNAs (ncRNAs) has been significantly improved over the past decade. On the other hand, semantic annotation of ncRNA data is facing critical challenges due to the lack of a comprehensive ontology to serve as common data elements and data exchange standards in the field. We developed the Non-Coding RNA Ontology (NCRO) to handle this situation. By providing a formally defined ncRNA controlled vocabulary, the NCRO aims to fill a specific and highly needed niche in semantic annotation of large amounts of ncRNA biological and clinical data.

  15. Multiple resonant railgun power supply

    DOEpatents

    Honig, Emanuel M.; Nunnally, William C.

    1988-01-01

    A multiple repetitive resonant railgun power supply provides energy for repetitively propelling projectiles from a pair of parallel rails. A plurality of serially connected paired parallel rails are powered by similar power supplies. Each supply comprises an energy storage capacitor, a storage inductor to form a resonant circuit with the energy storage capacitor and a magnetic switch to transfer energy between the resonant circuit and the pair of parallel rails for the propelling of projectiles. The multiple serial operation permits relatively small energy components to deliver overall relatively large amounts of energy to the projectiles being propelled.

  16. DNA Yield From Tissue Samples in Surgical Pathology and Minimum Tissue Requirements for Molecular Testing.

    PubMed

    Austin, Melissa C; Smith, Christina; Pritchard, Colin C; Tait, Jonathan F

    2016-02-01

    Complex molecular assays are increasingly used to direct therapy and provide diagnostic and prognostic information but can require relatively large amounts of DNA. To provide data to pathologists to help them assess tissue adequacy and provide prospective guidance on the amount of tissue that should be procured. We used slide-based measurements to establish a relationship between processed tissue volume and DNA yield by A260 from 366 formalin-fixed, paraffin-embedded tissue samples submitted for the 3 most common molecular assays performed in our laboratory (EGFR, KRAS, and BRAF). We determined the average DNA yield per unit of tissue volume, and we used the distribution of DNA yields to calculate the minimum volume of tissue that should yield sufficient DNA 99% of the time. All samples with a volume greater than 8 mm(3) yielded at least 1 μg of DNA, and more than 80% of samples producing less than 1 μg were extracted from less than 4 mm(3) of tissue. Nine square millimeters of tissue should produce more than 1 μg of DNA 99% of the time. We conclude that 2 tissue cores, each 1 cm long and obtained with an 18-gauge needle, will almost always provide enough DNA for complex multigene assays, and our methodology may be readily extrapolated to individual institutional practice.

  17. Sediment budget analysis from Landslide debris and river channel change during the extreme event - example of Typhoon Morakot at Laonong river, Taiwan

    NASA Astrophysics Data System (ADS)

    Chang, Kuo-Jen; Huang, Yu-Ting; Huang, Mei-Jen; Chiang, Yi-Lin; Yeh, En-Chao; Chao, Yu-Jui

    2014-05-01

    Taiwan, due to the high seismicity and high annual rainfall, numerous landslides triggered every year and severe impacts affect the island. Typhoon Morakot brought extreme and long-time rainfall for Taiwan in August 2009. It further caused huge loss of life and property in central and southern Taiwan. Laonong River is the largest tributary of Gaoping River. It's length is 137 km, and the basin area is 1373 km2. More than 2000mm rainfall brought and maximum rainfall exceeded 100mm/hr in the region by Typhoon Morakot in Aug, 2009. Its heavy rains made many landslides and debris flew into the river and further brought out accumulation and erosion on river banks of different areas. It caused severe disasters within the Laonong River drainage. In the past, the study of sediment blockage of river channel usually relies on field investigation, but due to inconvenient transportation, topographical barriers, or located in remote areas, etc. the survey is hardly to be completed sometimes. In recent years, the rapid development of remote sensing technology improves image resolution and quality significantly. Remote sensing technology can provide a wide range of image data, and provide essential and precious information. Furthermore, although the amount of sediment transportation can be estimated by using data such as rainfall, river flux, and suspended loads, the situation of large debris migration cannot be studied via those data. However, landslides, debris flow and river sediment transportation model in catchment area can be evaluated easily through analyzing the digital terrain model (DTM) . The purpose of this study is to investigate the phenomenon of river migration and to evaluate the amount of migration along Laonong River by analyzing the DEM before and after the typhoon Morakot. The DEMs are built by using the aerial images taken by digital mapping camera (DMC) and by airborne digital scanner 40 (ADS 40) before and after typhoon event. The results show that lateral erosion of the Laonong River caused by the typhoon seriously, especially in Yushan National Park, and midstream region. However, lateral erosion in downstream region is not so obvious. Meanwhile the siltation depth resulted from the Typhoon Morakot is larger in upstream region than in midstream and downstream regions. The amount of landslide debris created by Typhoon Morakot was too excessive to be transported. Materials just siltated in the upstream in place, same as in the middle stream area. Because of the amount of river slope erosion and sediment collapse in the downstream region is less than in upstream and midstream region, the amount of river erosion slightly larger than the amount of river siltation. The goals of this project are trying to decipher the sliding process and morphologic changes of large landslide areas, sediment transport and budgets, and to investigate the phenomenon of river migration. The results of this study provides not only geomatics and GIS dataset of the hazards, but also for essential geomorphologic information for other study, and for hazard mitigation and planning, as well.

  18. Estimating tobacco consumption in remote Aboriginal communities using retail sales data: some challenges and opportunities.

    PubMed

    MacLaren, David; Redman-MacLaren, Michelle; Clough, Alan

    2010-07-01

    To describe and discuss challenges and opportunities encountered when estimating tobacco consumption in six remote Aboriginal communities using tobacco sales data from retail outlets. We consider tobacco sales data collected from retail outlets selling tobacco to six Aboriginal communities in two similar but separate studies. Despite challenges--including: not all outlets provided data; data not uniform across outlets (sales and invoice data); change in format of data; personnel change or management restructures; and anomalies in data and changes in community populations--tobacco consumption was estimated and returned through project newsletters and community feedback sessions. Amounts of tobacco sold were returned using graphs in newsletters and pictures of items common to the community in community feedback sessions. Despite inherent limitations of estimating tobacco consumption using tobacco sales data, returning the amount of tobacco sold to communities provided an opportunity to discuss tobacco consumption and provide a focal point for individual and community action. Using this method, however, may require large and sustained changes be observed over time to evaluate whether initiatives to reduce tobacco consumption have been effective. Estimating tobacco consumption in remote Aboriginal communities using tobacco sales data from retail outlets requires careful consideration of many logistical, social, cultural and geographic challenges.

  19. Supraventricular tachycardia induced by chocolate: is chocolate too sweet for the heart?

    PubMed

    Parasramka, Saurabh; Dufresne, Alix

    2012-09-01

    Conflicting studies have been published concerning the association between chocolate and cardiovascular diseases. Fewer articles have described the potential arrhythmogenic risk related to chocolate intake. We present a case of paroxysmal supraventricular tachycardia in a woman after consumption of large quantity of chocolate. A 53-year-old woman with no significant medical history presented to us with complaints of palpitations and shortness of breath after consuming large amounts of chocolate. Electrocardiogram showed supraventricular tachycardia at 165 beats per minute, which was restored to sinus rhythm after adenosine bolus injection. Electrophysiology studies showed atrioventricular nodal reentry tachycardia, which was treated with radiofrequency ablation. Chocolate contains caffeine and theobromine, which are methylxanthines and are competitive antagonists of adenosine and can have arrhythmogenic potential. Our case very well describes an episode of tachycardia precipitated by large amount of chocolate consumption in a patient with underlying substrate. There are occasional case reports describing association between chocolate, caffeine, and arrhythmias. A large Danish study, however, did not find any association between amount of daily caffeine consumption and risk of arrhythmia.

  20. A Cost Benefit Analysis of Emerging LED Water Purification Systems in Expeditionary Environments

    DTIC Science & Technology

    2017-03-23

    the initial contingency response phase, ROWPUs are powered by large generators which require relatively large amounts of fossil fuels. The amount of...they attract and cling together forming a larger particle (Chem Treat, 2016). Flocculation is the addition of a polymer to water that clumps...smaller particles together to form larger particles. The idea for both methods is that larger particles will either settle out of or be removed from the

  1. Galaxy And Mass Assembly (GAMA): the connection between metals, specific SFR and H I gas in galaxies: the Z-SSFR relation

    NASA Astrophysics Data System (ADS)

    Lara-López, M. A.; Hopkins, A. M.; López-Sánchez, A. R.; Brough, S.; Colless, M.; Bland-Hawthorn, J.; Driver, S.; Foster, C.; Liske, J.; Loveday, J.; Robotham, A. S. G.; Sharp, R. G.; Steele, O.; Taylor, E. N.

    2013-06-01

    We study the interplay between gas phase metallicity (Z), specific star formation rate (SSFR) and neutral hydrogen gas (H I) for galaxies of different stellar masses. Our study uses spectroscopic data from Galaxy and Mass Assembly and Sloan Digital Sky Survey (SDSS) star-forming galaxies, as well as H I detection from the Arecibo Legacy Fast Arecibo L-band Feed Array (ALFALFA) and Galex Arecibo SDSS Survey (GASS) public catalogues. We present a model based on the Z-SSFR relation that shows that at a given stellar mass, depending on the amount of gas, galaxies will follow opposite behaviours. Low-mass galaxies with a large amount of gas will show high SSFR and low metallicities, while low-mass galaxies with small amounts of gas will show lower SSFR and high metallicities. In contrast, massive galaxies with a large amount of gas will show moderate SSFR and high metallicities, while massive galaxies with small amounts of gas will show low SSFR and low metallicities. Using ALFALFA and GASS counterparts, we find that the amount of gas is related to those drastic differences in Z and SSFR for galaxies of a similar stellar mass.

  2. Closha: bioinformatics workflow system for the analysis of massive sequencing data.

    PubMed

    Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook

    2018-02-19

    While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .

  3. Safe Upper-Bounds Inference of Energy Consumption for Java Bytecode Applications

    NASA Technical Reports Server (NTRS)

    Navas, Jorge; Mendez-Lojo, Mario; Hermenegildo, Manuel V.

    2008-01-01

    Many space applications such as sensor networks, on-board satellite-based platforms, on-board vehicle monitoring systems, etc. handle large amounts of data and analysis of such data is often critical for the scientific mission. Transmitting such large amounts of data to the remote control station for analysis is usually too expensive for time-critical applications. Instead, modern space applications are increasingly relying on autonomous on-board data analysis. All these applications face many resource constraints. A key requirement is to minimize energy consumption. Several approaches have been developed for estimating the energy consumption of such applications (e.g. [3, 1]) based on measuring actual consumption at run-time for large sets of random inputs. However, this approach has the limitation that it is in general not possible to cover all possible inputs. Using formal techniques offers the potential for inferring safe energy consumption bounds, thus being specially interesting for space exploration and safety-critical systems. We have proposed and implemented a general frame- work for resource usage analysis of Java bytecode [2]. The user defines a set of resource(s) of interest to be tracked and some annotations that describe the cost of some elementary elements of the program for those resources. These values can be constants or, more generally, functions of the input data sizes. The analysis then statically derives an upper bound on the amount of those resources that the program as a whole will consume or provide, also as functions of the input data sizes. This article develops a novel application of the analysis of [2] to inferring safe upper bounds on the energy consumption of Java bytecode applications. We first use a resource model that describes the cost of each bytecode instruction in terms of the joules it consumes. With this resource model, we then generate energy consumption cost relations, which are then used to infer safe upper bounds. How energy consumption for each bytecode instruction is measured is beyond the scope of this paper. Instead, this paper is about how to infer safe energy consumption estimations assuming that those energy consumption costs are provided. For concreteness, we use a simplified version of an existing resource model [1] in which an energy consumption cost for individual Java opcodes is defined.

  4. Wireless in-situ Sensor Network for Agriculture and Water Monitoring on a River Basin Scale in Southern Finland: Evaluation from a Data User’s Perspective

    PubMed Central

    Kotamäki, Niina; Thessler, Sirpa; Koskiaho, Jari; Hannukkala, Asko O.; Huitu, Hanna; Huttula, Timo; Havento, Jukka; Järvenpää, Markku

    2009-01-01

    Sensor networks are increasingly being implemented for environmental monitoring and agriculture to provide spatially accurate and continuous environmental information and (near) real-time applications. These networks provide a large amount of data which poses challenges for ensuring data quality and extracting relevant information. In the present paper we describe a river basin scale wireless sensor network for agriculture and water monitoring. The network, called SoilWeather, is unique and the first of this type in Finland. The performance of the network is assessed from the user and maintainer perspectives, concentrating on data quality, network maintenance and applications. The results showed that the SoilWeather network has been functioning in a relatively reliable way, but also that the maintenance and data quality assurance by automatic algorithms and calibration samples requires a lot of effort, especially in continuous water monitoring over large areas. We see great benefits on sensor networks enabling continuous, real-time monitoring, while data quality control and maintenance efforts highlight the need for tight collaboration between sensor and sensor network owners to decrease costs and increase the quality of the sensor data in large scale applications. PMID:22574050

  5. Radiation Exposure Analyses Supporting the Development of Solar Particle Event Shielding Technologies

    NASA Technical Reports Server (NTRS)

    Walker, Steven A.; Clowdsley, Martha S.; Abston, H. Lee; Simon, Hatthew A.; Gallegos, Adam M.

    2013-01-01

    NASA has plans for long duration missions beyond low Earth orbit (LEO). Outside of LEO, large solar particle events (SPEs), which occur sporadically, can deliver a very large dose in a short amount of time. The relatively low proton energies make SPE shielding practical, and the possibility of the occurrence of a large event drives the need for SPE shielding for all deep space missions. The Advanced Exploration Systems (AES) RadWorks Storm Shelter Team was charged with developing minimal mass SPE storm shelter concepts for missions beyond LEO. The concepts developed included "wearable" shields, shelters that could be deployed at the onset of an event, and augmentations to the crew quarters. The radiation transport codes, human body models, and vehicle geometry tools contained in the On-Line Tool for the Assessment of Radiation In Space (OLTARIS) were used to evaluate the protection provided by each concept within a realistic space habitat and provide the concept designers with shield thickness requirements. Several different SPE models were utilized to examine the dependence of the shield requirements on the event spectrum. This paper describes the radiation analysis methods and the results of these analyses for several of the shielding concepts.

  6. Demonstration of Hadoop-GIS: A Spatial Data Warehousing System Over MapReduce.

    PubMed

    Aji, Ablimit; Sun, Xiling; Vo, Hoang; Liu, Qioaling; Lee, Rubao; Zhang, Xiaodong; Saltz, Joel; Wang, Fusheng

    2013-11-01

    The proliferation of GPS-enabled devices, and the rapid improvement of scientific instruments have resulted in massive amounts of spatial data in the last decade. Support of high performance spatial queries on large volumes data has become increasingly important in numerous fields, which requires a scalable and efficient spatial data warehousing solution as existing approaches exhibit scalability limitations and efficiency bottlenecks for large scale spatial applications. In this demonstration, we present Hadoop-GIS - a scalable and high performance spatial query system over MapReduce. Hadoop-GIS provides an efficient spatial query engine to process spatial queries, data and space based partitioning, and query pipelines that parallelize queries implicitly on MapReduce. Hadoop-GIS also provides an expressive, SQL-like spatial query language for workload specification. We will demonstrate how spatial queries are expressed in spatially extended SQL queries, and submitted through a command line/web interface for execution. Parallel to our system demonstration, we explain the system architecture and details on how queries are translated to MapReduce operators, optimized, and executed on Hadoop. In addition, we will showcase how the system can be used to support two representative real world use cases: large scale pathology analytical imaging, and geo-spatial data warehousing.

  7. A MapReduce approach to diminish imbalance parameters for big deoxyribonucleic acid dataset.

    PubMed

    Kamal, Sarwar; Ripon, Shamim Hasnat; Dey, Nilanjan; Ashour, Amira S; Santhi, V

    2016-07-01

    In the age of information superhighway, big data play a significant role in information processing, extractions, retrieving and management. In computational biology, the continuous challenge is to manage the biological data. Data mining techniques are sometimes imperfect for new space and time requirements. Thus, it is critical to process massive amounts of data to retrieve knowledge. The existing software and automated tools to handle big data sets are not sufficient. As a result, an expandable mining technique that enfolds the large storage and processing capability of distributed or parallel processing platforms is essential. In this analysis, a contemporary distributed clustering methodology for imbalance data reduction using k-nearest neighbor (K-NN) classification approach has been introduced. The pivotal objective of this work is to illustrate real training data sets with reduced amount of elements or instances. These reduced amounts of data sets will ensure faster data classification and standard storage management with less sensitivity. However, general data reduction methods cannot manage very big data sets. To minimize these difficulties, a MapReduce-oriented framework is designed using various clusters of automated contents, comprising multiple algorithmic approaches. To test the proposed approach, a real DNA (deoxyribonucleic acid) dataset that consists of 90 million pairs has been used. The proposed model reduces the imbalance data sets from large-scale data sets without loss of its accuracy. The obtained results depict that MapReduce based K-NN classifier provided accurate results for big data of DNA. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Mining Critical Metals and Elements from Seawater: Opportunities and Challenges.

    PubMed

    Diallo, Mamadou S; Kotte, Madhusudhana Rao; Cho, Manki

    2015-08-18

    The availability and sustainable supply of technology metals and valuable elements is critical to the global economy. There is a growing realization that the development and deployment of the clean energy technologies and sustainable products and manufacturing industries of the 21st century will require large amounts of critical metals and valuable elements including rare-earth elements (REEs), platinum group metals (PGMs), lithium, copper, cobalt, silver, and gold. Advances in industrial ecology, water purification, and resource recovery have established that seawater is an important and largely untapped source of technology metals and valuable elements. This feature article discusses the opportunities and challenges of mining critical metals and elements from seawater. We highlight recent advances and provide an outlook of the future of metal mining and resource recovery from seawater.

  9. Anatomy of a flaring proto-planetary disk around a young intermediate-mass star.

    PubMed

    Lagage, Pierre-Olivier; Doucet, Coralie; Pantin, Eric; Habart, Emilie; Duchêne, Gaspard; Ménard, François; Pinte, Christophe; Charnoz, Sébastien; Pel, Jan-Willem

    2006-10-27

    Although planets are being discovered around stars more massive than the Sun, information about the proto-planetary disks where such planets have built up is sparse. We have imaged mid-infrared emission from polycyclic aromatic hydrocarbons at the surface of the disk surrounding the young intermediate-mass star HD 97048 and characterized the disk. The disk is in an early stage of evolution, as indicated by its large content of dust and its hydrostatic flared geometry, indicative of the presence of a large amount of gas that is well mixed with dust and gravitationally stable. The disk is a precursor of debris disks found around more-evolved A stars such as beta-Pictoris and provides the rare opportunity to witness the conditions prevailing before (or during) planet formation.

  10. Bismuth Subgallate Toxicity in the Age of Online Supplement Use.

    PubMed

    Sampognaro, Paul; Vo, Kathy T; Richie, Megan; Blanc, Paul D; Keenan, Kevin

    2017-11-01

    Bismuth salts have been used to treat gastroenterological disorders and are readily available over-the-counter and via the internet. Even though generally considered safe, bismuth compounds can cause a syndrome of subacute, progressive encephalopathy when taken in large quantities. We present the case of woman who developed progressive encephalopathy, aphasia, myoclonus, and gait instability after chronically ingesting large amounts of bismuth subgallate purchased from a major online marketing website to control symptoms of irritable bowel syndrome. After extensive neurological work-up, elevated bismuth levels in her blood, urine, and cerebrospinal fluid confirmed the diagnosis of bismuth-related neurotoxicity. She improved slowly following cessation of exposure. This case highlights bismuth subgallate as a neurotoxic bismuth formulation and reminds providers of the potential for safety misconceptions of positively reviewed online supplements.

  11. Method for forming a chemical microreactor

    DOEpatents

    Morse, Jeffrey D [Martinez, CA; Jankowski, Alan [Livermore, CA

    2009-05-19

    Disclosed is a chemical microreactor that provides a means to generate hydrogen fuel from liquid sources such as ammonia, methanol, and butane through steam reforming processes when mixed with an appropriate amount of water. The microreactor contains capillary microchannels with integrated resistive heaters to facilitate the occurrence of catalytic steam reforming reactions. Two distinct embodiment styles are discussed. One embodiment style employs a packed catalyst capillary microchannel and at least one porous membrane. Another embodiment style employs a porous membrane with a large surface area or a porous membrane support structure containing a plurality of porous membranes having a large surface area in the aggregate, i.e., greater than about 1 m.sup.2/cm.sup.3. Various methods to form packed catalyst capillary microchannels, porous membranes and porous membrane support structures are also disclosed.

  12. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae

    PubMed Central

    2011-01-01

    Background There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Findings Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Conclusions Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies. PMID:21693048

  13. Antibody recognition of the glycoprotein g of viral haemorrhagic septicemia virus (VHSV) purified in large amounts from insect larvae.

    PubMed

    Encinas, Paloma; Gomez-Sebastian, Silvia; Nunez, Maria Carmen; Gomez-Casado, Eduardo; Escribano, Jose M; Estepa, Amparo; Coll, Julio

    2011-06-21

    There are currently no purification methods capable of producing the large amounts of fish rhabdoviral glycoprotein G (gpG) required for diagnosis and immunisation purposes or for studying structure and molecular mechanisms of action of this molecule (ie. pH-dependent membrane fusion). As a result of the unavailability of large amounts of the gpG from viral haemorrhagic septicaemia rhabdovirus (VHSV), one of the most dangerous viruses affecting cultured salmonid species, research interests in this field are severely hampered. Previous purification methods to obtain recombinant gpG from VHSV in E. coli, yeast and baculovirus grown in insect cells have not produced soluble conformations or acceptable yields. The development of large-scale purification methods for gpGs will also further research into other fish rhabdoviruses, such as infectious haematopoietic necrosis virus (IHNV), spring carp viremia virus (SVCV), hirame rhabdovirus (HIRRV) and snakehead rhabdovirus (SHRV). Here we designed a method to produce milligram amounts of soluble VHSV gpG. Only the transmembrane and carboxy terminal-deleted (amino acid 21 to 465) gpG was efficiently expressed in insect larvae. Recognition of G21-465 by ß-mercaptoethanol-dependent neutralizing monoclonal antibodies (N-MAbs) and pH-dependent recognition by sera from VHSV-hyperimmunized or VHSV-infected rainbow trout (Oncorhynchus mykiss) was demonstrated. Given that the purified G21-465 conserved some of its most important properties, this method might be suitable for the large-scale production of fish rhabdoviral gpGs for use in diagnosis, fusion and antigenicity studies.

  14. Large temporal scale and capacity subsurface bulk energy storage with CO2

    NASA Astrophysics Data System (ADS)

    Saar, M. O.; Fleming, M. R.; Adams, B. M.; Ogland-Hand, J.; Nelson, E. S.; Randolph, J.; Sioshansi, R.; Kuehn, T. H.; Buscheck, T. A.; Bielicki, J. M.

    2017-12-01

    Decarbonizing energy systems by increasing the penetration of variable renewable energy (VRE) technologies requires efficient and short- to long-term energy storage. Very large amounts of energy can be stored in the subsurface as heat and/or pressure energy in order to provide both short- and long-term (seasonal) storage, depending on the implementation. This energy storage approach can be quite efficient, especially where geothermal energy is naturally added to the system. Here, we present subsurface heat and/or pressure energy storage with supercritical carbon dioxide (CO2) and discuss the system's efficiency, deployment options, as well as its advantages and disadvantages, compared to several other energy storage options. CO2-based subsurface bulk energy storage has the potential to be particularly efficient and large-scale, both temporally (i.e., seasonal) and spatially. The latter refers to the amount of energy that can be stored underground, using CO2, at a geologically conducive location, potentially enabling storing excess power from a substantial portion of the power grid. The implication is that it would be possible to employ centralized energy storage for (a substantial part of) the power grid, where the geology enables CO2-based bulk subsurface energy storage, whereas the VRE technologies (solar, wind) are located on that same power grid, where (solar, wind) conditions are ideal. However, this may require reinforcing the power grid's transmission lines in certain parts of the grid to enable high-load power transmission from/to a few locations.

  15. Were Ocean Impacts an Important Mechanism to Deliver Meteoritic Organic Matter to the Early Earth? Some Inferences from Eltanin

    NASA Technical Reports Server (NTRS)

    Kyte, Frank T.; Gersonde, Rainer; Kuhn. Gerhard

    2002-01-01

    Several workers have addressed the potential for extraterrestrial delivery of volatles, including water and complex organic compounds, to the early Earth. For example, Chyba and Sagan (1992) argued that since impacts would destroy organic matter, most extraterrestrial organics must be delivered in the fine-fractions of interplanetary dust. More recent computer simulations (Pierazzo and Chyba, 1999), however, have shown that substantial amounts of amino acids may survive the impacts of large (km-sized) comets and that this may exceed the amounts derived from IDPs or Miller-Urey synthesis in the atmosphere. Once an ocean developed on the early Earth, impacts of small ,asteroids and comets into deep-ocean basins were potentially common and may have been the most likely events to deliver large amounts of organics. The deposits of the late Pliocene impact of the Eltanin asteroid into the Bellingshausen Sea provide the only record of a deep-ocean (approx. 5 km) impact that can be used to constrain models of these events. This impact was first discovered in 1981 as an Ir anomaly in sediment cores collected by the USNS Eltanin in 1965 (Kyte et al., 1981). In 1995, Polarstem expedition ANT XII/4 made the first geological survey of the suspected impact region. Three sediment cores sampled around the San Martin seamounts (approx. 57.5S, 91 W) contained well-preserved impact deposits that include disturbed ocean sediments and meteoritic impact ejecta (Gersonde et al., 1997). The latter is composed of shock- melted asteroidal materials and unmelted meteorites. In 2001, the FS Polarstem returned to the impact area during expedition ANT XVIII/5a. At least 16 cores were recovered that contain ejecta deposits. These cores and geophysical data from the expedition can be used to map the effects of the impact over a large region of the ocean floor.

  16. System Learning via Exploratory Data Analysis: Seeing Both the Forest and the Trees

    NASA Astrophysics Data System (ADS)

    Habash Krause, L.

    2014-12-01

    As the amount of observational Earth and Space Science data grows, so does the need for learning and employing data analysis techniques that can extract meaningful information from those data. Space-based and ground-based data sources from all over the world are used to inform Earth and Space environment models. However, with such a large amount of data comes a need to organize those data in a way such that trends within the data are easily discernible. This can be tricky due to the interaction between physical processes that lead to partial correlation of variables or multiple interacting sources of causality. With the suite of Exploratory Data Analysis (EDA) data mining codes available at MSFC, we have the capability to analyze large, complex data sets and quantitatively identify fundamentally independent effects from consequential or derived effects. We have used these techniques to examine the accuracy of ionospheric climate models with respect to trends in ionospheric parameters and space weather effects. In particular, these codes have been used to 1) Provide summary "at-a-glance" surveys of large data sets through categorization and/or evolution over time to identify trends, distribution shapes, and outliers, 2) Discern the underlying "latent" variables which share common sources of causality, and 3) Establish a new set of basis vectors by computing Empirical Orthogonal Functions (EOFs) which represent the maximum amount of variance for each principal component. Some of these techniques are easily implemented in the classroom using standard MATLAB functions, some of the more advanced applications require the statistical toolbox, and applications to unique situations require more sophisiticated levels of programming. This paper will present an overview of the range of tools available and how they might be used for a variety of time series Earth and Space Science data sets. Examples of feature recognition from both 1D and 2D (e.g. imagery) time series data sets will be presented.

  17. Deconstructing calsequestrin. Complex buffering in the calcium store of skeletal muscle

    PubMed Central

    Royer, Leandro; Ríos, Eduardo

    2009-01-01

    Since its discovery in 1971, calsequestrin has been recognized as the main Ca2+ binding protein inside the sarcoplasmic reticulum (SR), the organelle that stores and upon demand mobilizes Ca2+ for contractile activation of muscle. This article reviews the potential roles of calsequestrin in excitation–contraction coupling of skeletal muscle. It first considers the quantitative demands for a structure that binds Ca2+ inside the SR in view of the amounts of the ion that must be mobilized to elicit muscle contraction. It briefly discusses existing evidence, largely gathered in cardiac muscle, of two roles for calsequestrin: as Ca2+ reservoir and as modulator of the activity of Ca2+ release channels, and then considers the results of an incipient body of work that manipulates the cellular endowment of calsequestrin. The observations include evidence that both the Ca2+ buffering capacity of calsequestrin in solution and that of the SR in intact cells decay as the free Ca2+ concentration is lowered. Together with puzzling observations of increase of Ca2+ inside the SR, in cells or vesicular fractions, upon activation of Ca2+ release, this is interpreted as evidence that the Ca2+ buffering in the SR is non-linear, and is optimized for support of Ca2+ release at the physiological levels of SR Ca2+ concentration. Such non-linearity of buffering is qualitatively explained by a speculation that puts together ideas first proposed by others. The speculation pictures calsequestrin polymers as ‘wires’ that both bind Ca2+ and efficiently deliver it near the release channels. In spite of the kinetic changes, the functional studies reveal that cells devoid of calsequestrin are still capable of releasing large amounts of Ca2+ into the myoplasm, consistent with the long term viability and apparent good health of mice engineered for calsequestrin ablation. The experiments therefore suggest that other molecules are capable of providing sites for reversible binding of large amounts of Ca2+ inside the sarcoplasmic reticulum. PMID:19403601

  18. Privacy and medical information on the Internet.

    PubMed

    Nelson, Steven B

    2006-02-01

    Health-care consumers are beginning to realize the presence and value of health-care information available on the Internet, but they need to be aware of risks that may be involved. In addition to delivering information, some Web sites collect information. Though not all of the information might be classified as protected health information, consumers need to realize what is collected and how it might be used. Consumers should know a Web site\\'s privacy policy before divulging any personal information. Health-care providers have a responsibility to know what information they are collecting and why. Web servers may collect large amounts of visitor information by default, and they should be modified to limit data collection to only what is necessary. Providers need to be cognizant of the many regulations concerning collection and disclosure of information obtained from consumers. Providers should also provide an easily understood privacy policy for users.

  19. Effect of noble gases on an atmospheric greenhouse /Titan/.

    NASA Technical Reports Server (NTRS)

    Cess, R.; Owen, T.

    1973-01-01

    Several models for the atmosphere of Titan have been investigated, taking into account various combinations of neon and argon. The investigation shows that the addition of large amounts of Ne and/or Ar will substantially reduce the hydrogen abundance required for a given greenhouse effect. The fact that a large amount of neon should be present if the atmosphere is a relic of the solar nebula is an especially attractive feature of the models, because it is hard to justify appropriate abundances of other enhancing agents.

  20. Spectral Characterization of the Wave Energy Resource for Puerto Rico (PR) and the United States Virgin Islands (USVI)

    NASA Astrophysics Data System (ADS)

    Garcia, C. G.; Canals, M.; Irizarry, A. A.

    2016-02-01

    Nowadays a significant amount of wave energy assessments have taken place due to the development of the ocean energy markets worldwide. Energy contained in surface gravity waves is scattered along frequency components that can be described using wave spectra. Correspondingly, characterization and quantification of harvestable wave energy is inherently dictated by the nature of the two-dimensional wave spectrum. The present study uses spectral wave data from the operational SWAN-based CariCOOS Nearshore Wave Model to evaluate the capture efficiency of multiple wave energy converters (WEC). This study revolves around accurately estimating available wave energy as a function of varying spectral distributions, effectively providing a detailed insight concerning local wave conditions for PR and USVI and the resulting available-energy to generated-power ratio. Results in particular, provide a comprehensive characterization of three years' worth of SWAN-based datasets by outlining where higher concentrations of wave energy are localized in the spectrum. Subsequently, the aforementioned datasets were processed to quantify the amount of energy incident on two proposed sites located in PR and USVI. Results were largely influenced by local trade wind activity, which drive predominant sea states, and the amount of North-Atlantic swells that propagate towards the region. Each wave event was numerically analyzed in the frequency domain to evaluate the capacity of a WEC to perform under different spectral distribution scenarios, allowing for a correlation between electrical power output and spectral energy distribution to be established.

  1. Impact of the BALLOTS Shared Cataloging System on the Amount of Change in the Library Technical Processing Department.

    ERIC Educational Resources Information Center

    Kershner, Lois M.

    The amount of change resulting from the implementation of the Bibliographic Automation of Large Library Operations using a Time-sharing System (BALLOTS) is analyzed, in terms of (1) physical room arrangement, (2) work procedure, and (3) organizational structure. Also considered is the factor of amount of time the new system has been in use.…

  2. An Earth-System Approach to Understanding the Deepwater Horizon Oil Spill

    ERIC Educational Resources Information Center

    Robeck, Edward

    2011-01-01

    The Deepwater Horizon explosion on April 20, 2010, and the subsequent release of oil into the Gulf of Mexico created an ecological disaster of immense proportions. The estimates of the amounts of oil, whether for the amount released per day or the total amount of oil disgorged from the well, call on numbers so large they defy the capacity of most…

  3. The impact of a windshield in a tipping bucket rain gauge on the reduction of losses in precipitation measurements during snowfall events

    NASA Astrophysics Data System (ADS)

    Buisan, Samuel T.; Collado, Jose Luis; Alastrue, Javier

    2016-04-01

    The amount of snow available controls the ecology and hydrological response of mountainous areas and cold regions and affects economic activities including winter tourism, hydropower generation, floods and water supply. An accurate measurement of snowfall accumulation amount is critical and source of error for a better evaluation and verification of numerical weather forecast, hydrological and climate models. It is well known that the undercatch of solid precipitation resulting from wind-induced updrafts at the gauge orifice is the main factor affecting the quality and accuracy of the amount of snowfall precipitation. This effect can be reduced by the use of different windshields. Overall, Tipping Bucket Rain Gauges (TPBRG) provide a large percentage of the precipitation amount measurements, in all climate regimes, estimated at about 80% of the total of observations by automatic instruments. In the frame of the WMO-SPICE project, we compared at the Formigal-Sarrios station (Spanish Pyrenees, 1800 m a.s.l.) the measured precipitation in two heated TPBRGs, one of them protected with a single alter windshield in order to reduce the wind bias. Results were contrasted with measured precipitation using the SPICE reference gauge (Pluvio2 OTT) in a Double Fence Intercomparison Reference (DFIR). Results reported that shielded reduces undercatch up to 40% when wind speed exceeds 6 m/s. The differences when compared with the reference gauge reached values higher than 70%. The inaccuracy of these measurements showed a significant impact in nowcasting operations and climatology in Spain, especially during some heavy snowfall episodes. Also, hydrological models showed a better agreement with the observed rivers flow when including the precipitation not accounted during these snowfall events. The conclusions of this experiment will be used to take decisions on the suitability of the installation of windshields in stations characterized by a large quantity of snowfalls during the winter season and which are mainly located in Northern Spain

  4. Lossless Astronomical Image Compression and the Effects of Random Noise

    NASA Technical Reports Server (NTRS)

    Pence, William

    2009-01-01

    In this paper we compare a variety of modern image compression methods on a large sample of astronomical images. We begin by demonstrating from first principles how the amount of noise in the image pixel values sets a theoretical upper limit on the lossless compression ratio of the image. We derive simple procedures for measuring the amount of noise in an image and for quantitatively predicting how much compression will be possible. We then compare the traditional technique of using the GZIP utility to externally compress the image, with a newer technique of dividing the image into tiles, and then compressing and storing each tile in a FITS binary table structure. This tiled-image compression technique offers a choice of other compression algorithms besides GZIP, some of which are much better suited to compressing astronomical images. Our tests on a large sample of images show that the Rice algorithm provides the best combination of speed and compression efficiency. In particular, Rice typically produces 1.5 times greater compression and provides much faster compression speed than GZIP. Floating point images generally contain too much noise to be effectively compressed with any lossless algorithm. We have developed a compression technique which discards some of the useless noise bits by quantizing the pixel values as scaled integers. The integer images can then be compressed by a factor of 4 or more. Our image compression and uncompression utilities (called fpack and funpack) that were used in this study are publicly available from the HEASARC web site.Users may run these stand-alone programs to compress and uncompress their own images.

  5. Image processing for improved eye-tracking accuracy

    NASA Technical Reports Server (NTRS)

    Mulligan, J. B.; Watson, A. B. (Principal Investigator)

    1997-01-01

    Video cameras provide a simple, noninvasive method for monitoring a subject's eye movements. An important concept is that of the resolution of the system, which is the smallest eye movement that can be reliably detected. While hardware systems are available that estimate direction of gaze in real-time from a video image of the pupil, such systems must limit image processing to attain real-time performance and are limited to a resolution of about 10 arc minutes. Two ways to improve resolution are discussed. The first is to improve the image processing algorithms that are used to derive an estimate. Off-line analysis of the data can improve resolution by at least one order of magnitude for images of the pupil. A second avenue by which to improve resolution is to increase the optical gain of the imaging setup (i.e., the amount of image motion produced by a given eye rotation). Ophthalmoscopic imaging of retinal blood vessels provides increased optical gain and improved immunity to small head movements but requires a highly sensitive camera. The large number of images involved in a typical experiment imposes great demands on the storage, handling, and processing of data. A major bottleneck had been the real-time digitization and storage of large amounts of video imagery, but recent developments in video compression hardware have made this problem tractable at a reasonable cost. Images of both the retina and the pupil can be analyzed successfully using a basic toolbox of image-processing routines (filtering, correlation, thresholding, etc.), which are, for the most part, well suited to implementation on vectorizing supercomputers.

  6. THE CONTRIBUTION OF CORONAL JETS TO THE SOLAR WIND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lionello, R.; Török, T.; Titov, V. S.

    Transient collimated plasma eruptions in the solar corona, commonly known as coronal (or X-ray) jets, are among the most interesting manifestations of solar activity. It has been suggested that these events contribute to the mass and energy content of the corona and solar wind, but the extent of these contributions remains uncertain. We have recently modeled the formation and evolution of coronal jets using a three-dimensional (3D) magnetohydrodynamic (MHD) code with thermodynamics in a large spherical domain that includes the solar wind. Our model is coupled to 3D MHD flux-emergence simulations, i.e., we use boundary conditions provided by such simulationsmore » to drive a time-dependent coronal evolution. The model includes parametric coronal heating, radiative losses, and thermal conduction, which enables us to simulate the dynamics and plasma properties of coronal jets in a more realistic manner than done so far. Here, we employ these simulations to calculate the amount of mass and energy transported by coronal jets into the outer corona and inner heliosphere. Based on observed jet-occurrence rates, we then estimate the total contribution of coronal jets to the mass and energy content of the solar wind to (0.4–3.0)% and (0.3–1.0)%, respectively. Our results are largely consistent with the few previous rough estimates obtained from observations, supporting the conjecture that coronal jets provide only a small amount of mass and energy to the solar wind. We emphasize, however, that more advanced observations and simulations (including parametric studies) are needed to substantiate this conjecture.« less

  7. Electroweak corrections to hadronic production of W bosons at large transverse momenta

    NASA Astrophysics Data System (ADS)

    Kühn, Johann H.; Kulesza, A.; Pozzorini, S.; Schulze, M.

    2008-07-01

    To match the precision of present and future measurements of W-boson production at hadron colliders electroweak radiative corrections must be included in the theory predictions. In this paper we consider their effect on the transverse momentum ( p) distribution of W bosons, with emphasis on large p. We evaluate the full electroweak O(α) corrections to the processes pp→W+jet and pp¯→W+jet including virtual and real photonic contributions. We present the explicit expressions in analytical form for the virtual corrections and provide results for the real corrections, discussing in detail the treatment of soft and collinear singularities. We also provide compact approximate expressions which are valid in the high-energy region, where the electroweak corrections are strongly enhanced by logarithms of sˆ/MW2. These expressions describe the complete asymptotic behaviour at one loop as well as the leading and next-to-leading logarithms at two loops. Numerical results are presented for proton-proton collisions at 14 TeV and proton-antiproton collisions at 2 TeV. The corrections are negative and their size increases with p. At the LHC, where transverse momenta of 2 TeV or more can be reached, the one- and two-loop corrections amount up to -40% and +10%, respectively, and will be important for a precise analysis of W production. At the Tevatron, transverse momenta up to 300 GeV are within reach. In this case the electroweak corrections amount up to -10% and are thus larger than the expected statistical error.

  8. GWASeq: targeted re-sequencing follow up to GWAS.

    PubMed

    Salomon, Matthew P; Li, Wai Lok Sibon; Edlund, Christopher K; Morrison, John; Fortini, Barbara K; Win, Aung Ko; Conti, David V; Thomas, Duncan C; Duggan, David; Buchanan, Daniel D; Jenkins, Mark A; Hopper, John L; Gallinger, Steven; Le Marchand, Loïc; Newcomb, Polly A; Casey, Graham; Marjoram, Paul

    2016-03-03

    For the last decade the conceptual framework of the Genome-Wide Association Study (GWAS) has dominated the investigation of human disease and other complex traits. While GWAS have been successful in identifying a large number of variants associated with various phenotypes, the overall amount of heritability explained by these variants remains small. This raises the question of how best to follow up on a GWAS, localize causal variants accounting for GWAS hits, and as a consequence explain more of the so-called "missing" heritability. Advances in high throughput sequencing technologies now allow for the efficient and cost-effective collection of vast amounts of fine-scale genomic data to complement GWAS. We investigate these issues using a colon cancer dataset. After QC, our data consisted of 1993 cases, 899 controls. Using marginal tests of associations, we identify 10 variants distributed among six targeted regions that are significantly associated with colorectal cancer, with eight of the variants being novel to this study. Additionally, we perform so-called 'SNP-set' tests of association and identify two sets of variants that implicate both common and rare variants in the etiology of colorectal cancer. Here we present a large-scale targeted re-sequencing resource focusing on genomic regions implicated in colorectal cancer susceptibility previously identified in several GWAS, which aims to 1) provide fine-scale targeted sequencing data for fine-mapping and 2) provide data resources to address methodological questions regarding the design of sequencing-based follow-up studies to GWAS. Additionally, we show that this strategy successfully identifies novel variants associated with colorectal cancer susceptibility and can implicate both common and rare variants.

  9. Design of capacity incentive and energy compensation for demand response programs

    NASA Astrophysics Data System (ADS)

    Liu, Zhoubin; Cui, Wenqi; Shen, Ran; Hu, Yishuang; Wu, Hui; Ye, Chengjin

    2018-02-01

    Variability and Uncertainties caused by renewable energy sources have called for large amount of balancing services. Demand side resources (DSRs) can be a good alternative of traditional generating units to provide balancing service. In the areas where the electricity market has not been fully established, e.g., China, DSRs can help balance the power system with incentive-based demand response programs. However, there is a lack of information about the interruption cost of consumers in these areas, making it hard to determine the rational amount of capacity incentive and energy compensation for the participants of demand response programs. This paper proposes an algorithm to calculate the amount of capacity incentive and energy compensation for demand response programs when there lacks the information about interruption cost. Available statistical information of interruption cost in referenced areas is selected as the referenced data. Interruption cost of the targeted area is converted from the referenced area by product per electricity consumption. On this basis, capacity incentive and energy compensation are obtained to minimize the payment to consumers. Moreover, the loss of consumers is guaranteed to be covered by the revenue they earned from load serving entities.

  10. Dutch food bank parcels do not meet nutritional guidelines for a healthy diet.

    PubMed

    Neter, Judith E; Dijkstra, S Coosje; Visser, Marjolein; Brouwer, Ingeborg A

    2016-08-01

    Nutritional intakes of food bank recipients and consequently their health status largely rely on the availability and quality of donated food in provided food parcels. In this cross-sectional study, the nutritional quality of ninety-six individual food parcels was assessed and compared with the Dutch nutritional guidelines for a healthy diet. Furthermore, we assessed how food bank recipients use the contents of the food parcel. Therefore, 251 Dutch food bank recipients from eleven food banks throughout the Netherlands filled out a general questionnaire. The provided amounts of energy (19 849 (sd 162 615) kJ (4744 (sd 38 866) kcal)), protein (14·6 energy percentages (en%)) and SFA (12·9 en%) in a single-person food parcel for one single day were higher than the nutritional guidelines, whereas the provided amounts of fruits (97 (sd 1441) g) and fish (23 (sd 640) g) were lower. The number of days for which macronutrients, fruits, vegetables and fish were provided for a single-person food parcel ranged from 1·2 (fruits) to 11·3 (protein) d. Of the participants, only 9·5 % bought fruits and 4·6 % bought fish to supplement the food parcel, 39·4 % used all foods provided and 75·7 % were (very) satisfied with the contents of the food parcel. Our study shows that the nutritional content of food parcels provided by Dutch food banks is not in line with the nutritional guidelines. Improving the quality of the parcels is likely to positively impact the dietary intake of this vulnerable population subgroup.

  11. Coupling a basin erosion and river sediment transport model into a large scale hydrological model: an application in the Amazon basin

    NASA Astrophysics Data System (ADS)

    Buarque, D. C.; Collischonn, W.; Paiva, R. C. D.

    2012-04-01

    This study presents the first application and preliminary results of the large scale hydrodynamic/hydrological model MGB-IPH with a new module to predict the spatial distribution of the basin erosion and river sediment transport in a daily time step. The MGB-IPH is a large-scale, distributed and process based hydrological model that uses a catchment based discretization and the Hydrological Response Units (HRU) approach. It uses physical based equations to simulate the hydrological processes, such as the Penman Monteith model for evapotranspiration, and uses the Muskingum Cunge approach and a full 1D hydrodynamic model for river routing; including backwater effects and seasonal flooding. The sediment module of the MGB-IPH model is divided into two components: 1) prediction of erosion over the basin and sediment yield to river network; 2) sediment transport along the river channels. Both MGB-IPH and the sediment module use GIS tools to display relevant maps and to extract parameters from SRTM DEM (a 15" resolution was adopted). Using the catchment discretization the sediment module applies the Modified Universal Soil Loss Equation to predict soil loss from each HRU considering three sediment classes defined according to the soil texture: sand, silt and clay. The effects of topography on soil erosion are estimated by a two-dimensional slope length (LS) factor which using the contributing area approach and a local slope steepness (S), both estimated for each DEM pixel using GIS algorithms. The amount of sediment releasing to the catchment river reach in each day is calculated using a linear reservoir. Once the sediment reaches the river they are transported into the river channel using an advection equation for silt and clay and a sediment continuity equation for sand. A sediment balance based on the Yang sediment transport capacity, allowing to compute the amount of erosion and deposition along the rivers, is performed for sand particles as bed load, whilst no erosion or deposition is allowed for silt and clay. The model was first applied on the Madeira River basin, one of the major tributaries of the Amazon River (~1.4*106 km2) accounting for 35% of the suspended sediment amount annually transported for the Amazon river to the ocean. Model results agree with observed data, mainly for monthly and annual time scales. The spatial distribution of soil erosion within the basin showed a large amount of sediment being delivered from the Andean regions of Bolivia and Peru. Spatial distribution of mean annual sediment along the river showed that Madre de Dios, Mamoré and Beni rivers transport the major amount of sediment. Simulated daily suspended solid discharge agree with observed data. The model is able to provide temporaly and spatialy distributed estimates of soil loss source over the basin, locations with tendency for erosion or deposition along the rivers, and to reproduce long term sediment yield at several locations. Despite model results are encouraging, further effort is needed to validate the model considering the scarcity of data at large scale.

  12. Toward high-throughput genotyping: dynamic and automatic software for manipulating large-scale genotype data using fluorescently labeled dinucleotide markers.

    PubMed

    Li, J L; Deng, H; Lai, D B; Xu, F; Chen, J; Gao, G; Recker, R R; Deng, H W

    2001-07-01

    To efficiently manipulate large amounts of genotype data generated with fluorescently labeled dinucleotide markers, we developed a Microsoft database management system, named. offers several advantages. First, it accommodates the dynamic nature of the accumulations of genotype data during the genotyping process; some data need to be confirmed or replaced by repeat lab procedures. By using, the raw genotype data can be imported easily and continuously and incorporated into the database during the genotyping process that may continue over an extended period of time in large projects. Second, almost all of the procedures are automatic, including autocomparison of the raw data read by different technicians from the same gel, autoadjustment among the allele fragment-size data from cross-runs or cross-platforms, autobinning of alleles, and autocompilation of genotype data for suitable programs to perform inheritance check in pedigrees. Third, provides functions to track electrophoresis gel files to locate gel or sample sources for any resultant genotype data, which is extremely helpful for double-checking consistency of raw and final data and for directing repeat experiments. In addition, the user-friendly graphic interface of renders processing of large amounts of data much less labor-intensive. Furthermore, has built-in mechanisms to detect some genotyping errors and to assess the quality of genotype data that then are summarized in the statistic reports automatically generated by. The can easily handle >500,000 genotype data entries, a number more than sufficient for typical whole-genome linkage studies. The modules and programs we developed for the can be extended to other database platforms, such as Microsoft SQL server, if the capability to handle still greater quantities of genotype data simultaneously is desired.

  13. Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain.

    PubMed

    Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel

    2016-01-01

    Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts.

  14. Conflict Misleads Large Carnivore Management and Conservation: Brown Bears and Wolves in Spain

    PubMed Central

    Fernández-Gil, Alberto; Naves, Javier; Ordiz, Andrés; Quevedo, Mario; Revilla, Eloy; Delibes, Miguel

    2016-01-01

    Large carnivores inhabiting human-dominated landscapes often interact with people and their properties, leading to conflict scenarios that can mislead carnivore management and, ultimately, jeopardize conservation. In northwest Spain, brown bears Ursus arctos are strictly protected, whereas sympatric wolves Canis lupus are subject to lethal control. We explored ecological, economic and societal components of conflict scenarios involving large carnivores and damages to human properties. We analyzed the relation between complaints of depredations by bears and wolves on beehives and livestock, respectively, and bear and wolf abundance, livestock heads, number of culled wolves, amount of paid compensations, and media coverage. We also evaluated the efficiency of wolf culling to reduce depredations on livestock. Bear damages to beehives correlated positively to the number of female bears with cubs of the year. Complaints of wolf predation on livestock were unrelated to livestock numbers; instead, they correlated positively to the number of wild ungulates harvested during the previous season, the number of wolf packs, and to wolves culled during the previous season. Compensations for wolf complaints were fivefold higher than for bears, but media coverage of wolf damages was thirtyfold higher. Media coverage of wolf damages was unrelated to the actual costs of wolf damages, but the amount of news correlated positively to wolf culling. However, wolf culling was followed by an increase in compensated damages. Our results show that culling of the wolf population failed in its goal of reducing damages, and suggest that management decisions are at least partly mediated by press coverage. We suggest that our results provide insight to similar scenarios, where several species of large carnivores share the landscape with humans, and management may be reactive to perceived conflicts. PMID:26974962

  15. Learning Setting-Generalized Activity Models for Smart Spaces

    PubMed Central

    Cook, Diane J.

    2011-01-01

    The data mining and pervasive computing technologies found in smart homes offer unprecedented opportunities for providing context-aware services, including health monitoring and assistance to individuals experiencing difficulties living independently at home. In order to provide these services, smart environment algorithms need to recognize and track activities that people normally perform as part of their daily routines. However, activity recognition has typically involved gathering and labeling large amounts of data in each setting to learn a model for activities in that setting. We hypothesize that generalized models can be learned for common activities that span multiple environment settings and resident types. We describe our approach to learning these models and demonstrate the approach using eleven CASAS datasets collected in seven environments. PMID:21461133

  16. An evaluation of contractor projected and actual costs

    NASA Technical Reports Server (NTRS)

    Kwiatkowski, K. A.; Buffalano, C.

    1974-01-01

    GSFC contractors with cost-plus contracts provide cost estimates for each of the next four quarters on a quarterly basis. Actual expenditures over a two-year period were compared to the estimates, and the data were sorted in different ways to answer several questions and give quantification to observations, such as how much does the accuracy of estimates degrade as they are made further into the future? Are estimates made for small dollar amounts more accurate than for large dollar estimates? Other government agencies and private companies with cost-plus contracts may be interested in this analysis as potential methods of contract management for their organizations. It provides them with the different methods one organization is beginning to use to control costs.

  17. Maximizing Macromolecule Crystal Size for Neutron Diffraction Experiments

    NASA Technical Reports Server (NTRS)

    Judge, R. A.; Kephart, R.; Leardi, R.; Myles, D. A.; Snell, E. H.; vanderWoerd, M.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    A challenge in neutron diffraction experiments is growing large (greater than 1 cu mm) macromolecule crystals. In taking up this challenge we have used statistical experiment design techniques to quickly identify crystallization conditions under which the largest crystals grow. These techniques provide the maximum information for minimal experimental effort, allowing optimal screening of crystallization variables in a simple experimental matrix, using the minimum amount of sample. Analysis of the results quickly tells the investigator what conditions are the most important for the crystallization. These can then be used to maximize the crystallization results in terms of reducing crystal numbers and providing large crystals of suitable habit. We have used these techniques to grow large crystals of Glucose isomerase. Glucose isomerase is an industrial enzyme used extensively in the food industry for the conversion of glucose to fructose. The aim of this study is the elucidation of the enzymatic mechanism at the molecular level. The accurate determination of hydrogen positions, which is critical for this, is a requirement that neutron diffraction is uniquely suited for. Preliminary neutron diffraction experiments with these crystals conducted at the Institute Laue-Langevin (Grenoble, France) reveal diffraction to beyond 2.5 angstrom. Macromolecular crystal growth is a process involving many parameters, and statistical experimental design is naturally suited to this field. These techniques are sample independent and provide an experimental strategy to maximize crystal volume and habit for neutron diffraction studies.

  18. Interior properties of the inner Saturnian moons from space astrometry data

    NASA Astrophysics Data System (ADS)

    Lainey, Valery; Noyelles, Benoît; Cooper, Nick; Murray, Carl; Park, Ryan; Rambaux, Nicolas

    2018-04-01

    During thirteen years in orbit around Saturn before its final plunge, the Cassini spacecraft provided more than ten thousand astrometric measurements. Such large amounts of accurate data enable the search for extremely faint signals in the orbital motion of the moons. Among those, the detection of the dynamical feedback of the rotation of the inner moons of Saturn on their respective orbits becomes possible. Using all the currently available astrometric data associated with Atlas, Prometheus, Pandora, Janus and Epimetheus, we provide a detailed analysis of the ISS data, with special emphasis on their statistical behavior and source of biases. Then, we try quantifying the physical librations of Prometheus, Pandora, Epimetheus and Janus from the monitoring of their orbits. Last, we show how introducing measurements directly derived from imaging can provide tighter constraints on these quantities.

  19. Quick Estimation Model for the Concentration of Indoor Airborne Culturable Bacteria: An Application of Machine Learning

    PubMed Central

    Liu, Zhijian; Li, Hao; Cao, Guoqing

    2017-01-01

    Indoor airborne culturable bacteria are sometimes harmful to human health. Therefore, a quick estimation of their concentration is particularly necessary. However, measuring the indoor microorganism concentration (e.g., bacteria) usually requires a large amount of time, economic cost, and manpower. In this paper, we aim to provide a quick solution: using knowledge-based machine learning to provide quick estimation of the concentration of indoor airborne culturable bacteria only with the inputs of several measurable indoor environmental indicators, including: indoor particulate matter (PM2.5 and PM10), temperature, relative humidity, and CO2 concentration. Our results show that a general regression neural network (GRNN) model can sufficiently provide a quick and decent estimation based on the model training and testing using an experimental database with 249 data groups. PMID:28758941

  20. From data to the decision: A software architecture to integrate predictive modelling in clinical settings.

    PubMed

    Martinez-Millana, A; Fernandez-Llatas, C; Sacchi, L; Segagni, D; Guillen, S; Bellazzi, R; Traver, V

    2015-08-01

    The application of statistics and mathematics over large amounts of data is providing healthcare systems with new tools for screening and managing multiple diseases. Nonetheless, these tools have many technical and clinical limitations as they are based on datasets with concrete characteristics. This proposition paper describes a novel architecture focused on providing a validation framework for discrimination and prediction models in the screening of Type 2 diabetes. For that, the architecture has been designed to gather different data sources under a common data structure and, furthermore, to be controlled by a centralized component (Orchestrator) in charge of directing the interaction flows among data sources, models and graphical user interfaces. This innovative approach aims to overcome the data-dependency of the models by providing a validation framework for the models as they are used within clinical settings.

  1. FISH Oracle 2: a web server for integrative visualization of genomic data in cancer research.

    PubMed

    Mader, Malte; Simon, Ronald; Kurtz, Stefan

    2014-03-31

    A comprehensive view on all relevant genomic data is instrumental for understanding the complex patterns of molecular alterations typically found in cancer cells. One of the most effective ways to rapidly obtain an overview of genomic alterations in large amounts of genomic data is the integrative visualization of genomic events. We developed FISH Oracle 2, a web server for the interactive visualization of different kinds of downstream processed genomics data typically available in cancer research. A powerful search interface and a fast visualization engine provide a highly interactive visualization for such data. High quality image export enables the life scientist to easily communicate their results. A comprehensive data administration allows to keep track of the available data sets. We applied FISH Oracle 2 to published data and found evidence that, in colorectal cancer cells, the gene TTC28 may be inactivated in two different ways, a fact that has not been published before. The interactive nature of FISH Oracle 2 and the possibility to store, select and visualize large amounts of downstream processed data support life scientists in generating hypotheses. The export of high quality images supports explanatory data visualization, simplifying the communication of new biological findings. A FISH Oracle 2 demo server and the software is available at http://www.zbh.uni-hamburg.de/fishoracle.

  2. Anomalous Ion Heating, Intrinsic and Induced Rotation in the Pegasus Toroidal Experiment

    NASA Astrophysics Data System (ADS)

    Burke, M. G.; Barr, J. L.; Bongard, M. W.; Fonck, R. J.; Hinson, E. T.; Perry, J. M.; Redd, A. J.; Thome, K. E.

    2014-10-01

    Pegasus plasmas are initiated through either standard, MHD stable, inductive current drive or non-solenoidal local helicity injection (LHI) current drive with strong reconnection activity, providing a rich environment to study ion dynamics. During LHI discharges, a large amount of anomalous impurity ion heating has been observed, with Ti ~ 800 eV but Te < 100 eV. The ion heating is hypothesized to be a result of large-scale magnetic reconnection activity, as the amount of heating scales with increasing fluctuation amplitude of the dominant, edge localized, n = 1 MHD mode. Chordal Ti spatial profiles indicate centrally peaked temperatures, suggesting a region of good confinement near the plasma core surrounded by a stochastic region. LHI plasmas are observed to rotate, perhaps due to an inward radial current generated by the stochastization of the plasma edge by the injected current streams. H-mode plasmas are initiated using a combination of high-field side fueling and Ohmic current drive. This regime shows a significant increase in rotation shear compared to L-mode plasmas. In addition, these plasmas have been observed to rotate in the counter-Ip direction without any external momentum sources. The intrinsic rotation direction is consistent with predictions from the saturated Ohmic confinement regime. Work supported by US DOE Grant DE-FG02-96ER54375.

  3. Large repayments of premium subsidies may be owed to the IRS if family income changes are not promptly reported.

    PubMed

    Jacobs, Ken; Graham-Squire, Dave; Gould, Elise; Roby, Dylan

    2013-09-01

    Subsidies for health insurance premiums under the Affordable Care Act are refundable tax credits. They can be taken when taxes are filed or in advance, as reductions in monthly premiums that must be reconciled at tax filing. Recipients who take subsidies in advance will receive tax refunds if their subsidies were too small but will have to make repayments if their subsidies were too high. We analyzed predicted repayments and refunds for people receiving subsidies, using California as a case study. We found that many families could owe large repayments to the Internal Revenue Service at their next tax filing. If income changes were reported and credits adjusted in a timely manner throughout the tax year, the number of filers owing repayments would be reduced by 7-41 percent and the median size of repayments reduced by as much as 61 percent (depending on the level of changes reported and the method used to adjust the subsidy amounts). We recommend that the health insurance exchanges mandated by the Affordable Care Act educate consumers about how the subsidies work and the need to promptly report income changes. We also recommend that they provide tools and assistance to determine the amount of subsidies that enrollees should take in advance.

  4. Test Data Monitor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosas, Joseph

    The National Security Campus (NSC) collects a large amount of test data used The National Security Campus (NSC) collects a large amount of test data used to accept high value and high rigor product. The data has been used historically to support root cause analysis when anomalies are detected in down-stream processes. The opportunity to use the data for predictive failure analysis however, had never been exploited. The primary goal of the Test Data Monitor (TDM) software is to provide automated capabilities to analyze data in near-real-time and report trends that foreshadow actual product failures. To date, the aerospace industrymore » as a whole is challenged at utilizing collected data to the degree that modern technology allows. As a result of the innovation behind TDM, Honeywell is able to monitor millions of data points through a multitude of SPC algorithms continuously and autonomously so that our personnel resources can more efficiently and accurately direct their attention to suspect processes or features. TDM’s capabilities have been recognized by our U.S. Department of Energy National Nuclear Security Administration (NNSA) sponsor for potential use at other sites within the NNSA. This activity supports multiple initiatives including expectations of the NNSA and broader corporate goals that center around data-based quality controls on production.« less

  5. Action of the chemical agent fipronil on the reproductive process of semi-engorged females of the tick Rhipicephalus sanguineus (Latreille, 1806) (Acari: Ixodidae). Ultrastructural evaluation of ovary cells.

    PubMed

    Oliveira, Patrícia Rosa de; Bechara, Gervásio Henrique; Morales, Maria Aparecida Marin; Mathias, Maria Izabel Camargo

    2009-06-01

    The ovary of the tick Rhipicephalus sanguineus consists of a wall of epithelial cells and a large number of oocytes in five different developmental stages (I-V), which are attached to the wall by a pedicel. The present study provides ultrastructural information on the effects (dose-response) of the acaricide fipronil (Frontline) on ovaries of semi-engorged females of R. sanguineus, as well as it demonstrates some possible defense mechanisms used by oocytes to protect themselves against this chemical agent. Individuals were divided into four groups. Group I was used as control while groups II, III and IV were treated with fipronil at the concentrations of 1, 5 and 10 ppm, respectively. Fipronil at the concentration of 10 ppm had the strongest effect on the development of oocytes. At this concentration, even oocytes that reached the final developmental stage exhibited damaged cell structures. Moreover, the observation in fipronil-treated R. sanguineus ticks of damaged cellular components such as plasmic membrane, mitochondria and protein granules (due to alteration in the protein synthesis), and cellular defense mechanisms such as increase in the amount of cytoplasmic microtubules and large amounts of digestive vacuoles and myelin figures, were only possible by means of ultrastructure.

  6. A comparative evaluation of supervised and unsupervised representation learning approaches for anaplastic medulloblastoma differentiation

    NASA Astrophysics Data System (ADS)

    Cruz-Roa, Angel; Arevalo, John; Basavanhally, Ajay; Madabhushi, Anant; González, Fabio

    2015-01-01

    Learning data representations directly from the data itself is an approach that has shown great success in different pattern recognition problems, outperforming state-of-the-art feature extraction schemes for different tasks in computer vision, speech recognition and natural language processing. Representation learning applies unsupervised and supervised machine learning methods to large amounts of data to find building-blocks that better represent the information in it. Digitized histopathology images represents a very good testbed for representation learning since it involves large amounts of high complex, visual data. This paper presents a comparative evaluation of different supervised and unsupervised representation learning architectures to specifically address open questions on what type of learning architectures (deep or shallow), type of learning (unsupervised or supervised) is optimal. In this paper we limit ourselves to addressing these questions in the context of distinguishing between anaplastic and non-anaplastic medulloblastomas from routine haematoxylin and eosin stained images. The unsupervised approaches evaluated were sparse autoencoders and topographic reconstruct independent component analysis, and the supervised approach was convolutional neural networks. Experimental results show that shallow architectures with more neurons are better than deeper architectures without taking into account local space invariances and that topographic constraints provide useful invariant features in scale and rotations for efficient tumor differentiation.

  7. A tool for optimization of the production and user analysis on the Grid, C. Grigoras for the ALICE Collaboration

    NASA Astrophysics Data System (ADS)

    Grigoras, Costin; Carminati, Federico; Vladimirovna Datskova, Olga; Schreiner, Steffen; Lee, Sehoon; Zhu, Jianlin; Gheata, Mihaela; Gheata, Andrei; Saiz, Pablo; Betev, Latchezar; Furano, Fabrizio; Mendez Lorenzo, Patricia; Grigoras, Alina Gabriela; Bagnasco, Stefano; Peters, Andreas Joachim; Saiz Santos, Maria Dolores

    2011-12-01

    With the LHC and ALICE entering a full operation and production modes, the amount of Simulation and RAW data processing and end user analysis computational tasks are increasing. The efficient management of all these tasks, all of which have large differences in lifecycle, amounts of processed data and methods to analyze the end result, required the development and deployment of new tools in addition to the already existing Grid infrastructure. To facilitate the management of the large scale simulation and raw data reconstruction tasks, ALICE has developed a production framework called a Lightweight Production Manager (LPM). The LPM is automatically submitting jobs to the Grid based on triggers and conditions, for example after a physics run completion. It follows the evolution of the job and publishes the results on the web for worldwide access by the ALICE physicists. This framework is tightly integrated with the ALICE Grid framework AliEn. In addition to the publication of the job status, LPM is also allowing a fully authenticated interface to the AliEn Grid catalogue, to browse and download files, and in the near future will provide simple types of data analysis through ROOT plugins. The framework is also being extended to allow management of end user jobs.

  8. Using GRACE to constrain precipitation amount over cold mountainous basins

    NASA Astrophysics Data System (ADS)

    Behrangi, Ali; Gardner, Alex S.; Reager, John T.; Fisher, Joshua B.

    2017-01-01

    Despite the importance for hydrology and climate-change studies, current quantitative knowledge on the amount and distribution of precipitation in mountainous and high-elevation regions is limited due to instrumental and retrieval shortcomings. Here by focusing on two large endorheic basins in High Mountain Asia, we show that satellite gravimetry (Gravity Recovery and Climate Experiment (GRACE)) can be used to provide an independent estimate of monthly accumulated precipitation using mass balance equation. Results showed that the GRACE-based precipitation estimate has the highest agreement with most of the commonly used precipitation products in summer, but it deviates from them in cold months, when the other products are expected to have larger errors. It was found that most of the products capture about or less than 50% of the total precipitation estimated using GRACE in winter. Overall, Global Precipitation Climatology Project (GPCP) showed better agreement with GRACE estimate than other products. Yet on average GRACE showed 30% more annual precipitation than GPCP in the study basins. In basins of appropriate size with an absence of dense ground measurements, as is a typical case in cold mountainous regions, we find GRACE can be a viable alternative to constrain monthly and seasonal precipitation estimates from other remotely sensed precipitation products that show large bias.

  9. Offset Initial Sodium Loss To Improve Coulombic Efficiency and Stability of Sodium Dual-Ion Batteries.

    PubMed

    Ma, Ruifang; Fan, Ling; Chen, Suhua; Wei, Zengxi; Yang, Yuhua; Yang, Hongguan; Qin, Yong; Lu, Bingan

    2018-05-09

    Sodium dual-ion batteries (NDIBs) are attracting extensive attention recently because of their low cost and abundant sodium resources. However, the low capacity of the carbonaceous anode would reduce the energy density, and the formation of the solid-electrolyte interphase (SEI) in the anode during the initial cycles will lead to large amount consumption of Na + in the electrolyte, which results in low Coulombic efficiency and inferior stability of the NDIBs. To address these issues, a phosphorus-doped soft carbon (P-SC) anode combined with a presodiation process is developed to enhance the performance of the NDIBs. The phosphorus atom doping could enhance the electric conductivity and further improve the sodium storage property. On the other hand, an SEI could preform in the anode during the presodiation process; thus the anode has no need to consume large amounts of Na + to form the SEI during the cycling of the NDIBs. Consequently, the NDIBs with P-SC anode after the presodiation process exhibit high Coulombic efficiency (over 90%) and long cycle stability (81 mA h g -1 at 1000 mA g -1 after 900 cycles with capacity retention of 81.8%), far more superior to the unsodiated NDIBs. This work may provide guidance for developing high performance NDIBs in the future.

  10. View the label before you view the movie: A field experiment into the impact of Portion size and Guideline Daily Amounts labelling on soft drinks in cinemas

    PubMed Central

    2011-01-01

    Background Large soft drink sizes increase consumption, and thereby contribute to obesity. Portion size labelling may help consumers to select more appropriate food portions. This study aimed to assess the effectiveness of portion size and caloric Guidelines for Daily Amounts (GDA) labelling on consumers' portion size choices and consumption of regular soft drinks. Methods A field experiment that took place on two subsequent evenings in a Dutch cinema. Participants (n = 101) were asked to select one of five different portion sizes of a soft drink. Consumers were provided with either portion size and caloric GDA labelling (experimental condition) or with millilitre information (control condition). Results Labelling neither stimulated participants to choose small portion sizes (OR = .75, p = .61, CI: .25 - 2.25), nor did labelling dissuade participants to choose large portion sizes (OR = .51, p = .36, CI: .12 - 2.15). Conclusions Portion size and caloric GDA labelling were found to have no effect on soft drink intake. Further research among a larger group of participants combined with pricing strategies is required. The results of this study are relevant for the current public health debate on food labelling. PMID:21645373

  11. Kinetics of methane hydrate replacement with carbon dioxide and nitrogen gas mixture using in situ NMR spectroscopy.

    PubMed

    Cha, Minjun; Shin, Kyuchul; Lee, Huen; Moudrakovski, Igor L; Ripmeester, John A; Seo, Yutaek

    2015-02-03

    In this study, the kinetics of methane replacement with carbon dioxide and nitrogen gas in methane gas hydrate prepared in porous silica gel matrices has been studied by in situ (1)H and (13)C NMR spectroscopy. The replacement process was monitored by in situ (1)H NMR spectra, where about 42 mol % of the methane in the hydrate cages was replaced in 65 h. Large amounts of free water were not observed during the replacement process, indicating a spontaneous replacement reaction upon exposing methane hydrate to carbon dioxide and nitrogen gas mixture. From in situ (13)C NMR spectra, we confirmed that the replacement ratio was slightly higher in small cages, but due to the composition of structure I hydrate, the amount of methane evolved from the large cages was larger than that of the small cages. Compositional analysis of vapor and hydrate phases was also carried out after the replacement reaction ceased. Notably, the composition changes in hydrate phases after the replacement reaction would be affected by the difference in the chemical potential between the vapor phase and hydrate surface rather than a pore size effect. These results suggest that the replacement technique provides methane recovery as well as stabilization of the resulting carbon dioxide hydrate phase without melting.

  12. Betweenness-Based Method to Identify Critical Transmission Sectors for Supply Chain Environmental Pressure Mitigation.

    PubMed

    Liang, Sai; Qu, Shen; Xu, Ming

    2016-02-02

    To develop industry-specific policies for mitigating environmental pressures, previous studies primarily focus on identifying sectors that directly generate large amounts of environmental pressures (a.k.a. production-based method) or indirectly drive large amounts of environmental pressures through supply chains (e.g., consumption-based method). In addition to those sectors as important environmental pressure producers or drivers, there exist sectors that are also important to environmental pressure mitigation as transmission centers. Economy-wide environmental pressure mitigation might be achieved by improving production efficiency of these key transmission sectors, that is, using less upstream inputs to produce unitary output. We develop a betweenness-based method to measure the importance of transmission sectors, borrowing the betweenness concept from network analysis. We quantify the betweenness of sectors by examining supply chain paths extracted from structural path analysis that pass through a particular sector. We take China as an example and find that those critical transmission sectors identified by betweenness-based method are not always identifiable by existing methods. This indicates that betweenness-based method can provide additional insights that cannot be obtained with existing methods on the roles individual sectors play in generating economy-wide environmental pressures. Betweenness-based method proposed here can therefore complement existing methods for guiding sector-level environmental pressure mitigation strategies.

  13. The Coriolis Program.

    ERIC Educational Resources Information Center

    Lissaman, P. B. S.

    1979-01-01

    Detailed are the history, development, and future objectives of the Coriolis program, a project designed to place large turbine units in the Florida Current that would generate large amounts of electric power. (BT)

  14. Cutting Edge: Protection by Antiviral Memory CD8 T Cells Requires Rapidly Produced Antigen in Large Amounts.

    PubMed

    Remakus, Sanda; Ma, Xueying; Tang, Lingjuan; Xu, Ren-Huan; Knudson, Cory; Melo-Silva, Carolina R; Rubio, Daniel; Kuo, Yin-Ming; Andrews, Andrew; Sigal, Luis J

    2018-05-15

    Numerous attempts to produce antiviral vaccines by harnessing memory CD8 T cells have failed. A barrier to progress is that we do not know what makes an Ag a viable target of protective CD8 T cell memory. We found that in mice susceptible to lethal mousepox (the mouse homolog of human smallpox), a dendritic cell vaccine that induced memory CD8 T cells fully protected mice when the infecting virus produced Ag in large quantities and with rapid kinetics. Protection did not occur when the Ag was produced in low amounts, even with rapid kinetics, and protection was only partial when the Ag was produced in large quantities but with slow kinetics. Hence, the amount and timing of Ag expression appear to be key determinants of memory CD8 T cell antiviral protective immunity. These findings may have important implications for vaccine design. Copyright © 2018 by The American Association of Immunologists, Inc.

  15. Profiling Oman education data using data mining approach

    NASA Astrophysics Data System (ADS)

    Alawi, Sultan Juma Sultan; Shaharanee, Izwan Nizal Mohd; Jamil, Jastini Mohd

    2017-10-01

    Nowadays, with a large amount of data generated by many application services in different learning fields has led to the new challenges in education field. Education portal is an important system that leads to a better development of education field. This research paper presents an innovative data mining techniques to understand and summarizes the information of Oman's education data generated from the Ministry of Education Oman "Educational Portal". This research embarks into performing student profiling of the Oman student database. This study utilized the k-means clustering technique to determine the students' profiles. An amount of 42484-student records from Sultanate of Oman has been extracted for this study. The findings of this study show the practicality of clustering technique to investigating student's profiles. Allowing for a better understanding of student's behavior and their academic performance. Oman Education Portal contain a large amounts of user activity and interaction data. Analyses of this large data can be meaningful for educator to improve the student performance level and recognize students who needed additional attention.

  16. Information Management System Supporting a Multiple Property Survey Program with Legacy Radioactive Contamination.

    PubMed

    Stager, Ron; Chambers, Douglas; Wiatzka, Gerd; Dupre, Monica; Callough, Micah; Benson, John; Santiago, Erwin; van Veen, Walter

    2017-04-01

    The Port Hope Area Initiative is a project mandated and funded by the Government of Canada to remediate properties with legacy low-level radioactive waste contamination in the Town of Port Hope, Ontario. The management and use of large amounts of data from surveys of some 4800 properties is a significant task critical to the success of the project. A large amount of information is generated through the surveys, including scheduling individual field visits to the properties, capture of field data laboratory sample tracking, QA/QC, property report generation and project management reporting. Web-mapping tools were used to track and display temporal progress of various tasks and facilitated consideration of spatial associations of contamination levels. The IM system facilitated the management and integrity of the large amounts of information collected, evaluation of spatial associations, automated report reproduction and consistent application and traceable execution for this project.x. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Controlled artificial upwelling in a fjord to combat toxic algae

    NASA Astrophysics Data System (ADS)

    McClimans, T. A.; Hansen, A. H.; Fredheim, A.; Lien, E.; Reitan, K. I.

    2003-04-01

    During the summer, primary production in the surface layers of some fjords depletes the nutrients to the degree that some arts of toxic algae dominate the flora. We describe an experiment employing a bubble curtain to lift significant amounts of nutrient-rich seawater to the light zone and provide an environment in which useful algae can survive. The motivation for the experiment is to provide a local region in which mussels can be cleansed from the effects of toxic algae. Three 100-m long, perforated pipes were suspended at 40 m depth in the Arnafjord, a side arm of the Sognefjord. Large amounts of compressed air were supplied during a period of three weeks. The deeper water mixed with the surface water and flowed from the mixing region at 5 to 15 m depth. Within a few days, the mixture of nutrient-rich water covered most of the inner portion of Arnafjord. Within 10 days, the plankton samples showed that the artificial upwelling produced the desired type of algae and excluded the toxic blooms that were occurring outside the manipulated fjord arm. The project (DETOX) is supported by the Norwegian ministries of Fisheries, Agriculture and Public Administration.

  18. Dilute Acid and Autohydrolysis Pretreatment

    NASA Astrophysics Data System (ADS)

    Yang, Bin; Wyman, Charles E.

    Exposure of cellulosic biomass to temperatures of about 120-210°C can remove most of the hemicellulose and produce cellulose-rich solids from which high glucose yields are possible with cellulase enzymes. Furthermore, the use of dilute sulfuric acid in this pretreatment operation can increase recovery of hemicellulose sugars substantially to about 85-95% of the maximum possible versus only about 65% if no acid is employed. The use of small-diameter tubes makes it possible to employ high solids concentrations similar to those preferred for commercial operations, with rapid heat-up, good temperature control, and accurate closure of material balances. Mixed reactors can be employed to pretreat larger amounts of biomass than possible in such small-diameter tubes, but solids concentrations are limited to about 15% or less to provide uniform temperatures. Pretreatment of large amounts of biomass at high solids concentrations is best carried out using direct steam injection and rapid pressure release, but closure of material balances in such “steam gun” devices is more difficult. Although flow of water alone or containing dilute acid is not practical commercially, such flow-through configurations provide valuable insight into biomass deconstruction kinetics not possible in the batch tubes, mixed reactors, or steam gun systems.

  19. Data identification for improving gene network inference using computational algebra.

    PubMed

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  20. Revealing the Earth's mantle from the tallest mountains using the Jinping Neutrino Experiment.

    PubMed

    Šrámek, Ondřej; Roskovec, Bedřich; Wipperfurth, Scott A; Xi, Yufei; McDonough, William F

    2016-09-09

    The Earth's engine is driven by unknown proportions of primordial energy and heat produced in radioactive decay. Unfortunately, competing models of Earth's composition reveal an order of magnitude uncertainty in the amount of radiogenic power driving mantle dynamics. Recent measurements of the Earth's flux of geoneutrinos, electron antineutrinos from terrestrial natural radioactivity, reveal the amount of uranium and thorium in the Earth and set limits on the residual proportion of primordial energy. Comparison of the flux measured at large underground neutrino experiments with geologically informed predictions of geoneutrino emission from the crust provide the critical test needed to define the mantle's radiogenic power. Measurement at an oceanic location, distant from nuclear reactors and continental crust, would best reveal the mantle flux, however, no such experiment is anticipated. We predict the geoneutrino flux at the site of the Jinping Neutrino Experiment (Sichuan, China). Within 8 years, the combination of existing data and measurements from soon to come experiments, including Jinping, will exclude end-member models at the 1σ level, define the mantle's radiogenic contribution to the surface heat loss, set limits on the composition of the silicate Earth, and provide significant parameter bounds for models defining the mode of mantle convection.

  1. The Effects of Source, Revision Possibility, and Amount of Feedback on Marketing Students' Impressions of Feedback on an Assignment

    ERIC Educational Resources Information Center

    Ackerman, David S.; Dommeyer, Curt J.; Gross, Barbara L.

    2017-01-01

    This study examines how three factors affect students' reactions to critical feedback on an assignment--amount of feedback (none vs. low amount vs. high amount), source of feedback (instructor-provided feedback vs. peer-provided feedback), and the situational context of the feedback (revision of paper is or is not possible). An incomplete 3 × 2 ×…

  2. Data Compression Algorithm Architecture for Large Depth-of-Field Particle Image Velocimeters

    NASA Technical Reports Server (NTRS)

    Bos, Brent; Memarsadeghi, Nargess; Kizhner, Semion; Antonille, Scott

    2013-01-01

    A large depth-of-field particle image velocimeter (PIV) is designed to characterize dynamic dust environments on planetary surfaces. This instrument detects lofted dust particles, and senses the number of particles per unit volume, measuring their sizes, velocities (both speed and direction), and shape factors when the particles are large. To measure these particle characteristics in-flight, the instrument gathers two-dimensional image data at a high frame rate, typically >4,000 Hz, generating large amounts of data for every second of operation, approximately 6 GB/s. To characterize a planetary dust environment that is dynamic, the instrument would have to operate for at least several minutes during an observation period, easily producing more than a terabyte of data per observation. Given current technology, this amount of data would be very difficult to store onboard a spacecraft, and downlink to Earth. Since 2007, innovators have been developing an autonomous image analysis algorithm architecture for the PIV instrument to greatly reduce the amount of data that it has to store and downlink. The algorithm analyzes PIV images and automatically reduces the image information down to only the particle measurement data that is of interest, reducing the amount of data that is handled by more than 10(exp 3). The state of development for this innovation is now fairly mature, with a functional algorithm architecture, along with several key pieces of algorithm logic, that has been proven through field test data acquired with a proof-of-concept PIV instrument.

  3. Cloud radiative effects and changes simulated by the Coupled Model Intercomparison Project Phase 5 models

    NASA Astrophysics Data System (ADS)

    Shin, Sun-Hee; Kim, Ok-Yeon; Kim, Dongmin; Lee, Myong-In

    2017-07-01

    Using 32 CMIP5 (Coupled Model Intercomparison Project Phase 5) models, this study examines the veracity in the simulation of cloud amount and their radiative effects (CREs) in the historical run driven by observed external radiative forcing for 1850-2005, and their future changes in the RCP (Representative Concentration Pathway) 4.5 scenario runs for 2006-2100. Validation metrics for the historical run are designed to examine the accuracy in the representation of spatial patterns for climatological mean, and annual and interannual variations of clouds and CREs. The models show large spread in the simulation of cloud amounts, specifically in the low cloud amount. The observed relationship between cloud amount and the controlling large-scale environment are also reproduced diversely by various models. Based on the validation metrics, four models—ACCESS1.0, ACCESS1.3, HadGEM2-CC, and HadGEM2-ES—are selected as best models, and the average of the four models performs more skillfully than the multimodel ensemble average. All models project global-mean SST warming at the increase of the greenhouse gases, but the magnitude varies across the simulations between 1 and 2 K, which is largely attributable to the difference in the change of cloud amount and distribution. The models that simulate more SST warming show a greater increase in the net CRE due to reduced low cloud and increased incoming shortwave radiation, particularly over the regions of marine boundary layer in the subtropics. Selected best-performing models project a significant reduction in global-mean cloud amount of about -0.99% K-1 and net radiative warming of 0.46 W m-2 K-1, suggesting a role of positive feedback to global warming.

  4. Overuse or underuse? An observation of pesticide use in China.

    PubMed

    Zhang, Chao; Hu, Ruifa; Shi, Guanming; Jin, Yanhong; Robson, Mark G; Huang, Xusheng

    2015-12-15

    Pesticide use has experienced a dramatic increase worldwide, especially in China, where a wide variety of pesticides are used in large amounts by farmers to control crop pests. While Chinese farmers are often criticized for pesticide overuse, this study shows the coexistence of overuse and underuse of pesticide based on the survey data of pesticide use in rice, cotton, maize, and wheat production in three provinces in China. A novel index amount approach is proposed to convert the amount of multiple pesticides used to control the same pest into an index amount of a referenced pesticide. We compare the summed index amount with the recommended dosage range of the referenced pesticide to classify whether pesticides are overused or underused. Using this new approach, the following main results were obtained. Pesticide overuse and underuse coexist after examining a total of 107 pesticides used to control up to 54 crop pests in rice, cotton, maize, and wheat production. In particular, pesticide overuse in more than half of the total cases for 9 crop pest species is detected. In contrast, pesticide underuse accounts for more than 20% of the total cases for 11 pests. We further indicate that the lack of knowledge and information on pesticide use and pest control among Chinese farmers may cause the coexistence of pesticide overuse and underuse. Our analysis provides indirect evidence that the commercialized agricultural extension system in China probably contributes to the coexistence of overuse and underuse. To improve pesticide use, it is urgent to reestablish the monitoring and forecasting system regarding pest control in China. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  6. Novel Bioreactor Platform for Scalable Cardiomyogenic Differentiation from Pluripotent Stem Cell-Derived Embryoid Bodies.

    PubMed

    Rungarunlert, Sasitorn; Ferreira, Joao N; Dinnyes, Andras

    2016-01-01

    Generation of cardiomyocytes from pluripotent stem cells (PSCs) is a common and valuable approach to produce large amount of cells for various applications, including assays and models for drug development, cell-based therapies, and tissue engineering. All these applications would benefit from a reliable bioreactor-based methodology to consistently generate homogenous PSC-derived embryoid bodies (EBs) at a large scale, which can further undergo cardiomyogenic differentiation. The goal of this chapter is to describe a scalable method to consistently generate large amount of homogeneous and synchronized EBs from PSCs. This method utilizes a slow-turning lateral vessel bioreactor to direct the EB formation and their subsequent cardiomyogenic lineage differentiation.

  7. Synthesis of large scale graphene oxide using plasma enhanced chemical vapor deposition method and its application in humidity sensing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yang; Chen, Yuming, E-mail: yumingchen@fudan.edu.cn; Engineering Research Center of Advanced Lighting Technology, Ministry of Education, 220 Handan Road, Shanghai 00433

    2016-03-14

    Large scale graphene oxide (GO) is directly synthesized on copper (Cu) foil by plasma enhanced chemical vapor deposition method under 500 °C and even lower temperature. Compared to the modified Hummer's method, the obtained GO sheet in this article is large, and it is scalable according to the Cu foil size. The oxygen-contained groups in the GO are introduced through the residual gas of methane (99.9% purity). To prevent the Cu surface from the bombardment of the ions in the plasma, we use low intensity discharge. Our experiment reveals that growth temperature has important influence on the carbon to oxygen ratiomore » (C/O ratio) in the GO; and it also affects the amount of π-π* bonds between carbon atoms. Preliminary experiments on a 6 mm × 12 mm GO based humidity sensor prove that the synthesized GO reacts well to the humidity change. Our GO synthesis method may provide another channel for obtaining large scale GO in gas sensing or other applications.« less

  8. Transmitted wavefront testing with large dynamic range based on computer-aided deflectometry

    NASA Astrophysics Data System (ADS)

    Wang, Daodang; Xu, Ping; Gong, Zhidong; Xie, Zhongmin; Liang, Rongguang; Xu, Xinke; Kong, Ming; Zhao, Jun

    2018-06-01

    The transmitted wavefront testing technique is demanded for the performance evaluation of transmission optics and transparent glass, in which the achievable dynamic range is a key issue. A computer-aided deflectometric testing method with fringe projection is proposed for the accurate testing of transmitted wavefronts with a large dynamic range. Ray tracing of the modeled testing system is carried out to achieve the virtual ‘null’ testing of transmitted wavefront aberrations. The ray aberration is obtained from the ray tracing result and measured slope, with which the test wavefront aberration can be reconstructed. To eliminate testing system modeling errors, a system geometry calibration based on computer-aided reverse optimization is applied to realize accurate testing. Both numerical simulation and experiments have been carried out to demonstrate the feasibility and high accuracy of the proposed testing method. The proposed testing method can achieve a large dynamic range compared with the interferometric method, providing a simple, low-cost and accurate way for the testing of transmitted wavefronts from various kinds of optics and a large amount of industrial transmission elements.

  9. Hydrocyclone/Filter for Concentrating Biomarkers from Soil

    NASA Technical Reports Server (NTRS)

    Ponce, Adrian; Obenhuber, Donald

    2008-01-01

    The hydrocyclone-filtration extractor (HFE), now undergoing development, is a simple, robust apparatus for processing large amounts of soil to extract trace amounts of microorganisms, soluble organic compounds, and other biomarkers from soil and to concentrate the extracts in amounts sufficient to enable such traditional assays as cell culturing, deoxyribonucleic acid (DNA) analysis, and isotope analysis. Originally intended for incorporation into a suite of instruments for detecting signs of life on Mars, the HFE could also be used on Earth for similar purposes, including detecting trace amounts of biomarkers or chemical wastes in soils.

  10. Unmanned Aerial Remote Sensing Facility of Wageningen UR: Overview of Activities

    NASA Astrophysics Data System (ADS)

    Masselink, Rens; Keesstra, Saskia; Baartman, Jantiene; Bartholomeus, Harm; Kooistra, Lammert

    2017-04-01

    The term nature-based solutions (NBS) refers to the sustainable management and use of nature for tackling societal challenges. The objectives of implementation of NBS are to provide a solution for environmental issues that affect the human economy and welfare, and simultaneously increase sustainability and biodiversity. Some primary goals for the implantation of NBS include flood protection by river restoration, erosion control and limiting nutrient transport from agricultural fields into surface waters. For the NBS to have a real effect for these issues, they need to be integrated over relatively large areas. Unmanned aerial systems (UASs) provide a platform to view and assess relatively large areas in a short amount of time at short time intervals. This allows for UAS data to be employed for the assessment of the functioning of certain NBS. Examples where UAS can be used are to look at the extent of inundated area during flooding or the migration of river meanders after (several) large events. Repeat surveys shed light on the evolution of the NBS, both at small and large scales. In this project, we are looking for effective ways to integrate UAS data and field-based measurements to obtain knowledge on the functioning of NBS. Several methods for using UAS to assess NBS implementation, measure NBS effectiveness and study the impact of NBS will be presented.

  11. An Intelligent Tool for Activity Data Collection

    PubMed Central

    Jehad Sarkar, A. M.

    2011-01-01

    Activity recognition systems using simple and ubiquitous sensors require a large variety of real-world sensor data for not only evaluating their performance but also training the systems for better functioning. However, a tremendous amount of effort is required to setup an environment for collecting such data. For example, expertise and resources are needed to design and install the sensors, controllers, network components, and middleware just to perform basic data collections. It is therefore desirable to have a data collection method that is inexpensive, flexible, user-friendly, and capable of providing large and diverse activity datasets. In this paper, we propose an intelligent activity data collection tool which has the ability to provide such datasets inexpensively without physically deploying the testbeds. It can be used as an inexpensive and alternative technique to collect human activity data. The tool provides a set of web interfaces to create a web-based activity data collection environment. It also provides a web-based experience sampling tool to take the user’s activity input. The tool generates an activity log using its activity knowledge and the user-given inputs. The activity knowledge is mined from the web. We have performed two experiments to validate the tool’s performance in producing reliable datasets. PMID:22163832

  12. Interannual kinetics (2010-2013) of large wood in a river corridor exposed to a 50-year flood event and fluvial ice dynamics

    NASA Astrophysics Data System (ADS)

    Boivin, Maxime; Buffin-Bélanger, Thomas; Piégay, Hervé

    2017-02-01

    Semi-alluvial rivers of the Gaspé Peninsula, Québec, are prone to produce and transport vast quantities of large wood (LW). The high rate of lateral erosion owing to high energy flows and noncohesive banks is the main process leading to the recruitment of large wood, which in turn initiates complex patterns of wood accumulation and reentrainment within the active channel. The delta of the Saint-Jean River (SJR) has accumulated large annual wood fluxes since 1960 that culminated in a wood raft of > 3-km in length in 2014. To document the kinetics of large wood on the main channel of SJR, four annual surveys were carried out from 2010 to 2013 to locate and describe > 1000 large wood jams (LWJ) and 2000 large wood individuals (LWI) along a 60-km river section. Airborne and ground photo/video images were used to estimate the wood volume introduced by lateral erosion and to identify local geomorphic conditions that control wood mobility and deposits. Video camera analysis allowed the examination of transport rates from three hydrometeorological events for specific river sections. Results indicate that the volume of LW recruited between 2010 and 2013 represents 57% of the total LW production over the 2004-2013 period. Volumes of wood deposited along the 60-km section were four times higher in 2013 than in 2010. Increases in wood amount occurred mainly in upper alluvial sections of the river, whereas decreases were observed in the semi-alluvial middle sections. Observations suggest that the 50-year flood event of 2010 produced large amounts of LW that were only partly exported out of the basin so that a significant amount was still available for subsequent floods. Large wood storage continued after this flood until a similar flood or an ice-breakup event could remobilise these LW accumulations into the river corridor. Ice-jam floods transport large amounts of wood during events with fairly low flow but do not contribute significantly to recruitment rates (ca. 10 to 30% early). It is fairly probable that the wood export peak observed in 2012 at the river mouth, where no flood occurred and which is similar to the 1-in 10-year flood of 2010, is mainly linked to such ice-break events that occurred in March 2012.

  13. Source-Receptor Relationship Analysis of the Atmospheric Deposition of PAHs Subject to Long-Range Transport in Northeast Asia.

    PubMed

    Inomata, Yayoi; Kajino, Mizuo; Sato, Keiichi; Kurokawa, Junichi; Tang, Ning; Ohara, Toshimasa; Hayakawa, Kazuichi; Ueda, Hiromasa

    2017-07-18

    The source-receptor relationship analysis of PAH deposition in Northeast Asia was investigated using an Eulerian regional-scale aerosol chemical transport model. Dry deposition (DD) of PAH was controlled by wind flow patterns, whereas wet deposition (WD) depended on precipitation in addition to wind flow patterns. The contribution of WD was approximately 50-90% of the total deposition, except during winter in Northern China (NCHN) and Eastern Russia (ERUS) because of the low amount of precipitation. The amount of PAH deposition showed clear seasonal variation and was high in winter and low in summer in downwind (South Korea, Japan) and oceanic-receptor regions. In the downwind region, the contributions from NCHN (WD 28-52%; DD 54-55%) and Central China (CCHN) (WD 43-65%; DD 33-38%) were large in winter, whereas self-contributions (WD 20-51%; DD 79-81%) were relatively high in summer. In the oceanic-receptor region, the deposition amount decreased with distance from the Asian continent. The amount of DD was strongly influenced by emissions from neighboring domains. The contributions of WD from NCHN (16-20%) and CCHN (28-35%) were large. The large contributions from China in summer to the downwind region were linked to vertical transport of PAHs over the Asian continent associated with convection.

  14. Highly Sensitive GMO Detection Using Real-Time PCR with a Large Amount of DNA Template: Single-Laboratory Validation.

    PubMed

    Mano, Junichi; Hatano, Shuko; Nagatomi, Yasuaki; Futo, Satoshi; Takabatake, Reona; Kitta, Kazumi

    2018-03-01

    Current genetically modified organism (GMO) detection methods allow for sensitive detection. However, a further increase in sensitivity will enable more efficient testing for large grain samples and reliable testing for processed foods. In this study, we investigated real-time PCR-based GMO detection methods using a large amount of DNA template. We selected target sequences that are commonly introduced into many kinds of GM crops, i.e., 35S promoter and nopaline synthase (NOS) terminator. This makes the newly developed method applicable to a wide range of GMOs, including some unauthorized ones. The estimated LOD of the new method was 0.005% of GM maize events; to the best of our knowledge, this method is the most sensitive among the GM maize detection methods for which the LOD was evaluated in terms of GMO content. A 10-fold increase in the DNA amount as compared with the amount used under common testing conditions gave an approximately 10-fold reduction in the LOD without PCR inhibition. Our method is applicable to various analytical samples, including processed foods. The use of other primers and fluorescence probes would permit highly sensitive detection of various recombinant DNA sequences besides the 35S promoter and NOS terminator.

  15. Dynamic travel information personalized and delivered to your cell phone : addendum.

    DOT National Transportation Integrated Search

    2011-03-01

    Real-time travel information must reach a significant amount of travelers to create a large amount of travel behavior change. For this project, since the TRAC-IT mobile phone application is used to monitor user context in terms of location, the mobil...

  16. New insights into liquid chromatography for more eco-friendly analysis of pharmaceuticals.

    PubMed

    Shaaban, Heba

    2016-10-01

    Greening the analytical methods used for analysis of pharmaceuticals has been receiving great interest aimed at eliminating or minimizing the amount of organic solvents consumed daily worldwide without loss in chromatographic performance. Traditional analytical LC techniques employed in pharmaceutical analysis consume tremendous amounts of hazardous solvents and consequently generate large amounts of waste. The monetary and ecological impact of using large amounts of solvents and waste disposal motivated the analytical community to search for alternatives to replace polluting analytical methodologies with clean ones. In this context, implementing the principles of green analytical chemistry (GAC) in analytical laboratories is highly desired. This review gives a comprehensive overview on different green LC pathways for implementing GAC principles in analytical laboratories and focuses on evaluating the greenness of LC analytical procedures. This review presents green LC approaches for eco-friendly analysis of pharmaceuticals in industrial, biological, and environmental matrices. Graphical Abstract Green pathways of liquid chromatography for more eco-friendly analysis of pharmaceuticals.

  17. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  18. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  19. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  20. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

Top