Sample records for process vast amounts

  1. Advanced Natural Language Processing and Temporal Mining for Clinical Discovery

    ERIC Educational Resources Information Center

    Mehrabi, Saeed

    2016-01-01

    There has been vast and growing amount of healthcare data especially with the rapid adoption of electronic health records (EHRs) as a result of the HITECH act of 2009. It is estimated that around 80% of the clinical information resides in the unstructured narrative of an EHR. Recently, natural language processing (NLP) techniques have offered…

  2. A Hybrid Approach for Translational Research

    ERIC Educational Resources Information Center

    Webster, Yue Wang

    2010-01-01

    Translational research has proven to be a powerful process that bridges the gap between basic science and medical practice. The complexity of translational research is two-fold: integration of vast amount of information in disparate silos, and dissemination of discoveries to stakeholders with different interests. We designed and implemented a…

  3. Sharing Local Revenue: One District's Perspective

    ERIC Educational Resources Information Center

    Cline, David S.

    2011-01-01

    The vast majority of U.S. school districts are considered independent and have taxing authority; the remaining districts rely on revenue and budgetary approval from their local government. In the latter case, localities often use some form of negotiated process to determine the amount of revenue their school districts will receive. Typically, a…

  4. Making Sense of Dollars and Cents

    ERIC Educational Resources Information Center

    Sorenson, Richard

    2010-01-01

    Principals must devote a vast amount of time and energy to campus funding and budgetary issues because budgeting and accounting procedures are an integral part of an effective instructional program. In fact, a principal's role in the budgetary process significantly impacts both budget development and instructional planning. Principals who fail to…

  5. Meaning Apprehension in the Cerebral Hemispheres

    ERIC Educational Resources Information Center

    Kandhadai, Padmapriya A.

    2009-01-01

    When we hear a word, it is remarkable how we store, activate and rapidly retrieve a vast amount of relevant information within a few hundred milliseconds. This thesis examines how meaning is processed in parallel--but with critical differences--between the two hemispheres of the brain. Event-related brain potentials (ERP) were used to examine…

  6. Designing Hypercontextualized Games: A Case Study with LieksaMyst

    ERIC Educational Resources Information Center

    Sedano, Carolina Islas; Sutinen, Erkki; Vinni, Mikko; Laine, Teemu H.

    2012-01-01

    Digital technology empowers one to access vast amounts of on-line data. From a learning perspective, however, it is difficult to access meaningful on-site information within a given context. The Hypercontextualized Game (HCG) design model interweaves on-site resources, translated as content, and the digital game. As a local game design process,…

  7. Myers-Briggs Personality Type and Adolescent Coping in the College Search

    ERIC Educational Resources Information Center

    Golden, Thomas Courtenay

    2009-01-01

    The college choice requires the adolescent to gather and synthesize vast amounts of information, reconcile sometimes competing personal and familial goals, and manage a range of emotions. This decision process represents a major developmental crisis with which the adolescent must cope. Scholars have noted that psychological strain and heightened…

  8. Seeking an Online Social Media Radar

    ERIC Educational Resources Information Center

    ter Veen, James

    2014-01-01

    Purpose: The purpose of this paper is to explore how the application of Systems Engineering tools and techniques can be applied to rapidly process and analyze the vast amounts of data present in social media in order to yield practical knowledge for Command and Control (C2) systems. Design/methodology/approach: Based upon comparative analysis of…

  9. Getting it right [Editorial

    Treesearch

    William M. Block

    2007-01-01

    Manuscripts contain a vast amount of information. Some of this information summarizes the state-of-knowledge and sets the stage for the paper. Other information presents data and summarizes analysis. Lastly, results are interpreted in the form of a discussion and management implications. Although a number of checks in the review and editorial processes catch errors...

  10. Openwebglobe 2: Visualization of Complex 3D-GEODATA in the (mobile) Webbrowser

    NASA Astrophysics Data System (ADS)

    Christen, M.

    2016-06-01

    Providing worldwide high resolution data for virtual globes consists of compute and storage intense tasks for processing data. Furthermore, rendering complex 3D-Geodata, such as 3D-City models with an extremely high polygon count and a vast amount of textures at interactive framerates is still a very challenging task, especially on mobile devices. This paper presents an approach for processing, caching and serving massive geospatial data in a cloud-based environment for large scale, out-of-core, highly scalable 3D scene rendering on a web based virtual globe. Cloud computing is used for processing large amounts of geospatial data and also for providing 2D and 3D map data to a large amount of (mobile) web clients. In this paper the approach for processing, rendering and caching very large datasets in the currently developed virtual globe "OpenWebGlobe 2" is shown, which displays 3D-Geodata on nearly every device.

  11. Developing a Content Strategy for an Academic Library Website

    ERIC Educational Resources Information Center

    Blakiston, Rebecca

    2013-01-01

    Academic library websites contain a vast amount of complex content and, all too often, there is a lack of established process for creating, updating, and deleting that content. There is no clear vision or purpose to the content, and numerous staff members are expected to maintain content with little guidance. Because of this, many library websites…

  12. An Administrator's Guide to Microcomputer Resources. Research & Development Series No. 239B.

    ERIC Educational Resources Information Center

    Zahniser, Gale; And Others

    This guide is designed to help educators sort through the vast amount of information that exists about the educational use of microcomputers. The first of five chapters takes the educational administrator through the decision process that is typically associated with choosing and adopting microcomputers for the school. For each point in this…

  13. Safeguarding Canadian Arctic Sovereignty Against Conventional Threats

    DTIC Science & Technology

    2009-06-01

    The effects of climate change as well as national interests over control of vast amounts of natural resources in the Arctic seem to be...Canadian Sovereignty, Climate Change, Military Capabilities for Arctic Operations 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18...THREATS, by MAJ Dave Abboud, Canadian Forces, 95 pages. The effects of climate change as well as national interests over control of vast amounts of

  14. On the value of information for Industry 4.0

    NASA Astrophysics Data System (ADS)

    Omenzetter, Piotr

    2018-03-01

    Industry 4.0, or the fourth industrial revolution, that blurs the boundaries between the physical and the digital, is underpinned by vast amounts of data collected by sensors that monitor processes and components of smart factories that continuously communicate amongst one another and with the network hubs via the internet of things. Yet, collection of those vast amounts of data, which are inherently imperfect and burdened with uncertainties and noise, entails costs including hardware and software, data storage, processing, interpretation and integration into the decision-making process to name just the few main expenditures. This paper discusses a framework for rationalizing the adoption of (big) data collection for Industry 4.0. The pre-posterior Bayesian decision analysis is used to that end and industrial process evolution with time is conceptualized as a stochastic observable and controllable dynamical system. The chief underlying motivation is to be able to use the collected data in such a way as to derive the most benefit from them by trading off successfully the management of risks pertinent to failure of the monitored processes and/or its components against the cost of data collection, processing and interpretation. This enables formulation of optimization problems for data collection, e.g. for selecting the monitoring system type, topology and/or time of deployment. An illustrative example utilizing monitoring of the operation of an assembly line and optimizing the topology of a monitoring system is provided to illustrate the theoretical concepts.

  15. NMSBA ? RS21.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kinnan, Mark K.; Valerio, Richard Arthur; Flanagan, Tatiana Paz

    2016-12-01

    This report gives introductory guidance on the level of effort required to create a data warehouse for mining data. Numerous tutorials have been provided to demonstrate the process of downloading raw data, processing the raw data, and importing the data into a PostgreSQL database. Additional information and tutorial has been provided on setting up a Hadoop cluster for storing vasts amounts of data. This report has been generated as a deliverable for a New Mexico Small Business Assistance (NMSBA) project.

  16. Dynamic Database. Efficiently Convert Massive Quantities of Sensor Data into Actionable Information for Tactical Commanders

    DTIC Science & Technology

    2000-06-01

    As the number of sensors, platforms, exploitation sites, and command and control nodes continues to grow in response to Joint Vision 2010 information ... dominance requirements, Commanders and analysts will have an ever increasing need to collect and process vast amounts of data over wide areas using a large number of disparate sensors and information gathering sources.

  17. Application of Climate Assessment Tool (CAT) to estimate climate variability impacts on nutrient loading from local watersheds

    Treesearch

    Ying Ouyang; Prem B. Parajuli; Gary Feng; Theodor D. Leininger; Yongshan Wan; Padmanava Dash

    2018-01-01

    A vast amount of future climate scenario datasets, created by climate models such as general circulation models (GCMs), have been used in conjunction with watershed models to project future climate variability impact on hydrological processes and water quality. However, these low spatial-temporal resolution datasets are often difficult to downscale spatially and...

  18. Natural Language Processing: A Tutorial. Revision

    DTIC Science & Technology

    1990-01-01

    English in word-for-word language translations. An oft-repeated (although fictional) anecdote illustrates the ... English by a language translation program, became: " The vodka is strong but 3 the steak is rotten." The point made is that vast amounts of knowledge...are required for effective language translations. The initial goal for Language Translation was "fully-automatic high-quality translation" (FAHOT).

  19. Abduction, Deduction and Induction: Can These Concepts Be Used for an Understanding of Methodological Processes in Interpretative Case Studies?

    ERIC Educational Resources Information Center

    Åsvoll, Håvard

    2014-01-01

    Within the area of interpretative case studies, there appears to be a vast amount of literature about theoretical interpretations as the main analytical strategy. In light of this theoretically based strategy in case studies, this article presents an extended perspective based on Charles Sanders Peirce's concepts of abduction, deduction and…

  20. [The need to develop demographic census systems for Latin America].

    PubMed

    Silva, A

    1987-01-01

    The author presents the case for developing new software packages specifically designed to process population census information for Latin America. The focus is on the problems faced by developing countries in handling vast amounts of data in an efficient way. First, the basic methods of census data processing are discussed, then brief descriptions of some of the available software are included. Finally, ways in which data processing programs could be geared toward and utilized for improving the accuracy of Latin American censuses in the 1990s are proposed.

  1. Sensible use of antisense: how to use oligonucleotides as research tools.

    PubMed

    Myers, K J; Dean, N M

    2000-01-01

    In the past decade, there has been a vast increase in the amount of gene sequence information that has the potential to revolutionize the way diseases are both categorized and treated. Old diagnoses, largely anatomical or descriptive in nature, are likely to be superceded by the molecular characterization of the disease. The recognition that certain genes drive key disease processes will also enable the rational design of gene-specific therapeutics. Antisense oligonucleotides represent a technology that should play multiple roles in this process.

  2. CD-ROM-aided Databases

    NASA Astrophysics Data System (ADS)

    Sano, Tomoyuki; Suzuki, Masataka; Nishida, Hideo

    The Development of CAI system using CD-ROM and NAPLPS (North American Presentation Level Protocol Syntax) was taken place by Himeji Dokkyo University. The characteristics of CAI using CD-ROM as information processing series for the department of liberal arts student are described. The system is that the computer program, vast amount of voice data and graphics data are stored in a CD-ROM. It is very effective to improve learning ability of student.

  3. Cloud-Based Applications for Organizing and Reviewing Plastic Surgery Content

    PubMed Central

    Luan, Anna; Momeni, Arash; Lee, Gordon K.

    2015-01-01

    Cloud-based applications including Box, Dropbox, Google Drive, Evernote, Notability, and Zotero are available for smartphones, tablets, and laptops and have revolutionized the manner in which medical students and surgeons read and utilize plastic surgery literature. Here we provide an overview of the use of Cloud computing in practice and propose an algorithm for organizing the vast amount of plastic surgery literature. Given the incredible amount of data being produced in plastic surgery and other surgical subspecialties, it is prudent for plastic surgeons to lead the process of providing solutions for the efficient organization and effective integration of the ever-increasing data into clinical practice. PMID:26576208

  4. Assessing the Performance of Human-Automation Collaborative Planning Systems

    DTIC Science & Technology

    2011-06-01

    process- ing and incorporating vast amounts of incoming information into their solutions. How- ever, these algorithms are brittle and unable to account for...planning system, a descriptive Mission Performance measure may address the total travel time on the path or the cost of the path (e.g. total work...minimizing costs or collisions [4, 32, 33]. Error measures for such a path planning system may track how many collisions occur or how much threat

  5. Pupillary Response as an Indicator of Processing Demands Within a Supervisory Control Simulation Environment

    DTIC Science & Technology

    2015-05-07

    Yerkes, R.M., and Dodson, J.D. (1908). The relation of strength of stimulus to rapidity of habit -formation. Journal of comparative neurology and psychology 18, 459-482. ...the amount of mental effort exerted (Kahneman, 1973; Beatty & Lucero-Wagoner, 2000; Andreassi, 2007). The vast majority of these studies , however...of this initial study , the authors were interested in investigating whether pupillometry data collected in a realistic UAV supervisory control

  6. SHARED TECHNOLOGY TRANSFER PROGRAM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    GRIFFIN, JOHN M. HAUT, RICHARD C.

    2008-03-07

    The program established a collaborative process with domestic industries for the purpose of sharing Navy-developed technology. Private sector businesses were educated so as to increase their awareness of the vast amount of technologies that are available, with an initial focus on technology applications that are related to the Hydrogen, Fuel Cells and Infrastructure Technologies (Hydrogen) Program of the U.S. Department of Energy. Specifically, the project worked to increase industry awareness of the vast technology resources available to them that have been developed with taxpayer funding. NAVSEA-Carderock and the Houston Advanced Research Center teamed with Nicholls State University to catalog NAVSEA-Carderockmore » unclassified technologies, rated the level of readiness of the technologies and established a web based catalog of the technologies. In particular, the catalog contains technology descriptions, including testing summaries and overviews of related presentations.« less

  7. Towards engineering of hormonal crosstalk in plant immunity.

    PubMed

    Shigenaga, Alexandra M; Berens, Matthias L; Tsuda, Kenichi; Argueso, Cristiana T

    2017-08-01

    Plant hormones regulate physiological responses in plants, including responses to pathogens and beneficial microbes. The last decades have provided a vast amount of evidence about the contribution of different plant hormones to plant immunity, and also of how they cooperate to orchestrate immunity activation, in a process known as hormone crosstalk. In this review we highlight the complexity of hormonal crosstalk in immunity and approaches currently being used to further understand this process, as well as perspectives to engineer hormone crosstalk for enhanced pathogen resistance and overall plant fitness. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Automatic building identification under bomb damage conditions

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Noll, Warren; Barker, Joseph; Wunsch, Donald C., II

    2009-05-01

    Given the vast amount of image intelligence utilized in support of planning and executing military operations, a passive automated image processing capability for target identification is urgently required. Furthermore, transmitting large image streams from remote locations would quickly use available band width (BW) precipitating the need for processing to occur at the sensor location. This paper addresses the problem of automatic target recognition for battle damage assessment (BDA). We utilize an Adaptive Resonance Theory approach to cluster templates of target buildings. The results show that the network successfully classifies targets from non-targets in a virtual test bed environment.

  9. NASA Automatic Information Security Handbook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This handbook details the Automated Information Security (AIS) management process for NASA. Automated information system security is becoming an increasingly important issue for all NASA managers and with rapid advancements in computer and network technologies and the demanding nature of space exploration and space research have made NASA increasingly dependent on automated systems to store, process, and transmit vast amounts of mission support information, hence the need for AIS systems and management. This handbook provides the consistent policies, procedures, and guidance to assure that an aggressive and effective AIS programs is developed, implemented, and sustained at all NASA organizations and NASA support contractors.

  10. The iMars web-GIS - spatio-temporal data queries and single image web map services

    NASA Astrophysics Data System (ADS)

    Walter, S. H. G.; Steikert, R.; Schreiner, B.; Sidiropoulos, P.; Tao, Y.; Muller, J.-P.; Putry, A. R. D.; van Gasselt, S.

    2017-09-01

    We introduce a new approach for a system dedicated to planetary surface change detection by simultaneous visualisation of single-image time series in a multi-temporal context. In the context of the EU FP-7 iMars project we process and ingest vast amounts of automatically co-registered (ACRO) images. The base of the co-registration are the high precision HRSC multi-orbit quadrangle image mosaics, which are based on bundle-block-adjusted multi-orbit HRSC DTMs.

  11. Human Factors in Intelligence, Surveillance, and Reconnaissance: Gaps for Soldiers and Technology Recommendations

    DTIC Science & Technology

    2014-07-01

    technology work seeks to address gaps in the management, processing, and fusion of heterogeneous (i.e., soft and hard ) information to aid human decision...and bandwidth) to exploit the vast and growing amounts of data [16], [17]. There is also a broad research program on techniques for soft and hard ...Mott, G. de Mel, and T. Pham, “Integrating hard and soft information sources for D2D using controlled natural language,” in Proc. Information Fusion

  12. Raman Laser Spectrometer (RLS) on-board data processing and compression

    NASA Astrophysics Data System (ADS)

    Diaz, C.; Lopez, G.; Hermosilla, I.; Catalá, A.; Rodriguez, J. A.; Perez, C.; Diaz, E.

    2013-09-01

    The Raman Laser Spectrometer (RLS) is one of the Pasteur Payload instruments, within the ESA's Aurora Exploration Programme, ExoMars mission. Particularly, the RLS scientific objectives are as follows: identify organic compound and search for life; identify the mineral products and indicators of biologic activities; characterize mineral phases produced by water-related processes; characterize igneous minerals and their alteration products; characterise water/geochemical environment as a function of depth in the shallow subsurface. The straightforward approach of operating the instrument would result in a vast amount of spectrum images. A flexible on-board data processing concept has been designed to accommodate scientific return to the sample nature and data downlink bandwidth.

  13. Currently available methodologies for the processing of intravascular ultrasound and optical coherence tomography images.

    PubMed

    Athanasiou, Lambros; Sakellarios, Antonis I; Bourantas, Christos V; Tsirka, Georgia; Siogkas, Panagiotis; Exarchos, Themis P; Naka, Katerina K; Michalis, Lampros K; Fotiadis, Dimitrios I

    2014-07-01

    Optical coherence tomography and intravascular ultrasound are the most widely used methodologies in clinical practice as they provide high resolution cross-sectional images that allow comprehensive visualization of the lumen and plaque morphology. Several methods have been developed in recent years to process the output of these imaging modalities, which allow fast, reliable and reproducible detection of the luminal borders and characterization of plaque composition. These methods have proven useful in the study of the atherosclerotic process as they have facilitated analysis of a vast amount of data. This review presents currently available intravascular ultrasound and optical coherence tomography processing methodologies for segmenting and characterizing the plaque area, highlighting their advantages and disadvantages, and discusses the future trends in intravascular imaging.

  14. Kudi: A free open-source python library for the analysis of properties along reaction paths.

    PubMed

    Vogt-Geisse, Stefan

    2016-05-01

    With increasing computational capabilities, an ever growing amount of data is generated in computational chemistry that contains a vast amount of chemically relevant information. It is therefore imperative to create new computational tools in order to process and extract this data in a sensible way. Kudi is an open source library that aids in the extraction of chemical properties from reaction paths. The straightforward structure of Kudi makes it easy to use for users and allows for effortless implementation of new capabilities, and extension to any quantum chemistry package. A use case for Kudi is shown for the tautomerization reaction of formic acid. Kudi is available free of charge at www.github.com/stvogt/kudi.

  15. Service Bundle Recommendation for Person-Centered Care Planning in Cities.

    PubMed

    Kotoulas, Spyros; Daly, Elizabeth; Tommasi, Pierpaolo; Kishimoto, Akihiro; Lopez, Vanessa; Stephenson, Martin; Botea, Adi; Sbodio, Marco; Marinescu, Radu; Rooney, Ronan

    2016-01-01

    Providing appropriate support for the most vulnerable individuals carries enormous societal significance and economic burden. Yet, finding the right balance between costs, estimated effectiveness and the experience of the care recipient is a daunting task that requires considering vast amount of information. We present a system that helps care teams choose the optimal combination of providers for a set of services. We draw from techniques in Open Data processing, semantic processing, faceted exploration, visual analytics, transportation analytics and multi-objective optimization. We present an implementation of the system using data from New York City and illustrate the feasibility these technologies to guide care workers in care planning.

  16. Automated Meteor Detection by All-Sky Digital Camera Systems

    NASA Astrophysics Data System (ADS)

    Suk, Tomáš; Šimberová, Stanislava

    2017-12-01

    We have developed a set of methods to detect meteor light traces captured by all-sky CCD cameras. Operating at small automatic observatories (stations), these cameras create a network spread over a large territory. Image data coming from these stations are merged in one central node. Since a vast amount of data is collected by the stations in a single night, robotic storage and analysis are essential to processing. The proposed methodology is adapted to data from a network of automatic stations equipped with digital fish-eye cameras and includes data capturing, preparation, pre-processing, analysis, and finally recognition of objects in time sequences. In our experiments we utilized real observed data from two stations.

  17. Molecular signatures from omics data: from chaos to consensus.

    PubMed

    Sung, Jaeyun; Wang, Yuliang; Chandrasekaran, Sriram; Witten, Daniela M; Price, Nathan D

    2012-08-01

    In the past 15 years, new "omics" technologies have made it possible to obtain high-resolution molecular snapshots of organisms, tissues, and even individual cells at various disease states and experimental conditions. It is hoped that these developments will usher in a new era of personalized medicine in which an individual's molecular measurements are used to diagnose disease, guide therapy, and perform other tasks more accurately and effectively than is possible using standard approaches. There now exists a vast literature of reported "molecular signatures". However, despite some notable exceptions, many of these signatures have suffered from limited reproducibility in independent datasets, insufficient sensitivity or specificity to meet clinical needs, or other challenges. In this paper, we discuss the process of molecular signature discovery on the basis of omics data. In particular, we highlight potential pitfalls in the discovery process, as well as strategies that can be used to increase the odds of successful discovery. Despite the difficulties that have plagued the field of molecular signature discovery, we remain optimistic about the potential to harness the vast amounts of available omics data in order to substantially impact clinical practice. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Enabling Big Geoscience Data Analytics with a Cloud-Based, MapReduce-Enabled and Service-Oriented Workflow Framework

    PubMed Central

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists. PMID:25742012

  19. Stochastic theory of nonequilibrium steady states and its applications. Part I

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Juan; Qian, Hong; Qian, Min

    2012-01-01

    The concepts of equilibrium and nonequilibrium steady states are introduced in the present review as mathematical concepts associated with stationary Markov processes. For both discrete stochastic systems with master equations and continuous diffusion processes with Fokker-Planck equations, the nonequilibrium steady state (NESS) is characterized in terms of several key notions which are originated from nonequilibrium physics: time irreversibility, breakdown of detailed balance, free energy dissipation, and positive entropy production rate. After presenting this NESS theory in pedagogically accessible mathematical terms that require only a minimal amount of prerequisites in nonlinear differential equations and the theory of probability, it is applied, in Part I, to two widely studied problems: the stochastic resonance (also known as coherent resonance) and molecular motors (also known as Brownian ratchet). Although both areas have advanced rapidly on their own with a vast amount of literature, the theory of NESS provides them with a unifying mathematical foundation. Part II of this review contains applications of the NESS theory to processes from cellular biochemistry, ranging from enzyme catalyzed reactions, kinetic proofreading, to zeroth-order ultrasensitivity.

  20. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    PubMed

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  1. FPGA-based prototype storage system with phase change memory

    NASA Astrophysics Data System (ADS)

    Li, Gezi; Chen, Xiaogang; Chen, Bomy; Li, Shunfen; Zhou, Mi; Han, Wenbing; Song, Zhitang

    2016-10-01

    With the ever-increasing amount of data being stored via social media, mobile telephony base stations, and network devices etc. the database systems face severe bandwidth bottlenecks when moving vast amounts of data from storage to the processing nodes. At the same time, Storage Class Memory (SCM) technologies such as Phase Change Memory (PCM) with unique features like fast read access, high density, non-volatility, byte-addressability, positive response to increasing temperature, superior scalability, and zero standby leakage have changed the landscape of modern computing and storage systems. In such a scenario, we present a storage system called FLEET which can off-load partial or whole SQL queries to the storage engine from CPU. FLEET uses an FPGA rather than conventional CPUs to implement the off-load engine due to its highly parallel nature. We have implemented an initial prototype of FLEET with PCM-based storage. The results demonstrate that significant performance and CPU utilization gains can be achieved by pushing selected query processing components inside in PCM-based storage.

  2. Text Mining in Biomedical Domain with Emphasis on Document Clustering.

    PubMed

    Renganathan, Vinaitheerthan

    2017-07-01

    With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise.

  3. FLIPing heterokaryons to analyze nucleo-cytoplasmic shuttling of yeast proteins.

    PubMed

    Belaya, Katsiaryna; Tollervey, David; Kos, Martin

    2006-05-01

    Nucleo-cytoplasmic shuttling is an important feature of proteins involved in nuclear export/import of RNAs, proteins, and also large ribonucleoprotein complexes such as ribosomes. The vast amount of proteomic data available shows that many of these processes are highly dynamic. Therefore, methods are needed to reliably assess whether a protein shuttles between nucleus and cytoplasm, and the kinetics with which it exchanges. Here we describe a combination of the classical heterokaryon assay with fluorescence recovery after photobleaching (FRAP) and fluorescence loss in photobleaching (FLIP) techniques, which allows an assessment of the kinetics of protein shuttling in the yeast Saccharomyces cerevisiae.

  4. The role of black holes in galaxy formation and evolution.

    PubMed

    Cattaneo, A; Faber, S M; Binney, J; Dekel, A; Kormendy, J; Mushotzky, R; Babul, A; Best, P N; Brüggen, M; Fabian, A C; Frenk, C S; Khalatyan, A; Netzer, H; Mahdavi, A; Silk, J; Steinmetz, M; Wisotzki, L

    2009-07-09

    Virtually all massive galaxies, including our own, host central black holes ranging in mass from millions to billions of solar masses. The growth of these black holes releases vast amounts of energy that powers quasars and other weaker active galactic nuclei. A tiny fraction of this energy, if absorbed by the host galaxy, could halt star formation by heating and ejecting ambient gas. A central question in galaxy evolution is the degree to which this process has caused the decline of star formation in large elliptical galaxies, which typically have little cold gas and few young stars, unlike spiral galaxies.

  5. Telearch - Integrated visual simulation environment for collaborative virtual archaeology.

    NASA Astrophysics Data System (ADS)

    Kurillo, Gregorij; Forte, Maurizio

    Archaeologists collect vast amounts of digital data around the world; however, they lack tools for integration and collaborative interaction to support reconstruction and interpretation process. TeleArch software is aimed to integrate different data sources and provide real-time interaction tools for remote collaboration of geographically distributed scholars inside a shared virtual environment. The framework also includes audio, 2D and 3D video streaming technology to facilitate remote presence of users. In this paper, we present several experimental case studies to demonstrate the integration and interaction with 3D models and geographical information system (GIS) data in this collaborative environment.

  6. Bottom to Top Approach for Railway KPI Generation

    NASA Astrophysics Data System (ADS)

    Villarejo, Roberto; Johansson, Carl-Anders; Leturiondo, Urko; Simon, Victor; Seneviratne, Dammika; Galar, Diego

    2017-09-01

    Railway maintenance especially on infrastructure produces a vast amount of data. However, having data is not synonymous with having information; rather, data must be processed to extract information. In railway maintenance, the development of key performance indicators (KPIs) linked to punctuality or capacity can help planned and scheduled maintenance, thus aligning the maintenance department with corporate objectives. There is a need for an improved method to analyse railway data to find the relevant KPIs. The system should support maintainers, answering such questions as what maintenance should be done, where and when. The system should equip the user with the knowledge of the infrastructure's condition and configuration, and the traffic situation so maintenance resources can be targeted to only those areas needing work. The amount of information is vast, so it must be hierarchized and aggregated; users must filter out the useless indicators. Data are fused by compiling several individual indicators into a single index; the resulting composite indicators measure multidimensional concepts which cannot be captured by a single index. The paper describes a method of monitoring a complex entity. In this scenario, a plurality of use indices and weighting values are used to create a composite and aggregated use index from a combination of lower level use indices and weighting values. The resulting composite and aggregated indicators can be a decision-making tool for asset managers at different hierarchical levels.

  7. How attention gates social interactions.

    PubMed

    Capozzi, Francesca; Ristic, Jelena

    2018-05-25

    Social interactions are at the core of social life. However, humans selectively choose their exchange partners and do not engage in all available opportunities for social encounters. In this review, we argue that attentional systems play an important role in guiding the selection of social interactions. Supported by both classic and emerging literature, we identify and characterize the three core processes-perception, interpretation, and evaluation-that interact with attentional systems to modulate selective responses to social environments. Perceptual processes facilitate attentional prioritization of social cues. Interpretative processes link attention with understanding of cues' social meanings and agents' mental states. Evaluative processes determine the perceived value of the source of social information. The interplay between attention and these three routes of processing places attention in a powerful role to manage the selection of the vast amount of social information that individuals encounter on a daily basis and, in turn, gate the selection of social interactions. © 2018 New York Academy of Sciences.

  8. Evaluating White Space of a Malaysian Secondary ELT Textbook

    ERIC Educational Resources Information Center

    Jin, Ng Yu

    2010-01-01

    Students are believed to learn at optimum level with materials with a lot of white space since the existence of vast number of words and limited amount of white space, especially in textbooks may contribute to the increase in anxiety among learners (Tomlinson, 1998). This study evaluates the amount of white space in terms of pixels in the…

  9. Lusitanization and Bakhtinian Perspectives on the Role of Portuguese in Angola and East Timor

    ERIC Educational Resources Information Center

    Makoni, Sinfree Bullock; Severo, Cristine

    2015-01-01

    A vast amount of literature addresses issues surrounding English and French in colonial and post-colonial communities. However, relative to the spread of English and French language ideology, a limited amount of literature exists on Lusitanization (i.e. the spread of Portuguese colonial ideology by Portugal during colonialism and the role of…

  10. Discarding the Throwaway Society. Worldwatch Paper 101.

    ERIC Educational Resources Information Center

    Young, John E.

    Today's industrial economies were founded on the use of vast quantities of materials and energy, and the economic health of nations has often been equated with the amount they consumed. The amount of materials that originally enters an economy tells nothing about the material's eventual fate or its contribution to human well-being. It tells a good…

  11. Selective attention in multi-chip address-event systems.

    PubMed

    Bartolozzi, Chiara; Indiveri, Giacomo

    2009-01-01

    Selective attention is the strategy used by biological systems to cope with the inherent limits in their available computational resources, in order to efficiently process sensory information. The same strategy can be used in artificial systems that have to process vast amounts of sensory data with limited resources. In this paper we present a neuromorphic VLSI device, the "Selective Attention Chip" (SAC), which can be used to implement these models in multi-chip address-event systems. We also describe a real-time sensory-motor system, which integrates the SAC with a dynamic vision sensor and a robotic actuator. We present experimental results from each component in the system, and demonstrate how the complete system implements a real-time stimulus-driven selective attention model.

  12. Telecommunications issues of intelligent database management for ground processing systems in the EOS era

    NASA Technical Reports Server (NTRS)

    Touch, Joseph D.

    1994-01-01

    Future NASA earth science missions, including the Earth Observing System (EOS), will be generating vast amounts of data that must be processed and stored at various locations around the world. Here we present a stepwise-refinement of the intelligent database management (IDM) of the distributed active archive center (DAAC - one of seven regionally-located EOSDIS archive sites) architecture, to showcase the telecommunications issues involved. We develop this architecture into a general overall design. We show that the current evolution of protocols is sufficient to support IDM at Gbps rates over large distances. We also show that network design can accommodate a flexible data ingestion storage pipeline and a user extraction and visualization engine, without interference between the two.

  13. The Livermore Brain: Massive Deep Learning Networks Enabled by High Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Barry Y.

    The proliferation of inexpensive sensor technologies like the ubiquitous digital image sensors has resulted in the collection and sharing of vast amounts of unsorted and unexploited raw data. Companies and governments who are able to collect and make sense of large datasets to help them make better decisions more rapidly will have a competitive advantage in the information era. Machine Learning technologies play a critical role for automating the data understanding process; however, to be maximally effective, useful intermediate representations of the data are required. These representations or “features” are transformations of the raw data into a form where patternsmore » are more easily recognized. Recent breakthroughs in Deep Learning have made it possible to learn these features from large amounts of labeled data. The focus of this project is to develop and extend Deep Learning algorithms for learning features from vast amounts of unlabeled data and to develop the HPC neural network training platform to support the training of massive network models. This LDRD project succeeded in developing new unsupervised feature learning algorithms for images and video and created a scalable neural network training toolkit for HPC. Additionally, this LDRD helped create the world’s largest freely-available image and video dataset supporting open multimedia research and used this dataset for training our deep neural networks. This research helped LLNL capture several work-for-others (WFO) projects, attract new talent, and establish collaborations with leading academic and commercial partners. Finally, this project demonstrated the successful training of the largest unsupervised image neural network using HPC resources and helped establish LLNL leadership at the intersection of Machine Learning and HPC research.« less

  14. Advanced Integrated Power Systems (AIPS)

    DTIC Science & Technology

    2012-10-08

    to the vast amount of DC devices (Especially electronics such as computers, etc.). The system would have AC inverters in ...allowed the generator to cycle on and off, a system with added energy storage plus significant amounts of solar energy, and a system with the same solar...fuel (Shaffer March 2009). This equates to roughly half of the fuel in theater being used to deliver

  15. Apples to Apples: Towards a Pan-Canadian Common University Data Set

    ERIC Educational Resources Information Center

    Junor, Sean; Kramer, Miriam; Usher, Alex

    2006-01-01

    Most universities in Canada are spending enormous amounts of effort collecting vast amounts of data, but are still--in some quarters--perceived to be unable to report data effectively. In other countries, this kind of problem has been resolved in two different ways. But in Canada, this hasn't happened. The purpose of this paper is to shed some…

  16. A Framework for Distributed Problem Solving

    NASA Astrophysics Data System (ADS)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  17. Text Mining in Biomedical Domain with Emphasis on Document Clustering

    PubMed Central

    2017-01-01

    Objectives With the exponential increase in the number of articles published every year in the biomedical domain, there is a need to build automated systems to extract unknown information from the articles published. Text mining techniques enable the extraction of unknown knowledge from unstructured documents. Methods This paper reviews text mining processes in detail and the software tools available to carry out text mining. It also reviews the roles and applications of text mining in the biomedical domain. Results Text mining processes, such as search and retrieval of documents, pre-processing of documents, natural language processing, methods for text clustering, and methods for text classification are described in detail. Conclusions Text mining techniques can facilitate the mining of vast amounts of knowledge on a given topic from published biomedical research articles and draw meaningful conclusions that are not possible otherwise. PMID:28875048

  18. There’s carbon in them thar hills: But how much? Could Pacific Northwest forests store more?

    Treesearch

    Andrea Watts; Andrew Gray; Thomas Whittier

    2017-01-01

    As a signatory to the United Nations Framework Convention on Climate Change, the United States annually compiles a report on the nation’s carbon flux—the amount of carbon emitted into the atmosphere compared to the amount stored by terrestrial landscapes. Forests store vast amounts of carbon, but it’s not fully understood how a forest’s storage capacity fluctuates as...

  19. Radiance calibration of the High Altitude Observatory white-light coronagraph on Skylab

    NASA Technical Reports Server (NTRS)

    Poland, A. I.; Macqueen, R. M.; Munro, R. H.; Gosling, J. T.

    1977-01-01

    The processing of over 35,000 photographs of the solar corona obtained by the white-light coronograph on Skylab is described. Calibration of the vast amount of data was complicated by temporal effects of radiation fog and latent image loss. These effects were compensated by imaging a calibration step wedge on each data frame. Absolute calibration of the wedge was accomplished through comparison with a set of previously calibrated glass opal filters. Analysis employed average characteristic curves derived from measurements of step wedges from many frames within a given camera half-load. The net absolute accuracy of a given radiance measurement is estimated to be 20%.

  20. Cryogenic and radiation hard ASIC design for large format NIR/SWIR detector

    NASA Astrophysics Data System (ADS)

    Gao, Peng; Dupont, Benoit; Dierickx, Bart; Müller, Eric; Verbruggen, Geert; Gielis, Stijn; Valvekens, Ramses

    2014-10-01

    An ASIC is developed to control and data quantization for large format NIR/SWIR detector arrays. Both cryogenic and space radiation environment issue are considered during the design. Therefore it can be integrated in the cryogenic chamber, which reduces significantly the vast amount of long wires going in and out the cryogenic chamber, i.e. benefits EMI and noise concerns, as well as the power consumption of cooling system and interfacing circuits. In this paper, we will describe the development of this prototype ASIC for image sensor driving and signal processing as well as the testing in both room and cryogenic temperature.

  1. Logic programming and metadata specifications

    NASA Technical Reports Server (NTRS)

    Lopez, Antonio M., Jr.; Saacks, Marguerite E.

    1992-01-01

    Artificial intelligence (AI) ideas and techniques are critical to the development of intelligent information systems that will be used to collect, manipulate, and retrieve the vast amounts of space data produced by 'Missions to Planet Earth.' Natural language processing, inference, and expert systems are at the core of this space application of AI. This paper presents logic programming as an AI tool that can support inference (the ability to draw conclusions from a set of complicated and interrelated facts). It reports on the use of logic programming in the study of metadata specifications for a small problem domain of airborne sensors, and the dataset characteristics and pointers that are needed for data access.

  2. Information specialist for a coming age (9)

    NASA Astrophysics Data System (ADS)

    Shibata, Ryosuke

    As competition among enterprises has become severe, the role of information center has increased. The larger the organization becomes because of a diversified business operation, the harder the personnels in charge of business, planning encounter their necessary information. Here is role of information center that it gets users to find appropriate information they need. Also enterprises must select information among vast amount of back-up information, which produces some indication when constructing the strategy. If the information center serves to select such information, analyze and process it, that is exactly categorized as strategic information activities. To promote those activities we have to consider how information centers should be located inside the enterprises.

  3. A Comprehensive Computer Package for Ambulatory Surgical Facilities

    PubMed Central

    Kessler, Robert R.

    1980-01-01

    Ambulatory surgical centers are a cost effective alternative to hospital surgery. Their increasing popularity has contributed to heavy case loads, an accumulation of vast amounts of medical and financial data and economic pressures to maintain a tight control over “cash flow”. Computerization is now a necessity to aid ambulatory surgical centers to maintain their competitive edge. An on-line system is especially necessary as it allows interactive scheduling of surgical cases, immediate access to financial data and rapid gathering of medical and statistical information. This paper describes the significant features of the computer package in use at the Salt Lake Surgical Center, which processes 500 cases per month.

  4. Geological applications of machine learning on hyperspectral remote sensing data

    NASA Astrophysics Data System (ADS)

    Tse, C. H.; Li, Yi-liang; Lam, Edmund Y.

    2015-02-01

    The CRISM imaging spectrometer orbiting Mars has been producing a vast amount of data in the visible to infrared wavelengths in the form of hyperspectral data cubes. These data, compared with those obtained from previous remote sensing techniques, yield an unprecedented level of detailed spectral resolution in additional to an ever increasing level of spatial information. A major challenge brought about by the data is the burden of processing and interpreting these datasets and extract the relevant information from it. This research aims at approaching the challenge by exploring machine learning methods especially unsupervised learning to achieve cluster density estimation and classification, and ultimately devising an efficient means leading to identification of minerals. A set of software tools have been constructed by Python to access and experiment with CRISM hyperspectral cubes selected from two specific Mars locations. A machine learning pipeline is proposed and unsupervised learning methods were implemented onto pre-processed datasets. The resulting data clusters are compared with the published ASTER spectral library and browse data products from the Planetary Data System (PDS). The result demonstrated that this approach is capable of processing the huge amount of hyperspectral data and potentially providing guidance to scientists for more detailed studies.

  5. UK to train 100 PhD students in data science

    NASA Astrophysics Data System (ADS)

    Allen, Michael

    2017-12-01

    A new PhD programme to develop techniques to handle the vast amounts of data being generated by experiments and facilities has been launched by the UK's Science and Technology Facilities Council (STFC).

  6. Geomorphologic Map of Titan's Polar Terrains

    NASA Astrophysics Data System (ADS)

    Birch, S. P. D.; Hayes, A. G.; Malaska, M. J.; Lopes, R. M. C.; Schoenfeld, A.; Williams, D. A.

    2016-06-01

    Titan's lakes and seas contain vast amounts of information regarding the history and evolution of Saturn's largest moon. To understand this landscape, we created a geomorphologic map, and then used our map to develop an evolutionary model.

  7. Applying Social Tagging to Manage Cognitive Load in a Web 2.0 Self-Learning Environment

    ERIC Educational Resources Information Center

    Huang, Yueh-Min; Huang, Yong-Ming; Liu, Chien-Hung; Tsai, Chin-Chung

    2013-01-01

    Web-based self-learning (WBSL) has received a lot of attention in recent years due to the vast amount of varied materials available in the Web 2.0 environment. However, this large amount of material also has resulted in a serious problem of cognitive overload that degrades the efficacy of learning. In this study, an information graphics method is…

  8. A study of TRIGLYCINE SULFATE (TGS) crystals from the International Microgravity Laboratory Mission (IML-1)

    NASA Technical Reports Server (NTRS)

    Lal, R. B.

    1992-01-01

    Preliminary evaluation of the data was made during the hologram processing procedure. A few representative holograms were selected and reconstructed in the HGS; photographs of sample particle images were made to illustrate the resolution of all three particle sizes. Based on these evaluations slight modifications were requested in the hologram processing procedure to optimize the hologram exposure in the vicinity of the crystal. Preliminary looks at the data showed that we are able to see and track all three sizes of particles throughout the chamber. Because of the vast amount of data available in the holograms, it was recommended that we produce a detailed data reduction plan with prioritization on the different types of data which can be extracted from the holograms.

  9. Distributed and parallel approach for handle and perform huge datasets

    NASA Astrophysics Data System (ADS)

    Konopko, Joanna

    2015-12-01

    Big Data refers to the dynamic, large and disparate volumes of data comes from many different sources (tools, machines, sensors, mobile devices) uncorrelated with each others. It requires new, innovative and scalable technology to collect, host and analytically process the vast amount of data. Proper architecture of the system that perform huge data sets is needed. In this paper, the comparison of distributed and parallel system architecture is presented on the example of MapReduce (MR) Hadoop platform and parallel database platform (DBMS). This paper also analyzes the problem of performing and handling valuable information from petabytes of data. The both paradigms: MapReduce and parallel DBMS are described and compared. The hybrid architecture approach is also proposed and could be used to solve the analyzed problem of storing and processing Big Data.

  10. Mining chemical information from open patents

    PubMed Central

    2011-01-01

    Linked Open Data presents an opportunity to vastly improve the quality of science in all fields by increasing the availability and usability of the data upon which it is based. In the chemical field, there is a huge amount of information available in the published literature, the vast majority of which is not available in machine-understandable formats. PatentEye, a prototype system for the extraction and semantification of chemical reactions from the patent literature has been implemented and is discussed. A total of 4444 reactions were extracted from 667 patent documents that comprised 10 weeks' worth of publications from the European Patent Office (EPO), with a precision of 78% and recall of 64% with regards to determining the identity and amount of reactants employed and an accuracy of 92% with regards to product identification. NMR spectra reported as product characterisation data are additionally captured. PMID:21999425

  11. Long term trending of engineering data for the Hubble Space Telescope

    NASA Technical Reports Server (NTRS)

    Cox, Ross M.

    1993-01-01

    A major goal in spacecraft engineering analysis is the detection of component failures before the fact. Trending is the process of monitoring subsystem states to discern unusual behaviors. This involves reducing vast amounts of data about a component or subsystem into a form that helps humans discern underlying patterns and correlations. A long term trending system has been developed for the Hubble Space Telescope. Besides processing the data for 988 distinct telemetry measurements each day, it produces plots of 477 important parameters for the entire 24 hours. Daily updates to the trend files also produce 339 thirty day trend plots each month. The total system combines command procedures to control the execution of the C-based data processing program, user-written FORTRAN routines, and commercial off-the-shelf plotting software. This paper includes a discussion the performance of the trending system and of its limitations.

  12. Cesium separation from contaminated milk using magnetic particles containing crystalline silicotitantes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nunez, L.; Kaminski, M.; Chemical Engineering

    2000-11-01

    The Chernobyl nuclear reactor disaster in 1986 contaminated vast regions of prime grazing land. Subsequently, milk produced in the region has been contaminated with small amounts of the long-lived fission product cesium-137, and the Ukraine is seeking to deploy a simple separation process that will remove the Cs and preserve the nutritional value of the milk. Tiny magnetic particles containing crystalline silicotitanates (CST) have been manufactured and tested to this end. The results show that partitioning efficiency is optimized with low ratios of particle mass to volume. To achieve 90% Cs decontamination in a single-stage process, <3 g of magneticmore » CST per l milk is sufficient with a 30-min mixing time. A two-stage process would utilize <0.4 g/l per stage. The modeling of the magnetic CST system described herein can be achieved rather simply which is important for deployment in the affected Ukraine region.« less

  13. EPA Science Matters Newsletter: EPA Recovery Champions Help Effort to Save Threatened Owl (Published January 2014)

    EPA Pesticide Factsheets

    Learn abut HexSim, a program developed by the EPA, that incorporates vast amounts of available data about dwindling wildlife species, such as spotted owls, to create scenarios involving virtual populations

  14. Visually Managing IPsec

    DTIC Science & Technology

    2010-03-01

    United States Air Force relies heavily on computer networks to transmit vast amounts of information throughout its organizations and with agencies...4 1.5. Thesis Organization ...and concepts are presented and explored. 1.5. Thesis Organization Chapter II provides background information on the current technologies that

  15. [Contribution and challenges of Big Data in oncology].

    PubMed

    Saintigny, Pierre; Foy, Jean-Philippe; Ferrari, Anthony; Cassier, Philippe; Viari, Alain; Puisieux, Alain

    2017-03-01

    Since the first draft of the human genome sequence published in 2001, the cost of sequencing has dramatically decreased. The development of new technologies such as next generation sequencing led to a comprehensive characterization of a large number of tumors of various types as well as to significant advances in precision medicine. Despite the valuable information this technological revolution has allowed to produce, the vast amount of data generated resulted in the emergence of new challenges for the biomedical community, such as data storage, processing and mining. Here, we describe the contribution and challenges of Big Data in oncology. Copyright © 2016 Société Française du Cancer. Published by Elsevier Masson SAS. All rights reserved.

  16. A gender- and sexual orientation-dependent spatial attentional effect of invisible images.

    PubMed

    Jiang, Yi; Costello, Patricia; Fang, Fang; Huang, Miner; He, Sheng

    2006-11-07

    Human observers are constantly bombarded with a vast amount of information. Selective attention helps us to quickly process what is important while ignoring the irrelevant. In this study, we demonstrate that information that has not entered observers' consciousness, such as interocularly suppressed (invisible) erotic pictures, can direct the distribution of spatial attention. Furthermore, invisible erotic information can either attract or repel observers' spatial attention depending on their gender and sexual orientation. While unaware of the suppressed pictures, heterosexual males' attention was attracted to invisible female nudes, heterosexual females' attention was attracted to invisible male nudes, gay males behaved similarly to heterosexual females, and gay/bisexual females performed in-between heterosexual males and females.

  17. Utilization of ontology look-up services in information retrieval for biomedical literature.

    PubMed

    Vishnyakova, Dina; Pasche, Emilie; Lovis, Christian; Ruch, Patrick

    2013-01-01

    With the vast amount of biomedical data we face the necessity to improve information retrieval processes in biomedical domain. The use of biomedical ontologies facilitated the combination of various data sources (e.g. scientific literature, clinical data repository) by increasing the quality of information retrieval and reducing the maintenance efforts. In this context, we developed Ontology Look-up services (OLS), based on NEWT and MeSH vocabularies. Our services were involved in some information retrieval tasks such as gene/disease normalization. The implementation of OLS services significantly accelerated the extraction of particular biomedical facts by structuring and enriching the data context. The results of precision in normalization tasks were boosted on about 20%.

  18. A Fault-Tolerant Radiation-Robust Mass Storage Concept for Highly Scaled Flash Memory

    NASA Astrophysics Data System (ADS)

    Fuchs, Cristian M.; Trinitis, Carsten; Appel, Nicolas; Langer, Martin

    2015-09-01

    Future spacemissions will require vast amounts of data to be stored and processed aboard spacecraft. While satisfying operational mission requirements, storage systems must guarantee data integrity and recover damaged data throughout the mission. NAND-flash memories have become popular for space-borne high performance mass memory scenarios, though future storage concepts will rely upon highly scaled flash or other memory technologies. With modern flash memory, single bit erasure coding and RAID based concepts are insufficient. Thus, a fully run-time configurable, high performance, dependable storage concept, requiring a minimal set of logic or software. The solution is based on composite erasure coding and can be adjusted for altered mission duration or changing environmental conditions.

  19. A gender- and sexual orientation-dependent spatial attentional effect of invisible images

    PubMed Central

    Jiang, Yi; Costello, Patricia; Fang, Fang; Huang, Miner; He, Sheng

    2006-01-01

    Human observers are constantly bombarded with a vast amount of information. Selective attention helps us to quickly process what is important while ignoring the irrelevant. In this study, we demonstrate that information that has not entered observers' consciousness, such as interocularly suppressed (invisible) erotic pictures, can direct the distribution of spatial attention. Furthermore, invisible erotic information can either attract or repel observers' spatial attention depending on their gender and sexual orientation. While unaware of the suppressed pictures, heterosexual males' attention was attracted to invisible female nudes, heterosexual females' attention was attracted to invisible male nudes, gay males behaved similarly to heterosexual females, and gay/bisexual females performed in-between heterosexual males and females. PMID:17075055

  20. Superflares and Giant Planets

    NASA Astrophysics Data System (ADS)

    Rubenstein, Eric P.

    2001-02-01

    Nine solar analogues, stars similar in size and composition to the Sun, are known to have produced enormous flares. These outbursts, which were from 100 to 10 million times the size of even the largest solar flares, have puzzled astronomers, because sunlike stars should in theory vary little in brightness. A likely explanation is that these stars have unseen planetary companions circling in close orbits. Giant planets with large magnetic fields would, over time, entangle the magnetic fields of the parent stars. Eventually, the stretched and twisted magnetic-field lines would break and reattach themselves in a less complicated arrangement. This process, called magnetic reconnection, neatly explains how vast amounts of energy can be released so suddenly from superflaring solar analogues.

  1. The impact of parent-child interaction on brain structures: cross-sectional and longitudinal analyses.

    PubMed

    Takeuchi, Hikaru; Taki, Yasuyuki; Hashizume, Hiroshi; Asano, Kohei; Asano, Michiko; Sassa, Yuko; Yokota, Susumu; Kotozaki, Yuka; Nouchi, Rui; Kawashima, Ryuta

    2015-02-04

    There is a vast amount of evidence from psychological studies that the amount of parent-child interaction affects the development of children's verbal skills and knowledge. However, despite the vast amount of literature, brain structural development associated with the amount of parent-child interaction has never been investigated. In the present human study, we used voxel-based morphometry to measure regional gray matter density (rGMD) and examined cross-sectional correlations between the amount of time spent with parents and rGMD among 127 boys and 135 girls. We also assessed correlations between the amount of time spent with parents and longitudinal changes that occurred a few years later among 106 boys and 102 girls. After correcting for confounding factors, we found negative effects of spending time with parents on rGMD in areas in the bilateral superior temporal gyrus (STG) via cross-sectional analyses as well as in the contingent areas of the right STG. We also confirmed positive effects of spending time with parents on the Verbal Comprehension score in cross-sectional and longitudinal analyses. rGMD in partly overlapping or contingent areas of the right STG was negatively correlated with age and the Verbal Comprehension score in cross-sectional analyses. Subsequent analyses revealed verbal parent-child interactions have similar effects on Verbal Comprehension scores and rGMD in the right STG in both cross-sectional and longitudinal analyses. These findings indicate that parent-child interactions affect the right STG, which may be associated with verbal skills. Copyright © 2015 the authors 0270-6474/15/352233-13$15.00/0.

  2. Perspectives of intellectual processing of large volumes of astronomical data using neural networks

    NASA Astrophysics Data System (ADS)

    Gorbunov, A. A.; Isaev, E. A.; Samodurov, V. A.

    2018-01-01

    In the process of astronomical observations vast amounts of data are collected. BSA (Big Scanning Antenna) LPI used in the study of impulse phenomena, daily logs 87.5 GB of data (32 TB per year). This data has important implications for both short-and long-term monitoring of various classes of radio sources (including radio transients of different nature), monitoring the Earth’s ionosphere, the interplanetary and the interstellar plasma, the search and monitoring of different classes of radio sources. In the framework of the studies discovered 83096 individual pulse events (in the interval of the study highlighted July 2012 - October 2013), which may correspond to pulsars, twinkling springs, and a rapid radio transients. Detected impulse events are supposed to be used to filter subsequent observations. The study suggests approach, using the creation of the multilayered artificial neural network, which processes the input raw data and after processing, by the hidden layer, the output layer produces a class of impulsive phenomena.

  3. Is Industry Managing Its Wastes Properly?

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1975

    1975-01-01

    Industry is faced with handling, disposing and recovering vast amounts of waste, much of it as a result of present pollution control technology. Industry has found the technology available, expensive and, without regulation, easy to ignore. Many industries are therefore improperly managing their wastes. (BT)

  4. Ubiquitous UAVs: a cloud based framework for storing, accessing and processing huge amount of video footage in an efficient way

    NASA Astrophysics Data System (ADS)

    Efstathiou, Nectarios; Skitsas, Michael; Psaroudakis, Chrysostomos; Koutras, Nikolaos

    2017-09-01

    Nowadays, video surveillance cameras are used for the protection and monitoring of a huge number of facilities worldwide. An important element in such surveillance systems is the use of aerial video streams originating from onboard sensors located on Unmanned Aerial Vehicles (UAVs). Video surveillance using UAVs represent a vast amount of video to be transmitted, stored, analyzed and visualized in a real-time way. As a result, the introduction and development of systems able to handle huge amount of data become a necessity. In this paper, a new approach for the collection, transmission and storage of aerial videos and metadata is introduced. The objective of this work is twofold. First, the integration of the appropriate equipment in order to capture and transmit real-time video including metadata (i.e. position coordinates, target) from the UAV to the ground and, second, the utilization of the ADITESS Versatile Media Content Management System (VMCMS-GE) for storing of the video stream and the appropriate metadata. Beyond the storage, VMCMS-GE provides other efficient management capabilities such as searching and processing of videos, along with video transcoding. For the evaluation and demonstration of the proposed framework we execute a use case where the surveillance of critical infrastructure and the detection of suspicious activities is performed. Collected video Transcodingis subject of this evaluation as well.

  5. Technology in the high entropy world.

    PubMed

    Tambo, N

    2006-01-01

    Modern growing society is mainly driven by oils and may be designated "petroleum civilisation". However, the basic energy used to drive the global ecosystem is solar radiation. The amount of fossil energy consumption is minimal in the whole global energy balance. Economic growth is mainly controlled by the fossil (commercial) energy consumption rate in urban areas. Water and sanitation systems are bridging economical activities and global ecosystems. Therefore, vast amounts of high entropy solar energy should always be taken into account in the water industry. Only in urban/industrial areas where most of the GDP is earned, are commercial energy driven systems inevitably introduced with maximum effort for energy saving. A water district concept to ensure appropriate quality use with the least deterioration of the environment is proposed. In other areas, decentralised water and sanitation systems driven on soft energy paths would be recommended. A process and system designed on a high entropy energy system would be the foundation for a future urban metabolic system revolution for when oil-based energy become scarce.

  6. An adaptive process-based cloud infrastructure for space situational awareness applications

    NASA Astrophysics Data System (ADS)

    Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce

    2014-06-01

    Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.

  7. Association mapping across numerous traits reveals patterns of functional variation in maize

    USDA-ARS?s Scientific Manuscript database

    Phenotypic variation in natural populations results from a combination of genetic effects, environmental effects, and gene-by-environment interactions. Despite the vast amount of genomic data becoming available, many pressing questions remain about the nature of genetic mutations that underlie funct...

  8. Data Destruction

    ERIC Educational Resources Information Center

    Bergren, Martha Dewey

    2005-01-01

    School nurses are caretakers of a vast amount of sensitive student and family health information. In schools, older computer hardware that previously stored education records is recycled for less demanding student and employee functions. Sensitive data must be adequately erased before electronic storage devices are reassigned or are discarded.…

  9. Culturing Conceptions: From First Principles

    ERIC Educational Resources Information Center

    Roth, Wolff-Michael; Lee, Yew Jin; Hwang, SungWon

    2008-01-01

    Over the past three decades, science educators have accumulated a vast amount of information on conceptions--variously defined as beliefs, ontologies, cognitive structures, mental models, or frameworks--that generally (at least initially) have been derived from interviews about certain topics. During the same time period, cultural studies has…

  10. Building a SEM Analytics Reporting Portfolio

    ERIC Educational Resources Information Center

    Goff, Jay W.; Williams, Brian G.; Kilgore, Wendy

    2016-01-01

    Effective strategic enrollment management (SEM) efforts require vast amounts of internal and external data to ensure that meaningful reporting and analysis systems can assist managers in decision making. A wide range of information is integral for leading effective and efficient student recruitment and retention programs. This article is designed…

  11. 15 CFR 1180.1 - Purpose and scope.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TECHNICAL INFORMATION SERVICE, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL.... (a) The purpose of this regulation is to facilitate public access to the vast amount of scientific... regulation provides a variety of methods for federal agencies to adopt to ensure the timely transfer to the...

  12. 15 CFR 1180.1 - Purpose and scope.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...

  13. 15 CFR 1180.1 - Purpose and scope.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...

  14. 15 CFR 1180.1 - Purpose and scope.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...

  15. 15 CFR 1180.1 - Purpose and scope.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) TECHNOLOGY ADMINISTRATION, DEPARTMENT OF COMMERCE TRANSFER BY FEDERAL AGENCIES OF SCIENTIFIC, TECHNICAL AND... purpose of this regulation is to facilitate public access to the vast amount of scientific, technical and... variety of methods for federal agencies to adopt to ensure the timely transfer to the National Technical...

  16. Ratio, Proportion and Scaling. Mathematics Resource Project.

    ERIC Educational Resources Information Center

    Hoffer, Shirley Ann, Ed.

    The Mathematics Resource Project has as its goal the production of topical resources for teachers, drawn from the vast amounts of available material. This experimental edition on Ratio, Proportion, and Scaling, contains a teaching emphasis section, a classroom materials section, and teacher commentaries. The teaching emphasis section stresses…

  17. Human Resources and the Internet.

    ERIC Educational Resources Information Center

    Cohen, Suzanne; Joseph, Deborah

    Concerned about falling behind the technology curve, organizations are using the Internet or intranets to provide and communicate information to their employees and create more efficient workplaces. The Internet is not just a "network of computer networks," but a medium conveying a vast, diverse amount of information. This publication is…

  18. The National Information Infrastructure: Agenda for Action.

    ERIC Educational Resources Information Center

    Department of Commerce, Washington, DC. Information Infrastructure Task Force.

    The National Information Infrastructure (NII) is planned as a web of communications networks, computers, databases, and consumer electronics that will put vast amounts of information at the users' fingertips. Private sector firms are beginning to develop this infrastructure, but essential roles remain for the Federal Government. The National…

  19. Equity Sensitivity in Illinois Public School Teachers

    ERIC Educational Resources Information Center

    Grossi, Robert G.

    2013-01-01

    Research supports the importance of teacher quality on effective student learning. School districts recognize this fact and focus extensively on hiring quality teachers and improving teaching skills through professional development programs. Amazingly, despite common sense and a vast amount of research that reflects that employee performance is a…

  20. Representing the Past by Solid Modeling + Golden Ratio Analysis

    ERIC Educational Resources Information Center

    Ding, Suining

    2008-01-01

    This paper describes the procedures of reconstructing ancient architecture using solid modeling with geometric analysis, especially the Golden Ratio analysis. In the past the recovery and reconstruction of ruins required bringing together fragments of evidence and vast amount of measurements from archaeological site. Although researchers and…

  1. Studies of the phytoplankton and soil algae of two strip-mine impoundments in Tuscarawas County, Ohio

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richards, J.N.

    1973-01-01

    The process of strip-mining leaves vast areas that are denuded of vegetation and are open to primary succession by organisms such as algae. Acid strip-mine impoundments are either formed by man-made or natural processes. These impoundments are remnants of old strip-mine pits that have been filled with runoff water. The water chemistry of these ponds reflects the chemistry of the earth strata above the coal seam that was mined. These young impoundments or ponds are extremely low in pH and quite acidic due to the presence of great amounts of sulfuric acid. Algae that are found in these types ofmore » habitats exhibit a tolerance to acid conditions and are considered to be acidophilic. Few species of algae are known to be common componenets of these habitats.« less

  2. VEP contrast sensitivity responses reveal reduced functional segregation of mid and high filters of visual channels in autism.

    PubMed

    Jemel, Boutheina; Mimeault, Daniel; Saint-Amour, Dave; Hosein, Anthony; Mottron, Laurent

    2010-06-01

    Despite the vast amount of behavioral data showing a pronounced tendency in individuals with autism spectrum disorder (ASD) to process fine visual details, much less is known about the neurophysiological characteristics of spatial vision in ASD. Here, we address this issue by assessing the contrast sensitivity response properties of the early visual-evoked potentials (VEPs) to sine-wave gratings of low, medium and high spatial frequencies in adults with ASD and in an age- and IQ-matched control group. Our results show that while VEP contrast responses to low and high spatial frequency gratings did not differ between ASD and controls, early VEPs to mid spatial frequency gratings exhibited similar response characteristics as those to high spatial frequency gratings in ASD. Our findings show evidence for an altered functional segregation of early visual channels, especially those responsible for processing mid- and high-frequency spatial scales.

  3. A Framework of Hyperspectral Image Compression using Neural Networks

    DOE PAGES

    Masalmah, Yahya M.; Martínez Nieves, Christian; Rivera Soto, Rafael; ...

    2015-01-01

    Hyperspectral image analysis has gained great attention due to its wide range of applications. Hyperspectral images provide a vast amount of information about underlying objects in an image by using a large range of the electromagnetic spectrum for each pixel. However, since the same image is taken multiple times using distinct electromagnetic bands, the size of such images tend to be significant, which leads to greater processing requirements. The aim of this paper is to present a proposed framework for image compression and to study the possible effects of spatial compression on quality of unmixing results. Image compression allows usmore » to reduce the dimensionality of an image while still preserving most of the original information, which could lead to faster image processing. Lastly, this paper presents preliminary results of different training techniques used in Artificial Neural Network (ANN) based compression algorithm.« less

  4. Integrating Scientific Array Processing into Standard SQL

    NASA Astrophysics Data System (ADS)

    Misev, Dimitar; Bachhuber, Johannes; Baumann, Peter

    2014-05-01

    We live in a time that is dominated by data. Data storage is cheap and more applications than ever accrue vast amounts of data. Storing the emerging multidimensional data sets efficiently, however, and allowing them to be queried by their inherent structure, is a challenge many databases have to face today. Despite the fact that multidimensional array data is almost always linked to additional, non-array information, array databases have mostly developed separately from relational systems, resulting in a disparity between the two database categories. The current SQL standard and SQL DBMS supports arrays - and in an extension also multidimensional arrays - but does so in a very rudimentary and inefficient way. This poster demonstrates the practicality of an SQL extension for array processing, implemented in a proof-of-concept multi-faceted system that manages a federation of array and relational database systems, providing transparent, efficient and scalable access to the heterogeneous data in them.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kyle, Jennifer E.; Zhang, Xing; Weitz, Karl K.

    Understanding how biological molecules are generated, metabolized and eliminated in living systems is important for interpreting processes such as immune response and disease pathology. While genomic and proteomic studies have provided vast amounts of information over the last several decades, interest in lipidomics has also grown due to improved analytical technologies revealing altered lipid metabolism in type 2 diabetes, cancer, and lipid storage disease. Liquid chromatography and mass spectrometry (LC-MS) measurements are currently the dominant approach for characterizing the lipidome by providing detailed information on the spatial and temporal composition of lipids. However, interpreting lipids’ biological roles is challenging duemore » to the existence of numerous structural and stereoisomers (i.e. distinct acyl chain and double-bond positions), which are unresolvable using present LC-MS approaches. Here we show that combining structurally-based ion mobility spectrometry (IMS) with LC-MS measurements distinguishes lipid isomers and allows insight into biological and disease processes.« less

  6. U.S. Air Forces Aerial Spray Mission: Should the Department of Defense Continue to Operate this Weapon of Mass Dispersion

    DTIC Science & Technology

    2015-12-01

    pesticide application over farm fields to produce a better crop.2 On 3 August 1921 in a joint effort between the U.S. Army Signal Corps in Dayton, Ohio... pesticide dissemination because of the relatively small amount of product needed to spray for nuisance insects over a vast area. The ULV system is... pesticide per minute. Applications that require massive amounts of liquid herbicide to neutralize cheatgrass and other fire-prone, invasive vegetation on

  7. Pathogenesis of Recalcitrant Chronic Rhinosinusitis: The Emerging Role of Innate Immune Cells.

    PubMed

    Kong, Il Gyu; Kim, Dae Woo

    2018-04-01

    Chronic rhinosinusitis (CRS) is a major part of the recalcitrant inflammatory diseases of the upper airway that needs enormous socioeconomic burden. T helper (Th) 2 type immune responses recruiting eosinophils were the most well-known immune players in CRS pathogenesis especially in western countries. By the piling up of a vast amount of researches to elucidate the pathogenic mechanism of CRS recently, heterogeneous inflammatory processes were found to be related to the phenotypes of CRS. Recently more cells other than T cells were in the focus of CRS pathogenesis, such as the epithelial cell, macrophage, innate lymphoid cells, and neutrophils. Here, we reviewed the recent research focusing on the innate immune cells related to CRS pathogenesis.

  8. Listeria Genomics

    NASA Astrophysics Data System (ADS)

    Cabanes, Didier; Sousa, Sandra; Cossart, Pascale

    The opportunistic intracellular foodborne pathogen Listeria monocytogenes has become a paradigm for the study of host-pathogen interactions and bacterial adaptation to mammalian hosts. Analysis of L. monocytogenes infection has provided considerable insight into how bacteria invade cells, move intracellularly, and disseminate in tissues, as well as tools to address fundamental processes in cell biology. Moreover, the vast amount of knowledge that has been gathered through in-depth comparative genomic analyses and in vivo studies makes L. monocytogenes one of the most well-studied bacterial pathogens. This chapter provides an overview of progress in the exploration of genomic, transcriptomic, and proteomic data in Listeria spp. to understand genome evolution and diversity, as well as physiological aspects of metabolism used by bacteria when growing in diverse environments, in particular in infected hosts.

  9. Combined semantic and similarity search in medical image databases

    NASA Astrophysics Data System (ADS)

    Seifert, Sascha; Thoma, Marisa; Stegmaier, Florian; Hammon, Matthias; Kramer, Martin; Huber, Martin; Kriegel, Hans-Peter; Cavallaro, Alexander; Comaniciu, Dorin

    2011-03-01

    The current diagnostic process at hospitals is mainly based on reviewing and comparing images coming from multiple time points and modalities in order to monitor disease progression over a period of time. However, for ambiguous cases the radiologist deeply relies on reference literature or second opinion. Although there is a vast amount of acquired images stored in PACS systems which could be reused for decision support, these data sets suffer from weak search capabilities. Thus, we present a search methodology which enables the physician to fulfill intelligent search scenarios on medical image databases combining ontology-based semantic and appearance-based similarity search. It enabled the elimination of 12% of the top ten hits which would arise without taking the semantic context into account.

  10. Subliminal and Supraliminal Processing of Facial Expression of Emotions: Brain Oscillation in the Left/Right Frontal Area

    PubMed Central

    Balconi, Michela; Ferrari, Chiara

    2012-01-01

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types. PMID:24962767

  11. Volume and Value of Big Healthcare Data.

    PubMed

    Dinov, Ivo D

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions.

  12. Subliminal and supraliminal processing of facial expression of emotions: brain oscillation in the left/right frontal area.

    PubMed

    Balconi, Michela; Ferrari, Chiara

    2012-03-26

    The unconscious effects of an emotional stimulus have been highlighted by a vast amount of research, whereover it remains questionable whether it is possible to assign a specific function to cortical brain oscillations in the unconscious perception of facial expressions of emotions. Alpha band variation was monitored within the right- and left-cortical side when subjects consciously (supraliminal stimulation) or unconsciously (subliminal stimulation) processed facial patterns. Twenty subjects looked at six facial expressions of emotions (anger, fear, surprise, disgust, happiness, sadness, and neutral) under two different conditions: supraliminal (200 ms) vs. subliminal (30 ms) stimulation (140 target-mask pairs for each condition). The results showed that conscious/unconscious processing and the significance of the stimulus can modulate the alpha power. Moreover, it was found that there was an increased right frontal activity for negative emotions vs. an increased left response for positive emotion. The significance of facial expressions was adduced to elucidate cortical different responses to emotional types.

  13. Volume and Value of Big Healthcare Data

    PubMed Central

    Dinov, Ivo D.

    2016-01-01

    Modern scientific inquiries require significant data-driven evidence and trans-disciplinary expertise to extract valuable information and gain actionable knowledge about natural processes. Effective evidence-based decisions require collection, processing and interpretation of vast amounts of complex data. The Moore's and Kryder's laws of exponential increase of computational power and information storage, respectively, dictate the need rapid trans-disciplinary advances, technological innovation and effective mechanisms for managing and interrogating Big Healthcare Data. In this article, we review important aspects of Big Data analytics and discuss important questions like: What are the challenges and opportunities associated with this biomedical, social, and healthcare data avalanche? Are there innovative statistical computing strategies to represent, model, analyze and interpret Big heterogeneous data? We present the foundation of a new compressive big data analytics (CBDA) framework for representation, modeling and inference of large, complex and heterogeneous datasets. Finally, we consider specific directions likely to impact the process of extracting information from Big healthcare data, translating that information to knowledge, and deriving appropriate actions. PMID:26998309

  14. Anima: Modular Workflow System for Comprehensive Image Data Analysis

    PubMed Central

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima. PMID:25126541

  15. Wind erosion of cropland in the northwestern Tarim Basin

    USDA-ARS?s Scientific Manuscript database

    The Aksu region within the Tarim Basin is a major source of windblown dust due to aridity and vast areas under intensive irrigated crop production. Despite the importance of crop production to the local economy and sustenance, little is known about the amount of soil eroded by wind from agricultural...

  16. Digital Geodata Traces--New Challenges for Geographic Education

    ERIC Educational Resources Information Center

    Hohnle, Steffen; Michel, Boris; Glasze, Georg; Uphues, Rainer

    2013-01-01

    Young people in modern societies consciously (e.g. Facebook) or unconsciously (e.g. some Google services) produce a vast amount of geodata. Using relational databases, private companies are capable of creating very precise profiles of the individual user and his/her spatial practices from this data. This almost inevitably prompts questions…

  17. Medusahead invasion along unimproved roads, animal trails, and random transects

    USDA-ARS?s Scientific Manuscript database

    Medusahead, an exotic annual grass, is rapidly spreading and causing ecological damage across the western United States. It is critical that land managers prioritize where they direct treatment and monitoring efforts due to the vast areas this exotic plant occupies and the limited amount of resourc...

  18. University Endowment Reform: A Dialogue

    ERIC Educational Resources Information Center

    Miller, Charles; Munson, Lynne

    2008-01-01

    In late September 2007, the issue of wealthy university endowments became front page news. Members of the Senate Finance Committee, most notably Sen. Charles Grassley (R-IA), questioned why some endowments were amassing vast amounts of tax-subsidized wealth while simultaneously raising tuition on average families to greater and greater levels. The…

  19. Detection and quantification of fugitive emissions from Colorado oil and gas production operations using remote monitoring

    EPA Science Inventory

    Western states contain vast amounts of oil and gas production. For example, Weld County Colorado contains approximately 25,000 active oil and gas well sites with associated production operations. There is little information on the air pollutant emission potential from this source...

  20. Geometry + Technology = Proof

    ERIC Educational Resources Information Center

    Lyublinskaya, Irina; Funsch, Dan

    2012-01-01

    Several interactive geometry software packages are available today to secondary school teachers. An example is The Geometer's Sketchpad[R] (GSP), also known as Dynamic Geometry[R] software, developed by Key Curriculum Press. This numeric based technology has been widely adopted in the last twenty years, and a vast amount of creativity has been…

  1. Understanding Education Involving Geovisual Analytics

    ERIC Educational Resources Information Center

    Stenliden, Linnea

    2013-01-01

    Handling the vast amounts of data and information available in contemporary society is a challenge. Geovisual Analytics provides technology designed to increase the effectiveness of information interpretation and analytical task solving. To date, little attention has been paid to the role such tools can play in education and to the extent to which…

  2. How to study effectively

    PubMed Central

    Fowler, Alexander; Whitehurst, Katharine; Al Omran, Yasser; Rajmohan, Shivanchan; Udeaja, Yagazie; Koshy, Kiron

    2017-01-01

    The ability to study effectively is an essential part of completing a medical degree. To cope with the vast amount of information and skills needed to be acquired, it is necessary develop effective study techniques. In this article we outline the various methods students can use to excel in upcoming examinations. PMID:29177223

  3. Predictors of Academic Achievement and Their Possible Applications

    ERIC Educational Resources Information Center

    Lockshin, Jeffrey; Zamkov, Oleg

    2009-01-01

    A significant amount of attention has been given to the predictors of academic achievement in higher education. However, the vast majority of articles have centred on entrance criteria and the learning approaches or personal habits of students. Investigations into how achievement depends on student efforts, being almost invariably based on…

  4. A Framework for Teaching Social and Environmental Sustainability to Undergraduate Business Majors

    ERIC Educational Resources Information Center

    Brumagim, Alan L.; Cann, Cynthia W.

    2012-01-01

    The authors outline an undergraduate exercise to help students more fully understand the environmental and social justice aspects of business sustainability activities. A simple hierarchical framework, based on Maslow's (1943) work, was utilized to help the students understand, analyze, and judge the vast amount of corporate sustainability…

  5. Attitudes toward Elementary School Student Retention.

    ERIC Educational Resources Information Center

    Faerber, Kay; Van Dusseldorp, Ralph

    Nonpromotion of elementary school students is a highly controversial and emotional issue, and a vast amount of literature has been devoted to the topic. With the current emphasis on raising academic standards in public schools, more and more educators are viewing "social promotion" with disfavor. This study was conducted to determine current…

  6. Children's Attitudes toward Older Adults and Aging: A Synthesis of Research

    ERIC Educational Resources Information Center

    Gilbert, Cara N.; Ricketts, Kristina G.

    2008-01-01

    This paper serves as a summation of literature on children's attitudes toward older adults and aging. Research indicates that the vast amount of information available provides varying levels of understanding toward children's actual views of older adults. Differences between measurements, settings, and procedures stand as barriers in…

  7. Quantitative Literacy for Undergraduate Business Students in the 21st Century

    ERIC Educational Resources Information Center

    McClure, Richard; Sircar, Sumit

    2008-01-01

    The current business environment is awash in vast amounts of data that ongoing transactions continually generate. Leading-edge corporations are using business analytics to achieve competitive advantage. However, educators are not adequately preparing business school students in quantitative methods to meet this challenge. For more than half a…

  8. Beyond a Binary: The Lives of Gender-Nonconforming Youth

    ERIC Educational Resources Information Center

    Rankin, Sue; Beemyn, Genny

    2012-01-01

    The vast majority of college students, classroom faculty, student affairs educators, and administrators have a tremendous amount to learn about gender diversity. For this majority and for gender-nonconforming students and educators, opportunities are all but untapped to leverage this diversity to enhance learning, as well as to support and…

  9. Long-Term Trends in Ecological Systems: A Basis for Understanding Responses to Global Change

    USDA-ARS?s Scientific Manuscript database

    The Eco Trends Editorial Committee sorted through vast amounts of historical and ongoing data from 50 ecological sites in the continental United States including Alaska, several islands, and Antarctica to present in a logical format the variables commonly collected. This report presents a subset of...

  10. Developing a Dynamic Inference Expert System to Support Individual Learning at Work

    ERIC Educational Resources Information Center

    Hung, Yu Hsin; Lin, Chun Fu; Chang, Ray I.

    2015-01-01

    In response to the rapid growth of information in recent decades, knowledge-based systems have become an essential tool for organizational learning. The application of electronic performance-support systems in learning activities has attracted considerable attention from researchers. Nevertheless, the vast, ever-increasing amount of information is…

  11. A Generic Archive Protocol and an Implementation

    NASA Astrophysics Data System (ADS)

    Jordan, J. M.; Jennings, D. G.; McGlynn, T. A.; Ruggiero, N. G.; Serlemitsos, T. A.

    1993-01-01

    Archiving vast amounts of data has become a major part of every scientific space mission today. GRASP, the Generic Retrieval/Ar\\-chive Services Protocol, addresses the question of how to archive the data collected in an environment where the underlying hardware archives and computer hosts may be rapidly changing.

  12. Major soybean maturity gene haplotypes revealed by SNPViz analysis of 72 sequenced soybean genomes

    USDA-ARS?s Scientific Manuscript database

    In this Genomics Era, vast amounts of next generation sequencing data have become publicly-available for multiple genomes across hundreds of species. Analysis of these large-scale datasets can become cumbersome, especially when comparing nucleotide polymorphisms across many samples within a dataset...

  13. The Artful Dodgers: Directors of Ethnic Studies Programs

    ERIC Educational Resources Information Center

    Fenchak, Paul

    1974-01-01

    Of the vast amount of ethnic studies promoted by state education departments, one conclusion can be easily made: East European ethnic studies are conspicuously absent. Suggestions are given to increase the awareness of Eastern European culture and heritage and promote their inclusion in formal ethnic studies curriculum. (Author/DE)

  14. Developing Citizenship through Honors

    ERIC Educational Resources Information Center

    Hester, Jacob Andrew; Besing, Kari Lynn

    2017-01-01

    For decades, research has shown that higher levels of education correspond to increased interest in politics and civic engagement. Despite the vast amount of scholarly attention, why this link exists is still disputed. One theory about the connection is the civic education hypothesis, which claims that the causal link between education and civic…

  15. Tracking Actual Usage: The Attention Metadata Approach

    ERIC Educational Resources Information Center

    Wolpers, Martin; Najjar, Jehad; Verbert, Katrien; Duval, Erik

    2007-01-01

    The information overload in learning and teaching scenarios is a main hindering factor for efficient and effective learning. New methods are needed to help teachers and students in dealing with the vast amount of available information and learning material. Our approach aims to utilize contextualized attention metadata to capture behavioural…

  16. Analysis of spatio-temporal land cover changes for hydrological impact assessment within the Nyando River Basin of Kenya.

    PubMed

    Olang, Luke Omondi; Kundu, Peter; Bauer, Thomas; Fürst, Josef

    2011-08-01

    The spatio-temporal changes in the land cover states of the Nyando Basin were investigated for auxiliary hydrological impact assessment. The predominant land cover types whose conversions could influence the hydrological response of the region were selected. Six Landsat images for 1973, 1986, and 2000 were processed to discern the changes based on a methodology that employs a hybrid of supervised and unsupervised classification schemes. The accuracy of the classifications were assessed using reference datasets processed in a GIS with the help of ground-based information obtained through participatory mapping techniques. To assess the possible hydrological effect of the detected changes during storm events, a physically based lumped approach for infiltration loss estimation was employed within five selected sub-basins. The results obtained indicated that forests in the basin declined by 20% while agricultural fields expanded by 16% during the entire period of study. Apparent from the land cover conversion matrices was that the majority of the forest decline was a consequence of agricultural expansion. The model results revealed decreased infiltration amounts by between 6% and 15%. The headwater regions with the vast deforestation were noted to be more vulnerable to the land cover change effects. Despite the haphazard land use patterns and uncertainties related to poor data quality for environmental monitoring and assessment, the study exposed the vast degradation and hence the need for sustainable land use planning for enhanced catchment management purposes.

  17. A systems biology approach to predict and characterize human gut microbial metabolites in colorectal cancer.

    PubMed

    Wang, QuanQiu; Li, Li; Xu, Rong

    2018-04-18

    Colorectal cancer (CRC) is the second leading cause of cancer-related deaths. It is estimated that about half the cases of CRC occurring today are preventable. Recent studies showed that human gut microbiota and their collective metabolic outputs play important roles in CRC. However, the mechanisms by which human gut microbial metabolites interact with host genetics in contributing CRC remain largely unknown. We hypothesize that computational approaches that integrate and analyze vast amounts of publicly available biomedical data have great potential in better understanding how human gut microbial metabolites are mechanistically involved in CRC. Leveraging vast amount of publicly available data, we developed a computational algorithm to predict human gut microbial metabolites for CRC. We validated the prediction algorithm by showing that previously known CRC-associated gut microbial metabolites ranked highly (mean ranking: top 10.52%; median ranking: 6.29%; p-value: 3.85E-16). Moreover, we identified new gut microbial metabolites likely associated with CRC. Through computational analysis, we propose potential roles for tartaric acid, the top one ranked metabolite, in CRC etiology. In summary, our data-driven computation-based study generated a large amount of associations that could serve as a starting point for further experiments to refute or validate these microbial metabolite associations in CRC cancer.

  18. Study of Using Solar Thermal Power for the Margarine Melting Heat Process.

    PubMed

    Sharaf Eldean, Mohamed A; Soliman, A M

    2015-04-01

    The heating process of melting margarine requires a vast amount of thermal energy due to its high melting point and the size of the reservoir it is contained in. Existing methods to heat margarine have a high hourly cost of production and use fossil fuels which have been shown to have a negative impact on the environment. Thus, we perform an analytical feasibility study of using solar thermal power as an alternative energy source for the margarine melting process. In this study, the efficiency and cost effectiveness of a parabolic trough collector (PTC) solar field are compared with that of a steam boiler. Different working fluids (water vapor and Therminol-VP1 heat transfer oil (HTO)) through the solar field are also investigated. The results reveal the total hourly cost ($/h) by the conventional configuration is much greater than the solar applications regardless of the type of working fluid. Moreover, the conventional configuration causes a negative impact to the environment by increasing the amount of CO 2 , CO, and NO 2 by 117.4 kg/day, 184 kg/day, and 74.7 kg/day, respectively. Optimized period of melt and tank volume parameters at temperature differences not exceeding 25 °C are found to be 8-10 h and 100 m 3 , respectively. The solar PTC operated with water and steam as the working fluid is recommended as a vital alternative for the margarine melting heating process.

  19. The JCSG high-throughput structural biology pipeline.

    PubMed

    Elsliger, Marc André; Deacon, Ashley M; Godzik, Adam; Lesley, Scott A; Wooley, John; Wüthrich, Kurt; Wilson, Ian A

    2010-10-01

    The Joint Center for Structural Genomics high-throughput structural biology pipeline has delivered more than 1000 structures to the community over the past ten years. The JCSG has made a significant contribution to the overall goal of the NIH Protein Structure Initiative (PSI) of expanding structural coverage of the protein universe, as well as making substantial inroads into structural coverage of an entire organism. Targets are processed through an extensive combination of bioinformatics and biophysical analyses to efficiently characterize and optimize each target prior to selection for structure determination. The pipeline uses parallel processing methods at almost every step in the process and can adapt to a wide range of protein targets from bacterial to human. The construction, expansion and optimization of the JCSG gene-to-structure pipeline over the years have resulted in many technological and methodological advances and developments. The vast number of targets and the enormous amounts of associated data processed through the multiple stages of the experimental pipeline required the development of variety of valuable resources that, wherever feasible, have been converted to free-access web-based tools and applications.

  20. Harnessing the Power of Education Research Databases with the Pearl-Harvesting Methodological Framework for Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter

    2010-01-01

    Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…

  1. Using Cluster Analysis for Data Mining in Educational Technology Research

    ERIC Educational Resources Information Center

    Antonenko, Pavlo D.; Toy, Serkan; Niederhauser, Dale S.

    2012-01-01

    Cluster analysis is a group of statistical methods that has great potential for analyzing the vast amounts of web server-log data to understand student learning from hyperlinked information resources. In this methodological paper we provide an introduction to cluster analysis for educational technology researchers and illustrate its use through…

  2. A Comparison of Robbers' Use of Physical Coercion in Commercial and Street Robberies

    ERIC Educational Resources Information Center

    McCluskey, John D.

    2013-01-01

    The face-to-face confrontation involved in the crime of robbery renders vast amounts of financial, physical, and psychological injury in the United States. This study developed hypotheses from existing literature regarding salient situational factors associated with the prevalence of overt physical coercion during commercial and street robberies.…

  3. WhoKnows? Evaluating Linked Data Heuristics with a Quiz that Cleans up DBpedia

    ERIC Educational Resources Information Center

    Waitelonis, Jorg; Ludwig, Nadine; Knuth, Magnus; Sack, Harald

    2011-01-01

    Purpose: Linking Open Data (LOD) provides a vast amount of well structured semantic information, but many inconsistencies may occur, especially if the data are generated with the help of automated methods. Data cleansing approaches enable detection of inconsistencies and overhauling of affected data sets, but they are difficult to apply…

  4. Foundation: Transforming data bases into knowledge bases

    NASA Technical Reports Server (NTRS)

    Purves, R. B.; Carnes, James R.; Cutts, Dannie E.

    1987-01-01

    One approach to transforming information stored in relational data bases into knowledge based representations and back again is described. This system, called Foundation, allows knowledge bases to take advantage of vast amounts of pre-existing data. A benefit of this approach is inspection, and even population, of data bases through an intelligent knowledge-based front-end.

  5. Using berry impact recording device for bruising assessment in southern highbush blueberry

    USDA-ARS?s Scientific Manuscript database

    Blueberries are prone to bruise damages and bruising leads to a rapid increase in the amount of decay. Due to excessive bruising damages caused by machine harvesters, the vast majority of the fruit destined for the fresh market is hand-harvested currently in the United States. The industry needs m...

  6. Navigational Support in Lifelong Learning: Enhancing Effectiveness through Indirect Social Navigation

    ERIC Educational Resources Information Center

    Janssen, Jose; van den Berg, Bert; Tattersall, Colin; Hummel, Hans; Koper, Rob

    2007-01-01

    Efficient and effective lifelong learning requires that learners can make well informed choices from a vast amount of learning opportunities. This article proposes to support learners by drawing on principles of self-organization and indirect social navigation; by analysing choices made by learners who went before and feeding this information back…

  7. Effects of Long-Term Representations on Free Recall of Unrelated Words

    ERIC Educational Resources Information Center

    Katkov, Mikhail; Romani, Sandro; Tsodyks, Misha

    2015-01-01

    Human memory stores vast amounts of information. Yet recalling this information is often challenging when specific cues are lacking. Here we consider an associative model of retrieval where each recalled item triggers the recall of the next item based on the similarity between their long-term neuronal representations. The model predicts that…

  8. Encyclopedia of Smoking and Tobacco.

    ERIC Educational Resources Information Center

    Hirschfelder, Arlene B.

    This encyclopedia presents an extensive listing of current and historical information relating to tobacco. It aims to provide accurate, current, and balanced information to people of all viewpoints and on both sides of the smoking debate. The A-to-Z format makes a vast amount of current information easily accessible. Over 600 entries are complied…

  9. The Internet, The Hidden Web, and Useful Web Resources: ERIC, ERIC/CASS, & The Virtual Library.

    ERIC Educational Resources Information Center

    Kirkman, Chris; Frady, Allen; Walz, Garry R.

    Counselors and educators face a constant struggle to keep abreast of the vast amounts of new information available, assessing this information, and continuing to gather even more information. Individual's information searching strategies often take considerable time and cause considerable frustration in getting the results wanted. While increasing…

  10. Teaching the Survey Non-Traditional Style

    ERIC Educational Resources Information Center

    Eichhorn, Niels

    2013-01-01

    Teaching survey courses at the university level can be a difficult task. The vast majority of students have to take survey classes as part of their curriculum and, as a result, bring a fair amount of resentment and/or ambivalence with them. Furthermore, many students already arrive on campus with negative opinions about history classes. This…

  11. Filtering Data for Detecting Differential Development

    ERIC Educational Resources Information Center

    Brinkhuis, Matthieu J. S.; Bakker, Marjan; Maris, Gunter

    2015-01-01

    The amount of data available in the context of educational measurement has vastly increased in recent years. Such data are often incomplete, involve tests administered at different time points and during the course of many years, and can therefore be quite challenging to model. In addition, intermediate results like grades or report cards being…

  12. Taking Care of Business: The Repercussions of Commodified Electronic Literacy.

    ERIC Educational Resources Information Center

    Dickinson, Sandra C.

    The corporate takeover of the Internet has moved literacy into uncharted territory. Couched in capitalist metaphors of liberation, choice, utility, and desirability, an expanded communications network that provides quick and easy access to vast amounts of information appeals to the national psyche. The commodification of literacy as a result of…

  13. Pathfinding in the Research Forest: The Pearl Harvesting Method for Effective Information Retrieval

    ERIC Educational Resources Information Center

    Sandieson, Robert

    2006-01-01

    Knowledge of empirical research has become important for everyone involved in education and special education. Policy, practice, and informed reporting rely on locating and understanding unfiltered, original source material. Although access to vast amounts of research has been greatly facilitated by online databases, such as ERIC and PsychInfo,…

  14. Forest fires in the insular Caribbean

    Treesearch

    A.M.J. Robbins; C.M. Eckelmann; M. Quinones

    2008-01-01

    This paper presents a summary of the forest fire reports in the insular Caribbean derived from both management reports and an analysis of publicly available Moderate Resolution Imaging Spectrodiometer (MODIS) satellite active fire products from the region. A vast difference between the amount of fires reported by land managers and fire points in the MODIS Fire...

  15. Exploring Barriers to the Categorization of Electronic Content in a Global Professional Services Firm

    ERIC Educational Resources Information Center

    Totterdale, Robert L.

    2009-01-01

    Businesses have always maintained records pertinent to the enterprise. With over 90% of new business records now estimated to be available in electronic form, organizations struggle to manage these vast amounts of electronic content while at the same time meeting collaboration, knowledge management, regulatory, and compliance needs. This case…

  16. AERIS : Assessment and Fusion of Commercial Vehicle Electronic Control Unit (ECU) Data for Real-Time Emission Modeling

    DOT National Transportation Integrated Search

    2012-06-01

    Heavy-duty trucks (HDTs) play a significant role in the freight transportation sector in the U.S. However, they consume a vast amount of fuel and are a significant source of both greenhouse gas and criteria pollutant emissions. In order to properly d...

  17. Longitudinal Study of First-Time Freshmen Using Data Mining

    ERIC Educational Resources Information Center

    Nandeshwar, Ashutosh R.

    2010-01-01

    In the modern world, higher education is transitioning from enrollment mode to recruitment mode. This shift paved the way for institutional research and policy making from historical data perspective. More and more universities in the U.S. are implementing and using enterprise resource planning (ERP) systems, which collect vast amounts of data.…

  18. Toward New Data and Information Management Solutions for Data-Intensive Ecological Research

    ERIC Educational Resources Information Center

    Laney, Christine Marie

    2013-01-01

    Ecosystem health is deteriorating in many parts of the world due to direct and indirect anthropogenic pressures. Generating accurate, useful, and impactful models of past, current, and future states of ecosystem structure and function is a complex endeavor that often requires vast amounts of data from multiple sources and knowledge from…

  19. Blended Learning over Two Decades

    ERIC Educational Resources Information Center

    Zhonggen, Yu; Yuexiu, Zhejiang

    2015-01-01

    The 21st century has witnessed vast amounts of research into blended learning since the conception of online learning formed the possibility of blended learning in the early 1990s. The theme of this paper is blended learning in mainstream disciplinary communities. In particular, the paper reports on findings from the last two decades which looked…

  20. Brain Matters: Translating Research into Classroom Practice.

    ERIC Educational Resources Information Center

    Wolfe, Patricia

    Maintaining that educators need a functional understanding of the brain and how it operates in order to teach effectively and to critically analyze the vast amount of neuroscientific information being published, this book provides information on brain-imaging techniques and the anatomy and physiology of the brain. The book also introduces a model…

  1. Geomorphology of Titan's polar terrains: Using the landscape's topographic form to constrain surface processes

    NASA Astrophysics Data System (ADS)

    Birch, S. P.; Hayes, A. G., Jr.; Dietrich, W. E.; Howard, A. D.; Malaska, M. J.; Moore, J. M.; Mastrogiuseppe, M.; White, O. L.; Hofgartner, J. D.; Soderblom, J. M.; Barnes, J. W.; Bristow, C.; Kirk, R. L.; Turtle, E. P.; Wood, C. A.; Stofan, E. R.

    2015-12-01

    Driven by an expansive atmosphere, Titan's lakes, seas and accompanied hydrological cycle hold vast amounts of information regarding the history and evolution of Titan. To understand these features, we constructed a geomorphologic map of Titan's polar terrains using a combination of the Cassini SAR, ISS, VIMS, and topographic datasets. In combining SAR, ISS, and VIMS imagery with topographic data, our geomorphic map reveals a stratigraphic sequence from which we infer formation processes. In mapping both the South and North poles with the same morphologic units, we conclude that processes that dominated the North Pole also operated in the South. Large seas, which are currently methane/ethane filled in the North and dry in the South, characterize both poles. The current day dichotomy may result only from differing initial conditions. Regions removed from the mare are dominated by smooth, undulating plains, bounded by moderately dissected uplands that are discretized into observable drainage basins. These plains contain the highest density of filled and empty lake depressions, which appear morphologically distinct from the larger mare. The thicknesses of these undulating plains are retrieved from the depths of the embedded empty depressions that are up to 800 m deep. The development of such large deposits and the surrounding hillslopes can be explained by the presence of previously vast polar oceans. Larger liquid bodies would have allowed for a sustained accumulation of soluble and insoluble sediments from Titan's lower latitudes. Two plausible evolutionary scenarios include seas that were slightly larger, followed by tectonic uplift, or oceans that were much larger, that have since lost most of their volume over time to methane photolysis. In either scenario, thick sedimentary deposits of soluble materials are required to have been emplaced prior to the formation of the small lake depressions.

  2. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  3. Smart algorithms and adaptive methods in computational fluid dynamics

    NASA Astrophysics Data System (ADS)

    Tinsley Oden, J.

    1989-05-01

    A review is presented of the use of smart algorithms which employ adaptive methods in processing large amounts of data in computational fluid dynamics (CFD). Smart algorithms use a rationally based set of criteria for automatic decision making in an attempt to produce optimal simulations of complex fluid dynamics problems. The information needed to make these decisions is not known beforehand and evolves in structure and form during the numerical solution of flow problems. Once the code makes a decision based on the available data, the structure of the data may change, and criteria may be reapplied in order to direct the analysis toward an acceptable end. Intelligent decisions are made by processing vast amounts of data that evolve unpredictably during the calculation. The basic components of adaptive methods and their application to complex problems of fluid dynamics are reviewed. The basic components of adaptive methods are: (1) data structures, that is what approaches are available for modifying data structures of an approximation so as to reduce errors; (2) error estimation, that is what techniques exist for estimating error evolution in a CFD calculation; and (3) solvers, what algorithms are available which can function in changing meshes. Numerical examples which demonstrate the viability of these approaches are presented.

  4. A method for interactive specification of multiple-block topologies

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.; Mccann, Karen M.

    1991-01-01

    A method is presented for dealing with the vast amount of topological and other data which must be specified to generate a multiple-block computational grid. Specific uses of the graphical capabilities of a powerful scientific workstation are described which reduce the burden on the user of collecting and formatting such large amounts of data. A program to implement this method, 3DPREP, is described. A plotting transformation algorithm, some useful software tools, notes on programming, and a database organization are also presented. Example grids developed using the method are shown.

  5. Relational-database model for improving quality assurance and process control in a composite manufacturing environment

    NASA Astrophysics Data System (ADS)

    Gentry, Jeffery D.

    2000-05-01

    A relational database is a powerful tool for collecting and analyzing the vast amounts of inner-related data associated with the manufacture of composite materials. A relational database contains many individual database tables that store data that are related in some fashion. Manufacturing process variables as well as quality assurance measurements can be collected and stored in database tables indexed according to lot numbers, part type or individual serial numbers. Relationships between manufacturing process and product quality can then be correlated over a wide range of product types and process variations. This paper presents details on how relational databases are used to collect, store, and analyze process variables and quality assurance data associated with the manufacture of advanced composite materials. Important considerations are covered including how the various types of data are organized and how relationships between the data are defined. Employing relational database techniques to establish correlative relationships between process variables and quality assurance measurements is then explored. Finally, the benefits of database techniques such as data warehousing, data mining and web based client/server architectures are discussed in the context of composite material manufacturing.

  6. The use of expanded clay dust in paint manufacturing

    NASA Astrophysics Data System (ADS)

    Sverguzova, S. V.; Sapronova, Zh A.; Starostina, Yu L.; Belovodskiy, E. A.

    2018-01-01

    Production increase of useful products is accompanied by the formation and the accumulation of the vast amounts of industrial wastes, the bulk of which is not involved in the recycling processes. An example of such wastes is dust bag filters of ceramsite production. At the large enterprises, the volume of its formation can reach 7-8 tons of dust per day, which is 10-15% of feedstock mass. The studies on the use of ceramsite production dust as filler pigment in the composition of organic mixed primer of red-brown color are carried out in this work. For comparison, red iron oxide pigment (Pg FGM) was used. The results showed that, primer with the use of expanded clay dust is characterized by the short drying time and meets all regulatory requirements.

  7. Kill ratio calculation for in-line yield prediction

    NASA Astrophysics Data System (ADS)

    Lorenzo, Alfonso; Oter, David; Cruceta, Sergio; Valtuena, Juan F.; Gonzalez, Gerardo; Mata, Carlos

    1999-04-01

    The search for better yields in IC manufacturing calls for a smarter use of the vast amount of data that can be generated by a world class production line.In this scenario, in-line inspection processes produce thousands of wafer maps, number of defects, defect type and pictures every day. A step forward is to correlate these with the other big data- generator area: test. In this paper, we present how these data can be put together and correlated to obtain a very useful yield predicting tool. This correlation will first allow us to calculate the kill ratio, i.e. the probability for a defect of a certain size in a certain layer to kill the die. Then we will use that number to estimate the cosmetic yield that a wafer will have.

  8. Expanding the Availability of Lightweight Aluminum Alloy Armor Plate Procured from Detailed Military Specifications

    NASA Astrophysics Data System (ADS)

    Doherty, Kevin; Squillacioti, Richard; Cheeseman, Bryan; Placzankis, Brian; Gallardy, Denver

    For many years, the range of aluminum alloys for armor plate applications obtainable in accordance with detailed military specifications was very limited. However, the development of improved aluminum alloys for aerospace and other applications has provided an opportunity to modernize the Army portfolio for ground vehicle armor applications. While the benefits of offering additional alloy choices to vehicle designers is obvious, the process of creating detailed military specifications for armor plate applications is not trivial. A significant amount of material and testing is required to develop the details required by an armor plate specification. Due to the vast number of material programs that require standardization and with a limited amount of manpower and funds as a result of Standardization Reform in 1995, one typically requires a need statement from a vehicle program office to justify and sponsor the work. This presentation will focus on recent aluminum alloy armor plate specifications that have added capability to vehicle designers' selection of armor materials that offer possible benefits such as lower cost, higher strength, better ballistic and corrosion resistance, improved weldability, etc.

  9. Mining moving object trajectories in location-based services for spatio-temporal database update

    NASA Astrophysics Data System (ADS)

    Guo, Danhuai; Cui, Weihong

    2008-10-01

    Advances in wireless transmission and mobile technology applied to LBS (Location-based Services) flood us with amounts of moving objects data. Vast amounts of gathered data from position sensors of mobile phones, PDAs, or vehicles hide interesting and valuable knowledge and describe the behavior of moving objects. The correlation between temporal moving patterns of moving objects and geo-feature spatio-temporal attribute was ignored, and the value of spatio-temporal trajectory data was not fully exploited too. Urban expanding or frequent town plan change bring about a large amount of outdated or imprecise data in spatial database of LBS, and they cannot be updated timely and efficiently by manual processing. In this paper we introduce a data mining approach to movement pattern extraction of moving objects, build a model to describe the relationship between movement patterns of LBS mobile objects and their environment, and put up with a spatio-temporal database update strategy in LBS database based on trajectories spatiotemporal mining. Experimental evaluation reveals excellent performance of the proposed model and strategy. Our original contribution include formulation of model of interaction between trajectory and its environment, design of spatio-temporal database update strategy based on moving objects data mining, and the experimental application of spatio-temporal database update by mining moving objects trajectories.

  10. Current and potential uses of bioactive molecules from marine processing waste.

    PubMed

    Suleria, Hafiz Ansar Rasul; Masci, Paul; Gobe, Glenda; Osborne, Simone

    2016-03-15

    Food industries produce huge amounts of processing waste that are often disposed of incurring expenses and impacting upon the environment. For these and other reasons, food processing waste streams, in particular marine processing waste streams, are gaining popularity amongst pharmaceutical, cosmetic and nutraceutical industries as sources of bioactive molecules. In the last 30 years, there has been a gradual increase in processed marine products with a concomitant increase in waste streams that include viscera, heads, skins, fins, bones, trimmings and shellfish waste. In 2010, these waste streams equated to approximately 24 million tonnes of mostly unused resources. Marine processing waste streams not only represent an abundant resource, they are also enriched with structurally diverse molecules that possess a broad panel of bioactivities including anti-oxidant, anti-coagulant, anti-thrombotic, anti-cancer and immune-stimulatory activities. Retrieval and characterisation of bioactive molecules from marine processing waste also contributes valuable information to the vast field of marine natural product discovery. This review summarises the current use of bioactive molecules from marine processing waste in different products and industries. Moreover, this review summarises new research into processing waste streams and the potential for adoption by industries in the creation of new products containing marine processing waste bioactives. © 2015 Society of Chemical Industry.

  11. Hybrid Text: An Engaging Genre to Teach Content Area Material across the Curriculum

    ERIC Educational Resources Information Center

    Bintz, William P.; Ciecierski, Lisa M.

    2017-01-01

    The Common Core State Standards for English language arts expect that teachers will use narrative and informational texts to teach content area material across the curriculum. However, many teachers at all grade levels struggle to incorporate both kinds of text, especially given the vast amount of specialized content they are required to teach.…

  12. Tuning In: Using the News for a Content-Based ESL Class

    ERIC Educational Resources Information Center

    Moglen, Daniel

    2014-01-01

    Vast amounts of daily news content are widely available and easily accessible, and they can be converted into materials for intermediate and advanced ESL classes. This article will describe the why and how for integrating news media sources into a multiskills ESL classroom. Through the news, students are immediately engaged with the material…

  13. AUTOMATED IDENTIFICATION AND SORTING OF RARE EARTH ELEMENTS IN AN E-WASTE RECYCLING STREAM - PHASE I

    EPA Science Inventory

    Electronic waste (e-waste) is one of the most rapidly growing waste problems worldwide. Improper handling of e-waste results in vast amounts of toxic waste being sent to landfill and leaching into the water supply. Due to these concerns, e-waste recycling is a rapidly gro...

  14. Probabilistic Gait Classification in Children with Cerebral Palsy: A Bayesian Approach

    ERIC Educational Resources Information Center

    Van Gestel, Leen; De Laet, Tinne; Di Lello, Enrico; Bruyninckx, Herman; Molenaers, Guy; Van Campenhout, Anja; Aertbelien, Erwin; Schwartz, Mike; Wambacq, Hans; De Cock, Paul; Desloovere, Kaat

    2011-01-01

    Three-dimensional gait analysis (3DGA) generates a wealth of highly variable data. Gait classifications help to reduce, simplify and interpret this vast amount of 3DGA data and thereby assist and facilitate clinical decision making in the treatment of CP. CP gait is often a mix of several clinically accepted distinct gait patterns. Therefore,…

  15. AUTOMATED REMOVAL OF BROMINATED FLAME RETARDANT MATERIAL FROM A MIXED E-WASTE PLASTICS RECYCLING STREAM - PHASE I

    EPA Science Inventory

    Electronic waste (e-waste) is one of the most rapidly growing waste problems worldwide. Improper handling of e-waste results in vast amounts of toxic waste being sent to landfills and leaching into the water supply. Because of these concerns, e-waste recycling is a rapidly gro...

  16. AUTOMATED REMOVAL OF BROMINATED FLAME RETARDANT MATERIAL FROM A MIXED E-WASTE PLASTICS RECYCLING STREAM - PHASE II

    EPA Science Inventory

    Electronic waste (e-waste) is one of the most rapidly growing waste problems worldwide. Improper handling of e-waste results in vast amounts of toxic waste being sent to landfill and leaching into the water supply. Due to there concerns e-waste recycling is a rapidly growing...

  17. Research, Development & Dissemination in Educational Planning & Educational Facilities Planning. Project SIMU-School.

    ERIC Educational Resources Information Center

    Hunt, Lester W.; Burr, Donald F.

    The Simu-School Program and the National Center for Educational Planning were conceived because of the need for (1) expertise in educational planning, (2) a system to collect and assemble the vast amount of knowledge concerning education, (3) community involvement in planning, and (4) a system to accurately interpret today's data in planning for…

  18. Arnold's Advantages: How Governor Schwarzenegger Acquired English through De Facto Bilingual Education

    ERIC Educational Resources Information Center

    Ramos, Francisco; Krashen, Stephen

    2013-01-01

    Governor Arnold Schwarzenegger has repeatedly mentioned that immigrants to the United States should do what he did to acquire English: Avoid using their first languages and speak, listen to, and read a vast amount of materials in English--a combination he referred to as "immersion." Yet, Schwarzenegger's real path to successful English…

  19. The Teacher as Instructional Designer: A Search Through the Curriculum Maze.

    ERIC Educational Resources Information Center

    Jacko, Carol M.; Garman, Noreen M.

    The vast amount of literature on curriculum and instructional design does not establish a clear frame of reference to help teachers utilize and operationalize concepts and information. Four specific problem areas are of special concern in the literature: (1) the terminology and definitions are confusing; (2) the place of the classroom teacher is…

  20. New Metaphors for Organizing Data Could Change the Nature of Computers.

    ERIC Educational Resources Information Center

    Young, Jeffrey R.

    1997-01-01

    Based on the idea that the current framework for organizing electronic data does not take advantage of the mind's ability to make connections among disparate pieces of information, several projects at universities around the country are taking new approaches to classification and storage of vast amounts of computerized data. The new systems take…

  1. Mining for Gold: Utilizing SEC Filings to Develop MBA Students' Understanding of Legal Concepts

    ERIC Educational Resources Information Center

    Willey, Susan; Sherman, Peggy

    2010-01-01

    Many MBA classes, such as those in accounting and finance, require students to examine securities filings to perform financial analyses of companies. Often, however, students are unaware of the vast amount of additional information that can be obtained from a company's securities filings and other public information. Much of this information…

  2. Make the Conference Come to You

    ERIC Educational Resources Information Center

    Behrens, Susan J.

    2008-01-01

    In today's environmentally aware climate, the author relates that she sees the traditional academic conference in a new light. Many people travel great distances and use vast amounts of resources to stand in front of other energy consumers and read a paper aloud. People do not even call it giving a talk anymore; it is giving a paper. The author…

  3. Using Digital Participatory Research to Foster Glocal Competence: Constructing Multimedia Projects as a Form of Global and Civic Citizenship

    ERIC Educational Resources Information Center

    Mathews, Sarah A.

    2016-01-01

    Digital Participatory Research (DPR) combines grass-roots participatory research and photojournalism, asks students to investigate assets and issues within their community, and facilitates civic participation by using problem-posing and praxis-orientated methods. Although there is a vast amount of research documenting the impact of DPR at the…

  4. Quasiregularity and Its Discontents: The Legacy of the Past Tense Debate

    ERIC Educational Resources Information Center

    Seidenberg, Mark S.; Plaut, David C.

    2014-01-01

    Rumelhart and McClelland's chapter about learning the past tense created a degree of controversy extraordinary even in the adversarial culture of modern science. It also stimulated a vast amount of research that advanced the understanding of the past tense, inflectional morphology in English and other languages, the nature of linguistic…

  5. Making It Relevant: How a Black Male Teacher Sustained Professional Relationships through Culturally Responsive Discourse

    ERIC Educational Resources Information Center

    Thomas, Ebony Elizabeth; Warren, Chezare A.

    2017-01-01

    What we know about the experiences of black teachers is limited, especially considering the vast amount of research conducted on and about black boys and young men. This article describes and analyzes how a black teacher at a suburban high school in the Midwestern United States negotiated professional relationships through culturally relevant…

  6. Tennessee's Forests, 2004

    Treesearch

    Christopher M. Oswalt; Sonja N. Oswalt; Tony G. Johnson; James L. Chamberlain; KaDonna C. Randolph; John W. Coulston

    2009-01-01

    Forest land area in Tennessee amounted to 13.78 million acres. About 125 different species, mostly hardwood, account for an estimated 22.6 billion cubic feet of all growing-stock volume on timberland in the State. Hardwood forest types occupy the vast majority of the State's forest land, and oak-hickory is the dominant forest-type group, accounting for about 10.1...

  7. Rescuing Language Education from the Neoliberal Disaster: Culturometric Predictions and Analyses of Future Policy

    ERIC Educational Resources Information Center

    Boufoy-Bastick, Béatrice

    2015-01-01

    Over the last three decades neoliberal government policies have spread successfully around the world with disastrous effects on the social infrastructures of many countries. Neoliberal policies move vast amounts of public money into private hands increasing the gap between rich and poor and decimating social support services for the majority of…

  8. Teachers' Perspectives of Lower Secondary School Students in Streamed Classes--A Western Australian Case Study

    ERIC Educational Resources Information Center

    Johnston, Olivia; Wildy, Helen

    2018-01-01

    Streaming in secondary schools is not beneficial for improving student outcomes of education with vast amounts of educational research indicating that it does not improve academic results and increases inequity. Yet teachers often prefer working in streamed classes, and research shows that teachers mediate the effects of streaming on students.…

  9. Understanding brains: details, intuition, and big data.

    PubMed

    Marder, Eve

    2015-05-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  10. Race and Ethnicity in the Genome Era: The Complexity of the Constructs

    ERIC Educational Resources Information Center

    Bonham, Vence L.; Warshauer-Baker, Esther; Collins, Francis S.

    2005-01-01

    The vast amount of biological information that is now available through the completion of the Human Genome Project presents opportunities and challenges. The genomic era has the potential to advance an understanding of human genetic variation and its role in human health and disease. A challenge for genomics research is to understand the…

  11. Research on Substance Abuse: Alcohol, Drugs, Tobacco. Matrix No. 14.

    ERIC Educational Resources Information Center

    Robins, Lee N.

    In the last few years, a vast amount of research has accumulated with respect to American children's use of legal and illicit drugs. This research has included cross-sectional studies (which have attempted to determine current drug usage, age of onset for each drug used, and maximum frequency of use in the lifetime); longitudinal studies (which…

  12. How Does Target Know so Much about Its Customers? Utilizing Customer Analytics to Make Marketing Decisions

    ERIC Educational Resources Information Center

    Corrigan, Hope B.; Craciun, Georgiana; Powell, Allison M.

    2014-01-01

    Every time shoppers make a purchase at a store or browse a Web site, customer behavior is tracked, analyzed, and perhaps shared with other businesses. Target Corporation is a leader in analyzing vast amounts of data to identify buying patterns, improve customer satisfaction, predict future trends, select promotional strategies, and increase…

  13. The Effects of Class Size on Student Achievement in Intermediate Level Elementary Students

    ERIC Educational Resources Information Center

    McInerney, Melissa

    2014-01-01

    Class size and student achievement have been debated for decades. The vast amount of research on this topic is either conflicting or inconclusive. There are large and small scale studies that support both sides of this dilemma (Achilles, Nye, Boyd-Zaharias, Fulton, & Cain, 1994; Glass & Smith, 1979; Slavin, 1989). Class size reduction is a…

  14. Translating Knowledge on Poverty to Humanize Care: Benefits and Synergies of Community Engagement with the Arts

    ERIC Educational Resources Information Center

    Lévesque, Martine Cécile; Dupéré, Sophie; Morin, Nathalie; Côté, Johanne; Roberge, Nancy; Laurin, Isabelle; Charbonneau, Anne; Loignon, Christine; Bedos, Christophe

    2015-01-01

    The knowledge translation movement in health has led to the production of vast amounts of knowledge tools aimed at broadening clinicians' evidence base and improving the quality and efficacy of their practices. However important, these tools, largely oriented towards biomedical and technological aspects of care, are of limited potential for…

  15. Using natural language processing to identify problem usage of prescription opioids.

    PubMed

    Carrell, David S; Cronkite, David; Palmer, Roy E; Saunders, Kathleen; Gross, David E; Masters, Elizabeth T; Hylan, Timothy R; Von Korff, Michael

    2015-12-01

    Accurate and scalable surveillance methods are critical to understand widespread problems associated with misuse and abuse of prescription opioids and for implementing effective prevention and control measures. Traditional diagnostic coding incompletely documents problem use. Relevant information for each patient is often obscured in vast amounts of clinical text. We developed and evaluated a method that combines natural language processing (NLP) and computer-assisted manual review of clinical notes to identify evidence of problem opioid use in electronic health records (EHRs). We used the EHR data and text of 22,142 patients receiving chronic opioid therapy (≥70 days' supply of opioids per calendar quarter) during 2006-2012 to develop and evaluate an NLP-based surveillance method and compare it to traditional methods based on International Classification of Disease, Ninth Edition (ICD-9) codes. We developed a 1288-term dictionary for clinician mentions of opioid addiction, abuse, misuse or overuse, and an NLP system to identify these mentions in unstructured text. The system distinguished affirmative mentions from those that were negated or otherwise qualified. We applied this system to 7336,445 electronic chart notes of the 22,142 patients. Trained abstractors using a custom computer-assisted software interface manually reviewed 7751 chart notes (from 3156 patients) selected by the NLP system and classified each note as to whether or not it contained textual evidence of problem opioid use. Traditional diagnostic codes for problem opioid use were found for 2240 (10.1%) patients. NLP-assisted manual review identified an additional 728 (3.1%) patients with evidence of clinically diagnosed problem opioid use in clinical notes. Inter-rater reliability among pairs of abstractors reviewing notes was high, with kappa=0.86 and 97% agreement for one pair, and kappa=0.71 and 88% agreement for another pair. Scalable, semi-automated NLP methods can efficiently and accurately identify evidence of problem opioid use in vast amounts of EHR text. Incorporating such methods into surveillance efforts may increase prevalence estimates by as much as one-third relative to traditional methods. Copyright © 2015. Published by Elsevier Ireland Ltd.

  16. FPGA cluster for high-performance AO real-time control system

    NASA Astrophysics Data System (ADS)

    Geng, Deli; Goodsell, Stephen J.; Basden, Alastair G.; Dipper, Nigel A.; Myers, Richard M.; Saunter, Chris D.

    2006-06-01

    Whilst the high throughput and low latency requirements for the next generation AO real-time control systems have posed a significant challenge to von Neumann architecture processor systems, the Field Programmable Gate Array (FPGA) has emerged as a long term solution with high performance on throughput and excellent predictability on latency. Moreover, FPGA devices have highly capable programmable interfacing, which lead to more highly integrated system. Nevertheless, a single FPGA is still not enough: multiple FPGA devices need to be clustered to perform the required subaperture processing and the reconstruction computation. In an AO real-time control system, the memory bandwidth is often the bottleneck of the system, simply because a vast amount of supporting data, e.g. pixel calibration maps and the reconstruction matrix, need to be accessed within a short period. The cluster, as a general computing architecture, has excellent scalability in processing throughput, memory bandwidth, memory capacity, and communication bandwidth. Problems, such as task distribution, node communication, system verification, are discussed.

  17. Computer simulation and evaluation of edge detection algorithms and their application to automatic path selection

    NASA Technical Reports Server (NTRS)

    Longendorfer, B. A.

    1976-01-01

    The construction of an autonomous roving vehicle requires the development of complex data-acquisition and processing systems, which determine the path along which the vehicle travels. Thus, a vehicle must possess algorithms which can (1) reliably detect obstacles by processing sensor data, (2) maintain a constantly updated model of its surroundings, and (3) direct its immediate actions to further a long range plan. The first function consisted of obstacle recognition. Obstacles may be identified by the use of edge detection techniques. Therefore, the Kalman Filter was implemented as part of a large scale computer simulation of the Mars Rover. The second function consisted of modeling the environment. The obstacle must be reconstructed from its edges, and the vast amount of data must be organized in a readily retrievable form. Therefore, a Terrain Modeller was developed which assembled and maintained a rectangular grid map of the planet. The third function consisted of directing the vehicle's actions.

  18. The role of the Internet in cancer patients' engagement with complementary and alternative treatments.

    PubMed

    Broom, Alex; Tovey, Philip

    2008-04-01

    This article draws on a study of 80 National Health Service cancer patients and their experiences of using the Internet within disease and treatment processes. It focuses on the role the Internet plays in the context of potential or actual engagement with complementary and alternative medicine (CAM). The results depart from previous conceptualizations of the Internet as a major source of CAM knowledge, and second, as a major pathway to patient CAM usage. Moreover, the results highlight significant anxiety as patients attempt to process vast amounts of complex biomedical diagnostic and prognostic information online. For patients attempting to embrace alternative therapeutic models of cancer care, exposure to prognostic data may pose considerable risks to individual well-being and engagement with healing practices. On the basis of these results we problematize social theorizations of the Internet as contributing to such things as: the democratization of knowledge; the deprofessionalization of medicine; and patient empowerment. We emphasize, instead, the potential role of the Internet in reinforcing biomedicine's paradigmatic dominance in cancer care.

  19. Automated Ontology Generation Using Spatial Reasoning

    NASA Astrophysics Data System (ADS)

    Coalter, Alton; Leopold, Jennifer L.

    Recently there has been much interest in using ontologies to facilitate knowledge representation, integration, and reasoning. Correspondingly, the extent of the information embodied by an ontology is increasing beyond the conventional is_a and part_of relationships. To address these requirements, a vast amount of digitally available information may need to be considered when building ontologies, prompting a desire for software tools to automate at least part of the process. The main efforts in this direction have involved textual information retrieval and extraction methods. For some domains extension of the basic relationships could be enhanced further by the analysis of 2D and/or 3D images. For this type of media, image processing algorithms are more appropriate than textual analysis methods. Herein we present an algorithm that, given a collection of 3D image files, utilizes Qualitative Spatial Reasoning (QSR) to automate the creation of an ontology for the objects represented by the images, relating the objects in terms of is_a and part_of relationships and also through unambiguous Relational Connection Calculus (RCC) relations.

  20. Experimental evolution in biofilm populations

    PubMed Central

    Steenackers, Hans P.; Parijs, Ilse; Foster, Kevin R.; Vanderleyden, Jozef

    2016-01-01

    Biofilms are a major form of microbial life in which cells form dense surface associated communities that can persist for many generations. The long-life of biofilm communities means that they can be strongly shaped by evolutionary processes. Here, we review the experimental study of evolution in biofilm communities. We first provide an overview of the different experimental models used to study biofilm evolution and their associated advantages and disadvantages. We then illustrate the vast amount of diversification observed during biofilm evolution, and we discuss (i) potential ecological and evolutionary processes behind the observed diversification, (ii) recent insights into the genetics of adaptive diversification, (iii) the striking degree of parallelism between evolution experiments and real-life biofilms and (iv) potential consequences of diversification. In the second part, we discuss the insights provided by evolution experiments in how biofilm growth and structure can promote cooperative phenotypes. Overall, our analysis points to an important role of biofilm diversification and cooperation in bacterial survival and productivity. Deeper understanding of both processes is of key importance to design improved antimicrobial strategies and diagnostic techniques. PMID:26895713

  1. Experimental evolution in biofilm populations.

    PubMed

    Steenackers, Hans P; Parijs, Ilse; Dubey, Akanksha; Foster, Kevin R; Vanderleyden, Jozef

    2016-05-01

    Biofilms are a major form of microbial life in which cells form dense surface associated communities that can persist for many generations. The long-life of biofilm communities means that they can be strongly shaped by evolutionary processes. Here, we review the experimental study of evolution in biofilm communities. We first provide an overview of the different experimental models used to study biofilm evolution and their associated advantages and disadvantages. We then illustrate the vast amount of diversification observed during biofilm evolution, and we discuss (i) potential ecological and evolutionary processes behind the observed diversification, (ii) recent insights into the genetics of adaptive diversification, (iii) the striking degree of parallelism between evolution experiments and real-life biofilms and (iv) potential consequences of diversification. In the second part, we discuss the insights provided by evolution experiments in how biofilm growth and structure can promote cooperative phenotypes. Overall, our analysis points to an important role of biofilm diversification and cooperation in bacterial survival and productivity. Deeper understanding of both processes is of key importance to design improved antimicrobial strategies and diagnostic techniques. © FEMS 2016.

  2. A Common Approach for the Certifying of International Space Station (ISS) Basic Hardware for Ground Safety

    NASA Technical Reports Server (NTRS)

    Kirkpatrick, Paul D.; Trinchero, Jean-Pierre

    2005-01-01

    In order to support the International Space Station, as well as any future long term human missions, vast amounts of logistical-type hardware is required to be processed through the various launch sites. This category consists of such hardware as spare parts, replacement items, and upgraded hardware. The category also includes samples for experiments and consumables. One attribute that all these items have is they are generally non-hazardous, at least to ground personnel. Even though the items are non-hazardous, launch site ground safety has a responsibility for the protection of personnel, the flight hardware, and launch site resources. In order to fulfill this responsibility, the safety organization must have knowledge of the hardware and its operations. Conversely, the hardware providers are entitled to a process that is commensurate with the hazard. Additionally, a common system should be in place that is flexible enough to account for the requirements at all launch sites, so that, the hardware provider need only complete one process for ground safety regardless of the launch site.

  3. Surface Operations Systems Improve Airport Efficiency

    NASA Technical Reports Server (NTRS)

    2009-01-01

    With Small Business Innovation Research (SBIR) contracts from Ames Research Center, Mosaic ATM of Leesburg, Virginia created software to analyze surface operations at airports. Surface surveillance systems, which report locations every second for thousands of air and ground vehicles, generate massive amounts of data, making gathering and analyzing this information difficult. Mosaic?s Surface Operations Data Analysis and Adaptation (SODAA) tool is an off-line support tool that can analyze how well the airport surface operation is working and can help redesign procedures to improve operations. SODAA helps researchers pinpoint trends and correlations in vast amounts of recorded airport operations data.

  4. An Internet of Things platform architecture for supporting ambient assisted living environments.

    PubMed

    Tsirmpas, Charalampos; Kouris, Ioannis; Anastasiou, Athanasios; Giokas, Kostas; Iliopoulou, Dimitra; Koutsouris, Dimitris

    2017-01-01

    Internet of Things (IoT) is the logical further development of today's Internet, enabling a huge amount of devices to communicate, compute, sense and act. IoT sensors placed in Ambient Assisted Living (AAL) environments, enable the context awareness and allow the support of the elderly in their daily routines, ultimately allowing an independent and safe lifestyle. The vast amount of data that are generated and exchanged between the IoT nodes require innovative context modeling approaches that go beyond currently used models. Current paper presents and evaluates an open interoperable platform architecture in order to utilize the technical characteristics of IoT and handle the large amount of generated data, as a solution to the technical requirements of AAL applications.

  5. A global "imaging'' view on systems approaches in immunology.

    PubMed

    Ludewig, Burkhard; Stein, Jens V; Sharpe, James; Cervantes-Barragan, Luisa; Thiel, Volker; Bocharov, Gennady

    2012-12-01

    The immune system exhibits an enormous complexity. High throughput methods such as the "-omic'' technologies generate vast amounts of data that facilitate dissection of immunological processes at ever finer resolution. Using high-resolution data-driven systems analysis, causal relationships between complex molecular processes and particular immunological phenotypes can be constructed. However, processes in tissues, organs, and the organism itself (so-called higher level processes) also control and regulate the molecular (lower level) processes. Reverse systems engineering approaches, which focus on the examination of the structure, dynamics and control of the immune system, can help to understand the construction principles of the immune system. Such integrative mechanistic models can properly describe, explain, and predict the behavior of the immune system in health and disease by combining both higher and lower level processes. Moving from molecular and cellular levels to a multiscale systems understanding requires the development of methodologies that integrate data from different biological levels into multiscale mechanistic models. In particular, 3D imaging techniques and 4D modeling of the spatiotemporal dynamics of immune processes within lymphoid tissues are central for such integrative approaches. Both dynamic and global organ imaging technologies will be instrumental in facilitating comprehensive multiscale systems immunology analyses as discussed in this review. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Knowledge bases, clinical decision support systems, and rapid learning in oncology.

    PubMed

    Yu, Peter Paul

    2015-03-01

    One of the most important benefits of health information technology is to assist the cognitive process of the human mind in the face of vast amounts of health data, limited time for decision making, and the complexity of the patient with cancer. Clinical decision support tools are frequently cited as a technologic solution to this problem, but to date useful clinical decision support systems (CDSS) have been limited in utility and implementation. This article describes three unique sources of health data that underlie fundamentally different types of knowledge bases which feed into CDSS. CDSS themselves comprise a variety of models which are discussed. The relationship of knowledge bases and CDSS to rapid learning health systems design is critical as CDSS are essential drivers of rapid learning in clinical care. Copyright © 2015 by American Society of Clinical Oncology.

  7. [Advances in microbial solar cells--A review].

    PubMed

    Guo, Xiaoyun; Yu, Changping; Zheng, Tianling

    2015-08-04

    The energy crisis has become one of the major problems hindering the development of the world. The emergence of microbial fuel cells provides a new solution to the energy crisis. Microbial solar cells, integrating photosynthetic organisms such as plants and microalgae into microbial fuel cells, can convert solar energy into electrical energy. Microbial solar cell has steady electric energy, and broad application prospects in wastewater treatment, biodiesel processing and intermediate metabolites production. Here we reviewed recent progress of microbial solar cells from the perspective of the role of photosynthetic organisms in microbial fuel cells, based on a vast amount of literature, and discussed their advantages and deficiency. At last, brief analysis of the facing problems and research needs of microbial fuel cells are undertaken. This work was expected to be beneficial for the application of the microbial solar cells technology.

  8. Use of computers in dysmorphology.

    PubMed Central

    Diliberti, J H

    1988-01-01

    As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092

  9. What to Do with Years of Data on Student Engagement

    ERIC Educational Resources Information Center

    Lipka, Sara

    2007-01-01

    George D. Kuh has directed the National Survey of Student Engagement since it began. Over the past eight years, the survey has collected a vast amount of data on how students learn and grow. It has promoted successful practices like learning communities, in which students take courses that focus on a theme, and culminating projects like theses and…

  10. Earthworms in tropical tree plantations: effects of management and relations with soil carbon and nutrient use efficiency

    Treesearch

    X Zou; Grizelle Gonzalez

    2001-01-01

    With the vast amount of abandoned tropical land due to non- sustainable farming practices, tropical tree-plantations become an effective means in restoring soil productivity and preserving ecosystem biodiversity. Because earthworms are the dominant soil fauna in moist tropical regions and play an important role in improving soil fertility, understanding the mechanisms...

  11. A Probabilistic Approach to Data Integration in Biomedical Research: The IsBIG Experiments

    ERIC Educational Resources Information Center

    Anand, Vibha

    2010-01-01

    Biomedical research has produced vast amounts of new information in the last decade but has been slow to find its use in clinical applications. Data from disparate sources such as genetic studies and summary data from published literature have been amassed, but there is a significant gap, primarily due to a lack of normative methods, in combining…

  12. Pathways to Labor Market Success: The Literacy Proficiency of U.S. Adults. Policy Information Report

    ERIC Educational Resources Information Center

    Sum, Andrew; Kirsch, Irwin; Yamamoto, Kentaro

    2004-01-01

    This is the fourth in a series of reports that draws upon the vast amount of background and assessment data and information that have been collected from the National Adult Literacy Survey (NALS) and the International Adult Literacy Survey (IALS). In this report, the authors find connections between the literacy skills of adults and their success…

  13. Where Have All the Scientific Data Gone? LIS Perspective on the Data-at-Risk Predicament

    ERIC Educational Resources Information Center

    Thompson, Cheryl A.; Robertson, W. Davenport; Greenberg, Jane

    2014-01-01

    Scientists produce vast amounts of data that often are not preserved properly or do not have inventories, placing them at risk. As part of an effort to more fully understand the data-at-risk predicament, researchers who were engaged in the DARI project at UNC's Metadata Research Center surveyed information custodians working in a range of…

  14. Improving Reading Comprehension Strategies Using Student Produced CD's Combined with More Traditional Activities.

    ERIC Educational Resources Information Center

    Goss, Gail

    Readers possess vast amounts of knowledge gained from their prior experiences and exposures. The more they are helped to use that knowledge for connecting new ideas to known subjects as they read, the better their comprehension will be. Discussions before reading have been a traditional way to activate students' schema for stories, but a new…

  15. A Selective and Evaluative Bibliographic Essay on Mormonism: For Use in Public, Academic, and Special Libraries.

    ERIC Educational Resources Information Center

    Laughlin, David L.

    The Church of Jesus Christ of Latter-day Saints (LDS) was established in 1830 by six men led by Joseph Smith. Today this group, commonly called Mormons, numbers approximately seven million members worldwide. Mormonism has sometimes been the object of public, political, and ecclesiastical animosity and misinformation. There is now a vast amount of…

  16. The Moving Image in Education Research: Reassembling the Body in Classroom Video Data

    ERIC Educational Resources Information Center

    de Freitas, Elizabeth

    2016-01-01

    While audio recordings and observation might have dominated past decades of classroom research, video data is now the dominant form of data in the field. Ubiquitous videography is standard practice today in archiving the body of both the teacher and the student, and vast amounts of classroom and experiment clips are stored in online archives. Yet…

  17. Flickr's Potential as an Academic Image Resource: An Exploratory Study

    ERIC Educational Resources Information Center

    Angus, Emma; Stuart, David; Thelwall, Mike

    2010-01-01

    Many web 2.0 sites are extremely popular and contain vast amounts of content, but how much of this content is useful in academia? This exploratory paper investigates the potential use of the popular web 2.0 image site Flickr as an academic image resource. The study identified images tagged with any one of 12 subject names derived from recognized…

  18. Violence or Nonviolence: Which do We Choose?

    ERIC Educational Resources Information Center

    Schafer, John

    2005-01-01

    A large mass of research on violence now exists, yet the utilitarian value of this vast amount of scientific endeavor may be rated as low, comparing it to the levels of violence at all levels abounding in the world today. The author calls for centralizing funding and work on violence at the national level in the United States, perhaps forming a…

  19. A New Way of Making Cultural Information Resources Visible on the Web: Museums and the Open Archive Initiative.

    ERIC Educational Resources Information Center

    Perkins, John

    Museums hold enormous amounts of information in collections management systems and publish academic and scholarly research in print journals, exhibition catalogs, virtual museum presentations, and community publications. Much of this rich content is unavailable to web search engines or otherwise gets lost in the vastness of the World Wide Web. The…

  20. Understanding Brains: Details, Intuition, and Big Data

    PubMed Central

    Marder, Eve

    2015-01-01

    Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important. PMID:25965068

  1. Generation and Recovery of Solid Wood Waste in the U.S.

    Treesearch

    Bob Falk; David McKeever

    2012-01-01

    North America has a vast system of hardwood and softwood forests, and the wood harvested from this resource is widely used in many applications. These include lumber and other building materials, furniture, crating, containers, pallets and other consumer goods. This wide array of wood products generates not only a large amount of industrial wood by-product during the...

  2. Regional biomass stores and dynamics in forests of coastal Alaska

    Treesearch

    Mikhaill A. Yatskov; Mark E. Harmon; Olga N. Krankina; Tara M. Barrett; Kevin R. Dobelbower; Andrew N. Gray; Becky Fasth; Lori Trummer; Toni L. Hoyman; Chana M. Dudoit

    2015-01-01

    Coastal Alaska is a vast forested region (6.2 million ha) with the potential to store large amounts of carbon in live and dead biomass thus influencing continental and global carbon dynamics. The main objectives of this study were to assess regional biomass stores, examine the biomass partitioning between live and dead pools, and evaluate the effect of disturbance on...

  3. "It Should at Least Seem Scientific!" Textual Features of "Scientificness" and Their Impact on Lay Assessments of Online Information

    ERIC Educational Resources Information Center

    Thomm, Eva; Bromme, Rainer

    2012-01-01

    The Internet is a convenient source of information about science-based topics (e.g., health matters). Whereas experts are familiar with the conventions of "true" scientific discourse and the assessment of scientific information, laypeople may have great difficulty choosing among, evaluating, and deciding on the vast amount of information…

  4. What Is Gravity?

    ERIC Educational Resources Information Center

    Nelson, George

    2004-01-01

    Gravity is the name given to the phenomenon that any two masses, like you and the Earth, attract each other. One pulls on the Earth and the Earth pulls on one the same amount. And one does not have to be touching. Gravity acts over vast distances, like the 150 million kilometers (93 million miles) between the Earth and the Sun or the billions of…

  5. EPSS Needs Assessment: Oops, I Forgot How to Do That!

    ERIC Educational Resources Information Center

    Nguyen, Frank

    2005-01-01

    How many times have you attended a marathon training class only to return to your job and promptly forget what you learned? How many times have you programmed your VCR, only to find yourself searching for the manual six months later? The reality is that humans accumulate a vast amount of life experience and knowledge. Adult learning theory…

  6. Completion in Vocational and Academic Upper Secondary School: The Importance of School Motivation, Self-Efficacy, and Individual Characteristics

    ERIC Educational Resources Information Center

    Daehlen, Marianne

    2017-01-01

    A vast amount of research is devoted to identifying factors that predict early school leaving. However, there is no simple explanation because the results show that young people leave education prematurely for various reasons, such as their level of school involvement, their background characteristics and different school systems. This article…

  7. Ultra-low gossypol cottonseed: gene-silencing opens up a vast, but underutilized protein resource for human nutrition

    USDA-ARS?s Scientific Manuscript database

    Cotton, grown mainly for its fiber, is a major crop in several developing and developed countries across the globe. In 2012, 48.8 million metric tons (MMT) of cottonseed was produced worldwide as a by-product of the 25.9 MMT of cotton lint production (FAO Production Statistics). This amount of cot...

  8. Ultra-low gossypol cottonseed: gene silencing opens up a vast, but underutilized protein resource for humanity

    USDA-ARS?s Scientific Manuscript database

    Cotton, grown mainly for its fiber, is a major crop in several developing and developed countries all across the globe. In 2011, 48.8 million metric tons (MMT) of cottonseed was produced worldwide as a by-product of the 26.1 MMT of cotton lint production (FAO Statistics). This amount of cottonseed, ...

  9. Recommending images of user interests from the biomedical literature

    NASA Astrophysics Data System (ADS)

    Clukey, Steven; Xu, Songhua

    2013-03-01

    Every year hundreds of thousands of biomedical images are published in journals and conferences. Consequently, finding images relevant to one's interests becomes an ever daunting task. This vast amount of literature creates a need for intelligent and easy-to-use tools that can help researchers effectively navigate through the content corpus and conveniently locate materials of their interests. Traditionally, literature search tools allow users to query content using topic keywords. However, manual query composition is often time and energy consuming. A better system would be one that can automatically deliver relevant content to a researcher without having the end user manually manifest one's search intent and interests via search queries. Such a computer-aided assistance for information access can be provided by a system that first determines a researcher's interests automatically and then recommends images relevant to the person's interests accordingly. The technology can greatly improve a researcher's ability to stay up to date in their fields of study by allowing them to efficiently browse images and documents matching their needs and interests among the vast amount of the biomedical literature. A prototype system implementation of the technology can be accessed via http://www.smartdataware.com.

  10. Selection platforms for directed evolution in synthetic biology

    PubMed Central

    Tizei, Pedro A.G.; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B.

    2016-01-01

    Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules–gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function–be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. PMID:27528765

  11. Selection platforms for directed evolution in synthetic biology.

    PubMed

    Tizei, Pedro A G; Csibra, Eszter; Torres, Leticia; Pinheiro, Vitor B

    2016-08-15

    Life on Earth is incredibly diverse. Yet, underneath that diversity, there are a number of constants and highly conserved processes: all life is based on DNA and RNA; the genetic code is universal; biology is limited to a small subset of potential chemistries. A vast amount of knowledge has been accrued through describing and characterizing enzymes, biological processes and organisms. Nevertheless, much remains to be understood about the natural world. One of the goals in Synthetic Biology is to recapitulate biological complexity from simple systems made from biological molecules-gaining a deeper understanding of life in the process. Directed evolution is a powerful tool in Synthetic Biology, able to bypass gaps in knowledge and capable of engineering even the most highly conserved biological processes. It encompasses a range of methodologies to create variation in a population and to select individual variants with the desired function-be it a ligand, enzyme, pathway or even whole organisms. Here, we present some of the basic frameworks that underpin all evolution platforms and review some of the recent contributions from directed evolution to synthetic biology, in particular methods that have been used to engineer the Central Dogma and the genetic code. © 2016 The Author(s).

  12. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  13. Intelligent Decisions Need Intelligent Choice of Models and Data - a Bayesian Justifiability Analysis for Models with Vastly Different Complexity

    NASA Astrophysics Data System (ADS)

    Nowak, W.; Schöniger, A.; Wöhling, T.; Illman, W. A.

    2016-12-01

    Model-based decision support requires justifiable models with good predictive capabilities. This, in turn, calls for a fine adjustment between predictive accuracy (small systematic model bias that can be achieved with rather complex models), and predictive precision (small predictive uncertainties that can be achieved with simpler models with fewer parameters). The implied complexity/simplicity trade-off depends on the availability of informative data for calibration. If not available, additional data collection can be planned through optimal experimental design. We present a model justifiability analysis that can compare models of vastly different complexity. It rests on Bayesian model averaging (BMA) to investigate the complexity/performance trade-off dependent on data availability. Then, we disentangle the complexity component from the performance component. We achieve this by replacing actually observed data by realizations of synthetic data predicted by the models. This results in a "model confusion matrix". Based on this matrix, the modeler can identify the maximum model complexity that can be justified by the available (or planned) amount and type of data. As a side product, the matrix quantifies model (dis-)similarity. We apply this analysis to aquifer characterization via hydraulic tomography, comparing four models with a vastly different number of parameters (from a homogeneous model to geostatistical random fields). As a testing scenario, we consider hydraulic tomography data. Using subsets of these data, we determine model justifiability as a function of data set size. The test case shows that geostatistical parameterization requires a substantial amount of hydraulic tomography data to be justified, while a zonation-based model can be justified with more limited data set sizes. The actual model performance (as opposed to model justifiability), however, depends strongly on the quality of prior geological information.

  14. Mining manufacturing data for discovery of high productivity process characteristics.

    PubMed

    Charaniya, Salim; Le, Huong; Rangwala, Huzefa; Mills, Keri; Johnson, Kevin; Karypis, George; Hu, Wei-Shou

    2010-06-01

    Modern manufacturing facilities for bioproducts are highly automated with advanced process monitoring and data archiving systems. The time dynamics of hundreds of process parameters and outcome variables over a large number of production runs are archived in the data warehouse. This vast amount of data is a vital resource to comprehend the complex characteristics of bioprocesses and enhance production robustness. Cell culture process data from 108 'trains' comprising production as well as inoculum bioreactors from Genentech's manufacturing facility were investigated. Each run constitutes over one-hundred on-line and off-line temporal parameters. A kernel-based approach combined with a maximum margin-based support vector regression algorithm was used to integrate all the process parameters and develop predictive models for a key cell culture performance parameter. The model was also used to identify and rank process parameters according to their relevance in predicting process outcome. Evaluation of cell culture stage-specific models indicates that production performance can be reliably predicted days prior to harvest. Strong associations between several temporal parameters at various manufacturing stages and final process outcome were uncovered. This model-based data mining represents an important step forward in establishing a process data-driven knowledge discovery in bioprocesses. Implementation of this methodology on the manufacturing floor can facilitate a real-time decision making process and thereby improve the robustness of large scale bioprocesses. 2010 Elsevier B.V. All rights reserved.

  15. Science For Sendai - Bridging the gap between research and application

    NASA Astrophysics Data System (ADS)

    Rees, J.

    2015-12-01

    Disasters have an enormous cost in lives and livelihoods, but the use of rigorous evidence-based scientific approaches to minimise their impact remains poor. Vast amounts of science which could be readily applied for disaster risk reduction (DRR) is under-utilised, if used at all. Previous international agreements have failed to change this picture, but there is a clear call from the international community that the 2015 Sendai framework should make a difference; it is thus re-appraising how to bridge the chasm that exists between DRR relevant scientists and potential users of their research. There is widespread recognition of the need for risk affected countries and communities to engage in science-based decision-making, but several barriers, such as a lack of infrastructure or necessary skills, institutions, and enforcement of science-based policies require significant attention. There are now incentives for governments to respond: the framework has science embedded throughout and it sets-out national targets against which science uptake can be monitored; similarly, widening access to insurance also demands sound science. Advances such as open-data and models, increasing computational capacity, expanding networks, evolving diverse mobile technologies and the other multiple facets of the big data agenda, also should drive change. So, how does the scientific community need to adapt? Whilst vast amounts of 'DRR-relevant' science has been produced, too little of it can be readily used in DRR science. Much remains highly disciplinary and focused on analysis of limited distributions or single processes with a small number of agents; by contrast real-world DRR problems are commonly complex, with multiple drivers and uncertainties. There is a major need for a trans-disciplinary DRR-focused risk research agenda to evolve. Not only do research funders need to develop and resource risk research, but researchers themselves need to identify that focussing on the bigger risk picture is commonly more important than addressing the traditional disciplinary topics that they have commonly engaged.

  16. The Need for Deeper Hydrology

    NASA Astrophysics Data System (ADS)

    Fogg, G. E.

    2016-12-01

    Hydrologists often compartmentalize subsurface fluid systems into soil, vadose zone, and groundwater even though such entities are all part of a dynamic continuum. Similarly, hydrogeologists mainly study the fresh groundwater that is essential to water resources upon which humans and ecosystems depend. While vast amounts of these fresh groundwater resources are in sedimentary basins, many of those basins contain vast amounts of saline groundwater and petroleum underneath the freshwater. Contrary to popular assumptions in the hydrogeology and petroleum communities, the saline groundwater and petroleum resources are not stagnant, but migrate in response to Tothian, topographically driven flow as well as other driving forces controlled by thermal, density and geomechanical processes. Importantly, the transition between fresh and saline groundwater does not necessarily represent a boundary between deep, stagnant groundwater and shallower, circulating groundwater. The deep groundwater is part of the subsurface fluid continuum, and exploitation of saline aquifer systems for conventional and unconventional (e.g., fracking) petroleum production or for injection of waste fluids should be done with some knowledge of the integrated fresh and saline water hydrogeologic system. Without sufficient knowledge of the deep and shallow hydrogeology, there will be significant uncertainty about the possible impacts of injection and petroleum extraction activities on overlying fresh groundwater quality and quantity. When significant uncertainty like this exists in science, public and scientific perceptions of consequences swing wildly from one extreme to another. Accordingly, professional and lay opinions on fracking range from predictions of doom to predictions of zero impact. This spastic range of opinions stems directly from the scientific uncertainty about hydrogeologic interactions between shallow and deep hydrogeologic systems. To responsibly manage both the fresh and saline, petroliferous groundwater resources, a new era of whole-system characterization is needed that integrates deep and shallow geologic and hydrogeologic models and data, including aquifer-aquitard frameworks, head and pressure in space and time, and hydrogeochemistry.

  17. A method to predict equilibrium conditions of gas hydrate formation in porous media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clarke, M.A.; Pooladi-Darvish, M.; Bishnoi, P.R.

    1999-06-01

    In the petroleum industry, it is desirable to avoid the formation of gas hydrates. When gas hydrates form, they tend to agglomerate and block pipelines and process equipment. However, naturally occurring gas hydrates that form in the permafrost region or in deep oceans represent a vast untouched natural gas reserve. Although the exact amount of gas in the hydrate form is not known, it is believed to be comparable to the known amount of gas in the free state. Numerous methods for the recovery of natural gas from hydrate fields have been proposed. These techniques include thermal decomposition, depressurization, andmore » chemical injection. To fully exploit hydrate reserves, it will be necessary to know the decomposition/formation conditions of the gas hydrate in porous media. A predictive model has been developed to determine the incipient hydrate formation conditions in porous media. The only additional information that is needed to determine the incipient hydrate formation conditions is the pore radius, surface energy per unit area, and wetting angle. It was found that the model performed well in predicting the experimental data of Handa and Stupin.« less

  18. Scaling-up Sustainable Land Management Practices through the Concept of the Rural Resource Centre: Reconciling Farmers' Interests with Research Agendas

    ERIC Educational Resources Information Center

    Takoutsing, Bertin; Tchoundjeu, Zacharie; Degrande, Ann; Asaah, Ebenezar; Tsobeng, Alain

    2014-01-01

    Purpose: Formal agricultural research has generated vast amount of knowledge and fundamental insights on land management, but their low adoption has been attributed to the use of public extension approach. This research aims to address whether and how full participation of farmers through the concept of Rural Resource Centre (RRC) provides new…

  19. Blog, Chat, Edit, Text, or Tweet? Using Online Tools to Advance Adult Civic Engagement

    ERIC Educational Resources Information Center

    Black, Laura W.

    2012-01-01

    The rapid growth of the Internet over the past 20 years has brought with it a vast amount of communication about public issues of concern to citizens. Over time, the Internet has become increasingly social and interactive. This move from "Web 1.0" to "Web 2.0" involves a great deal of user-generated content and interaction. This transition has…

  20. The effect of water table fluctuation on soil respiration in a lower coastal plain forested wetland in the southeastern U.S.

    Treesearch

    Guofang Miao; Asko Noormets; Jean-Christophe Domec; Carl C. Trettin; Steve G. McNulty; Ge Sun; John S. King

    2013-01-01

    Anthropogenic and environmental pressures on wetland hydrology may trigger changes in carbon (C) cycling, potentially exposing vast amounts of soil C to rapid decomposition. We measured soil CO2 efflux (Rs) continuously from 2009 to 2010 in a lower coastal plain forested wetland in North Carolina, U.S., to characterize its...

  1. Voices from Denali: "it's bigger than wilderness"

    Treesearch

    Alan E. Watson; Katie Knotek; Neal Christensen

    2005-01-01

    Denali National Park and Preserve, at over 6 million acres (2.5 million ha) contains the highest point in North America. Mount McKinley, at more than 20,000 feet (more than 6,000 m) above sea level, watches over thousands of caribou, moose, packs of wolves, grizzly bears, and Dall sheep, as well as many other mountains and a vast amount of rare plant life. Research was...

  2. "I Guess It Was Pretty Fun": Using WebQuests in the Middle School Classroom.

    ERIC Educational Resources Information Center

    Lipscomb, George

    2003-01-01

    Notes that the WebQuest helps students harness the vast amount of on-line resources available. Presents a list of 10 suggestions that may help teachers unfamiliar with WebQuests, especially those in the history classroom, to use them more effectively. Concludes that students learned a great deal about the Civil War by doing the WebQuest, and the…

  3. Bringing 21st Century Learning to the High School Classroom: Program Evaluation on Pedagogical Change

    ERIC Educational Resources Information Center

    Travis, Michael G.

    2013-01-01

    Children today are born into a world with endless amounts of information at their fingertips, the ability to instantly connect with others, and smartphones with an app for virtually everything. It is a world that is vastly different than that of their parents or grandparents. As these students sit in classrooms all over the world, their teachers…

  4. Wildland fire emissions, carbon, and climate: Emission factors

    Treesearch

    Shawn Urbanski

    2014-01-01

    While the vast majority of carbon emitted by wildland fires is released as CO2, CO, and CH4, wildland fire smoke is nonetheless a rich and complex mixture of gases and aerosols. Primary emissions include significant amounts of CH4 and aerosol (organic aerosol and black carbon), which are short-lived climate forcers. In addition to CO2 and short-lived climate forcers,...

  5. Education's Data Management Initiative: Significant Progress Made, but Better Planning Needed to Accomplish Project Goals. Report to Congressional Committees. GAO-06-6

    ERIC Educational Resources Information Center

    Bellis, David

    2005-01-01

    As a condition of receiving federal funding for elementary and secondary education programs, states each year provide vast amounts of data to Education. While the need for information that informs evaluation is important (particularly with the No Child Left Behind Act), Education's data gathering has heretofore presented some problems. It has been…

  6. Putting an End to the Battle over Homework.

    ERIC Educational Resources Information Center

    Lacina-Gifford, Lorna J.; Gifford, Russell B.

    2004-01-01

    You would think that the sound of school bells ringing out in dismissal would be a happy one to the ears of students anxious to go home after a long day in classes, but many are finding the work is just beginning after the final bell. Homework may be nothing new, but lately the vast amounts coming home in the book bags and backpacks of students of…

  7. The National Educational Panel Study (NEPS) in Germany: An Overview of Design, Research Options and Access, with a Focus on Lower-Secondary School

    ERIC Educational Resources Information Center

    Strietholt, Rolf; Naujokat, Kerstin; Mai, Tobias; Kretschmer, Sara; Jarsinski, Stephan; Goy, Martin; Frahm, Sarah; Kanders, Michael; Bos, Wilfried; Blatt, Inge

    2013-01-01

    This article introduces the National Educational Panel Study (NEPS). This German longitudinal study produces a vast amount of data for the scientific community, and researchers all around Europe are invited to use the data to address various research questions empirically. Therefore, the authors provide information about the purpose as well as the…

  8. Computer Security: Improvements Needed to Reduce Risk to Critical Federal Operations and Assets

    DTIC Science & Technology

    2001-11-09

    COMPUTER SECURITY Improvements Needed to Reduce Risk to Critical Federal Operations and Assets Statement of Robert F. Dacey Director, Information...Improvements Needed to Reduce Risk to Critical Federal Operations and Assets Contract Number Grant Number Program Element Number Author(s...The benefits have been enormous. Vast amounts of information are now literally at our fingertips, facilitating research on virtually every topic

  9. Internationalizing the Business School: Constructing Partnership between the Humanities and the Professions during an NEH Grant Project. Marketing Component.

    ERIC Educational Resources Information Center

    Marco, Gayle

    The addition of international concepts in the business school curriculum has been a major thrust of accrediting agencies and the profession at large. While marketers working within the United States have a vast amount of knowledge of their customers, many marketers are "fooled" by the notion that consumers in other countries are the same…

  10. Active Learning in the Classroom: A Muscle Identification Game in a Kinesiology Course

    ERIC Educational Resources Information Center

    McCarroll, Michele L.; Pohle-Krauza, Rachael J.; Martin, Jennifer L.

    2009-01-01

    It is often difficult for educators to teach a kinesiology and applied anatomy (KAA) course due to the vast amount of information that students are required to learn. In this study, a convenient sample of students ("class A") from one section of a KAA course played the speed muscle introduction and matching game, which is loosely based off the…

  11. The Role of Parenting in the Prediction of Criminal Involvement: Findings from a Nationally Representative Sample of Youth and a Sample of Adopted Youth

    ERIC Educational Resources Information Center

    Beaver, Kevin M.; Schwartz, Joseph A.; Connolly, Eric J.; Al-Ghamdi, Mohammed Said; Kobeisy, Ahmed Nezar

    2015-01-01

    The role of parenting in the development of criminal behavior has been the source of a vast amount of research, with the majority of studies detecting statistically significant associations between dimensions of parenting and measures of criminal involvement. An emerging group of scholars, however, has drawn attention to the methodological…

  12. Analysis of Service Records Management Systems for Rescue and Retention of Cultural Resource Documents

    DTIC Science & Technology

    2009-06-01

    this information was not migrated to the new data- base . The responsible offices were told to destroy the old cards, and thus, vast amounts of...then necessary to examine the online service-specific records management systems , namely Army Records Information Management System (ARIMS ), Air...Force Records Information Management System (AFRIMS), and the Navy Records Management System .3 Each system

  13. Dead-wood addition promotes non-saproxylic epigeal arthropods but effects are mediated by canopy openness

    Treesearch

    Sebastian Seibold; Claus Bässler; Petr Baldrian; Lena Reinhard; Simon Thorn; Michael D. Ulyshen; Ingmar Weiß; Jörg Müller

    2016-01-01

    Restoring dead-wood amounts in forests is an increasingly and successfully applied conservation measure to counteract negative effects of intensive logging on biodiversity of saproxylic taxa. By contrast, if and how dead-wood addition benefits the vast number of non-saproxylic forest taxa, and how this varies with contextual factors like canopy openness, remains poorly...

  14. How Choice, Co-Creation, and Culture Are Changing What It Means to Be Net Savvy

    ERIC Educational Resources Information Center

    Lorenzo, George; Oblinger, Diana; Dziuban, Charles

    2007-01-01

    The vast amount of readily available information is just one reason for transforming the way in conducting research and acquiring knowledge. The nature of information itself has changed. In text and other formats, information is not just created by experts--it is created and co-created by amateurs. More than ever before, people can choose what,…

  15. Making Sense of Learner and Learning Big Data: Reviewing Five Years of Data Wrangling at the Open University UK

    ERIC Educational Resources Information Center

    Rienties, Bart; Cross, Simon; Marsh, Vicky; Ullmann, Thomas

    2017-01-01

    Most distance learning institutions collect vast amounts of learning data. Making sense of this 'Big Data' can be a challenge, in particular when data are stored at different data warehouses and require advanced statistical skills to interpret complex patterns of data. As a leading institute on learning analytics, the Open University UK instigated…

  16. Managing Personal and Group Collections of Information

    NASA Technical Reports Server (NTRS)

    Wolfe, Shawn R.; Wragg, Stephen D.; Chen, James R.; Koga, Dennis (Technical Monitor)

    1999-01-01

    The internet revolution has dramatically increased the amount of information available to users. Various tools such as search engines have been developed to help users find the information they need from this vast repository. Users often also need tools to help manipulate the growing amount of useful information they have discovered. Current tools available for this purpose are typically local components of web browsers designed to manage URL bookmarks. They provide limited functionalities to handle high information complexities. To tackle this have created DIAMS, an agent-based tool to help users or groups manage their information collections and share their collections with other. the main features of DIAMS are described here.

  17. The VAST Challenge: History, Scope, and Outcomes: An introduction to the Special Issue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Grinstein, Georges; Whiting, Mark A.

    2014-10-01

    Visual analytics aims to facilitate human insight from complex data via a combination of visual representations, interaction techniques, and supporting algorithms. To create new tools and techniques that achieve this goal requires that researchers have an understanding of analytical questions to be addressed, data that illustrates the complexities and ambiguities found in realistic analytic settings, and methods for evaluating whether the plausible insights are gained through use of the new methods. However, researchers do not, generally speaking, have access to analysts who can articulate their problems or operational data that is used for analysis. To fill this gap, the Visualmore » Analytics Science and Technology (VAST) Challenge has been held annually since 2006. The VAST Challenge provides an opportunity for researchers to experiment with realistic but not real problems, using realistic synthetic data with known events embedded. Since its inception, the VAST Challenge has evolved along with the visual analytics research community to pose more complex challenges, ranging from text analysis to video analysis to large scale network log analysis. The seven years of the VAST Challenge have seen advancements in research and development, education, evaluation, and in the challenge process itself. This special issue of Information Visualization highlights some of the noteworthy advancements in each of these areas. Some of these papers focus on important research questions related to the challenge itself, and other papers focus on innovative research that has been shaped by participation in the challenge. This paper describes the VAST Challenge process and benefits in detail. It also provides an introduction to and context for the remaining papers in the issue.« less

  18. Hybrid coexpression link similarity graph clustering for mining biological modules from multiple gene expression datasets.

    PubMed

    Salem, Saeed; Ozcaglar, Cagri

    2014-01-01

    Advances in genomic technologies have enabled the accumulation of vast amount of genomic data, including gene expression data for multiple species under various biological and environmental conditions. Integration of these gene expression datasets is a promising strategy to alleviate the challenges of protein functional annotation and biological module discovery based on a single gene expression data, which suffers from spurious coexpression. We propose a joint mining algorithm that constructs a weighted hybrid similarity graph whose nodes are the coexpression links. The weight of an edge between two coexpression links in this hybrid graph is a linear combination of the topological similarities and co-appearance similarities of the corresponding two coexpression links. Clustering the weighted hybrid similarity graph yields recurrent coexpression link clusters (modules). Experimental results on Human gene expression datasets show that the reported modules are functionally homogeneous as evident by their enrichment with biological process GO terms and KEGG pathways.

  19. Large-scale additive manufacturing with bioinspired cellulosic materials.

    PubMed

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  20. Methane for Power Generation in Muaro Jambi: A Green Prosperity Model Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moriarty, K.; Elchinger, M.; Hill, G.

    2014-07-01

    NREL conducted eight model projects for Millennium Challenge Corporation's (MCC) Compact with Indonesia. Green Prosperity, the largest project of the Compact, seeks to address critical constraints to economic growth while supporting the Government of Indonesia's commitment to a more sustainable, less carbon-intensive future. This study evaluates electricity generation from the organic content of wastewater at a palm oil mill in Muaro Jambi, Sumatra. Palm mills use vast amounts of water in the production process resulting in problematic waste water called palm oil mill effluent (POME). The POME releases methane to the atmosphere in open ponds which could be covered tomore » capture the methane to produce renewable electricity for rural villages. The study uses average Indonesia data to determine the economic viability of methane capture at a palm oil mill and also evaluates technology as well as social and environmental impacts of the project.« less

  1. The science, technology and research network (STARNET) a searchable thematic compilation of web resources

    USGS Publications Warehouse

    Blados, W.R.; Cotter, G.A.; Hermann, T.

    2007-01-01

    International alliances in space efforts have resulted in a more rapid diffusion of space technology. This, in turn, increases pressure on organizations to push forward with technological developments and to take steps to maximize their inclusion into the research and development (R&D) process and the overall advancement and enhancement of space technology. To cope with this vast and rapidly growing amount of data and information that is vital to the success of the innovation, the Information Management Committee (IMC) of the Research Technology Agency (RTA) developed the science, technology and research network (STARNET). The purpose of this network is to facilitate access to worldwide information elements in terms of science, technology and overall research. It provides a virtual library with special emphasis on international security; a "one stop" information resource for policy makers, program managers, scientists, engineers, researchers and others. ?? 2007 IEEE.

  2. Teaching artificial intelligence to read electropherograms.

    PubMed

    Taylor, Duncan; Powers, David

    2016-11-01

    Electropherograms are produced in great numbers in forensic DNA laboratories as part of everyday criminal casework. Before the results of these electropherograms can be used they must be scrutinised by analysts to determine what the identified data tells us about the underlying DNA sequences and what is purely an artefact of the DNA profiling process. A technique that lends itself well to such a task of classification in the face of vast amounts of data is the use of artificial neural networks. These networks, inspired by the workings of the human brain, have been increasingly successful in analysing large datasets, performing medical diagnoses, identifying handwriting, playing games, or recognising images. In this work we demonstrate the use of an artificial neural network which we train to 'read' electropherograms and show that it can generalise to unseen profiles. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Subclinical magnesium deficiency: a principal driver of cardiovascular disease and a public health crisis

    PubMed Central

    DiNicolantonio, James J; Wilson, William

    2018-01-01

    Because serum magnesium does not reflect intracellular magnesium, the latter making up more than 99% of total body magnesium, most cases of magnesium deficiency are undiagnosed. Furthermore, because of chronic diseases, medications, decreases in food crop magnesium contents, and the availability of refined and processed foods, the vast majority of people in modern societies are at risk for magnesium deficiency. Certain individuals will need to supplement with magnesium in order to prevent suboptimal magnesium deficiency, especially if trying to obtain an optimal magnesium status to prevent chronic disease. Subclinical magnesium deficiency increases the risk of numerous types of cardiovascular disease, costs nations around the world an incalculable amount of healthcare costs and suffering, and should be considered a public health crisis. That an easy, cost-effective strategy exists to prevent and treat subclinical magnesium deficiency should provide an urgent call to action. PMID:29387426

  4. Current status of accreditation for drug testing in hair.

    PubMed

    Cooper, Gail; Moeller, Manfred; Kronstrand, Robert

    2008-03-21

    At the annual meeting of the Society of Hair Testing in Vadstena, Sweden in 2006, a committee was appointed to address the issue of guidelines for hair testing and to assess the current status of accreditation amongst laboratories offering drug testing in hair. A short questionnaire was circulated amongst the membership and interested parties. Fifty-two responses were received from hair testing laboratories providing details on the amount and type of hair tests they offered and the status of accreditation within their facilities. Although the vast majority of laboratories follow current guidelines (83%), only nine laboratories were accredited to ISO/IEC 17025 for hair testing. A significant number of laboratories reporting that they were in the process of developing quality systems with a view to accrediting their methods within 2-3 years. This study provides an insight into the status of accreditation in hair testing laboratories and supports the need for guidelines to encourage best practice.

  5. Understanding mutagenesis through delineation of mutational signatures in human cancer

    DOE PAGES

    Petljak, Mia; Alexandrov, Ludmil B.

    2016-05-04

    Each individual cell within a human body acquires a certain number of somatic mutations during a course of its lifetime. These mutations originate from a wide spectra of both endogenous and exogenous mutational processes that leave distinct patterns of mutations, termed mutational signatures, embedded within the genomes of all cells. In recent years, the vast amount of data produced by sequencing of cancer genomes was coupled with novel mathematical models and computational tools to generate the first comprehensive map of mutational signatures in human cancer. Up to date, >30 distinct mutational signatures have been identified, and etiologies have been proposedmore » for many of them. This paper provides a brief historical background on examination of mutational patterns in human cancer, summarizes the knowledge accumulated since introducing the concept of mutational signatures and discusses their future potential applications and perspectives within the field.« less

  6. Plasticity of gastro-intestinal vagal afferent endings.

    PubMed

    Kentish, Stephen J; Page, Amanda J

    2014-09-01

    Vagal afferents are a vital link between the peripheral tissue and central nervous system (CNS). There is an abundance of vagal afferents present within the proximal gastrointestinal tract which are responsible for monitoring and controlling gastrointestinal function. Whilst essential for maintaining homeostasis there is a vast amount of literature emerging which describes remarkable plasticity of vagal afferents in response to endogenous as well as exogenous stimuli. This plasticity for the most part is vital in maintaining healthy processes; however, there are increased reports of vagal plasticity being disrupted in pathological states, such as obesity. Many of the disruptions, observed in obesity, have the potential to reduce vagal afferent satiety signalling which could ultimately perpetuate the obese state. Understanding how plasticity occurs within vagal afferents will open a whole new understanding of gut function as well as identify new treatment options for obesity. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. SEMCARE: Multilingual Semantic Search in Semi-Structured Clinical Data.

    PubMed

    López-García, Pablo; Kreuzthaler, Markus; Schulz, Stefan; Scherr, Daniel; Daumke, Philipp; Markó, Kornél; Kors, Jan A; van Mulligen, Erik M; Wang, Xinkai; Gonna, Hanney; Behr, Elijah; Honrado, Ángel

    2016-01-01

    The vast amount of clinical data in electronic health records constitutes a great potential for secondary use. However, most of this content consists of unstructured or semi-structured texts, which is difficult to process. Several challenges are still pending: medical language idiosyncrasies in different natural languages, and the large variety of medical terminology systems. In this paper we present SEMCARE, a European initiative designed to minimize these problems by providing a multi-lingual platform (English, German, and Dutch) that allows users to express complex queries and obtain relevant search results from clinical texts. SEMCARE is based on a selection of adapted biomedical terminologies, together with Apache UIMA and Apache Solr as open source state-of-the-art natural language pipeline and indexing technologies. SEMCARE has been deployed and is currently being tested at three medical institutions in the UK, Austria, and the Netherlands, showing promising results in a cardiology use case.

  8. Ground Truth Creation for Complex Clinical NLP Tasks - an Iterative Vetting Approach and Lessons Learned.

    PubMed

    Liang, Jennifer J; Tsou, Ching-Huei; Devarakonda, Murthy V

    2017-01-01

    Natural language processing (NLP) holds the promise of effectively analyzing patient record data to reduce cognitive load on physicians and clinicians in patient care, clinical research, and hospital operations management. A critical need in developing such methods is the "ground truth" dataset needed for training and testing the algorithms. Beyond localizable, relatively simple tasks, ground truth creation is a significant challenge because medical experts, just as physicians in patient care, have to assimilate vast amounts of data in EHR systems. To mitigate potential inaccuracies of the cognitive challenges, we present an iterative vetting approach for creating the ground truth for complex NLP tasks. In this paper, we present the methodology, and report on its use for an automated problem list generation task, its effect on the ground truth quality and system accuracy, and lessons learned from the effort.

  9. Semantic Analysis of Email Using Domain Ontologies and WordNet

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Keller, Richard M.

    2005-01-01

    The problem of capturing and accessing knowledge in paper form has been supplanted by a problem of providing structure to vast amounts of electronic information. Systems that can construct semantic links for natural language documents like email messages automatically will be a crucial element of semantic email tools. We have designed an information extraction process that can leverage the knowledge already contained in an existing semantic web, recognizing references in email to existing nodes in a network of ontology instances by using linguistic knowledge and knowledge of the structure of the semantic web. We developed a heuristic score that uses several forms of evidence to detect references in email to existing nodes in the Semanticorganizer repository's network. While these scores cannot directly support automated probabilistic inference, they can be used to rank nodes by relevance and link those deemed most relevant to email messages.

  10. Industrial use of land observation satellite systems

    NASA Technical Reports Server (NTRS)

    Henderson, F. B., III

    1984-01-01

    The principal industrial users of land observation satellite systems are the geological industries; oil/gas, mining, and engineering/environmental companies. The primary system used is LANDSAT/MSS. Currently, use is also being made of the limited amounts of SKYLAB photography, SEASAT and SIR-A radar, and the new LANDSAT/TM data available. Although considered experimental, LANDSAT data is now used operationally by several hundred exploration and engineering companies worldwide as a vastly improved geological mapping tool to help direct more expensive geophysical and drilling phases, leading to more efficient decision-making and results. Future needs include global LANDSAT/TM; higher spatial resolution; stereo and radar; improved data handling, processing distribution and archiving systems, and integrated geographical information systems (GIS). For a promising future, governments must provide overall continuity (government and/or private sector) of such systems, insure continued government R and D, and commit to operating internationally under the civil Open Skies policy.

  11. Extraction of gold (Au) particles from sea water by Delftia Acidovorans microbes

    NASA Astrophysics Data System (ADS)

    Yusoff, A. H. M.; Nading, M. E.; Salimi, M. N.

    2017-10-01

    Gold-mining activities have been an issue, especially when it involves in contamination of chemicals, for example arsenic and mercury. However, despite of these hazards, gold-mining activities are still being conducted. This is because the gold is worth, regardless of the problems. Gold-mining, as known needs a very large area of land, or site plant. Vast amount of labor force, mechanical force and fund are a must in order for the mining process to be continued. High demand of gold, made gold-mining industries as ones of the most profitable industries in the world. Thus, this has encouraged another alternative way to extract gold. At the mining site, researchers found that biomineralization of gold by Delftia acidovorans can be conducted. How it is done still cannot be understood. It is said that the bacteria secretes secondary metabolites, Delftibactin as a defensive mechanism against the toxicity of the soluble gold. Researchers try to find another source of elemental gold besides of the earth’s core. The options are either lava of a volcano or ocean. Here, the focus is seawater. The problem of seawater is that its composition still not yet to be proved. Dissolve gold existed as gold chloride in seawater, but in a very small amount. So, the gold separation should be focused, in order to make this process to be a successful one. Factors such as depth, climate, region, temperature need to be considered. If this difference affecting the separating process, standardized seawater composition have to be proposed.

  12. Analytics to Better Interpret and Use Large Amounts of Heterogeneous Data

    NASA Astrophysics Data System (ADS)

    Mathews, T. J.; Baskin, W. E.; Rinsland, P. L.

    2014-12-01

    Data scientists at NASA's Atmospheric Science Data Center (ASDC) are seasoned software application developers who have worked with the creation, archival, and distribution of large datasets (multiple terabytes and larger). In order for ASDC data scientists to effectively implement the most efficient processes for cataloging and organizing data access applications, they must be intimately familiar with data contained in the datasets with which they are working. Key technologies that are critical components to the background of ASDC data scientists include: large RBMSs (relational database management systems) and NoSQL databases; web services; service-oriented architectures; structured and unstructured data access; as well as processing algorithms. However, as prices of data storage and processing decrease, sources of data increase, and technologies advance - granting more people to access to data at real or near-real time - data scientists are being pressured to accelerate their ability to identify and analyze vast amounts of data. With existing tools this is becoming exceedingly more challenging to accomplish. For example, NASA Earth Science Data and Information System (ESDIS) alone grew from having just over 4PBs of data in 2009 to nearly 6PBs of data in 2011. This amount then increased to roughly10PBs of data in 2013. With data from at least ten new missions to be added to the ESDIS holdings by 2017, the current volume will continue to grow exponentially and drive the need to be able to analyze more data even faster. Though there are many highly efficient, off-the-shelf analytics tools available, these tools mainly cater towards business data, which is predominantly unstructured. Inadvertently, there are very few known analytics tools that interface well to archived Earth science data, which is predominantly heterogeneous and structured. This presentation will identify use cases for data analytics from an Earth science perspective in order to begin to identify specific tools that may be able to address those challenges.

  13. The Serious Joy and the Joyful Work of Play: Children becoming Agentive Actors in Co-Authoring Themselves and Their World through Play

    ERIC Educational Resources Information Center

    Stetsenko, Anna; Ho, Pi-Chun Grace

    2015-01-01

    In most cultures, play seems to matter a great deal to young children. This is evidenced by the vast amount of time children spent playing and the combination of often unsurpassed passion, imagination, and energy which they invest in this activity. This paper explores why play matters through the lens of Bakhtin's dialogic approach combined with…

  14. Natural gas hydrates; vast resource, uncertain future

    USGS Publications Warehouse

    Collett, T.S.

    2001-01-01

    Gas hydrates are naturally occurring icelike solids in which water molecules trap gas molecules in a cagelike structure known as a clathrate. Although many gases form hydrates in nature, methane hydrate is by far the most common; methane is the most abundant natural gas. The volume of carbon contained in methane hydrates worldwide is estimated to be twice the amount contained in all fossil fuels on Earth, including coal.

  15. Iran: Politics, Gulf Security, and U.S. Policy

    DTIC Science & Technology

    2016-03-30

    to produce the added benefit of improving U.S.-Iran relations. However, since the deal was finalized, Iran, Iran has tested ballistic missiles and...political and social freedoms and reducing Iran’s international isolation, continues to back state intervention in the economy to benefit workers and lower...is able to generate profits from its business affiliates, which enjoy vast tax and regulatory benefits , and can spend significant amounts of

  16. Biological Features of the Soil: Advanced Crop and Soil Science. A Course of Study.

    ERIC Educational Resources Information Center

    Miller, Larry E.

    The course of study represents the third of six modules in advanced crop and soil science and introduces the agriculture student to biological features of soil. Upon completing the two day lesson, the student will: (1) realize the vast amount of life present in the soil, (2) be able to list representative animal and plant life in the soil by size,…

  17. Information Literacy in the Study of American Politics: Using New Media to Teach Information Literacy in the Political Science Classroom

    ERIC Educational Resources Information Center

    Cope, Jonathan; Flanagan, Richard

    2013-01-01

    Students have access to a vast amount of information about American politics through new media outlets (e.g., the Internet). We survey the perils and promise of this new landscape through a case study of a political science class at the College of Staten Island, City University of New York (CUNY), that examined congressional races in the 2010…

  18. An Integrated Literature Review of the Knowledge Needs of Parents with Children with Special Health Care Needs and of Instruments to Assess These Needs

    ERIC Educational Resources Information Center

    Adler, Kristin; Salanterä, Sanna; Leino­-Kilpi, Helena; Grädel, Barbara

    2015-01-01

    The purpose of this integrative (including both quantitative and qualitative studies) literature review was to identify knowledge needs of parents of a child with special health care needs and to evaluate instruments to assess these needs. The content analysis of 48 publications revealed a vast amount of knowledge needs that were categorized into…

  19. New Initiatives for Electronic Scholarly Publishing: Academic Information Sources on the Internet

    DTIC Science & Technology

    2004-12-01

    parallel with the changing economics of publishing. A strong movement, among researchers and academics ( user community), seeks to free scientific...interface between the user and a vast amount of published and unpublished information (Oppenheim 1997: 398), which was made available in hard copy, via...have implemented facilities that enable the user to exercise clear options for selectively retrieving material (OpCit), to discuss and rank the articles

  20. The Army’s Operational Energy Challenge

    DTIC Science & Technology

    2011-05-01

    battery chargers . Solar Hybrid—a system capable of providing up to 10 kilowatts of power continuously while reducing gen- erator running time by 20...granted. Army vehicles consume unprecedented amounts of fuel for mobility and onboard power. Average fuel demand per soldier has increased from about 1... electric power. This depen- dence translates to a vulnerability as fuel and water com- pose the vast majority of resupply volume, which, in turn

  1. Services and the National Information Infrastructure. Report of the Information Infrastructure Task Force Committee on Applications and Technology, Technology Policy Working Group. Draft for Public Comment.

    ERIC Educational Resources Information Center

    Office of Science and Technology Policy, Washington, DC.

    In this report, the National Information Infrastructure (NII) services issue is addressed, and activities to advance the development of NII services are recommended. The NII is envisioned to grow into a seamless web of communications networks, computers, databases, and consumer electronics that will put vast amounts of information at users'…

  2. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  3. Photocatalytic production and processing of conjugated linoleic acid-rich soy oil.

    PubMed

    Jain, Vishal P; Proctor, Andrew

    2006-07-26

    Daily intake of conjugated linoleic acid (CLA), an anticarcinogenic, antiatherosclerotic, antimutagenic agent, and antioxidant, from dairy and meat products is substantially less than estimated required values. The objective of this study was to obtain CLA-rich soybean oil by a customized photochemical reaction system with an iodine catalyst and evaluate the effect of processing on iodine and iodo compounds after adsorption. After 144 h of irradiation, a total CLA yield of 24% (w/w) total oil was obtained with 0.15% (w/w) iodine. Trans,trans isomers (17.5%) formed the majority of the total yield and are also associated with health benefits. The isomers cis-9,trans-11 and trans-10,cis-12 CLA, associated with maximum health benefits, formed approximately 3.5% of the total oil. This amount is quite significant considering that total CLA obtained from dairy sources is only 0.6%. ATR-FTIR, 1H NMR, and GC-MS analyses indicated the absence of peroxide and aldehyde protons, providing evidence that secondary lipid oxidation products were not formed during the photochemical reaction. Adsorption processing vastly reduced the iodine and iodocompounds without CLA loss. Photocatalysis significantly increased the levels of CLA in soybean oil.

  4. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  5. Predicting intensity ranks of peptide fragment ions.

    PubMed

    Frank, Ari M

    2009-05-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm into models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal multiple reaction monitoring (MRM) transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html.

  6. Predicting Intensity Ranks of Peptide Fragment Ions

    PubMed Central

    Frank, Ari M.

    2009-01-01

    Accurate modeling of peptide fragmentation is necessary for the development of robust scoring functions for peptide-spectrum matches, which are the cornerstone of MS/MS-based identification algorithms. Unfortunately, peptide fragmentation is a complex process that can involve several competing chemical pathways, which makes it difficult to develop generative probabilistic models that describe it accurately. However, the vast amounts of MS/MS data being generated now make it possible to use data-driven machine learning methods to develop discriminative ranking-based models that predict the intensity ranks of a peptide's fragment ions. We use simple sequence-based features that get combined by a boosting algorithm in to models that make peak rank predictions with high accuracy. In an accompanying manuscript, we demonstrate how these prediction models are used to significantly improve the performance of peptide identification algorithms. The models can also be useful in the design of optimal MRM transitions, in cases where there is insufficient experimental data to guide the peak selection process. The prediction algorithm can also be run independently through PepNovo+, which is available for download from http://bix.ucsd.edu/Software/PepNovo.html. PMID:19256476

  7. Physical environment virtualization for human activities recognition

    NASA Astrophysics Data System (ADS)

    Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen

    2015-05-01

    Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.

  8. DELP Symposium: Tectonics of eastern Asia and western Pacific Continental Margin

    NASA Astrophysics Data System (ADS)

    Eastern Asia and the western Pacific make up a broad region of active plate tectonic interaction. The area is a natural laboratory for studying the processes involved in the origin and evolution of volcanic island arcs, marginal basins, accretionary prisims, oceanic trenches, accreted terranes, ophiolite emplacement, and intracontinental deformation. Many of our working concepts of plate tectonics and intraplate deformation were developed in this region, even though details of the geology and geophysics there must be considered of a reconnaissance nature.During the past few years researchers have accumulated a vast amount of new and detailed information and have developed a better understanding of the processes that have shaped the tectonic elements in this region. To bring together scientists from many disciplines and to present the wide range of new data and ideas that offer a broader perspective on the interrelations of geological, geochemical, geophysical and geodetic studies, the symposium Tectonics of Eastern Asia and Western Pacific Continental Margin was held December 13-16, 1988, at the Tokyo Institute of Technology in Japan, under the auspicies of DELP (Dynamics and Evolution of the Lithosphere Project).

  9. Automated daily processing of more than 1000 ground-based GPS receivers for studying intense ionospheric storms

    NASA Technical Reports Server (NTRS)

    Komjathy, Attila; Sparks, Lawrence; Wilson, Brian D.; Mannucci, Anthony J.

    2005-01-01

    To take advantage of the vast amount of GPS data, researchers use a number of techniques to estimate satellite and receiver interfrequency biases and the total electron content (TEC) of the ionosphere. Most techniques estimate vertical ionospheric structure and, simultaneously, hardware-related biases treated as nuisance parameters. These methods often are limited to 200 GPS receivers and use a sequential least squares or Kalman filter approach. The biases are later removed from the measurements to obtain unbiased TEC. In our approach to calibrating GPS receiver and transmitter interfrequency biases we take advantage of all available GPS receivers using a new processing algorithm based on the Global Ionospheric Mapping (GIM) software developed at the Jet Propulsion Laboratory. This new capability is designed to estimate receiver biases for all stations. We solve for the instrumental biases by modeling the ionospheric delay and removing it from the observation equation using precomputed GIM maps. The precomputed GIM maps rely on 200 globally distributed GPS receivers to establish the ''background'' used to model the ionosphere at the remaining 800 GPS sites.

  10. Image simulation for automatic license plate recognition

    NASA Astrophysics Data System (ADS)

    Bala, Raja; Zhao, Yonghui; Burry, Aaron; Kozitsky, Vladimir; Fillion, Claude; Saunders, Craig; Rodríguez-Serrano, José

    2012-01-01

    Automatic license plate recognition (ALPR) is an important capability for traffic surveillance applications, including toll monitoring and detection of different types of traffic violations. ALPR is a multi-stage process comprising plate localization, character segmentation, optical character recognition (OCR), and identification of originating jurisdiction (i.e. state or province). Training of an ALPR system for a new jurisdiction typically involves gathering vast amounts of license plate images and associated ground truth data, followed by iterative tuning and optimization of the ALPR algorithms. The substantial time and effort required to train and optimize the ALPR system can result in excessive operational cost and overhead. In this paper we propose a framework to create an artificial set of license plate images for accelerated training and optimization of ALPR algorithms. The framework comprises two steps: the synthesis of license plate images according to the design and layout for a jurisdiction of interest; and the modeling of imaging transformations and distortions typically encountered in the image capture process. Distortion parameters are estimated by measurements of real plate images. The simulation methodology is successfully demonstrated for training of OCR.

  11. Opening toward life: experiences of basic body awareness therapy in persons with major depression.

    PubMed

    Danielsson, Louise; Rosberg, Susanne

    2015-01-01

    Although there is a vast amount of research on different strategies to alleviate depression, knowledge of movement-based treatments focusing on body awareness is sparse. This study explores the experiences of basic body awareness therapy (BBAT) in 15 persons diagnosed with major depression who participated in the treatment in a randomized clinical trial. Hermeneutic phenomenological methodology inspired the approach to interviews and data analysis. The participants' experiences were essentially grasped as a process of enhanced existential openness, opening toward life, exceeding the tangible corporeal dimension to also involve emotional, temporal, and relational aspects of life. Five constituents of this meaning were described: vitality springing forth, grounding oneself, recognizing patterns in one's body, being acknowledged and allowed to be oneself, and grasping the vagueness. The process of enhanced perceptual openness challenges the numbness experienced in depression, which can provide hope for change, but it is connected to hard work and can be emotionally difficult to bear. Inspired by a phenomenological framework, the results of this study illuminate novel clinical and theoretical insight into the meaning of BBAT as an adjunctive approach in the treatment of depression.

  12. HydroClim: a Continental-Scale Database of Contemporary and Future Streamflow and Stream Temperature Estimates for Aquatic Ecosystem Studies

    NASA Astrophysics Data System (ADS)

    Knouft, J.; Ficklin, D. L.; Bart, H. L.; Rios, N. E.

    2017-12-01

    Streamflow and water temperature are primary factors influencing the traits, distribution, and diversity of freshwater species. Ongoing changes in climate are causing directional alteration of these environmental conditions, which can impact local ecological processes. Accurate estimation of these variables is critical for predicting the responses of species to ongoing changes in freshwater habitat, yet ecologically relevant high-resolution data describing variation in streamflow and water temperature across North America are not available. Considering the vast amount of web-accessible freshwater biodiversity data, development and application of appropriate hydrologic data are critical to the advancement of our understanding of freshwater systems. To address this issue, we are developing the "HydroClim" database, which will provide web-accessible (www.hydroclim.org) historical and projected monthly streamflow and water temperature data for stream sections in all major watersheds across the United States and Canada from 1950-2099. These data will also be integrated with FishNet 2 (www.fishnet2.net), an online biodiversity database that provides open access to over 2 million localities of freshwater fish species in the United States and Canada, thus allowing for the characterization of the habitat requirements of freshwater species across this region. HydroClim should provide a vast array of opportunities for a greater understanding of water resources as well as information for the conservation of freshwater biodiversity in the United States and Canada in the coming century.

  13. Developing a Hadoop-based Middleware for Handling Multi-dimensional NetCDF

    NASA Astrophysics Data System (ADS)

    Li, Z.; Yang, C. P.; Schnase, J. L.; Duffy, D.; Lee, T. J.

    2014-12-01

    Climate observations and model simulations are collecting and generating vast amounts of climate data, and these data are ever-increasing and being accumulated in a rapid speed. Effectively managing and analyzing these data are essential for climate change studies. Hadoop, a distributed storage and processing framework for large data sets, has attracted increasing attentions in dealing with the Big Data challenge. The maturity of Infrastructure as a Service (IaaS) of cloud computing further accelerates the adoption of Hadoop in solving Big Data problems. However, Hadoop is designed to process unstructured data such as texts, documents and web pages, and cannot effectively handle the scientific data format such as array-based NetCDF files and other binary data format. In this paper, we propose to build a Hadoop-based middleware for transparently handling big NetCDF data by 1) designing a distributed climate data storage mechanism based on POSIX-enabled parallel file system to enable parallel big data processing with MapReduce, as well as support data access by other systems; 2) modifying the Hadoop framework to transparently processing NetCDF data in parallel without sequencing or converting the data into other file formats, or loading them to HDFS; and 3) seamlessly integrating Hadoop, cloud computing and climate data in a highly scalable and fault-tolerance framework.

  14. Can serpentinization induce fracturing? Fluid pathway development and the volume increase enigma

    NASA Astrophysics Data System (ADS)

    Plümper, Oliver; Jamtveit, Bjørn; Røyne, Anja

    2013-04-01

    Serpentinization of ultramafic rocks has first-order effects on global element cycles, the rheology of the oceanic lithosphere, plays a key role in plate tectonics by lubricating subduction zones and has been linked to the origin of life due to the creation of abiogenic hydrocarbons. In addition, the capability of ultramafic rocks to safely store enormous amounts of carbon dioxide through mineral reactions may provide a unique solution to fight global warming. However, all the aforementioned processes are reliant on the creation and maintenance of fluid pathways to alter an originally impermeable rock. Although the forces that move tectonic plates can produce these fluid pathways by mechanical fracturing, there is ample evidence that serpentinization reactions can 'eat' their way through a rock. This process is facilitated by solid volume changes during mineral reactions that cause expansion, fracturing the rock to generate fluid pathways. Natural observations of serpentinization/carbonation in ultramafic rocks indicate that the associated positive solid volume change alone exerts enough stress on the surrounding rock to build up a fracture network and that the influence of external tectonic forces is not necessary. Through various feedbacks these systems can either become self-sustaining, when an interconnected fracture network is formed, or self-limiting due to fluid pathway obstruction. However, extensively serpentinized outcrops suggest that although crystal growth in newly opened spaces would reduce permeability, serpentinization is not always self-limiting as porosity generation can occur concomitantly, maintaining or even increasing permeability. This is consistent with theory and demonstrates that fluids transported through fracture networks can alter vast amounts of originally impermeable rock. Nevertheless, whether serpentinization can actually generate these fracture networks is still a matter of debate and only a few scientific investigations have focused on this topic so far. Here, we investigate the feasibility of reaction-induced fracturing and pore space evolution during serpentinization by combining microstructural investigations using scanning/transmission electron microscopy and synchrotron micro-tomography of natural samples with theoretical considerations on the forces exerted during solid volume increasing reactions. We particularly focus on the interface-scale mechanism of reaction-induced fracturing (Plümper et al. 2012) and the establishment of microstructural markers (e.g., inert exsolutions in olivine) to identify volume changes and estimate crystallization pressures (Kelemen and Hirth 2012). Our investigations suggest that reaction-induced fracturing during serpentinization is possible and during certain physico-chemical circumstances a positive feedback to alter vast amounts of originally impermeable rock is established. Plümper O., Røyne A., Magraso A., Jamtveit B. (2012) The interface-scale mechanism of reaction-induced fracturing during serpentinization. Geology. 40, 1103-1106. Kelemen, P. B. & Hirth, G. (2012) Reaction-driven cracking during retrograde metamorphism: Olivine hydration and carbonation. Earth and Planetary Science Letters 345, 81-89.

  15. Cloud-based NEXRAD Data Processing and Analysis for Hydrologic Applications

    NASA Astrophysics Data System (ADS)

    Seo, B. C.; Demir, I.; Keem, M.; Goska, R.; Weber, J.; Krajewski, W. F.

    2016-12-01

    The real-time and full historical archive of NEXRAD Level II data, covering the entire United States from 1991 to present, recently became available on Amazon cloud S3. This provides a new opportunity to rebuild the Hydro-NEXRAD software system that enabled users to access vast amounts of NEXRAD radar data in support of a wide range of research. The system processes basic radar data (Level II) and delivers radar-rainfall products based on the user's custom selection of features such as space and time domain, river basin, rainfall product space and time resolution, and rainfall estimation algorithms. The cloud-based new system can eliminate prior challenges faced by Hydro-NEXRAD data acquisition and processing: (1) temporal and spatial limitation arising from the limited data storage; (2) archive (past) data ingestion and format conversion; and (3) separate data processing flow for the past and real-time Level II data. To enhance massive data processing and computational efficiency, the new system is implemented and tested for the Iowa domain. This pilot study begins by ingesting rainfall metadata and implementing Hydro-NEXRAD capabilities on the cloud using the new polarimetric features, as well as the existing algorithm modules and scripts. The authors address the reliability and feasibility of cloud computation and processing, followed by an assessment of response times from an interactive web-based system.

  16. Software and the future of programming languages.

    PubMed

    Aho, Alfred V

    2004-02-27

    Although software is the key enabler of the global information infrastructure, the amount and extent of software in use in the world today are not widely understood, nor are the programming languages and paradigms that have been used to create the software. The vast size of the embedded base of existing software and the increasing costs of software maintenance, poor security, and limited functionality are posing significant challenges for the software R&D community.

  17. The U.S.-Saudi Partnership: Is This Marriage Headed for Divorce?

    DTIC Science & Technology

    2008-09-01

    water exploration, Crane discovered vast amounts of oil deposits and alerted his engineers to exploit this further. Aramco would later make trillions...British also utilized the Suez Canal as a major trading route to India, and created coaling stations along this route in various ports. More importantly...and Indonesian oil production ramped up output to fill the gap, and by early September 1967, the Arab producers gave up the embargo.101

  18. [The need for education and regulation regarding the use of digital media].

    PubMed

    Gautellier, Christian

    2015-01-01

    Children and teenagers spend vast amounts of time in front of screens. Faced with this reality, it is essential that they receive media education to help them get a proper grasp of information and image cultures. It is designed to offer global support for their cognitive, emotional and social construction and requires the participation all those who play a role in their education: their family, teachers and extracurricular activity leaders.

  19. Factors Influencing Material Removal And Surface Finish Of The Polishing Of Silica Glasses

    DTIC Science & Technology

    2006-01-01

    Mechanical Properties of Quartz and Zerodur ® ..................................... 48 TABLE 4.2: Results from variable load and lap velocity experiments...of glass and glass-ceramic substrates which are used in a vast amount of applications, from optics for lithographic machines to mirrors and lenses...SiO2) glass polishing with metal oxide abrasive particles. This scheme will mirror the experimentation in this thesis, and hopefully provide a better

  20. Science and data science.

    PubMed

    Blei, David M; Smyth, Padhraic

    2017-08-07

    Data science has attracted a lot of attention, promising to turn vast amounts of data into useful predictions and insights. In this article, we ask why scientists should care about data science. To answer, we discuss data science from three perspectives: statistical, computational, and human. Although each of the three is a critical component of data science, we argue that the effective combination of all three components is the essence of what data science is about.

  1. Neonatal Informatics: Transforming Neonatal Care Through Translational Bioinformatics

    PubMed Central

    Palma, Jonathan P.; Benitz, William E.; Tarczy-Hornoch, Peter; Butte, Atul J.; Longhurst, Christopher A.

    2012-01-01

    The future of neonatal informatics will be driven by the availability of increasingly vast amounts of clinical and genetic data. The field of translational bioinformatics is concerned with linking and learning from these data and applying new findings to clinical care to transform the data into proactive, predictive, preventive, and participatory health. As a result of advances in translational informatics, the care of neonates will become more data driven, evidence based, and personalized. PMID:22924023

  2. Towards SDS (Strategic Defense System) Testing and Evaluation: A collection of Relevant Topics

    DTIC Science & Technology

    1989-07-01

    the proof of the next. 89 The Piton project is the first instance of stacking.two verified components. In 1985 Warren...Accelerated? In the long term, a vast amount of work needs to be done. Below are some miscellaneous, fairly near term projects which would seem to provide...and predictions for the current project . It provides a quantitative analysis of the environment and a model of the

  3. Drought-induced carbon loss in peatlands

    NASA Astrophysics Data System (ADS)

    Fenner, Nathalie; Freeman, Chris

    2011-12-01

    Peatlands store vast amounts of organic carbon, amounting to approximately 455 Pg. Carbon builds up in these water-saturated environments owing to the presence of phenolic compounds--which inhibit microbial activity and therefore prevent the breakdown of organic matter. Anoxic conditions limit the activity of phenol oxidase, the enzyme responsible for the breakdown of phenolic compounds. Droughts introduce oxygen into these systems, and the frequency of these events is rising. Here, we combine in vitro manipulations, mesocosm experiments and field observations to examine the impact of drought on peatland carbon loss. We show that drought stimulates bacterial growth and phenol oxidase activity, resulting in a reduction in the concentration of phenolic compounds in peat. This further stimulates microbial growth, causing the breakdown of organic matter and the release of carbon dioxide in a biogeochemical cascade. We further show that re-wetting the peat accelerates carbon losses to the atmosphere and receiving waters, owing to drought-induced increases in nutrient and labile carbon levels, which raise pH and stimulate anaerobic decomposition. We suggest that severe drought, and subsequent re-wetting, could destabilize peatland carbon stocks; understanding this process could aid understanding of interactions between peatlands and other environmental trends, and lead to the development of strategies for increasing carbon stocks.

  4. Challenges in Small Screening Laboratories: SaaS to the rescue

    PubMed Central

    Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William

    2012-01-01

    The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415

  5. Genetic risk prediction using a spatial autoregressive model with adaptive lasso.

    PubMed

    Wen, Yalu; Shen, Xiaoxi; Lu, Qing

    2018-05-31

    With rapidly evolving high-throughput technologies, studies are being initiated to accelerate the process toward precision medicine. The collection of the vast amounts of sequencing data provides us with great opportunities to systematically study the role of a deep catalog of sequencing variants in risk prediction. Nevertheless, the massive amount of noise signals and low frequencies of rare variants in sequencing data pose great analytical challenges on risk prediction modeling. Motivated by the development in spatial statistics, we propose a spatial autoregressive model with adaptive lasso (SARAL) for risk prediction modeling using high-dimensional sequencing data. The SARAL is a set-based approach, and thus, it reduces the data dimension and accumulates genetic effects within a single-nucleotide variant (SNV) set. Moreover, it allows different SNV sets having various magnitudes and directions of effect sizes, which reflects the nature of complex diseases. With the adaptive lasso implemented, SARAL can shrink the effects of noise SNV sets to be zero and, thus, further improve prediction accuracy. Through simulation studies, we demonstrate that, overall, SARAL is comparable to, if not better than, the genomic best linear unbiased prediction method. The method is further illustrated by an application to the sequencing data from the Alzheimer's Disease Neuroimaging Initiative. Copyright © 2018 John Wiley & Sons, Ltd.

  6. Transgenic multivitamin corn through biofortification of endosperm with three vitamins representing three distinct metabolic pathways

    PubMed Central

    Naqvi, Shaista; Zhu, Changfu; Farre, Gemma; Ramessar, Koreen; Bassie, Ludovic; Breitenbach, Jürgen; Perez Conesa, Dario; Ros, Gaspar; Sandmann, Gerhard; Capell, Teresa; Christou, Paul

    2009-01-01

    Vitamin deficiency affects up to 50% of the world's population, disproportionately impacting on developing countries where populations endure monotonous, cereal-rich diets. Transgenic plants offer an effective way to increase the vitamin content of staple crops, but thus far it has only been possible to enhance individual vitamins. We created elite inbred South African transgenic corn plants in which the levels of 3 vitamins were increased specifically in the endosperm through the simultaneous modification of 3 separate metabolic pathways. The transgenic kernels contained 169-fold the normal amount of β-carotene, 6-fold the normal amount of ascorbate, and double the normal amount of folate. Levels of engineered vitamins remained stable at least through to the T3 homozygous generation. This achievement, which vastly exceeds any realized thus far by conventional breeding alone, opens the way for the development of nutritionally complete cereals to benefit the world's poorest people. PMID:19416835

  7. Radiolysis of hexavalent plutonium in solutions of uranyl nitrate containing fission product simulants

    NASA Astrophysics Data System (ADS)

    Rance, Peter J. W.; Zilberman, B. Ya.; Akopov, G. A.

    2000-07-01

    The effect of the inherent radioactivity on the chemical state of plutonium ions in solution was recognized very shortly after the first macroscopic amounts of plutonium became available and early studies were conducted as part of the Manhattan Project. However, the behavior of plutonium ions, in nitric acid especially, has been found to be somewhat complex, so much so that a relatively modern summary paper included the comment that, "The vast amount of work carried out in nitric acid solutions can not be adequately summarized. Suffice it to say results in these solutions are plagued with irreproducibility and induction periods…" Needless to say, the presence of other ions in solution, as occurs when irradiated nuclear fuel is dissolved, further complicates matters. The purpose of the work described below was to add to the rather small amount of qualitative data available relating to the radiolytic behavior of plutonium in solutions of irradiated nuclear fuel.

  8. Numerical and experimental studies on effects of moisture content on combustion characteristics of simulated municipal solid wastes in a fixed bed.

    PubMed

    Sun, Rui; Ismail, Tamer M; Ren, Xiaohan; Abd El-Salam, M

    2015-05-01

    In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on the combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. A Software Developer’s Guide to Informal Evaluation of Visual Analytics Environments Using VAST Challenge Information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kristin A.; Scholtz, Jean; Whiting, Mark A.

    The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less

  10. PLOCAN glider portal: a gateway for useful data management and visualization system

    NASA Astrophysics Data System (ADS)

    Morales, Tania; Lorenzo, Alvaro; Viera, Josue; Barrera, Carlos; José Rueda, María

    2014-05-01

    Nowadays monitoring ocean behavior and its characteristics involves a wide range of sources able to gather and provide a vast amount of data in spatio-temporal scales. Multiplatform infrastructures, like PLOCAN, hold a variety of autonomous Lagrangian and Eulerian devices addressed to collect information then transferred to land in near-real time. Managing all this data collection in an efficient way is a major issue. Advances in ocean observation technologies, where underwater autonomous gliders play a key role, has brought as a consequence an improvement of spatio-temporal resolution which offers a deeper understanding of the ocean but requires a bigger effort in the data management process. There are general requirements in terms of data management in that kind of environments, such as processing raw data at different levels to obtain valuable information, storing data coherently and providing accurate products to final users according to their specific needs. Managing large amount of data can be certainly tedious and complex without having right tools and operational procedures; hence automating these tasks through software applications saves time and reduces errors. Moreover, data distribution is highly relevant since scientist tent to assimilate different sources for comparison and validation. The use of web applications has boosted the necessary scientific dissemination. Within this argument, PLOCAN has implemented a set of independent but compatible applications to process, store and disseminate information gathered through different oceanographic platforms. These applications have been implemented using open standards, such as HTML and CSS, and open source software, like python as programming language and Django as framework web. More specifically, a glider application has been developed within the framework of FP7-GROOM project. Regarding data management, this project focuses on collecting and making available consistent and quality controlled datasets as well as fostering open access to glider data.

  11. Synthesizing parallel imaging applications using the CAP (computer-aided parallelization) tool

    NASA Astrophysics Data System (ADS)

    Gennart, Benoit A.; Mazzariol, Marc; Messerli, Vincent; Hersch, Roger D.

    1997-12-01

    Imaging applications such as filtering, image transforms and compression/decompression require vast amounts of computing power when applied to large data sets. These applications would potentially benefit from the use of parallel processing. However, dedicated parallel computers are expensive and their processing power per node lags behind that of the most recent commodity components. Furthermore, developing parallel applications remains a difficult task: writing and debugging the application is difficult (deadlocks), programs may not be portable from one parallel architecture to the other, and performance often comes short of expectations. In order to facilitate the development of parallel applications, we propose the CAP computer-aided parallelization tool which enables application programmers to specify at a high-level of abstraction the flow of data between pipelined-parallel operations. In addition, the CAP tool supports the programmer in developing parallel imaging and storage operations. CAP enables combining efficiently parallel storage access routines and image processing sequential operations. This paper shows how processing and I/O intensive imaging applications must be implemented to take advantage of parallelism and pipelining between data access and processing. This paper's contribution is (1) to show how such implementations can be compactly specified in CAP, and (2) to demonstrate that CAP specified applications achieve the performance of custom parallel code. The paper analyzes theoretically the performance of CAP specified applications and demonstrates the accuracy of the theoretical analysis through experimental measurements.

  12. Techniques A: continuous waves

    NASA Astrophysics Data System (ADS)

    Beuthan, J.

    1993-08-01

    In a vast amount of medical diseases the biochemical and physiological changes of soft tissues are hardly detectable by conventional techniques of diagnostic imaging (x- ray, ultrasound, computer tomography, and MRI). The detectivity is low and the technical efforts are tremendous. On the other hand these pathologic variations induce significant changes of the optical tissue parameters which can be detected. The corresponding variations of the scattered light can most easily be detected and evaluated by infrared diaphanoscopy, even on optical thick tissue slices.

  13. Distributing and storing data efficiently by means of special datasets in the ATLAS collaboration

    NASA Astrophysics Data System (ADS)

    Köneke, Karsten; ATLAS Collaboration

    2011-12-01

    With the start of the LHC physics program, the ATLAS experiment started to record vast amounts of data. This data has to be distributed and stored on the world-wide computing grid in a smart way in order to enable an effective and efficient analysis by physicists. This article describes how the ATLAS collaboration chose to create specialized reduced datasets in order to efficiently use computing resources and facilitate physics analyses.

  14. Summary and Review of the Tectonic Structure of Eurasia. Part 1

    DTIC Science & Technology

    1980-12-05

    DTIC TAB Just tIcjat DIstrju1j D it i AVi Dis a2 INTRODUCTION An extensive search of the available geologic and geo- physical literature dealing...with the crust and upper mantle properties of the U.S.S.R. and Eurasia has been conducted. During the past 25 years a vast amount of deep seismic...boundaries for these provinces were drawn after considering geologic evolution. Seismic activity, heat flow, Moho properties , crustal properties

  15. Everglades: The Catalyst to Combat the World Water Crisis

    DTIC Science & Technology

    2009-02-27

    the current energy crisis of the 21st Century is centered on oil . However, unlike water, there are numerous alternatives to energy other than oil . A...sources for fresh water. However, desalination is expensive and requires significant amounts of energy. Given the reliance of the United States on oil ...Everglades is a river, but also, a rich ecosystem that supports a multitude of life to include vast flora and algae, mangroves , wading birds, shrimp and

  16. The Logistics of Waging War 1982-1993

    DTIC Science & Technology

    1993-09-01

    January 1984). 5. Walker, Captain Carol A. "DMES: A Giant Step Toward Increased Airlift Capability," Airlift, pages 10 - 11 (Spring 1984). The Logistics...personnel ate three meals a The Logistics of Waging War 1982 - 1993 The War in the Persian Gulf * 48 day, seven days a week, amounting to 1,200.000... meals per day, or 8.4 million meals per week. While the Saudi government supplied vast quantities of soft drinks, fresh fruit, and potable water, the

  17. Chemical and Physical Properties of Individual Aerosol Particles Characterized in Sacramento, CA during CARES Field Campaign

    NASA Astrophysics Data System (ADS)

    Zelenyuk, A.; Beranek, J.; Vaden, T.; Imre, D. G.; Zaveri, R. A.

    2011-12-01

    We present results of measurements conducted by our Single Particle Mass Spectrometer, SPLAT II, in Sacramento, CA over the month of June 2010. SPLAT II measured the size of 195 million particles, and compositions of 10 million particles. In addition to size and composition, SPLAT II simultaneously measured size, density and composition of 121,000 individual particles. These measurements were conducted 2 - 3 times per day, depending on conditions. The data show that throughout the day particles were relatively small (<200 nm), and the vast majority were composed of oxygenated organics mixed with various amounts of sulfate. In addition, we characterized fresh and processed soot, biomass burning aerosol, organic amines, fresh and processed sea salt, and few dust particles. The data show a reproducible diurnal pattern in aerosol size distributions, number concentrations, and compositions. Early in the day, number concentrations were low, particles were very small, and the size distributions peaked at ~70 nm. At this time of the day, 80 nm particles had a density of 1.3 g cm-3; while the density of 200 nm particles was 1.6 g cm-3, consistent with our mass spectra showing that smaller particles were composed of organics mixed with ~10% sulfates, while larger particles were composed mostly of sulfate mixed with a small amount of organics. Later in the day, secondary organic aerosols (SOA) formation led to a number of nucleation events that significantly increased the number concentrations of very small particles. By mid-afternoon, as more SOA formed and condensed, particles increased in size the number concentrations of particles larger than 70 nm increased and the densities of particles 80 to 200 nm particles was ~1.3 g cm-3. The vast majority of these particles were composed of oxygenated organics mixed with a ~10% sulfate. In other words they were SOA particles mixed with a small amount of sulfate. The mass spectra of these particles shows that there were two types of SOA particles, which we labeled Type 43 and Type 44, to indicate which of the two mass-spectral peaks caries higher intensity. We were also able to conduct room temperature evaporation studies of these particles on four separate occasions and found the evaporation kinetics to be reproducible. The data show that after 4 hours of evaporation, in an organic vapor free environment, particles lose only ~20% of their volume. Moreover, evaporation starts with a relatively fast phase and proceeds with a much slower stage about 2 hours after evaporation starts. It is important to keep in mind that these slow evaporating SOA particles were relatively fresh. Based on these studies and similar studies conducted in our laboratory we conclude that these atmospheric SOA particles are quasi-solids. Moreover the data indicate that to first order it is reasonable to approximate SOA particles as being non-volatile. Interestingly, we find that in both SOA particle types a large fraction of the intensity in peaks 44 and 73 was related to a small amount of surface compounds that evaporated within a few minutes.

  18. Constraining geostatistical models with hydrological data to improve prediction realism

    NASA Astrophysics Data System (ADS)

    Demyanov, V.; Rojas, T.; Christie, M.; Arnold, D.

    2012-04-01

    Geostatistical models reproduce spatial correlation based on the available on site data and more general concepts about the modelled patters, e.g. training images. One of the problem of modelling natural systems with geostatistics is in maintaining realism spatial features and so they agree with the physical processes in nature. Tuning the model parameters to the data may lead to geostatistical realisations with unrealistic spatial patterns, which would still honour the data. Such model would result in poor predictions, even though although fit the available data well. Conditioning the model to a wider range of relevant data provide a remedy that avoid producing unrealistic features in spatial models. For instance, there are vast amounts of information about the geometries of river channels that can be used in describing fluvial environment. Relations between the geometrical channel characteristics (width, depth, wave length, amplitude, etc.) are complex and non-parametric and are exhibit a great deal of uncertainty, which is important to propagate rigorously into the predictive model. These relations can be described within a Bayesian approach as multi-dimensional prior probability distributions. We propose a way to constrain multi-point statistics models with intelligent priors obtained from analysing a vast collection of contemporary river patterns based on previously published works. We applied machine learning techniques, namely neural networks and support vector machines, to extract multivariate non-parametric relations between geometrical characteristics of fluvial channels from the available data. An example demonstrates how ensuring geological realism helps to deliver more reliable prediction of a subsurface oil reservoir in a fluvial depositional environment.

  19. Recent advances in cross-cultural measurement in psychiatric epidemiology: utilizing 'what matters most' to identify culture-specific aspects of stigma.

    PubMed

    Yang, Lawrence Hsin; Thornicroft, Graham; Alvarado, Ruben; Vega, Eduardo; Link, Bruce George

    2014-04-01

    While stigma measurement across cultures has assumed growing importance in psychiatric epidemiology, it is unknown to what extent concepts arising from culture have been incorporated. We utilize a formulation of culture-as the everyday interactions that 'matter most' to individuals within a cultural group-to identify culturally-specific stigma dynamics relevant to measurement. A systematic literature review from January 1990 to September 2012 was conducted using PsycINFO, Medline and Google Scholar to identify articles studying: (i) mental health stigma-related concepts; (ii) ≥ 1 non-Western European cultural group. From 5292 abstracts, 196 empirical articles were located. The vast majority of studies (77%) utilized adaptations of existing Western-developed stigma measures to new cultural groups. Extremely few studies (2.0%) featured quantitative stigma measures derived within a non-Western European cultural group. A sizeable amount (16.8%) of studies employed qualitative methods to identify culture-specific stigma processes. The 'what matters most' perspective identified cultural ideals of the everyday activities that comprise 'personhood' of 'preserving lineage' among specific Asian groups, 'fighting hard to overcome problems and taking advantage of immigration opportunities' among specific Latino-American groups, and 'establishing trust among religious institutions due to institutional discrimination' among African-American groups. These essential cultural interactions shaped culture-specific stigma manifestations. Mixed method studies (3.6%) corroborated these qualitative results. Quantitatively-derived, culturally-specific stigma measures were lacking. Further, the vast majority of qualitative studies on stigma were conducted without using stigma-specific frameworks. We propose the 'what matters most' approach to address this key issue in future research.

  20. Surface Crystallization of Cloud Droplets: Implications for Climate Change and Ozone Depletion

    NASA Technical Reports Server (NTRS)

    Tabazadeh, A.; Djikaev, Y. S.; Reiss, H.; Gore, Warren J. (Technical Monitor)

    2002-01-01

    The process of supercooled liquid water crystallization into ice is still not well understood. Current experimental data on homogeneous freezing rates of ice nucleation in supercooled water droplets show considerable scatter. For example, at -33 C, the reported freezing nucleation rates vary by as much as 5 orders of magnitude, which is well outside the range of measurement uncertainties. Until now, experimental data on the freezing of supercooled water has been analyzed under the assumption that nucleation of ice took place in the interior volume of a water droplet. Here, the same data is reanalyzed assuming that the nucleation occurred "pseudoheterogeneously" at the air (or oil)-liquid water interface of the droplet. Our analysis suggest that the scatter in the nucleation data can be explained by two main factors. First, the current assumption that nucleation occurs solely inside the volume of a water droplet is incorrect. Second, because the nucleation process most likely occurs on the surface, the rates of nuclei formation could differ vastly when oil or air interfaces are involved. Our results suggest that ice freezing in clouds may initiate on droplet surfaces and such a process can allow for low amounts of liquid water (approx. 0.002 g per cubic meters) to remain supercooled down to -40 C as observed in the atmosphere.

  1. Non-Equilibrium Thermodynamics of Transcriptional Bursts

    NASA Astrophysics Data System (ADS)

    Hernández-Lemus, Enrique

    Gene transcription or Gene Expression (GE) is the process which transforms the information encoded in DNA into a functional RNA message. It is known that GE can occur in bursts or pulses. Transcription is irregular, with strong periods of activity, interspersed by long periods of inactivity. If we consider the average behavior over millions of cells, this process appears to be continuous. But at the individual cell level, there is considerable variability, and for most genes, very little activity at any one time. Some have claimed that GE bursting can account for the high variability in gene expression occurring between cells in isogenic populations. This variability has a big impact on cell behavior and thus on phenotypic conditions and disease. In view of these facts, the development of a thermodynamic framework to study gene expression and transcriptional regulation to integrate the vast amount of molecular biophysical GE data is appealing. Application of such thermodynamic formalism is useful to observe various dissipative phenomena in GE regulatory dynamics. In this chapter we will examine at some detail the complex phenomena of transcriptional bursts (specially of a certain class of anomalous bursts) in the context of a non-equilibrium thermodynamics formalism and will make some initial comments on the relevance of some irreversible processes that may be connected to anomalous transcriptional bursts.

  2. The changing face of informed surgical consent.

    PubMed

    Oosthuizen, J C; Burns, P; Timon, C

    2012-03-01

    To determine whether procedure-specific brochures improve patients' pre-operative knowledge, to determine the amount of information expected by patients during the consenting process, and to determine whether the recently proposed 'Request for Treatment' consenting process is viable on a large scale. A prospective, questionnaire-based study of 100 patients admitted for selected, elective surgical procedures. In total, 99 per cent of patients were satisfied with the information received in the out-patient department, regarding the proposed procedure. However, 38 per cent were unable to correctly state the nature of the surgery or specific procedure they were scheduled to undergo. Although the vast majority of patients were able to state the intended benefits to be gained from the procedure, only 54 per cent were able to list at least one potential complication, and 80 per cent indicated that they wished to be informed about all potential complications, even if these occurred in less than 1 per cent of cases. The introduction of procedure-specific brochures improved patients' pre-operative knowledge. Although the failings of current consenting practice are clear, the Request for Treatment consenting process would not appear to be a viable alternative because of the large number of patients unable to accurately recall the nature of the proposed surgery or potential complications, following consent counselling.

  3. Hybrid coexpression link similarity graph clustering for mining biological modules from multiple gene expression datasets

    PubMed Central

    2014-01-01

    Background Advances in genomic technologies have enabled the accumulation of vast amount of genomic data, including gene expression data for multiple species under various biological and environmental conditions. Integration of these gene expression datasets is a promising strategy to alleviate the challenges of protein functional annotation and biological module discovery based on a single gene expression data, which suffers from spurious coexpression. Results We propose a joint mining algorithm that constructs a weighted hybrid similarity graph whose nodes are the coexpression links. The weight of an edge between two coexpression links in this hybrid graph is a linear combination of the topological similarities and co-appearance similarities of the corresponding two coexpression links. Clustering the weighted hybrid similarity graph yields recurrent coexpression link clusters (modules). Experimental results on Human gene expression datasets show that the reported modules are functionally homogeneous as evident by their enrichment with biological process GO terms and KEGG pathways. PMID:25221624

  4. Recognising discourse causality triggers in the biomedical domain.

    PubMed

    Mihăilă, Claudiu; Ananiadou, Sophia

    2013-12-01

    Current domain-specific information extraction systems represent an important resource for biomedical researchers, who need to process vast amounts of knowledge in a short time. Automatic discourse causality recognition can further reduce their workload by suggesting possible causal connections and aiding in the curation of pathway models. We describe here an approach to the automatic identification of discourse causality triggers in the biomedical domain using machine learning. We create several baselines and experiment with and compare various parameter settings for three algorithms, i.e. Conditional Random Fields (CRF), Support Vector Machines (SVM) and Random Forests (RF). We also evaluate the impact of lexical, syntactic, and semantic features on each of the algorithms, showing that semantics improves the performance in all cases. We test our comprehensive feature set on two corpora containing gold standard annotations of causal relations, and demonstrate the need for more gold standard data. The best performance of 79.35% F-score is achieved by CRFs when using all three feature types.

  5. Submarine and deep-sea mine tailing placements: A review of current practices, environmental issues, natural analogs and knowledge gaps in Norway and internationally.

    PubMed

    Ramirez-Llodra, Eva; Trannum, Hilde C; Evenset, Anita; Levin, Lisa A; Andersson, Malin; Finne, Tor Erik; Hilario, Ana; Flem, Belinda; Christensen, Guttorm; Schaanning, Morten; Vanreusel, Ann

    2015-08-15

    The mining sector is growing in parallel with societal demands for minerals. One of the most important environmental issues and economic burdens of industrial mining on land is the safe storage of the vast amounts of waste produced. Traditionally, tailings have been stored in land dams, but the lack of land availability, potential risk of dam failure and topography in coastal areas in certain countries results in increasing disposal of tailings into marine systems. This review describes the different submarine tailing disposal methods used in the world in general and in Norway in particular, their impact on the environment (e.g. hyper-sedimentation, toxicity, processes related to changes in grain shape and size, turbidity), current legislation and need for future research. Understanding these impacts on the habitat and biota is essential to assess potential ecosystem changes and to develop best available techniques and robust management plans. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Minimal-assumption inference from population-genomic data

    NASA Astrophysics Data System (ADS)

    Weissman, Daniel; Hallatschek, Oskar

    Samples of multiple complete genome sequences contain vast amounts of information about the evolutionary history of populations, much of it in the associations among polymorphisms at different loci. Current methods that take advantage of this linkage information rely on models of recombination and coalescence, limiting the sample sizes and populations that they can analyze. We introduce a method, Minimal-Assumption Genomic Inference of Coalescence (MAGIC), that reconstructs key features of the evolutionary history, including the distribution of coalescence times, by integrating information across genomic length scales without using an explicit model of recombination, demography or selection. Using simulated data, we show that MAGIC's performance is comparable to PSMC' on single diploid samples generated with standard coalescent and recombination models. More importantly, MAGIC can also analyze arbitrarily large samples and is robust to changes in the coalescent and recombination processes. Using MAGIC, we show that the inferred coalescence time histories of samples of multiple human genomes exhibit inconsistencies with a description in terms of an effective population size based on single-genome data.

  7. Crop residue stabilization and application to agricultural and degraded soils: A review.

    PubMed

    Medina, Jorge; Monreal, Carlos; Barea, José Miguel; Arriagada, César; Borie, Fernando; Cornejo, Pablo

    2015-08-01

    Agricultural activities produce vast amounts of organic residues including straw, unmarketable or culled fruit and vegetables, post-harvest or post-processing wastes, clippings and residuals from forestry or pruning operations, and animal manure. Improper disposal of these materials may produce undesirable environmental (e.g. odors or insect refuges) and health impacts. On the other hand, agricultural residues are of interest to various industries and sectors of the economy due to their energy content (i.e., for combustion), their potential use as feedstock to produce biofuels and/or fine chemicals, or as a soil amendments for polluted or degraded soils when composted. Our objective is review new biotechnologies that could be used to manage these residues for land application and remediation of contaminated and eroded soils. Bibliographic information is complemented through a comprehensive review of the physico-chemical fundamental mechanisms involved in the transformation and stabilization of organic matter by biotic and abiotic soil components. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. MetaPathways v2.5: quantitative functional, taxonomic and usability improvements.

    PubMed

    Konwar, Kishori M; Hanson, Niels W; Bhatia, Maya P; Kim, Dongjae; Wu, Shang-Ju; Hahn, Aria S; Morgan-Lang, Connor; Cheung, Hiu Kan; Hallam, Steven J

    2015-10-15

    Next-generation sequencing is producing vast amounts of sequence information from natural and engineered ecosystems. Although this data deluge has an enormous potential to transform our lives, knowledge creation and translation need software applications that scale with increasing data processing and analysis requirements. Here, we present improvements to MetaPathways, an annotation and analysis pipeline for environmental sequence information that expedites this transformation. We specifically address pathway prediction hazards through integration of a weighted taxonomic distance and enable quantitative comparison of assembled annotations through a normalized read-mapping measure. Additionally, we improve LAST homology searches through BLAST-equivalent E-values and output formats that are natively compatible with prevailing software applications. Finally, an updated graphical user interface allows for keyword annotation query and projection onto user-defined functional gene hierarchies, including the Carbohydrate-Active Enzyme database. MetaPathways v2.5 is available on GitHub: http://github.com/hallamlab/metapathways2. shallam@mail.ubc.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  9. Understanding chemically processed solar cells based on quantum dots

    NASA Astrophysics Data System (ADS)

    Malgras, Victor; Nattestad, Andrew; Kim, Jung Ho; Dou, Shi Xue; Yamauchi, Yusuke

    2017-12-01

    Photovoltaic energy conversion is one of the best alternatives to fossil fuel combustion. Petroleum resources are now close to depletion and their combustion is known to be responsible for the release of a considerable amount of greenhouse gases and carcinogenic airborne particles. Novel third-generation solar cells include a vast range of device designs and materials aiming to overcome the factors limiting the current technologies. Among them, quantum dot-based devices showed promising potential both as sensitizers and as colloidal nanoparticle films. A good example is the p-type PbS colloidal quantum dots (CQDs) forming a heterojunction with a n-type wide-band-gap semiconductor such as TiO2 or ZnO. The confinement in these nanostructures is also expected to result in marginal mechanisms, such as the collection of hot carriers and generation of multiple excitons, which would increase the theoretical conversion efficiency limit. Ultimately, this technology could also lead to the assembly of a tandem-type cell with CQD films absorbing in different regions of the solar spectrum.

  10. Triglycerides Revisited to the Serial.

    PubMed

    Viecili, Paulo Ricardo Nazário; da Silva, Brenda; Hirsch, Gabriela E; Porto, Fernando G; Parisi, Mariana M; Castanho, Alison R; Wender, Michele; Klafke, Jonatas Z

    This review discusses the role of triglycerides (TGs) in the normal cardiovascular system as well as in the development and clinical manifestation of cardiovascular diseases. Regulation of TGs at the enzymatic and genetic level, in addition to their possible relevance as preclinical and clinical biomarkers, is discussed, culminating with a description of available and emerging treatments. Due to the high complexity of the subject and the vast amount of material in the literature, the objective of this review was not to exhaust the subject, but rather to compile the information to facilitate and improve the understanding of those interested in this topic. The main publications on the topic were sought out, especially those from the last 5 years. The data in the literature still give reason to believe that there is room for doubt regarding the use of TG as disease biomarkers; however, there is increasing evidence for the role of hypertriglyceridemia on the atherosclerotic inflammatory process, cardiovascular outcomes, and mortality. © 2017 Elsevier Inc. All rights reserved.

  11. Integrated Control of Na Transport along the Nephron

    PubMed Central

    Schnermann, Jürgen

    2015-01-01

    The kidney filters vast quantities of Na at the glomerulus but excretes a very small fraction of this Na in the final urine. Although almost every nephron segment participates in the reabsorption of Na in the normal kidney, the proximal segments (from the glomerulus to the macula densa) and the distal segments (past the macula densa) play different roles. The proximal tubule and the thick ascending limb of the loop of Henle interact with the filtration apparatus to deliver Na to the distal nephron at a rather constant rate. This involves regulation of both filtration and reabsorption through the processes of glomerulotubular balance and tubuloglomerular feedback. The more distal segments, including the distal convoluted tubule (DCT), connecting tubule, and collecting duct, regulate Na reabsorption to match the excretion with dietary intake. The relative amounts of Na reabsorbed in the DCT, which mainly reabsorbs NaCl, and by more downstream segments that exchange Na for K are variable, allowing the simultaneous regulation of both Na and K excretion. PMID:25098598

  12. CNV-WebStore: online CNV analysis, storage and interpretation.

    PubMed

    Vandeweyer, Geert; Reyniers, Edwin; Wuyts, Wim; Rooms, Liesbeth; Kooy, R Frank

    2011-01-05

    Microarray technology allows the analysis of genomic aberrations at an ever increasing resolution, making functional interpretation of these vast amounts of data the main bottleneck in routine implementation of high resolution array platforms, and emphasising the need for a centralised and easy to use CNV data management and interpretation system. We present CNV-WebStore, an online platform to streamline the processing and downstream interpretation of microarray data in a clinical context, tailored towards but not limited to the Illumina BeadArray platform. Provided analysis tools include CNV analsyis, parent of origin and uniparental disomy detection. Interpretation tools include data visualisation, gene prioritisation, automated PubMed searching, linking data to several genome browsers and annotation of CNVs based on several public databases. Finally a module is provided for uniform reporting of results. CNV-WebStore is able to present copy number data in an intuitive way to both lab technicians and clinicians, making it a useful tool in daily clinical practice.

  13. [HyperPsych--resources for medicine and psychology on the World Wide Web].

    PubMed

    Laszig, P

    1997-07-01

    Progress in the research of interactive communication technology and the acceleration of processing and transmitting information have promoted the development of computer networks allowing global access to scientific information and services. The recently most well-known net is the internet. Based on its integrative structure as a communication-directed as well as an information-directed medium, the internet helps researchers design scientific research. Especially medicine and psychology as information-dependent scientific disciplines may profit by using this technological offer. As a method to coordinate to the vast amount of medical and psychological data around the globe and to communicate with researchers world-wide, it enhances innovative possibilities for research, diagnosis and therapy. Currently, the World Wide Web is regarded as the most user-friendly and practical of all the internet resources. Based on a systematic introduction to the applications of the WWW, this article discusses relevant resources, points out possibilities and limits of network-supported scientific research and proposes many uses of this new medium.

  14. Unveiling the Low Surface Brightness Stellar Peripheries of Galaxies

    NASA Astrophysics Data System (ADS)

    Ferguson, Annette M. N.

    2018-01-01

    The low surface brightness peripheral regions of galaxies contain a gold mine of information about how minor mergers and accretions have influenced their evolution over cosmic time. Enormous stellar envelopes and copious amounts of faint tidal debris are natural outcomes of the hierarchical assembly process and the search for and study of these features, albeit highly challenging, offers the potential for unrivalled insight into the mechanisms of galaxy growth. Over the last two decades, there has been burgeoning interest in probing galaxy outskirts using resolved stellar populations. Wide-field surveys have uncovered vast tidal debris features and new populations of very remote globular clusters, while deep Hubble Space Telescope photometry has provided exquisite star formation histories back to the earliest epochs. I will highlight some recent results from studies within and beyond the Local Group and conclude by briefly discussing the great potential of future facilities, such as JWST, Euclid, LSST and WFIRST, for major breakthroughs in low surface brightness galaxy periphery science.

  15. The role of coal in industrialization: A case study of Nigeria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akarakiri, J.B.

    1989-01-01

    Coal is a mineral matter found in layers or beds in sedimentary rocks. It is a very highly variable substance. In addition to the variations from lignite to bituminous and anthracite, there are vast differences in its heating value, amount of volatiles, sulfur, moisture and so on. The chemical and physical properties of coal make it an important industrial raw material. There is proven 639 million tonnes of coal reserves in Nigeria. This paper examines the potential and current role of coal in the industrialization of Nigeria. Industries are now dependent on fuel oil as a source of fuel becausemore » of its economic and technological advantages over coal. Coal is a source of industrial energy for the future after the known oil reserves might have been exhausted. In the short term, coal can be used as a material for chemicals, iron and steel production as well as a substitute for wood energy in the process of industrialization.« less

  16. Understanding chemically processed solar cells based on quantum dots.

    PubMed

    Malgras, Victor; Nattestad, Andrew; Kim, Jung Ho; Dou, Shi Xue; Yamauchi, Yusuke

    2017-01-01

    Photovoltaic energy conversion is one of the best alternatives to fossil fuel combustion. Petroleum resources are now close to depletion and their combustion is known to be responsible for the release of a considerable amount of greenhouse gases and carcinogenic airborne particles. Novel third-generation solar cells include a vast range of device designs and materials aiming to overcome the factors limiting the current technologies. Among them, quantum dot-based devices showed promising potential both as sensitizers and as colloidal nanoparticle films. A good example is the p-type PbS colloidal quantum dots (CQDs) forming a heterojunction with a n-type wide-band-gap semiconductor such as TiO 2 or ZnO. The confinement in these nanostructures is also expected to result in marginal mechanisms, such as the collection of hot carriers and generation of multiple excitons, which would increase the theoretical conversion efficiency limit. Ultimately, this technology could also lead to the assembly of a tandem-type cell with CQD films absorbing in different regions of the solar spectrum.

  17. Big Data, Big Challenges: Implications for Chief Nurse Executives.

    PubMed

    Clancy, Thomas R; Reed, Laura

    2016-03-01

    As systems evolve over time, their natural tendency is to become increasingly more complex. Studies in the field of complex systems have generated new perspectives on the application of management strategies in health systems. Much of this research appears as a natural extension of the cross-disciplinary field of systems theory. In this edition, I begin a series of articles on the growing challenge faced by nurse administrators of finding value in the vast amounts of information collected by health systems today.

  18. Keeping their attention: innovative strategies for nursing education.

    PubMed

    Herrman, Judith W

    2011-10-01

    Providing nursing education in clinical and other educational settings presents several challenges. Changes in learners, vast amounts of material to be taught, and decreasing educational resources require increased effectiveness of nurse educators and each educational experience. Current teaching strategies may be enhanced to meet learners' expectations and address the reduced attention spans characteristic of today's learners. This article provides 20 strategies and additional helpful hints to increase learner engagement, improve retention of material, and make nursing education more enjoyable for instructors and learners. Copyright 2011, SLACK Incorporated.

  19. Are we there yet?

    PubMed

    Cristianini, Nello

    2010-05-01

    Statistical approaches to Artificial Intelligence are behind most success stories of the field in the past decade. The idea of generating non-trivial behaviour by analysing vast amounts of data has enabled recommendation systems, search engines, spam filters, optical character recognition, machine translation and speech recognition, among other things. As we celebrate the spectacular achievements of this line of research, we need to assess its full potential and its limitations. What are the next steps to take towards machine intelligence? 2010 Elsevier Ltd. All rights reserved.

  20. The -omics Era- Toward a Systems-Level Understanding of Streptomyces

    PubMed Central

    Zhou, Zhan; Gu, Jianying; Du, Yi-Ling; Li, Yong-Quan; Wang, Yufeng

    2011-01-01

    Streptomyces is a group of soil bacteria of medicinal, economic, ecological, and industrial importance. It is renowned for its complex biology in gene regulation, antibiotic production, morphological differentiation, and stress response. In this review, we provide an overview of the recent advances in Streptomyces biology inspired by -omics based high throughput technologies. In this post-genomic era, vast amounts of data have been integrated to provide significant new insights into the fundamental mechanisms of system control and regulation dynamics of Streptomyces. PMID:22379394

  1. MethylMix 2.0: an R package for identifying DNA methylation genes. | Office of Cancer Genomics

    Cancer.gov

    DNA methylation is an important mechanism regulating gene transcription, and its role in carcinogenesis has been extensively studied. Hyper and hypomethylation of genes is a major mechanism of gene expression deregulation in a wide range of diseases. At the same time, high-throughput DNA methylation assays have been developed generating vast amounts of genome wide DNA methylation measurements. We developed MethylMix, an algorithm implemented in R to identify disease specific hyper and hypomethylated genes.

  2. A Data Base Management System for Clinical and Epidemiologic Studies In Systemic Lupus Erythematosus: Design and Maintenance

    PubMed Central

    Kosmides, Victoria S.; Hochberg, Marc C.

    1984-01-01

    This report describes the development, design specifications, features and implementation of a data base management system (DBMS) for clinical and epidemiologic studies in SLE. The DBMS is multidimensional with arrays formulated across patients, studies and variables. The major impact of this DBMS has been to increase the efficiency of managing and analyzing vast amounts of clinical and laboratory data and, as a result, to allow for continued growth in research productivity in areas related to SLE.

  3. Constructing paths through social networks for disease surveillance

    NASA Astrophysics Data System (ADS)

    Greene, Marjorie

    2011-06-01

    Global health security needs better information on biological threats such as pandemics and bioterrorism that pose ever-increasing dangers for the health of populations worldwide. A vast amount of real-time information about infectious disease outbreaks is found in various forms of Web-based data streams. There are advantages and disadvantages of Internet-based surveillance and it has been suggested that an important research area will be to evaluate the application of technologies that will provide benefits to outbreak disease control at local, national, and international levels.

  4. User's Guide for Flight Simulation Data Visualization Workstation

    NASA Technical Reports Server (NTRS)

    Kaplan, Joseph A.; Chen, Ronnie; Kenney, Patrick S.; Koval, Christopher M.; Hutchinson, Brian K.

    1996-01-01

    Today's modern flight simulation research produces vast amounts of time sensitive data. The meaning of this data can be difficult to assess while in its raw format. Therefore, a method of breaking the data down and presenting it to the user in a graphical format is necessary. Simulation Graphics (SimGraph) is intended as a data visualization software package that will incorporate simulation data into a variety of animated graphical displays for easy interpretation by the simulation researcher. This document is intended as an end user's guide.

  5. Traces of a cosmic catastrophe

    NASA Astrophysics Data System (ADS)

    Chuianov, V. A.

    1982-03-01

    It is suggested that the ecological crisis which led to the extinction of many animal species approximately 65-million years ago may have been caused by a cosmic phenomenon, the fall of a giant meteorite (approximately 10 km in diameter). The fall of such a meteorite would have released a vast amount of dust into the atmosphere, leading to radical climatic changes and the extinction of the aforementioned species. The so-called iridium anomaly is cited as possible evidence of such an event.

  6. CPTAC Develops LinkedOmics – Public Web Portal to Analyze Multi-Omics Data Within and Across Cancer Types | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Multi-omics analysis has grown in popularity among biomedical researchers given the comprehensive characterization of thousands of molecular attributes in addition to clinical attributes. Several data portals have been devised to make these datasets directly available to the cancer research community. However, none of the existing data portals allow systematic exploration and interpretation of the complex relationships between the vast amount of clinical and molecular attributes. CPTAC investigator Dr.

  7. The optimisation of low-acceleration interstellar relativistic rocket trajectories using genetic algorithms

    NASA Astrophysics Data System (ADS)

    Fung, Kenneth K. H.; Lewis, Geraint F.; Wu, Xiaofeng

    2017-04-01

    A vast wealth of literature exists on the topic of rocket trajectory optimisation, particularly in the area of interplanetary trajectories due to its relevance today. Studies on optimising interstellar and intergalactic trajectories are usually performed in flat spacetime using an analytical approach, with very little focus on optimising interstellar trajectories in a general relativistic framework. This paper examines the use of low-acceleration rockets to reach galactic destinations in the least possible time, with a genetic algorithm being employed for the optimisation process. The fuel required for each journey was calculated for various types of propulsion systems to determine the viability of low-acceleration rockets to colonise the Milky Way. The results showed that to limit the amount of fuel carried on board, an antimatter propulsion system would likely be the minimum technological requirement to reach star systems tens of thousands of light years away. However, using a low-acceleration rocket would require several hundreds of thousands of years to reach these star systems, with minimal time dilation effects since maximum velocities only reached about 0.2 c . Such transit times are clearly impractical, and thus, any kind of colonisation using low acceleration rockets would be difficult. High accelerations, on the order of 1 g, are likely required to complete interstellar journeys within a reasonable time frame, though they may require prohibitively large amounts of fuel. So for now, it appears that humanity's ultimate goal of a galactic empire may only be possible at significantly higher accelerations, though the propulsion technology requirement for a journey that uses realistic amounts of fuel remains to be determined.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Kris A.; Scholtz, Jean; Whiting, Mark A.

    The VAST Challenge has been a popular venue for academic and industry participants for over ten years. Many participants comment that the majority of their time in preparing VAST Challenge entries is discovering elements in their software environments that need to be redesigned in order to solve the given task. Fortunately, there is no need to wait until the VAST Challenge is announced to test out software systems. The Visual Analytics Benchmark Repository contains all past VAST Challenge tasks, data, solutions and submissions. This paper details the various types of evaluations that may be conducted using the Repository information. Inmore » this paper we describe how developers can do informal evaluations of various aspects of their visual analytics environments using VAST Challenge information. Aspects that can be evaluated include the appropriateness of the software for various tasks, the various data types and formats that can be accommodated, the effectiveness and efficiency of the process supported by the software, and the intuitiveness of the visualizations and interactions. Researchers can compare their visualizations and interactions to those submitted to determine novelty. In addition, the paper provides pointers to various guidelines that software teams can use to evaluate the usability of their software. While these evaluations are not a replacement for formal evaluation methods, this information can be extremely useful during the development of visual analytics environments.« less

  9. Debris Discs: Modeling/theory review

    NASA Astrophysics Data System (ADS)

    Thébault, P.

    2012-03-01

    An impressive amount of photometric, spectroscopic and imaging observations of circumstellar debris discs has been accumulated over the past 3 decades, revealing that they come in all shapes and flavours, from young post-planet-formation systems like Beta-Pic to much older ones like Vega. What we see in these systems are small grains, which are probably only the tip of the iceberg of a vast population of larger (undetectable) collisionally-eroding bodies, leftover from the planet-formation process. Understanding the spatial structure, physical properties, origin and evolution of this dust is of crucial importance, as it is our only window into what is going on in these systems. Dust can be used as a tracer of the distribution of their collisional progenitors and of possible hidden massive pertubers, but can also allow to derive valuable information about the disc's total mass, size distribution or chemical composition. I will review the state of the art in numerical models of debris disc, and present some important issues that are explored by current modelling efforts: planet-disc interactions, link between cold (i.e. Herschel-observed) and hot discs, effect of binarity, transient versus continuous processes, etc. I will finally present some possible perspectives for the development of future models.

  10. Cloud-based distributed control of unmanned systems

    NASA Astrophysics Data System (ADS)

    Nguyen, Kim B.; Powell, Darren N.; Yetman, Charles; August, Michael; Alderson, Susan L.; Raney, Christopher J.

    2015-05-01

    Enabling warfighters to efficiently and safely execute dangerous missions, unmanned systems have been an increasingly valuable component in modern warfare. The evolving use of unmanned systems leads to vast amounts of data collected from sensors placed on the remote vehicles. As a result, many command and control (C2) systems have been developed to provide the necessary tools to perform one of the following functions: controlling the unmanned vehicle or analyzing and processing the sensory data from unmanned vehicles. These C2 systems are often disparate from one another, limiting the ability to optimally distribute data among different users. The Space and Naval Warfare Systems Center Pacific (SSC Pacific) seeks to address this technology gap through the UxV to the Cloud via Widgets project. The overarching intent of this three year effort is to provide three major capabilities: 1) unmanned vehicle control using an open service oriented architecture; 2) data distribution utilizing cloud technologies; 3) a collection of web-based tools enabling analysts to better view and process data. This paper focuses on how the UxV to the Cloud via Widgets system is designed and implemented by leveraging the following technologies: Data Distribution Service (DDS), Accumulo, Hadoop, and Ozone Widget Framework (OWF).

  11. Lignin from Micro- to Nanosize: Production Methods

    PubMed Central

    Beisl, Stefan; Miltner, Angela; Friedl, Anton

    2017-01-01

    Lignin is the second most abundant biopolymer after cellulose. It has long been obtained as a by-product of cellulose production in pulp and paper production, but had rather low added-value applications. A changing paper market and the emergence of biorefinery projects should generate vast amounts of lignin with the potential of value addition. Nanomaterials offer unique properties and the preparation of lignin nanoparticles and other nanostructures has therefore gained interest as a promising technique to obtain value-added lignin products. Due to lignin’s high structural and chemical heterogeneity, methods must be adapted to these different types. This review focuses on the ability of different formation methods to cope with the huge variety of lignin types and points out which particle characteristics can be achieved by which method. The current research’s main focus is on pH and solvent-shifting methods where the latter can yield solid and hollow particles. Solvent shifting also showed the capability to cope with different lignin types and solvents and antisolvents, respectively. However, process conditions have to be adapted to every type of lignin and reduction of solvent demand or the integration in a biorefinery process chain must be focused. PMID:28604584

  12. Dams in the Amazon: Belo Monte and Brazil's hydroelectric development of the Xingu River Basin.

    PubMed

    Fearnside, Phillip M

    2006-07-01

    Hydroelectric dams represent major investments and major sources of environmental and social impacts. Powerful forces surround the decision-making process on public investments in the various options for the generation and conservation of electricity. Brazil's proposed Belo Monte Dam (formerly Kararaô) and its upstream counterpart, the Altamira Dam (better known by its former name of Babaquara) are at the center of controversies on the decision-making process for major infrastructure projects in Amazonia. The Belo Monte Dam by itself would have a small reservoir area (440 km2) and large installed capacity (11, 181.3 MW), but the Altamira/Babaquara Dam that would regulate the flow of the Xingu River (thereby increasing power generation at Belo Monte) would flood a vast area (6140 km2). The great impact of dams provides a powerful reason for Brazil to reassess its current policies that allocate large amounts of energy in the country's national grid to subsidized aluminum smelting for export. The case of Belo Monte and the five additional dams planned upstream (including the Altamira/Babaquara Dam) indicate the need for Brazil to reform its environmental assessment and licensing system to include the impacts of multiple interdependent projects.

  13. New insights into the earliest stages of colorectal tumorigenesis.

    PubMed

    Sievers, Chelsie K; Grady, William M; Halberg, Richard B; Pickhardt, Perry J

    2017-08-01

    Tumors in the large intestine have been postulated to arise via a stepwise accumulation of mutations, a process that takes up to 20 years. Recent advances in lineage tracing and DNA sequencing, however, are revealing new evolutionary models that better explain the vast amount of heterogeneity observed within and across colorectal tumors. Areas covered: A review of the literature supporting a novel model of colorectal tumor evolution was conducted. The following commentary examines the basic science and clinical evidence supporting a modified view of tumor initiation and progression in the colon. Expert commentary: The proposed 'cancer punctuated equilibrium' model of tumor evolution better explains the variability seen within and across polyps of the colon and rectum. Small colorectal polyps (6-9mm) followed longitudinally by interval imaging with CT colonography have been reported to have multiple fates: some growing, some remaining static in size, and others regressing in size over time. This new model allows for this variability in growth behavior and supports the hypothesis that some tumors can be 'born to be bad' as originally postulated by Sottoriva and colleagues, with very early molecular events impacting tumor fitness and growth behavior in the later stages of the disease process.

  14. One decade of the Data Fusion Information Group (DFIG) model

    NASA Astrophysics Data System (ADS)

    Blasch, Erik

    2015-05-01

    The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.

  15. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling.

    PubMed

    Devi, R Suganya; Manjula, D; Siddharth, R K

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling.

  16. PipeCraft: Flexible open-source toolkit for bioinformatics analysis of custom high-throughput amplicon sequencing data.

    PubMed

    Anslan, Sten; Bahram, Mohammad; Hiiesalu, Indrek; Tedersoo, Leho

    2017-11-01

    High-throughput sequencing methods have become a routine analysis tool in environmental sciences as well as in public and private sector. These methods provide vast amount of data, which need to be analysed in several steps. Although the bioinformatics may be applied using several public tools, many analytical pipelines allow too few options for the optimal analysis for more complicated or customized designs. Here, we introduce PipeCraft, a flexible and handy bioinformatics pipeline with a user-friendly graphical interface that links several public tools for analysing amplicon sequencing data. Users are able to customize the pipeline by selecting the most suitable tools and options to process raw sequences from Illumina, Pacific Biosciences, Ion Torrent and Roche 454 sequencing platforms. We described the design and options of PipeCraft and evaluated its performance by analysing the data sets from three different sequencing platforms. We demonstrated that PipeCraft is able to process large data sets within 24 hr. The graphical user interface and the automated links between various bioinformatics tools enable easy customization of the workflow. All analytical steps and options are recorded in log files and are easily traceable. © 2017 John Wiley & Sons Ltd.

  17. An Efficient Approach for Web Indexing of Big Data through Hyperlinks in Web Crawling

    PubMed Central

    Devi, R. Suganya; Manjula, D.; Siddharth, R. K.

    2015-01-01

    Web Crawling has acquired tremendous significance in recent times and it is aptly associated with the substantial development of the World Wide Web. Web Search Engines face new challenges due to the availability of vast amounts of web documents, thus making the retrieved results less applicable to the analysers. However, recently, Web Crawling solely focuses on obtaining the links of the corresponding documents. Today, there exist various algorithms and software which are used to crawl links from the web which has to be further processed for future use, thereby increasing the overload of the analyser. This paper concentrates on crawling the links and retrieving all information associated with them to facilitate easy processing for other uses. In this paper, firstly the links are crawled from the specified uniform resource locator (URL) using a modified version of Depth First Search Algorithm which allows for complete hierarchical scanning of corresponding web links. The links are then accessed via the source code and its metadata such as title, keywords, and description are extracted. This content is very essential for any type of analyser work to be carried on the Big Data obtained as a result of Web Crawling. PMID:26137592

  18. Coding and decoding with dendrites.

    PubMed

    Papoutsi, Athanasia; Kastellakis, George; Psarrou, Maria; Anastasakis, Stelios; Poirazi, Panayiota

    2014-02-01

    Since the discovery of complex, voltage dependent mechanisms in the dendrites of multiple neuron types, great effort has been devoted in search of a direct link between dendritic properties and specific neuronal functions. Over the last few years, new experimental techniques have allowed the visualization and probing of dendritic anatomy, plasticity and integrative schemes with unprecedented detail. This vast amount of information has caused a paradigm shift in the study of memory, one of the most important pursuits in Neuroscience, and calls for the development of novel theories and models that will unify the available data according to some basic principles. Traditional models of memory considered neural cells as the fundamental processing units in the brain. Recent studies however are proposing new theories in which memory is not only formed by modifying the synaptic connections between neurons, but also by modifications of intrinsic and anatomical dendritic properties as well as fine tuning of the wiring diagram. In this review paper we present previous studies along with recent findings from our group that support a key role of dendrites in information processing, including the encoding and decoding of new memories, both at the single cell and the network level. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Automation and quality assurance of the production cycle

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Didenko, L.; Lauret, J.

    2010-04-01

    Processing datasets on the order of tens of terabytes is an onerous task, faced by production coordinators everywhere. Users solicit data productions and, especially for simulation data, the vast amount of parameters (and sometime incomplete requests) point at the need for a tracking, control and archiving all requests made so a coordinated handling could be made by the production team. With the advent of grid computing the parallel processing power has increased but traceability has also become increasing problematic due to the heterogeneous nature of Grids. Any one of a number of components may fail invalidating the job or execution flow in various stages of completion and re-submission of a few of the multitude of jobs (keeping the entire dataset production consistency) a difficult and tedious process. From the definition of the workflow to its execution, there is a strong need for validation, tracking, monitoring and reporting of problems. To ease the process of requesting production workflow, STAR has implemented several components addressing the full workflow consistency. A Web based online submission request module, implemented using Drupal's Content Management System API, enforces ahead that all parameters are described in advance in a uniform fashion. Upon submission, all jobs are independently tracked and (sometime experiment-specific) discrepancies are detected and recorded providing detailed information on where/how/when the job failed. Aggregate information on success and failure are also provided in near real-time.

  20. Developing seventh grade students' systems thinking skills in the context of the human circulatory system.

    PubMed

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students' systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students' systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students' systems thinking components and examine possible changes in the students' knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students' ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems' components and its processes within a framework of relationships, namely the students' systems thinking skills were improved in the course of learning using the teaching and learning materials.

  1. Developing Seventh Grade Students’ Systems Thinking Skills in the Context of the Human Circulatory System

    PubMed Central

    Raved, Lena; Yarden, Anat

    2014-01-01

    Developing systems thinking skills in school can provide useful tools to deal with a vast amount of medical and health information that may help learners in decision making in their future lives as citizen. Thus, there is a need to develop effective tools that will allow learners to analyze biological systems and organize their knowledge. Here, we examine junior high school students’ systems thinking skills in the context of the human circulatory system. A model was formulated for developing teaching and learning materials and for characterizing students’ systems thinking skills. Specifically, we asked whether seventh grade students, who studied about the human circulatory system, acquired systems thinking skills, and what are the characteristics of those skills? Concept maps were used to characterize students’ systems thinking components and examine possible changes in the students’ knowledge structure. These maps were composed by the students before and following the learning process. The study findings indicate a significant improvement in the students’ ability to recognize the system components and the processes that occur within the system, as well as the relationships between different levels of organization of the system, following the learning process. Thus, following learning students were able to organize the systems’ components and its processes within a framework of relationships, namely the students’ systems thinking skills were improved in the course of learning using the teaching and learning materials. PMID:25520948

  2. Thought Leaders during Crises in Massive Social Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Corley, Courtney D.; Farber, Robert M.; Reynolds, William

    The vast amount of social media data that can be gathered from the internet coupled with workflows that utilize both commodity systems and massively parallel supercomputers, such as the Cray XMT, open new vistas for research to support health, defense, and national security. Computer technology now enables the analysis of graph structures containing more than 4 billion vertices joined by 34 billion edges along with metrics and massively parallel algorithms that exhibit near-linear scalability according to number of processors. The challenge lies in making this massive data and analysis comprehensible to an analyst and end-users that require actionable knowledge tomore » carry out their duties. Simply stated, we have developed language and content agnostic techniques to reduce large graphs built from vast media corpora into forms people can understand. Specifically, our tools and metrics act as a survey tool to identify thought leaders' -- those members that lead or reflect the thoughts and opinions of an online community, independent of the source language.« less

  3. Collaboration and the Collective-Bargaining Process in Public Education

    ERIC Educational Resources Information Center

    Noggle, Matthew

    2010-01-01

    In the vast majority of school districts, the collective-bargaining process has evolved little during the past few decades. Teachers unions have successfully represented teachers' economic and job security interests by linking them to collective bargaining and procedural due process rights, but district administrators continue to make the…

  4. Quality assurance and quality control for autonomously collected geoscience data

    NASA Astrophysics Data System (ADS)

    Versteeg, R. J.; Richardson, A.; Labrecque, D.

    2006-12-01

    The growing interest in processes, coupled with the reduction in cost and complexity of sensors which allow for continuous data collection and transmission is giving rise to vast amounts of semi autonomously collected data. Such data is typically collected from a range of physical and chemical sensors and transmitted - either at the time of collection, or periodically as a collection of measurements - to a central server. Such setups can collect vast amounts of data. In cases where power is not an issue one datapoint can be collected every minute, resulting in tens of thousands of data points per month per sensor. Especially in cases in which multiple sensors are deployed it is infeasible to examine each individual datapoint for each individual sensor, and users typically will look at aggregates of such data on a periodic (once a week to once every few months) basis. Such aggregates (and the timelag between data collection and data evaluation) will impact the ability to rapidly identify and resolve data issues. Thus, there is a need to integrate data qa/qc rules and procedures in the data collection process. These should be implemented such that data is analyzed for compliance the moment it arrives at the server, and that any issues with this data result in notification of cognizant personnel. Typical issues (encountered in the field) include complete system failure (resulting in no data arriving at all), to complete sensor failure (data is collected, but is meaningless), to partial sensor failure (sensor gives erratic readings, or starts to exhibit a bias) to partial powerloss (system collects and transmits data only intermittently). We have implemented a suite of such rules and tests as part of the INL developed performance monitoring system. These rules are invoked as part of a data qa/qc workflow, and result in quality indicators for each datapoint as well as user alerts in case of issues. Tests which are applied to the data include tests on individual datapoints, tests on suites of datapoints, and tests applied over the whole dataset. Example of tests include: Did data arrive on time, is received data in a valid format, are all measurements present, is data within valid range, is data collected at appropriate time intervals, are the statistics of the data changing over time and is the data collected within an appropriate instrument calibration window? This approach, which is executed automatically on all data provides data end users with confidence and auditability regarding the quality and useability of autonomously collected data.

  5. LinkFinder: An expert system that constructs phylogenic trees

    NASA Technical Reports Server (NTRS)

    Inglehart, James; Nelson, Peter C.

    1991-01-01

    An expert system has been developed using the C Language Integrated Production System (CLIPS) that automates the process of constructing DNA sequence based phylogenies (trees or lineages) that indicate evolutionary relationships. LinkFinder takes as input homologous DNA sequences from distinct individual organisms. It measures variations between the sequences, selects appropriate proportionality constants, and estimates the time that has passed since each pair of organisms diverged from a common ancestor. It then designs and outputs a phylogenic map summarizing these results. LinkFinder can find genetic relationships between different species, and between individuals of the same species, including humans. It was designed to take advantage of the vast amount of sequence data being produced by the Genome Project, and should be of value to evolution theorists who wish to utilize this data, but who have no formal training in molecular genetics. Evolutionary theory holds that distinct organisms carrying a common gene inherited that gene from a common ancestor. Homologous genes vary from individual to individual and species to species, and the amount of variation is now believed to be directly proportional to the time that has passed since divergence from a common ancestor. The proportionality constant must be determined experimentally; it varies considerably with the types of organisms and DNA molecules under study. Given an appropriate constant, and the variation between two DNA sequences, a simple linear equation gives the divergence time.

  6. BioC: a minimalist approach to interoperability for biomedical text processing

    PubMed Central

    Comeau, Donald C.; Islamaj Doğan, Rezarta; Ciccarese, Paolo; Cohen, Kevin Bretonnel; Krallinger, Martin; Leitner, Florian; Lu, Zhiyong; Peng, Yifan; Rinaldi, Fabio; Torii, Manabu; Valencia, Alfonso; Verspoor, Karin; Wiegers, Thomas C.; Wu, Cathy H.; Wilbur, W. John

    2013-01-01

    A vast amount of scientific information is encoded in natural language text, and the quantity of such text has become so great that it is no longer economically feasible to have a human as the first step in the search process. Natural language processing and text mining tools have become essential to facilitate the search for and extraction of information from text. This has led to vigorous research efforts to create useful tools and to create humanly labeled text corpora, which can be used to improve such tools. To encourage combining these efforts into larger, more powerful and more capable systems, a common interchange format to represent, store and exchange the data in a simple manner between different language processing systems and text mining tools is highly desirable. Here we propose a simple extensible mark-up language format to share text documents and annotations. The proposed annotation approach allows a large number of different annotations to be represented including sentences, tokens, parts of speech, named entities such as genes or diseases and relationships between named entities. In addition, we provide simple code to hold this data, read it from and write it back to extensible mark-up language files and perform some sample processing. We also describe completed as well as ongoing work to apply the approach in several directions. Code and data are available at http://bioc.sourceforge.net/. Database URL: http://bioc.sourceforge.net/ PMID:24048470

  7. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures

    NASA Astrophysics Data System (ADS)

    Hegazy, Maha A.; Lotfy, Hayam M.; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-01

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations.

  8. Numerical and experimental studies on effects of moisture content on combustion characteristics of simulated municipal solid wastes in a fixed bed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Rui, E-mail: Sunsr@hit.edu.cn; Ismail, Tamer M., E-mail: temoil@aucegypt.edu; Ren, Xiaohan

    Highlights: • The effects of moisture content on the burning process of MSW are investigated. • A two-dimensional mathematical model was built to simulate the combustion process. • Temperature distributions, process rates, gas species were measured and simulated. • The The conversion ratio of C/CO and N/NO in MSW are inverse to moisture content. - Abstract: In order to reveal the features of the combustion process in the porous bed of a waste incinerator, a two-dimensional unsteady state model and experimental study were employed to investigate the combustion process in a fixed bed of municipal solid waste (MSW) on themore » combustion process in a fixed bed reactor. Conservation equations of the waste bed were implemented to describe the incineration process. The gas phase turbulence was modeled using the k–ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The rate of moisture evaporation, devolatilization rate, and char burnout was calculated according to the waste property characters. The simulation results were then compared with experimental data for different moisture content of MSW, which shows that the incineration process of waste in the fixed bed is reasonably simulated. The simulation results of solid temperature, gas species and process rate in the bed are accordant with experimental data. Due to the high moisture content of fuel, moisture evaporation consumes a vast amount of heat, and the evaporation takes up most of the combustion time (about 2/3 of the whole combustion process). The whole bed combustion process reduces greatly as MSW moisture content increases. The experimental and simulation results provide direction for design and optimization of the fixed bed of MSW.« less

  9. Toward a Dynamic, Multidimensional Research Framework for Strategic Processing

    ERIC Educational Resources Information Center

    Dinsmore, Daniel L.

    2017-01-01

    While the empirical literature on strategic processing is vast, understanding how and why certain strategies work for certain learners is far from clear. The purpose of this review is to systematically examine the theoretical and empirical literature on strategic process to parse out current conceptual and methodological progress to inform new…

  10. A Big Data Approach to Analyzing Market Volatility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Kesheng; Bethel, E. Wes; Gu, Ming

    2013-06-05

    Understanding the microstructure of the financial market requires the processing of a vast amount of data related to individual trades, and sometimes even multiple levels of quotes. Analyzing such a large volume of data requires tremendous computing power that is not easily available to financial academics and regulators. Fortunately, public funded High Performance Computing (HPC) power is widely available at the National Laboratories in the US. In this paper we demonstrate that the HPC resource and the techniques for data-intensive sciences can be used to greatly accelerate the computation of an early warning indicator called Volume-synchronized Probability of Informed tradingmore » (VPIN). The test data used in this study contains five and a half year's worth of trading data for about 100 most liquid futures contracts, includes about 3 billion trades, and takes 140GB as text files. By using (1) a more efficient file format for storing the trading records, (2) more effective data structures and algorithms, and (3) parallelizing the computations, we are able to explore 16,000 different ways of computing VPIN in less than 20 hours on a 32-core IBM DataPlex machine. Our test demonstrates that a modest computer is sufficient to monitor a vast number of trading activities in real-time – an ability that could be valuable to regulators. Our test results also confirm that VPIN is a strong predictor of liquidity-induced volatility. With appropriate parameter choices, the false positive rates are about 7% averaged over all the futures contracts in the test data set. More specifically, when VPIN values rise above a threshold (CDF > 0.99), the volatility in the subsequent time windows is higher than the average in 93% of the cases.« less

  11. Review of aragonite and calcite crystal morphogenesis in thermal spring systems

    NASA Astrophysics Data System (ADS)

    Jones, Brian

    2017-06-01

    Aragonite and calcite crystals are the fundamental building blocks of calcareous thermal spring deposits. The diverse array of crystal morphologies found in these deposits, which includes monocrystals, mesocrystals, skeletal crystals, dendrites, and spherulites, are commonly precipitated under far-from-equilibrium conditions. Such crystals form through both abiotic and biotic processes. Many crystals develop through non-classical crystal growth models that involve the arrangement of nanocrystals in a precisely controlled crystallographic register. Calcite crystal morphogenesis has commonly been linked to a ;driving force;, which is a conceptual measure of the distance of the growth conditions from equilibrium conditions. Essentially, this scheme indicates that increasing levels of supersaturation and various other parameters that produce a progressive change from monocrystals and mesocrystals to skeletal crystals to crystallographic and non-crystallographic dendrites, to dumbbells, to spherulites. Despite the vast amount of information available from laboratory experiments and natural spring systems, the precise factors that control the driving force are open to debate. The fact that calcite crystal morphogenesis is still poorly understood is largely a reflection of the complexity of the factors that influence aragonite and calcite precipitation. Available information indicates that variations in calcite crystal morphogenesis can be attributed to physical and chemical parameters of the parent water, the presence of impurities, the addition of organic or inorganic additives to the water, the rate of crystal growth, and/or the presence of microbes and their associated biofilms. The problems in trying to relate crystal morphogenesis to specific environmental parameters arise because it is generally impossible to disentangle the controlling factor(s) from the vast array of potential parameters that may act alone or in unison with each other.

  12. A vast amount of various invariant tori in the Nosé-Hoover oscillator.

    PubMed

    Wang, Lei; Yang, Xiao-Song

    2015-12-01

    This letter restudies the Nosé-Hoover oscillator. Some new averagely conservative regions are found, each of which is filled with different sequences of nested tori with various knot types. Especially, the dynamical behaviors near the border of "chaotic region" and conservative regions are studied showing that there exist more complicated and thinner invariant tori around the boundaries of conservative regions bounded by tori. Our results suggest an infinite number of island chains in a "chaotic sea" for the Nosé-Hoover oscillator.

  13. Open Source Software Tool Skyline Reaches Key Agreement with Mass Spectrometer Vendors | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The full proteomics analysis of a small tumor sample (similar in mass to a few grains of rice) produces well over 500 megabytes of unprocessed "raw" data when analyzed on a mass spectrometer (MS). Thus, for every proteomics experiment there is a vast amount of raw data that must be analyzed and interrogated in order to extract biological information. Moreover, the raw data output from different MS vendors are generally in different formats inhibiting the ability of labs to productively work together.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansen, Frank; Popp, Till; Wieczorek, Klaus

    The purposes of this paper are to review the vast amount of knowledge concerning crushed salt reconsolidation and its attendant hydraulic properties (i.e., its capability for fluid or gas transport) and to provide a sufficient basis to understand reconsolidation and healing rates under repository conditions. Topics covered include: deformation mechanisms and hydro-mechanical interactions during reconsolidation; the experimental data base pertaining to crushed salt reconsolidation; transport properties of consolidating granulated salt and provides quantitative substantiation of its evolution to characteristics emulating undisturbed rock salt; and extension of microscopic and laboratory observations and data to the applicable field scale.

  15. Assessing Atmospheric Water Injection from Oceanic Impacts

    NASA Technical Reports Server (NTRS)

    Pierazzo, E.

    2005-01-01

    Collisions of asteroids and comets with the Earth s surface are rare events that punctuate the geologic record. Due to the vastness of Earth s oceans, oceanic impacts of asteroids or comets are expected to be about 4 times more frequent than land impacts. The resulting injections of oceanic water into the upper atmosphere can have important repercussions on Earth s climate and atmospheric circulation. However, the duration and overall effect of these large injections are still unconstrained. This work addresses atmospheric injections of large amounts of water in oceanic impacts.

  16. Army Research Office’s ARO in Review 2014.The Annual Historical Record of the Army Research Laboratory’s Army Research Office (ARO) Programs and Funding Activities

    DTIC Science & Technology

    2015-07-01

    TEM image of 1T-TaS2 showing CDW discommensuration network. (Main panel) Nonlinear resistivity and current slip at large bias of device shown in lower...the same species. As most pollen is generally dispersed by either wind or insects, the male plants must produce pollen in vast amounts (up to...for Massive and Messy Data • Professor Yuri Bazilevs, University of California - San Diego; Fluid-Structure Interaction Simulation of Gas Turbine

  17. Navy Construction Contract Regulations versus the Board of Contract Appeals.

    DTIC Science & Technology

    1987-12-01

    This is unfortunate because the amount of useful knowledge in the cases is vast, but the access to it is time- comsuming . The selection of the cases is...AlE) was out of town. This figure cannot be used blindly. Other factors such as Government behavior , contractor behavior , or nature of the submittal...job. To complicate matters, t "h C (-,n t r a c:i n:; I:Dict.r may not be aware of th_ contractor’ h.me _.f... overhead behavior . The alternative Is

  18. KSC-2013-3251

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – Inside the Payload Hazardous Servicing Facility at NASA's Kennedy Space Center in Florida, technicians prepare a thermal blanket for installation on the MAVEN spacecraft's parabolic high gain antenna. MAVEN stands for Mars Atmosphere and Volatile Evolution. The antenna will communicate vast amounts of data to Earth during the mission. MAVEN is being prepared inside the facility for its scheduled November launch aboard a United Launch Alliance Atlas V rocket to Mars. Positioned in an orbit above the Red Planet, MAVEN will study the upper atmosphere of Mars in unprecedented detail. Photo credit: NASA/Jim Grossmann

  19. KSC-2013-3255

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – Inside the Payload Hazardous Servicing Facility at NASA's Kennedy Space Center in Florida, technicians install a thermal blanket on the parabolic high gain antenna of the Mars Atmosphere and Volatile Evolution, or MAVEN spacecraft. The antenna will communicate vast amounts of data to Earth during the mission. MAVEN is being prepared inside the facility for its scheduled November launch aboard a United Launch Alliance Atlas V rocket to Mars. Positioned in an orbit above the Red Planet, MAVEN will study the upper atmosphere of Mars in unprecedented detail. Photo credit: NASA/Jim Grossmann

  20. KSC-2013-3252

    NASA Image and Video Library

    2013-08-09

    CAPE CANAVERAL, Fla. – Inside the Payload Hazardous Servicing Facility at NASA's Kennedy Space Center in Florida, technicians apply tape to the thermal blanket for the MAVEN spacecraft's parabolic high gain antenna. MAVEN stands for Mars Atmosphere and Volatile Evolution. The antenna will communicate vast amounts of data to Earth during the mission. MAVEN is being prepared inside the facility for its scheduled November launch aboard a United Launch Alliance Atlas V rocket to Mars. Positioned in an orbit above the Red Planet, MAVEN will study the upper atmosphere of Mars in unprecedented detail. Photo credit: NASA/Jim Grossmann

  1. An open-source job management framework for parameter-space exploration: OACIS

    NASA Astrophysics Data System (ADS)

    Murase, Y.; Uchitane, T.; Ito, N.

    2017-11-01

    We present an open-source software framework for parameter-space exporation, named OACIS, which is useful to manage vast amount of simulation jobs and results in a systematic way. Recent development of high-performance computers enabled us to explore parameter spaces comprehensively, however, in such cases, manual management of the workflow is practically impossible. OACIS is developed aiming at reducing the cost of these repetitive tasks when conducting simulations by automating job submissions and data management. In this article, an overview of OACIS as well as a getting started guide are presented.

  2. A framework to explore the knowledge structure of multidisciplinary research fields.

    PubMed

    Uddin, Shahadat; Khan, Arif; Baur, Louise A

    2015-01-01

    Understanding emerging areas of a multidisciplinary research field is crucial for researchers, policymakers and other stakeholders. For them a knowledge structure based on longitudinal bibliographic data can be an effective instrument. But with the vast amount of available online information it is often hard to understand the knowledge structure for data. In this paper, we present a novel approach for retrieving online bibliographic data and propose a framework for exploring knowledge structure. We also present several longitudinal analyses to interpret and visualize the last 20 years of published obesity research data.

  3. Rehabilitation of landmine victims--the ultimate challenge.

    PubMed Central

    Walsh, Nicolas E.; Walsh, Wendy S.

    2003-01-01

    Antipersonnel landmines are often used indiscriminately and frequently result in injury or death of non-combatants. In the last 65 years, over 110 million mines have been spread throughout the world into an estimated 70 countries. Landmine victims use a disproportionately high amount of medical resources; the vast majority of incidents occur in regions and countries without a sophisticated medical infrastructure and with limited resources, where rehabilitation is difficult in the best of circumstances. It is suggested that only a quarter of the patients with amputation secondary to landmines receive appropriate care. PMID:14710508

  4. Current Status And Trends In Long Haul Fiber Optics Networks

    NASA Astrophysics Data System (ADS)

    Pyykkonen, Martin

    1986-01-01

    There have been many similar opinions expressed in recent months about there being an imminent bandwidth glut in the nation's long haul fiber optics network. These feelings are based largely on the vast magnitude of construction projects which are either in progress or completed by the major carriers, i.e., AT&T-Communications, MCI, NTN and US Sprint. Coupled with this advanced stage of construction and subsequent network operation, is the slowly developing demand for those applications which consume large amounts of bandwidth, namely those which are video-based.

  5. CAI System of Obunsha Co., Ltd. Using CD-ROM

    NASA Astrophysics Data System (ADS)

    Todokoro, Shigeru; Mukai, Yoshihiro

    This paper introduces the present status of R & D on CAI teaching materials in Obunsha Co., Ltd. Characteristics of CAI using CD-ROM as well as Culture-in CAI Teaching Materials System for junior high school English are described. The system consists of CD-ROM driver XM-2000 and Pasopia 700 of Toshiba Corporation having both features of CD-ROM and FD. CD-ROM stores vast amount of voice data while FD does text and graphics data. It is a frame-oriented mode system enabling to raise learning effect.

  6. A vast amount of various invariant tori in the Nosé-Hoover oscillator

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Yang, Xiao-Song

    2015-12-01

    This letter restudies the Nosé-Hoover oscillator. Some new averagely conservative regions are found, each of which is filled with different sequences of nested tori with various knot types. Especially, the dynamical behaviors near the border of "chaotic region" and conservative regions are studied showing that there exist more complicated and thinner invariant tori around the boundaries of conservative regions bounded by tori. Our results suggest an infinite number of island chains in a "chaotic sea" for the Nosé-Hoover oscillator.

  7. A vast amount of various invariant tori in the Nosé-Hoover oscillator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Lei; Department of Mathematics and Physics, Hefei University, Hefei 230601; Yang, Xiao-Song, E-mail: yangxs@hust.edu.cn

    2015-12-15

    This letter restudies the Nosé-Hoover oscillator. Some new averagely conservative regions are found, each of which is filled with different sequences of nested tori with various knot types. Especially, the dynamical behaviors near the border of “chaotic region” and conservative regions are studied showing that there exist more complicated and thinner invariant tori around the boundaries of conservative regions bounded by tori. Our results suggest an infinite number of island chains in a “chaotic sea” for the Nosé-Hoover oscillator.

  8. Convective and Stratiform Precipitation Processes and their Relationship to Latent Heating

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Lang, Steve; Zeng, Xiping; Shige, Shoichi; Takayabu, Yukari

    2009-01-01

    The global hydrological cycle is central to the Earth's climate system, with rainfall and the physics of its formation acting as the key links in the cycle. Two-thirds of global rainfall occurs in the Tropics. Associated with this rainfall is a vast amount of heat, which is known as latent heat. It arises mainly due to the phase change of water vapor condensing into liquid droplets; three-fourths of the total heat energy available to the Earth's atmosphere comes from tropical rainfall. In addition, fresh water provided by tropical rainfall and its variability exerts a large impact upon the structure and motions of the upper ocean layer. An improved convective -stratiform heating (CSH) algorithm has been developed to obtain the 3D structure of cloud heating over the Tropics based on two sources of information: 1) rainfall information, namely its amount and the fraction due to light rain intensity, observed directly from the Precipitation Radar (PR) on board the TRMM satellite and 2) synthetic cloud physics information obtained from cloud-resolving model (CRM) simulations of cloud systems. The cloud simulations provide details on cloud processes, specifically latent heating, eddy heat flux convergence and radiative heating/cooling, that. are not directly observable by satellite. The new CSH algorithm-derived heating has a noticeably different heating structure over both ocean and land regions compared to the previous CSH algorithm. One of the major differences between new and old algorithms is that the level of maximum cloud heating occurs 1 to 1.5 km lower in the atmosphere in the new algorithm. This can effect the structure of the implied air currents associated with the general circulation of the atmosphere in the Tropics. The new CSH algorithm will be used provide retrieved heating data to other heating algorithms to supplement their performance.

  9. Defining Tsunami Magnitude as Measure of Potential Impact

    NASA Astrophysics Data System (ADS)

    Titov, V. V.; Tang, L.

    2016-12-01

    The goal of tsunami forecast, as a system for predicting potential impact of a tsunami at coastlines, requires quick estimate of a tsunami magnitude. This goal has been recognized since the beginning of tsunami research. The work of Kajiura, Soloviev, Abe, Murty, and many others discussed several scales for tsunami magnitude based on estimates of tsunami energy. However, difficulties of estimating tsunami energy based on available tsunami measurements at coastal sea-level stations has carried significant uncertainties and has been virtually impossible in real time, before tsunami impacts coastlines. The slow process of tsunami magnitude estimates, including collection of vast amount of available coastal sea-level data from affected coastlines, made it impractical to use any tsunami magnitude scales in tsunami warning operations. Uncertainties of estimates made tsunami magnitudes difficult to use as universal scale for tsunami analysis. Historically, the earthquake magnitude has been used as a proxy of tsunami impact estimates, since real-time seismic data is available of real-time processing and ample amount of seismic data is available for an elaborate post event analysis. This measure of tsunami impact carries significant uncertainties in quantitative tsunami impact estimates, since the relation between the earthquake and generated tsunami energy varies from case to case. In this work, we argue that current tsunami measurement capabilities and real-time modeling tools allow for establishing robust tsunami magnitude that will be useful for tsunami warning as a quick estimate for tsunami impact and for post-event analysis as a universal scale for tsunamis inter-comparison. We present a method for estimating the tsunami magnitude based on tsunami energy and present application of the magnitude analysis for several historical events for inter-comparison with existing methods.

  10. Using LiCSAR as a fast-response system for the detection and the monitoring of volcanic unrest

    NASA Astrophysics Data System (ADS)

    Albino, F.; Biggs, J.; Hatton, E. L.; Spaans, K.; Gaddes, M.; McDougall, A.

    2017-12-01

    Based on the Smithsonian Institution volcano database, a total of 13256 volcanoes exist on Earth with 1273 having evidence of eruptive or unrest activity during the Holocene. InSAR techniques have proven their ability to detect and to quantify volcanic ground deformation on a case-by-case basis. However, the use of InSAR for the daily monitoring of every active volcano requires the development of automatic processing that can provide information in a couple of hours after a new radar acquisition. The LiCSAR system (http://comet.nerc.ac.uk/COMET-LiCS-portal/) answers this requirement by processing the vast amounts of data generated daily by the EU's Sentinel-1 satellite constellation. It provides now high-resolution deformation data for the entire Alpine-Himalayan seismic belt. The aim of our study is to extend LiCSAR system to the purpose of volcano monitoring. For each active volcano, the last Sentinel products calculated (phase, coherence and amplitude) will be available online in the COMET Volcano Deformation Database. To analyse this large amount of InSAR products, we develop an algorithm to automatically detect ground deformation signals as well as changes in coherence and amplitude in the time series. This toolbox could be a powerful fast-response system for helping volcanological observatories to manage new or ongoing volcanic crisis. Important information regarding the spatial and the temporal evolution of each ground deformation signal will also be added to the COMET database. This will benefit to better understand the conditions in which volcanic unrest leads to an eruption. Such worldwide survey enables us to establish a large catalogue of InSAR products, which will also be suitable for further studies (mapping of new lava flows, modelling of magmatic sources, evaluation of stress interactions).

  11. Big Data Application in Biomedical Research and Health Care: A Literature Review.

    PubMed

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care.

  12. Big Data Application in Biomedical Research and Health Care: A Literature Review

    PubMed Central

    Luo, Jake; Wu, Min; Gopukumar, Deepika; Zhao, Yiqing

    2016-01-01

    Big data technologies are increasingly used for biomedical and health-care informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. The cost of acquiring and analyzing biomedical data is expected to decrease dramatically with the help of technology upgrades, such as the emergence of new sequencing machines, the development of novel hardware and software for parallel computing, and the extensive expansion of EHRs. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of health care. The application of big data in health care is a fast-growing field, with many new discoveries and methodologies published in the last five years. In this paper, we review and discuss big data application in four major biomedical subdisciplines: (1) bioinformatics, (2) clinical informatics, (3) imaging informatics, and (4) public health informatics. Specifically, in bioinformatics, high-throughput experiments facilitate the research of new genome-wide association studies of diseases, and with clinical informatics, the clinical field benefits from the vast amount of collected patient data for making intelligent decisions. Imaging informatics is now more rapidly integrated with cloud platforms to share medical image data and workflows, and public health informatics leverages big data techniques for predicting and monitoring infectious disease outbreaks, such as Ebola. In this paper, we review the recent progress and breakthroughs of big data applications in these health-care domains and summarize the challenges, gaps, and opportunities to improve and advance big data applications in health care. PMID:26843812

  13. Complex networks in confined comminution

    NASA Astrophysics Data System (ADS)

    Walker, David M.; Tordesillas, Antoinette; Einav, Itai; Small, Michael

    2011-08-01

    The physical process of confined comminution is investigated within the framework of complex networks. We first characterize the topology of the unweighted contact networks as generated by the confined comminution process. We find this process gives rise to an ultimate contact network which exhibits a scale-free degree distribution and small world properties. In particular, if viewed in the context of networks through which information travels along shortest paths, we find that the global average of the node vulnerability decreases as the comminution process continues, with individual node vulnerability correlating with grain size. A possible application to the design of synthetic networks (e.g., sensor networks) is highlighted. Next we turn our attention to the physics of the granular comminution process and examine force transmission with respect to the weighted contact networks, where each link is weighted by the inverse magnitude of the normal force acting at the associated contact. We find that the strong forces (i.e., force chains) are transmitted along pathways in the network which are mainly following shortest-path routing protocols, as typically found, for example, in communication systems. Motivated by our earlier studies of the building blocks for self-organization in dense granular systems, we also explore the properties of the minimal contact cycles. The distribution of the contact strain energy intensity of 4-cycle motifs in the ultimate state of the confined comminution process is shown to be consistent with a scale-free distribution with infinite variance, thereby suggesting that 4-cycle arrangements of grains are capable of storing vast amounts of energy in their contacts without breaking.

  14. Design and development of a medical big data processing system based on Hadoop.

    PubMed

    Yao, Qin; Tian, Yu; Li, Peng-Fei; Tian, Li-Li; Qian, Yang-Ming; Li, Jing-Song

    2015-03-01

    Secondary use of medical big data is increasingly popular in healthcare services and clinical research. Understanding the logic behind medical big data demonstrates tendencies in hospital information technology and shows great significance for hospital information systems that are designing and expanding services. Big data has four characteristics--Volume, Variety, Velocity and Value (the 4 Vs)--that make traditional systems incapable of processing these data using standalones. Apache Hadoop MapReduce is a promising software framework for developing applications that process vast amounts of data in parallel with large clusters of commodity hardware in a reliable, fault-tolerant manner. With the Hadoop framework and MapReduce application program interface (API), we can more easily develop our own MapReduce applications to run on a Hadoop framework that can scale up from a single node to thousands of machines. This paper investigates a practical case of a Hadoop-based medical big data processing system. We developed this system to intelligently process medical big data and uncover some features of hospital information system user behaviors. This paper studies user behaviors regarding various data produced by different hospital information systems for daily work. In this paper, we also built a five-node Hadoop cluster to execute distributed MapReduce algorithms. Our distributed algorithms show promise in facilitating efficient data processing with medical big data in healthcare services and clinical research compared with single nodes. Additionally, with medical big data analytics, we can design our hospital information systems to be much more intelligent and easier to use by making personalized recommendations.

  15. Implementation of Web Processing Services (WPS) over IPSL Earth System Grid Federation (ESGF) node

    NASA Astrophysics Data System (ADS)

    Kadygrov, Nikolay; Denvil, Sebastien; Carenton, Nicolas; Levavasseur, Guillaume; Hempelmann, Nils; Ehbrecht, Carsten

    2016-04-01

    The Earth System Grid Federation (ESGF) is aimed to provide access to climate data for the international climate community. ESGF is a system of distributed and federated nodes that dynamically interact with each other. ESGF user may search and download climatic data, geographically distributed over the world, from one common web interface and through standardized API. With the continuous development of the climate models and the beginning of the sixth phase of the Coupled Model Intercomparison Project (CMIP6), the amount of data available from ESGF will continuously increase during the next 5 years. IPSL holds a replication of the different global and regional climate models output, observations and reanalysis data (CMIP5, CORDEX, obs4MIPs, etc) that are available on the IPSL ESGF node. In order to let scientists perform analysis of the models without downloading vast amount of data the Web Processing Services (WPS) were installed at IPSL compute node. The work is part of the CONVERGENCE project founded by French National Research Agency (ANR). PyWPS implementation of the Web processing Service standard from Open Geospatial Consortium (OGC) in the framework of birdhouse software is used. The processes could be run by user remotely through web-based WPS client or by using command-line tool. All the calculations are performed on the server side close to the data. If the models/observations are not available at IPSL it will be downloaded and cached by WPS process from ESGF network using synda tool. The outputs of the WPS processes are available for download as plots, tar-archives or as NetCDF files. We present the architecture of WPS at IPSL along with the processes for evaluation of the model performance, on-site diagnostics and post-analysis processing of the models output, e.g.: - regriding/interpolation/aggregation - ocgis (OpenClimateGIS) based polygon subsetting of the data - average seasonal cycle, multimodel mean, multimodel mean bias - calculation of the climate indices with icclim library (CERFACS) - atmospheric modes of variability In order to evaluate performance of any new model, once it became available in ESGF, we implement WPS with several model diagnostics and performance metrics calculated using ESMValTool (Eyring et al., GMDD 2015). As a further step we are developing new WPS processes and core-functions to be implemented at ISPL ESGF compute node following the scientific community needs.

  16. Variable-Volume Flushing (V-VF) device for water conservation in toilets

    NASA Technical Reports Server (NTRS)

    Jasper, Louis J., Jr.

    1993-01-01

    Thirty five percent of residential indoor water used is flushed down the toilet. Five out of six flushes are for liquid waste only, which requires only a fraction of the water needed for solid waste. Designers of current low-flush toilets (3.5-gal. flush) and ultra-low-flush toilets (1.5-gal. flush) did not consider the vastly reduced amount of water needed to flush liquid waste versus solid waste. Consequently, these toilets are less practical than desired and can be improved upon for water conservation. This paper describes a variable-volume flushing (V-VF) device that is more reliable than the currently used flushing devices (it will not leak), is simple, more economical, and more water conserving (allowing one to choose the amount of water to use for flushing solid and liquid waste).

  17. An emerging role: the nurse content curator.

    PubMed

    Brooks, Beth A

    2015-01-01

    A new phenomenon, the inverted or "flipped" classroom, assumes that students are no longer acquiring knowledge exclusively through textbooks or lectures. Instead, they are seeking out the vast amount of free information available to them online (the very essence of open source) to supplement learning gleaned in textbooks and lectures. With so much open-source content available to nursing faculty, it benefits the faculty to use readily available, technologically advanced content. The nurse content curator supports nursing faculty in its use of such content. Even more importantly, the highly paid, time-strapped faculty is not spending an inordinate amount of effort surfing for and evaluating content. The nurse content curator does that work, while the faculty uses its time more effectively to help students vet the truth, make meaning of the content, and learn to problem-solve. Brooks. © 2014 Wiley Periodicals, Inc.

  18. Deciphering Dynamical Patterns of Growth Processes

    ERIC Educational Resources Information Center

    Kolakowska, A.

    2009-01-01

    Large systems of statistical physics often display properties that are independent of particulars that characterize their microscopic components. Universal dynamical patterns are manifested by the presence of scaling laws, which provides a common insight into governing physics of processes as vastly diverse as, e.g., growth of geological…

  19. Three Case Studies on Business Collaboration and Process Management

    ERIC Educational Resources Information Center

    Fan, Shaokun

    2012-01-01

    The importance of collaboration has been recognized for more than 2000 years. While recent improvement in technology creates vast opportunities for collaboration, effective collaboration remains challenging as ad hoc teams work across time, geographical, language, and technical boundaries, and suffer from process inefficiency. My dissertation…

  20. Components of Conceptual Ecologies

    ERIC Educational Resources Information Center

    Park, Hyun Ju

    2007-01-01

    The theory of conceptual change is criticized because it focuses only on supposed underlying logical structures and rational process processes, and lacks attention to affective aspects as well as motivational constructs in students' learning science. This is a vast underestimation of the complexity and diversity of one's change of conceptions. The…

  1. Impact of anthropogenic heat release on regional climate in three vast urban agglomerations in China

    NASA Astrophysics Data System (ADS)

    Feng, Jinming; Wang, Jun; Yan, Zhongwei

    2014-03-01

    We simulated the impact of anthropogenic heat release (AHR) on the regional climate in three vast city agglomerations in China using the Weather Research and Forecasting model with nested high-resolution modeling. Based on energy consumption and high-quality land use data, we designed two scenarios to represent no-AHR and current-AHR conditions. By comparing the results of the two numerical experiments, changes of surface air temperature and precipitation due to AHR were quantified and analyzed. We concluded that AHR increases the temperature in these urbanized areas by about 0.5°C—1°C, and this increase is more pronounced in winter than in other seasons. The inclusion of AHR enhances the convergence of water vapor over urbanized areas. Together with the warming of the lower troposphere and the enhancement of ascending motions caused by AHR, the average convective available potential energy in urbanized areas is increased. Rainfall amounts in summer over urbanized areas are likely to increase and regional precipitation patterns to be altered to some extent.

  2. Pain as metaphor: metaphor and medicine

    PubMed Central

    Neilson, Shane

    2016-01-01

    Like many other disciplines, medicine often resorts to metaphor in order to explain complicated concepts that are imperfectly understood. But what happens when medicine's metaphors close off thinking, restricting interpretations and opinions to those of the negative kind? This paper considers the deleterious effects of destructive metaphors that cluster around pain. First, the metaphoric basis of all knowledge is introduced. Next, a particular subset of medical metaphors in the domain of neurology (doors/keys/wires) are shown to encourage mechanistic thinking. Because schematics are often used in medical textbooks to simplify the complex, this paper traces the visual metaphors implied in such schematics. Mechanistic-metaphorical thinking results in the accumulation of vast amounts of data through experimentation, but this paper asks what the real value of the information is since patients can generally only expect modest benefits – or none at all – for relief from chronic pain conditions. Elucidation of mechanism through careful experimentation creates an illusion of vast medical knowledge that, to a significant degree, is metaphor-based. This paper argues that for pain outcomes to change, our metaphors must change first. PMID:26253331

  3. A Review of Extra-Terrestrial Mining Robot Concepts

    NASA Technical Reports Server (NTRS)

    Mueller, Robert P.; Van Susante, Paul J.

    2011-01-01

    Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 100 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.

  4. A Review of Extra-Terrestrial Mining Concepts

    NASA Technical Reports Server (NTRS)

    Mueller, R. P.; van Susante, P. J.

    2012-01-01

    Outer space contains a vast amount of resources that offer virtually unlimited wealth to the humans that can access and use them for commercial purposes. One of the key technologies for harvesting these resources is robotic mining of regolith, minerals, ices and metals. The harsh environment and vast distances create challenges that are handled best by robotic machines working in collaboration with human explorers. Humans will benefit from the resources that will be mined by robots. They will visit outposts and mining camps as required for exploration, commerce and scientific research, but a continuous presence is most likely to be provided by robotic mining machines that are remotely controlled by humans. There have been a variety of extra-terrestrial robotic mining concepts proposed over the last 40 years and this paper will attempt to summarize and review concepts in the public domain (government, industry and academia) to serve as an informational resource for future mining robot developers and operators. The challenges associated with these concepts will be discussed and feasibility will be assessed. Future needs associated with commercial efforts will also be investigated.

  5. Optimizing Data Management in Grid Environments

    NASA Astrophysics Data System (ADS)

    Zissimos, Antonis; Doka, Katerina; Chazapis, Antony; Tsoumakos, Dimitrios; Koziris, Nectarios

    Grids currently serve as platforms for numerous scientific as well as business applications that generate and access vast amounts of data. In this paper, we address the need for efficient, scalable and robust data management in Grid environments. We propose a fully decentralized and adaptive mechanism comprising of two components: A Distributed Replica Location Service (DRLS) and a data transfer mechanism called GridTorrent. They both adopt Peer-to-Peer techniques in order to overcome performance bottlenecks and single points of failure. On one hand, DRLS ensures resilience by relying on a Byzantine-tolerant protocol and is able to handle massive concurrent requests even during node churn. On the other hand, GridTorrent allows for maximum bandwidth utilization through collaborative sharing among the various data providers and consumers. The proposed integrated architecture is completely backwards-compatible with already deployed Grids. To demonstrate these points, experiments have been conducted in LAN as well as WAN environments under various workloads. The evaluation shows that our scheme vastly outperforms the conventional mechanisms in both efficiency (up to 10 times faster) and robustness in case of failures and flash crowd instances.

  6. Nonverbal Behavior and the Communication Process.

    ERIC Educational Resources Information Center

    Duke, Charles R.

    The effect of nonverbal behavior on communication is apparent, but educators are left with the question of how an awareness of nonverbal behavior can fit into the classroom. In fact the average classroom offers a vast supply of information about nonverbal communication that remains relatively untouched in scientific studies. The processes of…

  7. Sentinel-1 Archive and Processing in the Cloud using the Hybrid Pluggable Processing Pipeline (HyP3) at the ASF DAAC

    NASA Astrophysics Data System (ADS)

    Arko, S. A.; Hogenson, R.; Geiger, A.; Herrmann, J.; Buechler, B.; Hogenson, K.

    2016-12-01

    In the coming years there will be an unprecedented amount of SAR data available on a free and open basis to research and operational users around the globe. The Alaska Satellite Facility (ASF) DAAC hosts, through an international agreement, data from the Sentinel-1 spacecraft and will be hosting data from the upcoming NASA ISRO SAR (NISAR) mission. To more effectively manage and exploit these vast datasets, ASF DAAC has begun moving portions of the archive to the cloud and utilizing cloud services to provide higher-level processing on the data. The Hybrid Pluggable Processing Pipeline (HyP3) project is designed to support higher-level data processing in the cloud and extend the capabilities of researchers to larger scales. Built upon a set of core Amazon cloud services, the HyP3 system allows users to request data processing using a number of canned algorithms or their own algorithms once they have been uploaded to the cloud. The HyP3 system automatically accesses the ASF cloud-based archive through the DAAC RESTful application programming interface and processes the data on Amazon's elastic compute cluster (EC2). Final products are distributed through Amazon's simple storage service (S3) and are available for user download. This presentation will provide an overview of ASF DAAC's activities moving the Sentinel-1 archive into the cloud and developing the integrated HyP3 system, covering both the benefits and difficulties of working in the cloud. Additionally, we will focus on the utilization of HyP3 for higher-level processing of SAR data. Two example algorithms, for sea-ice tracking and change detection, will be discussed as well as the mechanism for integrating new algorithms into the pipeline for community use.

  8. Expectations of Sinus Surgery

    MedlinePlus

    ... too fast, you may risk slowing the healing process. With time and good post-surgery care, the vast majority of patients experience significant long-term improvement! For information about possible risks and complications of ...

  9. Gas Atomization of Molten Metal: Part II. Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abu-Lebdeh, Taher M.; Leon, Genaro Perez-de; Hamoush, Sameer A.

    A numerical model was derived to obtain results for two alloys during the Gas Atomization (GA) method. The model equations and governing equations were implemented through the application of part I data. Aspects such as heat transfer, fluid mechanics, thermodynamics and law of motions were taken into account for the formulation of equations that take gas dynamics, droplet dynamics and energy balance or conservation into consideration. The inputs of the model include: Processing parameters such as the size of the droplets, characteristics of the metal alloy, initial temperature of the molten metal, properties and fractions of the atomization gas andmore » the gas pressure. The outputs include velocity and thermal profiles of the droplet and gas. Velocity profiles illustrate the velocity of both droplet and gas, while thermal profiles illustrate cooling rate and the rate of temperature change of the droplets. The alloys are gamma-Titanium Aluminide (γ-TiAl) and Al-3003-O. These alloys were selected due to the vast amount of applications both can have in several industries. Certain processing parameters were held constant, while others were altered. Furthermore, the main focus of this study was to gain insight into which optimal parameters should be utilized within the GA method for these alloys and to provide insight into the behavior of these alloys« less

  10. The Waterviz: The Confluence of Science, Art and Music Illuminates Pattern and Process in Water Cycle Data

    NASA Astrophysics Data System (ADS)

    Rustad, L.; Martin, M.; Cortada, X.; Quinn, M.; Garlick, S.; Casey, M.; Green, M. B.

    2017-12-01

    The WaterViz for Hubbard Brook is a new online tool for creatively communicating water cycle science to a broad audience with real time hydrologic data. Interfacing between the hydrologic sciences, visual arts, music, education, and graphic design, the WaterViz for Hubbard Brook builds on a new generation of digital environmental sensors and wireless communication devices that are revolutionizing how scientists `see' the natural world. In a nutshell, hydrologic data are captured from small first order catchments at the Hubbard Brook Experimental Forest, NH using an array of environmental sensors. These data are transmitted to the internet in real time and are used to drive a computer model that calculates all components of the water cycle for the catchment in real time. These data, in turn, drive an artistic simulation (delivered as a flash animation) and musical sonification (delivered via an internet radio station) of the water cycle,accurately reflecting the hydrologic processes occurring at that moment in time. The WaterViz for Hubbard Brook provides a unique and novel approach to interactively and intuitively engage the viewer with vast amount of data and information on water cycle science. The WaterViz for Hubbard Brook is available at: https://waterviz.org.

  11. Using the Convergent Cross Mapping method to test causality between Arctic Oscillation / North Atlantic Oscillation and Atlantic Multidecadal Oscillation

    NASA Astrophysics Data System (ADS)

    Gutowska, Dorota; Piskozub, Jacek

    2017-04-01

    There is a vast literature body on the climate indices and processes they represent. A large part of it deals with "teleconnections" or causal relations between them. However until recently time lagged correlations was the best tool of studying causation. However no correlation (even lagged) proves causation. We use a recently developed method of studying casual relations between short time series, Convergent Cross Mapping (CCM), to search for causation between the atmospheric (AO and NAO) and oceanic (AMO) indices. The version we have chosen (available as an R language package rEDM) allows for comparing time series with time lags. This work builds on previous one, showing with time-lagged correlations that AO/NAO precedes AMO by about 15 years and at the same time is preceded by AMO (but with an inverted sign) also by the same amount of time. This behaviour is identical to the relationship of a sine and cosine with the same period. This may suggest that the multidecadal oscillatory parts of the atmospheric and oceanic indices represent the same global-scale set of processes. In other words they may be symptoms of the same oscillation. The aim of present study is to test this hypothesis with a tool created specially for discovering causal relationships in dynamic systems.

  12. An optimal baseline selection methodology for data-driven damage detection and temperature compensation in acousto-ultrasonics

    NASA Astrophysics Data System (ADS)

    Torres-Arredondo, M.-A.; Sierra-Pérez, Julián; Cabanes, Guénaël

    2016-05-01

    The process of measuring and analysing the data from a distributed sensor network all over a structural system in order to quantify its condition is known as structural health monitoring (SHM). For the design of a trustworthy health monitoring system, a vast amount of information regarding the inherent physical characteristics of the sources and their propagation and interaction across the structure is crucial. Moreover, any SHM system which is expected to transition to field operation must take into account the influence of environmental and operational changes which cause modifications in the stiffness and damping of the structure and consequently modify its dynamic behaviour. On that account, special attention is paid in this paper to the development of an efficient SHM methodology where robust signal processing and pattern recognition techniques are integrated for the correct interpretation of complex ultrasonic waves within the context of damage detection and identification. The methodology is based on an acousto-ultrasonics technique where the discrete wavelet transform is evaluated for feature extraction and selection, linear principal component analysis for data-driven modelling and self-organising maps for a two-level clustering under the principle of local density. At the end, the methodology is experimentally demonstrated and results show that all the damages were detectable and identifiable.

  13. An Adaptive Insertion and Promotion Policy for Partitioned Shared Caches

    NASA Astrophysics Data System (ADS)

    Mahrom, Norfadila; Liebelt, Michael; Raof, Rafikha Aliana A.; Daud, Shuhaizar; Hafizah Ghazali, Nur

    2018-03-01

    Cache replacement policies in chip multiprocessors (CMP) have been investigated extensively and proven able to enhance shared cache management. However, competition among multiple processors executing different threads that require simultaneous access to a shared memory may cause cache contention and memory coherence problems on the chip. These issues also exist due to some drawbacks of the commonly used Least Recently Used (LRU) policy employed in multiprocessor systems, which are because of the cache lines residing in the cache longer than required. In image processing analysis of for example extra pulmonary tuberculosis (TB), an accurate diagnosis for tissue specimen is required. Therefore, a fast and reliable shared memory management system to execute algorithms for processing vast amount of specimen image is needed. In this paper, the effects of the cache replacement policy in a partitioned shared cache are investigated. The goal is to quantify whether better performance can be achieved by using less complex replacement strategies. This paper proposes a Middle Insertion 2 Positions Promotion (MI2PP) policy to eliminate cache misses that could adversely affect the access patterns and the throughput of the processors in the system. The policy employs a static predefined insertion point, near distance promotion, and the concept of ownership in the eviction policy to effectively improve cache thrashing and to avoid resource stealing among the processors.

  14. Evaluating the influence of motor control on selective attention through a stochastic model: the paradigm of motor control dysfunction in cerebellar patient.

    PubMed

    Veneri, Giacomo; Federico, Antonio; Rufa, Alessandra

    2014-01-01

    Attention allows us to selectively process the vast amount of information with which we are confronted, prioritizing some aspects of information and ignoring others by focusing on a certain location or aspect of the visual scene. Selective attention is guided by two cognitive mechanisms: saliency of the image (bottom up) and endogenous mechanisms (top down). These two mechanisms interact to direct attention and plan eye movements; then, the movement profile is sent to the motor system, which must constantly update the command needed to produce the desired eye movement. A new approach is described here to study how the eye motor control could influence this selection mechanism in clinical behavior: two groups of patients (SCA2 and late onset cerebellar ataxia LOCA) with well-known problems of motor control were studied; patients performed a cognitively demanding task; the results were compared to a stochastic model based on Monte Carlo simulations and a group of healthy subjects. The analytical procedure evaluated some energy functions for understanding the process. The implemented model suggested that patients performed an optimal visual search, reducing intrinsic noise sources. Our findings theorize a strict correlation between the "optimal motor system" and the "optimal stimulus encoders."

  15. Gas Atomization of Molten Metal: Part II. Applications

    DOE PAGES

    Abu-Lebdeh, Taher M.; Leon, Genaro Perez-de; Hamoush, Sameer A.; ...

    2016-02-01

    A numerical model was derived to obtain results for two alloys during the Gas Atomization (GA) method. The model equations and governing equations were implemented through the application of part I data. Aspects such as heat transfer, fluid mechanics, thermodynamics and law of motions were taken into account for the formulation of equations that take gas dynamics, droplet dynamics and energy balance or conservation into consideration. The inputs of the model include: Processing parameters such as the size of the droplets, characteristics of the metal alloy, initial temperature of the molten metal, properties and fractions of the atomization gas andmore » the gas pressure. The outputs include velocity and thermal profiles of the droplet and gas. Velocity profiles illustrate the velocity of both droplet and gas, while thermal profiles illustrate cooling rate and the rate of temperature change of the droplets. The alloys are gamma-Titanium Aluminide (γ-TiAl) and Al-3003-O. These alloys were selected due to the vast amount of applications both can have in several industries. Certain processing parameters were held constant, while others were altered. Furthermore, the main focus of this study was to gain insight into which optimal parameters should be utilized within the GA method for these alloys and to provide insight into the behavior of these alloys« less

  16. CPSF30 at the Interface of Alternative Polyadenylation and Cellular Signaling in Plants

    PubMed Central

    Chakrabarti, Manohar; Hunt, Arthur G.

    2015-01-01

    Post-transcriptional processing, involving cleavage of precursor messenger RNA (pre mRNA), and further incorporation of poly(A) tail to the 3' end is a key step in the expression of genetic information. Alternative polyadenylation (APA) serves as an important check point for the regulation of gene expression. Recent studies have shown widespread prevalence of APA in diverse systems. A considerable amount of research has been done in characterizing different subunits of so-called Cleavage and Polyadenylation Specificity Factor (CPSF). In plants, CPSF30, an ortholog of the 30 kD subunit of mammalian CPSF is a key polyadenylation factor. CPSF30 in the model plant Arabidopsis thaliana was reported to possess unique biochemical properties. It was also demonstrated that poly(A) site choice in a vast majority of genes in Arabidopsis are CPSF30 dependent, suggesting a pivotal role of this gene in APA and subsequent regulation of gene expression. There are also indications of this gene being involved in oxidative stress and defense responses and in cellular signaling, suggesting a role of CPSF30 in connecting physiological processes and APA. This review will summarize the biochemical features of CPSF30, its role in regulating APA, and possible links with cellular signaling and stress response modules. PMID:26061761

  17. The Person-Event Data Environment: leveraging big data for studies of psychological strengths in soldiers

    PubMed Central

    Vie, Loryana L.; Griffith, Kevin N.; Scheier, Lawrence M.; Lester, Paul B.; Seligman, Martin E. P.

    2013-01-01

    The Department of Defense (DoD) strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE) was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment, and security systems that influence Army-wide policies and procedures. PMID:24379795

  18. The Person-Event Data Environment: leveraging big data for studies of psychological strengths in soldiers.

    PubMed

    Vie, Loryana L; Griffith, Kevin N; Scheier, Lawrence M; Lester, Paul B; Seligman, Martin E P

    2013-01-01

    The Department of Defense (DoD) strives to efficiently manage the large volumes of administrative data collected and repurpose this information for research and analyses with policy implications. This need is especially present in the United States Army, which maintains numerous electronic databases with information on more than one million Active-Duty, Reserve, and National Guard soldiers, their family members, and Army civilian employees. The accumulation of vast amounts of digitized health, military service, and demographic data thus approaches, and may even exceed, traditional benchmarks for Big Data. Given the challenges of disseminating sensitive personal and health information, the Person-Event Data Environment (PDE) was created to unify disparate Army and DoD databases in a secure cloud-based enclave. This electronic repository serves the ultimate goal of achieving cost efficiencies in psychological and healthcare studies and provides a platform for collaboration among diverse scientists. This paper provides an overview of the uses of the PDE to perform command surveillance and policy analysis for Army leadership. The paper highlights the confluence of both economic and behavioral science perspectives elucidating empirically-based studies examining relations between psychological assets, health, and healthcare utilization. Specific examples explore the role of psychological assets in major cost drivers such as medical expenditures both during deployment and stateside, drug use, attrition from basic training, and low reenlistment rates. Through creation of the PDE, the Army and scientific community can now capitalize on the vast amounts of personnel, financial, medical, training and education, deployment, and security systems that influence Army-wide policies and procedures.

  19. Neptune: a bioinformatics tool for rapid discovery of genomic variation in bacterial populations

    PubMed Central

    Marinier, Eric; Zaheer, Rahat; Berry, Chrystal; Weedmark, Kelly A.; Domaratzki, Michael; Mabon, Philip; Knox, Natalie C.; Reimer, Aleisha R.; Graham, Morag R.; Chui, Linda; Patterson-Fortin, Laura; Zhang, Jian; Pagotto, Franco; Farber, Jeff; Mahony, Jim; Seyer, Karine; Bekal, Sadjia; Tremblay, Cécile; Isaac-Renton, Judy; Prystajecky, Natalie; Chen, Jessica; Slade, Peter

    2017-01-01

    Abstract The ready availability of vast amounts of genomic sequence data has created the need to rethink comparative genomics algorithms using ‘big data’ approaches. Neptune is an efficient system for rapidly locating differentially abundant genomic content in bacterial populations using an exact k-mer matching strategy, while accommodating k-mer mismatches. Neptune’s loci discovery process identifies sequences that are sufficiently common to a group of target sequences and sufficiently absent from non-targets using probabilistic models. Neptune uses parallel computing to efficiently identify and extract these loci from draft genome assemblies without requiring multiple sequence alignments or other computationally expensive comparative sequence analyses. Tests on simulated and real datasets showed that Neptune rapidly identifies regions that are both sensitive and specific. We demonstrate that this system can identify trait-specific loci from different bacterial lineages. Neptune is broadly applicable for comparative bacterial analyses, yet will particularly benefit pathogenomic applications, owing to efficient and sensitive discovery of differentially abundant genomic loci. The software is available for download at: http://github.com/phac-nml/neptune. PMID:29048594

  20. The effects of end-of-day picture review and a sensor-based picture capture procedure on autobiographical memory using SenseCam.

    PubMed

    Finley, Jason R; Brewer, William F; Benjamin, Aaron S

    2011-10-01

    Emerging "life-logging" technologies have tremendous potential to augment human autobiographical memory by recording and processing vast amounts of information from an individual's experiences. In this experiment undergraduate participants wore a SenseCam, a small, sensor-equipped digital camera, as they went about their normal daily activities for five consecutive days. Pictures were captured either at fixed intervals or as triggered by SenseCam's sensors. On two of five nights, participants watched an end-of-day review of a random subset of pictures captured that day. Participants were tested with a variety of memory measures at intervals of 1, 3, and 8 weeks. The most fruitful of six measures were recognition rating (on a 1-7 scale) and picture-cued recall length. On these tests, end-of-day review enhanced performance relative to no review, while pictures triggered by SenseCam's sensors showed little difference in performance compared to those taken at fixed time intervals. We discuss the promise of SenseCam as a tool for research and for improving autobiographical memory.

  1. Application of evidence-based dentistry: from research to clinical periodontal practice.

    PubMed

    Kwok, Vivien; Caton, Jack G; Polson, Alan M; Hunter, Paul G

    2012-06-01

    Dentists need to make daily decisions regarding patient care, and these decisions should essentially be scientifically sound. Evidence-based dentistry is meant to empower clinicians to provide the most contemporary treatment. The benefits of applying the evidence-based method in clinical practice include application of the most updated treatment and stronger reasoning to justify the treatment. A vast amount of information is readily accessible with today's digital technology, and a standardized search protocol can be developed to ensure that a literature search is valid, specific and repeatable. It involves developing a preset question (population, intervention, comparison and outcome; PICO) and search protocol. It is usually used academically to perform commissioned reviews, but it can also be applied to answer simple clinical queries. The scientific evidence thus obtained can then be considered along with patient preferences and values, clinical patient circumstances and the practitioner's experience and judgment in order to make the treatment decision. This paper describes how clinicians can incorporate evidence-based methods into patient care and presents a clinical example to illustrate the process. © 2012 John Wiley & Sons A/S.

  2. Cultural Evolutionary Tipping Points in the Storage and Transmission of Information

    PubMed Central

    Bentley, R. Alexander; O’Brien, Michael J.

    2012-01-01

    Human culture has evolved through a series of major tipping points in information storage and communication. The first was the appearance of language, which enabled communication between brains and allowed humans to specialize in what they do and to participate in complex mating games. The second was information storage outside the brain, most obviously expressed in the “Upper Paleolithic Revolution” – the sudden proliferation of cave art, personal adornment, and ritual in Europe some 35,000–45,000 years ago. More recently, this storage has taken the form of writing, mass media, and now the Internet, which is arguably overwhelming humans’ ability to discern relevant information. The third tipping point was the appearance of technology capable of accumulating and manipulating vast amounts of information outside humans, thus removing them as bottlenecks to a seemingly self-perpetuating process of knowledge explosion. Important components of any discussion of cultural evolutionary tipping points are tempo and mode, given that the rate of change, as well as the kind of change, in information storage and transmission has not been constant over the previous million years. PMID:23267338

  3. Cultural evolutionary tipping points in the storage and transmission of information.

    PubMed

    Bentley, R Alexander; O'Brien, Michael J

    2012-01-01

    Human culture has evolved through a series of major tipping points in information storage and communication. The first was the appearance of language, which enabled communication between brains and allowed humans to specialize in what they do and to participate in complex mating games. The second was information storage outside the brain, most obviously expressed in the "Upper Paleolithic Revolution" - the sudden proliferation of cave art, personal adornment, and ritual in Europe some 35,000-45,000 years ago. More recently, this storage has taken the form of writing, mass media, and now the Internet, which is arguably overwhelming humans' ability to discern relevant information. The third tipping point was the appearance of technology capable of accumulating and manipulating vast amounts of information outside humans, thus removing them as bottlenecks to a seemingly self-perpetuating process of knowledge explosion. Important components of any discussion of cultural evolutionary tipping points are tempo and mode, given that the rate of change, as well as the kind of change, in information storage and transmission has not been constant over the previous million years.

  4. Representation learning via Dual-Autoencoder for recommendation.

    PubMed

    Zhuang, Fuzhen; Zhang, Zhiqiang; Qian, Mingda; Shi, Chuan; Xie, Xing; He, Qing

    2017-06-01

    Recommendation has provoked vast amount of attention and research in recent decades. Most previous works employ matrix factorization techniques to learn the latent factors of users and items. And many subsequent works consider external information, e.g., social relationships of users and items' attributions, to improve the recommendation performance under the matrix factorization framework. However, matrix factorization methods may not make full use of the limited information from rating or check-in matrices, and achieve unsatisfying results. Recently, deep learning has proven able to learn good representation in natural language processing, image classification, and so on. Along this line, we propose a new representation learning framework called Recommendation via Dual-Autoencoder (ReDa). In this framework, we simultaneously learn the new hidden representations of users and items using autoencoders, and minimize the deviations of training data by the learnt representations of users and items. Based on this framework, we develop a gradient descent method to learn hidden representations. Extensive experiments conducted on several real-world data sets demonstrate the effectiveness of our proposed method compared with state-of-the-art matrix factorization based methods. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Net-Centric Sensors and Data Sources (N-CSDS) GEODSS Sidecar

    NASA Astrophysics Data System (ADS)

    Richmond, D.

    2012-09-01

    Vast amounts of Space Situational Sensor data is collected each day on closed, legacy systems. Massachusetts Institute of Technology Lincoln Laboratory (MIT/LL) developed a Net-Centric approach to expose this data under the Extended Space Sensors Architecture (ESSA) Advanced Concept Technology Demonstration (ACTD). The Net-Centric Sensors and Data Sources (N-CSDS) Ground-based Electro Optical Deep Space Surveillance (GEODSS) Sidecar is the next generation that moves the ESSA ACTD engineering tools to an operational baseline. The N-CSDS GEODSS sidecar high level architecture will be presented, highlighting the features that supports deployment at multiple diverse sensor sites. Other key items that will be covered include: 1) The Web Browser interface to perform searches of historical data 2) The capabilities of the deployed Web Services and example service request/responses 3) Example data and potential user applications will be highlighted 4) Specifics regarding the process to gain access to the N-CSDS GEODSS sensor data in near real time 5) Current status and future deployment plans (Including plans for deployment to the Maui GEODSS Site)

  6. Towards AN Inventory for Archaeological Heritage Management in Israel

    NASA Astrophysics Data System (ADS)

    Alef, Y.

    2017-08-01

    The vast amount of archaeological data and information that is systematically accumulated in the Israel Antiquities Authority database, has not yet been transformed into a tool for heritage management, i.e. accessible knowledge of the sites' cultural significance and risk assessment that is needed to support wise decision making regarding its future. As a response, a pilot project for developing an inventory for the archaeological heritage management was launched. A basic ESRI ArcGIS Online system was developed as a prototype, following the categories recommended in international standards for documentation. Five field surveys implementing the GIS system were conducted to examine different aspects and workflows: ancient synagogues in the Galilee, sites at risk, mosaics in Tel Shiqmona, the ancient settlement of Huqoq and sites included in The National Master Plan for Forests and Afforestation. The pilot project revealed the main gaps in knowledge and the critical faults in the working procedures. In spite of the systems' technological limitations, the results were convincing enough to promote a multidisciplinary discussion about the need for integration of significance and risk assessment in the working processes of the organization.

  7. Understanding chemically processed solar cells based on quantum dots

    PubMed Central

    Malgras, Victor; Nattestad, Andrew; Kim, Jung Ho; Dou, Shi Xue; Yamauchi, Yusuke

    2017-01-01

    Abstract Photovoltaic energy conversion is one of the best alternatives to fossil fuel combustion. Petroleum resources are now close to depletion and their combustion is known to be responsible for the release of a considerable amount of greenhouse gases and carcinogenic airborne particles. Novel third-generation solar cells include a vast range of device designs and materials aiming to overcome the factors limiting the current technologies. Among them, quantum dot-based devices showed promising potential both as sensitizers and as colloidal nanoparticle films. A good example is the p-type PbS colloidal quantum dots (CQDs) forming a heterojunction with a n-type wide-band-gap semiconductor such as TiO2 or ZnO. The confinement in these nanostructures is also expected to result in marginal mechanisms, such as the collection of hot carriers and generation of multiple excitons, which would increase the theoretical conversion efficiency limit. Ultimately, this technology could also lead to the assembly of a tandem-type cell with CQD films absorbing in different regions of the solar spectrum. PMID:28567179

  8. An approach for software-driven and standard-based support of cross-enterprise tumor boards.

    PubMed

    Mangesius, Patrick; Fischer, Bernd; Schabetsberger, Thomas

    2015-01-01

    For tumor boards, the networking of different medical disciplines' expertise continues to gain importance. However, interdisciplinary tumor boards spread across several institutions are rarely supported by information technology tools today. The aim of this paper is to point out an approach for a tumor board management system prototype. For analyzing the requirements, an incremental process was used. The requirements were surveyed using Informal Conversational Interview and documented with Use Case Diagrams defined by the Unified Modeling Language (UML). Analyses of current EHR standards were conducted to evaluate technical requirements. Functional and technical requirements of clinical conference applications were evaluated and documented. In several steps, workflows were derived and application mockups were created. Although there is a vast amount of common understanding concerning how clinical conferences should be conducted and how their workflows should be structured, these are hardly standardized, neither on a functional nor on a technical level. This results in drawbacks for participants and patients. Using modern EHR technologies based on profiles such as IHE Cross Enterprise document sharing (XDS), these deficits could be overcome.

  9. Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case

    PubMed Central

    Diez, Luis; Agüero, Ramón; Muñoz, Luis

    2017-01-01

    Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population’s health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis. PMID:28561783

  10. Bayesian network prior: network analysis of biological data using external knowledge

    PubMed Central

    Isci, Senol; Dogan, Haluk; Ozturk, Cengizhan; Otu, Hasan H.

    2014-01-01

    Motivation: Reverse engineering GI networks from experimental data is a challenging task due to the complex nature of the networks and the noise inherent in the data. One way to overcome these hurdles would be incorporating the vast amounts of external biological knowledge when building interaction networks. We propose a framework where GI networks are learned from experimental data using Bayesian networks (BNs) and the incorporation of external knowledge is also done via a BN that we call Bayesian Network Prior (BNP). BNP depicts the relation between various evidence types that contribute to the event ‘gene interaction’ and is used to calculate the probability of a candidate graph (G) in the structure learning process. Results: Our simulation results on synthetic, simulated and real biological data show that the proposed approach can identify the underlying interaction network with high accuracy even when the prior information is distorted and outperforms existing methods. Availability: Accompanying BNP software package is freely available for academic use at http://bioe.bilgi.edu.tr/BNP. Contact: hasan.otu@bilgi.edu.tr Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:24215027

  11. Application of robust face recognition in video surveillance systems

    NASA Astrophysics Data System (ADS)

    Zhang, De-xin; An, Peng; Zhang, Hao-xiang

    2018-03-01

    In this paper, we propose a video searching system that utilizes face recognition as searching indexing feature. As the applications of video cameras have great increase in recent years, face recognition makes a perfect fit for searching targeted individuals within the vast amount of video data. However, the performance of such searching depends on the quality of face images recorded in the video signals. Since the surveillance video cameras record videos without fixed postures for the object, face occlusion is very common in everyday video. The proposed system builds a model for occluded faces using fuzzy principal component analysis (FPCA), and reconstructs the human faces with the available information. Experimental results show that the system has very high efficiency in processing the real life videos, and it is very robust to various kinds of face occlusions. Hence it can relieve people reviewers from the front of the monitors and greatly enhances the efficiency as well. The proposed system has been installed and applied in various environments and has already demonstrated its power by helping solving real cases.

  12. Metro Optical Networks for Homeland Security

    NASA Astrophysics Data System (ADS)

    Bechtel, James H.

    Metro optical networks provide an enticing opportunity for strengthening homeland security. Many existing and emerging fiber-optic networks can be adapted for enhanced security applications. Applications include airports, theme parks, sports venues, and border surveillance systems. Here real-time high-quality video and captured images can be collected, transported, processed, and stored for security applications. Video and data collection are important also at correctional facilities, courts, infrastructure (e.g., dams, bridges, railroads, reservoirs, power stations), and at military and other government locations. The scaling of DWDM-based networks allows vast amounts of data to be collected and transported including biometric features of individuals at security check points. Here applications will be discussed along with potential solutions and challenges. Examples of solutions to these problems are given. This includes a discussion of metropolitan aggregation platforms for voice, video, and data that are SONET compliant for use in SONET networks and the use of DWDM technology for scaling and transporting a variety of protocols. Element management software allows not only network status monitoring, but also provides optimized allocation of network resources through the use of optical switches or electrical cross connects.

  13. Irrigation network extraction methodology from LiDAR DTM using Whitebox and ArcGIS

    NASA Astrophysics Data System (ADS)

    Mahor, M. A. P.; De La Cruz, R. M.; Olfindo, N. T.; Perez, A. M. C.

    2016-10-01

    Irrigation networks are important in distributing water resources to areas where rainfall is not enough to sustain agriculture. They are also crucial when it comes to being able to redirect vast amounts of water to decrease the risks of flooding in flat areas, especially near sources of water. With the lack of studies about irrigation feature extraction, which range from wide canals to small ditches, this study aims to present a method of extracting these features from LiDAR-derived digital terrain models (DTMs) using Geographic Information Systems (GIS) tools such as ArcGIS and Whitebox Geospatial Analysis Tools (Whitebox GAT). High-resolution LiDAR DTMs with 1-meter horizontal and 0.25-meter vertical accuracies were processed to generate the gully depth map. This map was then reclassified, converted to vector, and filtered according to segment length, and sinuosity to be able to isolate these irrigation features. Initial results in the test area show that the extraction completeness is greater than 80% when compared with data obtained from the National Irrigation Administration (NIA).

  14. Effect of ultrasound pre-treatment on the physicochemical composition of Agave durangensis leaves and potential enzyme production.

    PubMed

    Contreras-Hernández, M G; Ochoa-Martínez, L A; Rutiaga-Quiñones, J G; Rocha-Guzmán, N E; Lara-Ceniceros, T E; Contreras-Esquivel, J C; Prado Barragán, L A; Rutiaga-Quiñones, O M

    2018-02-01

    Approximately 1 million tons of agave plants are processed annually by the Mexican tequila and mezcal industry, generating vast amounts of agroindustrial solid waste. This type of lignocellulosic biomass is considered to be agroindustrial residue, which can be used to produce enzymes, giving it added value. However, the structure of lignocellulosic biomass makes it highly recalcitrant, and results in relatively low yield when used in its native form. The aim of this study was to investigate an effective pre-treatment method for the production of commercially important hydrolytic enzymes. In this work, the physical and chemical modification of Agave durangensis leaves was analysed using ultrasound and high temperature as pre-treatments, and production of enzymes was evaluated. The pre-treatments resulted in modification of the lignocellulosic structure and composition; the ultrasound pre-treatment improved the production of inulinase by 4 U/mg and cellulase by 0.297 U/mg, and thermal pre-treatment improved β-fructofuranosidase by 30 U/mg. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Electromagnetic Field Assessment as a Smart City Service: The SmartSantander Use-Case.

    PubMed

    Diez, Luis; Agüero, Ramón; Muñoz, Luis

    2017-05-31

    Despite the increasing presence of wireless communications in everyday life, there exist some voices raising concerns about their adverse effects. One particularly relevant example is the potential impact of the electromagnetic field they induce on the population's health. Traditionally, very specialized methods and devices (dosimetry) have been used to assess the strength of the E-field, with the main objective of checking whether it respects the corresponding regulations. In this paper, we propose a complete novel approach, which exploits the functionality leveraged by a smart city platform. We deploy a number of measuring probes, integrated as sensing devices, to carry out a characterization embracing large areas, as well as long periods of time. This unique platform has been active for more than one year, generating a vast amount of information. We process such information, and the obtained results validate the whole methodology. In addition, we discuss the variation of the E-field caused by cellular networks, considering additional information, such as usage statistics. Finally, we establish the exposure that can be attributed to the base stations within the scenario under analysis.

  16. Thoughtflow: Standards and Tools for Provenance Capture and Workflow Definition to Support Model‐Informed Drug Discovery and Development

    PubMed Central

    Wilkins, JJ; Chan, PLS; Chard, J; Smith, G; Smith, MK; Beer, M; Dunn, A; Flandorfer, C; Franklin, C; Gomeni, R; Harnisch, L; Kaye, R; Moodie, S; Sardu, ML; Wang, E; Watson, E; Wolstencroft, K

    2017-01-01

    Pharmacometric analyses are complex and multifactorial. It is essential to check, track, and document the vast amounts of data and metadata that are generated during these analyses (and the relationships between them) in order to comply with regulations, support quality control, auditing, and reporting. It is, however, challenging, tedious, error‐prone, and time‐consuming, and diverts pharmacometricians from the more useful business of doing science. Automating this process would save time, reduce transcriptional errors, support the retention and transfer of knowledge, encourage good practice, and help ensure that pharmacometric analyses appropriately impact decisions. The ability to document, communicate, and reconstruct a complete pharmacometric analysis using an open standard would have considerable benefits. In this article, the Innovative Medicines Initiative (IMI) Drug Disease Model Resources (DDMoRe) consortium proposes a set of standards to facilitate the capture, storage, and reporting of knowledge (including assumptions and decisions) in the context of model‐informed drug discovery and development (MID3), as well as to support reproducibility: “Thoughtflow.” A prototype software implementation is provided. PMID:28504472

  17. A Spatiotemporal Indexing Approach for Efficient Processing of Big Array-Based Climate Data with MapReduce

    NASA Technical Reports Server (NTRS)

    Li, Zhenlong; Hu, Fei; Schnase, John L.; Duffy, Daniel Q.; Lee, Tsengdar; Bowen, Michael K.; Yang, Chaowei

    2016-01-01

    Climate observations and model simulations are producing vast amounts of array-based spatiotemporal data. Efficient processing of these data is essential for assessing global challenges such as climate change, natural disasters, and diseases. This is challenging not only because of the large data volume, but also because of the intrinsic high-dimensional nature of geoscience data. To tackle this challenge, we propose a spatiotemporal indexing approach to efficiently manage and process big climate data with MapReduce in a highly scalable environment. Using this approach, big climate data are directly stored in a Hadoop Distributed File System in its original, native file format. A spatiotemporal index is built to bridge the logical array-based data model and the physical data layout, which enables fast data retrieval when performing spatiotemporal queries. Based on the index, a data-partitioning algorithm is applied to enable MapReduce to achieve high data locality, as well as balancing the workload. The proposed indexing approach is evaluated using the National Aeronautics and Space Administration (NASA) Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. The experimental results show that the index can significantly accelerate querying and processing (10 speedup compared to the baseline test using the same computing cluster), while keeping the index-to-data ratio small (0.0328). The applicability of the indexing approach is demonstrated by a climate anomaly detection deployed on a NASA Hadoop cluster. This approach is also able to support efficient processing of general array-based spatiotemporal data in various geoscience domains without special configuration on a Hadoop cluster.

  18. Evaluation of the efficiency of continuous wavelet transform as processing and preprocessing algorithm for resolution of overlapped signals in univariate and multivariate regression analyses; an application to ternary and quaternary mixtures.

    PubMed

    Hegazy, Maha A; Lotfy, Hayam M; Mowaka, Shereen; Mohamed, Ekram Hany

    2016-07-05

    Wavelets have been adapted for a vast number of signal-processing applications due to the amount of information that can be extracted from a signal. In this work, a comparative study on the efficiency of continuous wavelet transform (CWT) as a signal processing tool in univariate regression and a pre-processing tool in multivariate analysis using partial least square (CWT-PLS) was conducted. These were applied to complex spectral signals of ternary and quaternary mixtures. CWT-PLS method succeeded in the simultaneous determination of a quaternary mixture of drotaverine (DRO), caffeine (CAF), paracetamol (PAR) and p-aminophenol (PAP, the major impurity of paracetamol). While, the univariate CWT failed to simultaneously determine the quaternary mixture components and was able to determine only PAR and PAP, the ternary mixtures of DRO, CAF, and PAR and CAF, PAR, and PAP. During the calculations of CWT, different wavelet families were tested. The univariate CWT method was validated according to the ICH guidelines. While for the development of the CWT-PLS model a calibration set was prepared by means of an orthogonal experimental design and their absorption spectra were recorded and processed by CWT. The CWT-PLS model was constructed by regression between the wavelet coefficients and concentration matrices and validation was performed by both cross validation and external validation sets. Both methods were successfully applied for determination of the studied drugs in pharmaceutical formulations. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Engine Icing Data - An Analytics Approach

    NASA Technical Reports Server (NTRS)

    Fitzgerald, Brooke A.; Flegel, Ashlie B.

    2017-01-01

    Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.

  20. Bookshelf: a simple curation system for the storage of biomolecular simulation data.

    PubMed

    Vohra, Shabana; Hall, Benjamin A; Holdbrook, Daniel A; Khalid, Syma; Biggin, Philip C

    2010-01-01

    Molecular dynamics simulations can now routinely generate data sets of several hundreds of gigabytes in size. The ability to generate this data has become easier over recent years and the rate of data production is likely to increase rapidly in the near future. One major problem associated with this vast amount of data is how to store it in a way that it can be easily retrieved at a later date. The obvious answer to this problem is a database. However, a key issue in the development and maintenance of such a database is its sustainability, which in turn depends on the ease of the deposition and retrieval process. Encouraging users to care about meta-data is difficult and thus the success of any storage system will ultimately depend on how well used by end-users the system is. In this respect we suggest that even a minimal amount of metadata if stored in a sensible fashion is useful, if only at the level of individual research groups. We discuss here, a simple database system which we call 'Bookshelf', that uses python in conjunction with a mysql database to provide an extremely simple system for curating and keeping track of molecular simulation data. It provides a user-friendly, scriptable solution to the common problem amongst biomolecular simulation laboratories; the storage, logging and subsequent retrieval of large numbers of simulations. Download URL: http://sbcb.bioch.ox.ac.uk/bookshelf/

  1. Bookshelf: a simple curation system for the storage of biomolecular simulation data

    PubMed Central

    Vohra, Shabana; Hall, Benjamin A.; Holdbrook, Daniel A.; Khalid, Syma; Biggin, Philip C.

    2010-01-01

    Molecular dynamics simulations can now routinely generate data sets of several hundreds of gigabytes in size. The ability to generate this data has become easier over recent years and the rate of data production is likely to increase rapidly in the near future. One major problem associated with this vast amount of data is how to store it in a way that it can be easily retrieved at a later date. The obvious answer to this problem is a database. However, a key issue in the development and maintenance of such a database is its sustainability, which in turn depends on the ease of the deposition and retrieval process. Encouraging users to care about meta-data is difficult and thus the success of any storage system will ultimately depend on how well used by end-users the system is. In this respect we suggest that even a minimal amount of metadata if stored in a sensible fashion is useful, if only at the level of individual research groups. We discuss here, a simple database system which we call ‘Bookshelf’, that uses python in conjunction with a mysql database to provide an extremely simple system for curating and keeping track of molecular simulation data. It provides a user-friendly, scriptable solution to the common problem amongst biomolecular simulation laboratories; the storage, logging and subsequent retrieval of large numbers of simulations. Download URL: http://sbcb.bioch.ox.ac.uk/bookshelf/ PMID:21169341

  2. Local area networks, laboratory information management systems, languages, and operating systems in the lab and pilot plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dessy, R.E.

    1983-08-01

    Microprocessors and microcomputers are being incorporated into the instruments and controllers in our laboratory and pilot plant. They enhance both the quality and amount of information that is produced. Yet they simultaneously produce vast amounts of information that must be controlled, or scientists and engineers will become high priced secretaries. The devices need programs that control them in a time frame relevant to the experiment. Simple, expeditious pathways to the generation of software that will run rapidly is essential or first class scientists and engineers become second class system programmersexclamation This paper attempts to develop the vocabulary by which themore » people involved in this technological revolution can understand and control it. We will examine the elements that synergistically make up the electronic laboratory and pilot plant. More detailed analyses of each area may be found in a series of articles entitled A/C INTERFACE (1-4). Many factors interact in the final system that we bring into our laboratory. Yet many purchasers only perform a cursory evaluation on the superficial aspects of the hardware. The integrated lab and pilot plant require that microprocessors, which control and collect, be connected in a LAN to larger processors that can provide LIMS support. Statistics and scientific word processing capabilities then complete the armamentorium. The end result is a system that does things for the user, rather than doing things to him.« less

  3. Data driven innovations in structural health monitoring

    NASA Astrophysics Data System (ADS)

    Rosales, M. J.; Liyanapathirana, R.

    2017-05-01

    At present, substantial investments are being allocated to civil infrastructures also considered as valuable assets at a national or global scale. Structural Health Monitoring (SHM) is an indispensable tool required to ensure the performance and safety of these structures based on measured response parameters. The research to date on damage assessment has tended to focus on the utilization of wireless sensor networks (WSN) as it proves to be the best alternative over the traditional visual inspections and tethered or wired counterparts. Over the last decade, the structural health and behaviour of innumerable infrastructure has been measured and evaluated owing to several successful ventures of implementing these sensor networks. Various monitoring systems have the capability to rapidly transmit, measure, and store large capacities of data. The amount of data collected from these networks have eventually been unmanageable which paved the way to other relevant issues such as data quality, relevance, re-use, and decision support. There is an increasing need to integrate new technologies in order to automate the evaluation processes as well as to enhance the objectivity of data assessment routines. This paper aims to identify feasible methodologies towards the application of time-series analysis techniques to judiciously exploit the vast amount of readily available as well as the upcoming data resources. It continues the momentum of a greater effort to collect and archive SHM approaches that will serve as data-driven innovations for the assessment of damage through efficient algorithms and data analytics.

  4. How a phosphorus-acquisition strategy based on carboxylate exudation powers the success and agronomic potential of lupines (Lupinus, Fabaceae).

    PubMed

    Lambers, Hans; Clements, Jon C; Nelson, Matthew N

    2013-02-01

    Lupines (Lupinus species; Fabaceae) are an ancient crop with great potential to be developed further for high-protein feed and food, cover crops, and phytoremediation. Being legumes, they are capable of symbiotically fixing atmospheric nitrogen. However, Lupinus species appear to be nonmycorrhizal or weakly mycorrhizal at most; instead some produce cluster roots, which release vast amounts of phosphate-mobilizing carboxylates (inorganic anions). Other lupines produce cluster-like roots, which function in a similar manner, and some release large amounts of carboxylates without specialized roots. These traits associated with nutrient acquisition make lupines ideally suited for either impoverished soils or soils with large amounts of phosphorus that is poorly available for most plants, e.g., acidic or alkaline soils. Here we explore how common the nonmycorrhizal phosphorus-acquisition strategy based on exudation of carboxylates is in the genus Lupinus, concluding it is very likely more widespread than generally acknowledged. This trait may partly account for the role of lupines as pioneers or invasive species, but also makes them suitable crop plants while we reach "peak phosphorus".

  5. Imagery atlas: a structure of expert software designed to improve the accessibility of remote-sensed satellite imagery

    NASA Astrophysics Data System (ADS)

    Genet, Richard P.

    1995-11-01

    Policy changes in the United States and Europe will bring a number of firms into the remote sensing market. More importantly, there will be a vast increase in the amount of data and potentially, the amount of information, that is available for academic, commercial and a variety of public uses. Presently many of the users of remote sensing data have some understanding of photogrammetric and remote sensing technologies. This is especially true of environmentalist users and academics. As the amount of remote sensing data increases, in order to broaden the user base, it will become increasingly important that the information user not be required to have a background in photogrammetry, remote sensing, or even in the basics of geographic information systems. The user must be able to articulate his requirements in view of existence of new sources of information. This paper provides the framework for expert systems to accomplish this interface. Specific examples of the capabilities which must be developed in order to maximize the utility of specific images and image archives are presented and discussed.

  6. The Leverage of National Board Candidacy: An Exploration of Teacher Learning

    ERIC Educational Resources Information Center

    Hunzicker, Jana

    2008-01-01

    The vast majority of teachers who engage in the process of National Board certification describe it as the best professional development they have ever experienced - even when they do not achieve the certification. Learning leverage, an interactive dynamic characterized by rigor, reward, and risk, is what makes the certification process such a…

  7. The Socialization of Newcomers into Organizations: Integrating Learning and Social Exchange Processes

    ERIC Educational Resources Information Center

    Korte, Russell F.

    2007-01-01

    Traditional views of socialization focus primarily on the passive learning by the newcomer of the expectations of the organization. Theorizing and research on cognitive learning and social exchange indicate that the socialization process is vastly more complex. This paper views socialization through the lenses of cognitive learning and social…

  8. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  9. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  10. The IBM PC at NASA Ames

    NASA Technical Reports Server (NTRS)

    Peredo, James P.

    1988-01-01

    Like many large companies, Ames relies very much on its computing power to get work done. And, like many other large companies, finding the IBM PC a reliable tool, Ames uses it for many of the same types of functions as other companies. Presentation and clarification needs demand much of graphics packages. Programming and text editing needs require simpler, more-powerful packages. The storage space needed by NASA's scientists and users for the monumental amounts of data that Ames needs to keep demand the best database packages that are large and easy to use. Availability to the Micom Switching Network combines the powers of the IBM PC with the capabilities of other computers and mainframes and allows users to communicate electronically. These four primary capabilities of the PC are vital to the needs of NASA's users and help to continue and support the vast amounts of work done by the NASA employees.

  11. More Than the Sum of the Parts: Satellite Aerosol Remote Sensing, and Its Relationship to Sub-Orbital Measurements and Models

    NASA Technical Reports Server (NTRS)

    Kahn, Ralph

    2016-01-01

    Space-borne instruments are providing increasing amounts of data relating to global aerosol spectral optical depth, horizontal and vertical distribution, and very loose, but spatially and temporally extensive, constraints on particle micro-physical properties. The data sets, and many of the underlying techniques, are evolving rapidly. They represent a vast amount of information, potentially useful to the AAAR community. However, there are also issues, some quite subtle, that scientific users must take into consideration. This tutorial will provide one view of the answers to the following four questions: 1) What satellite-derived aerosol products are available? 2) What are their strengths and limitations? 3) How are they being used now? 4) How might they be used in conjunction with each other, with sub-orbital measurements, and with models to address cutting-edge aerosol questions?

  12. Information collection and processing of dam distortion in digital reservoir system

    NASA Astrophysics Data System (ADS)

    Liang, Yong; Zhang, Chengming; Li, Yanling; Wu, Qiulan; Ge, Pingju

    2007-06-01

    The "digital reservoir" is usually understood as describing the whole reservoir with digital information technology to make it serve the human existence and development furthest. Strictly speaking, the "digital reservoir" is referred to describing vast information of the reservoir in different dimension and space-time by RS, GPS, GIS, telemetry, remote-control and virtual reality technology based on computer, multi-media, large-scale memory and wide-band networks technology for the human existence, development and daily work, life and entertainment. The core of "digital reservoir" is to realize the intelligence and visibility of vast information of the reservoir through computers and networks. The dam is main building of reservoir, whose safety concerns reservoir and people's safety. Safety monitoring is important way guaranteeing the dam's safety, which controls the dam's running through collecting the dam's information concerned and developing trend. Safety monitoring of the dam is the process from collection and processing of initial safety information to forming safety concept in the brain. The paper mainly researches information collection and processing of the dam by digital means.

  13. No hormone to rule them all: Interactions of plant hormones during the responses of plants to pathogens.

    PubMed

    Shigenaga, Alexandra M; Argueso, Cristiana T

    2016-08-01

    Plant hormones are essential regulators of plant growth and immunity. In the last few decades, a vast amount of information has been obtained detailing the role of different plant hormones in immunity, and how they work together to ultimately shape the outcomes of plant pathogen interactions. Here we provide an overview on the roles of the main classes of plant hormones in the regulation of plant immunity, highlighting their metabolic and signaling pathways and how plants and pathogens utilize these pathways to activate or suppress defence. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. The future of sustainable food production.

    PubMed

    Ronald, Pamela; Adamchak, Raoul

    2010-03-01

    By the year 2050, the number of people on Earth is expected to increase from the current 6.7 to 9.2 billion. What is the best way to produce enough food to feed all these people? If we continue with current farming practices, vast amounts of wilderness will be lost, millions of birds and billions of insects will die, farm workers will be at increased risk for disease, and the public will lose billions of dollars as a consequence of environmental degradation. Clearly, there must be a better way to resolve the need for increased food production with the desire to minimize its impact.

  15. The superdeep well of the Kola Peninsula

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozlovsky, Y.A.

    1986-01-01

    The structure of continental crusts is a subject of ever increasing importance in the geological sciences. Over 15 years ago, Soviet scientist began drilling a superdeep well on the Kola Peninsula near Murmansk. The well has reached a depth of 12 km and is thereby the deepest well in the world, yielding a vast amount of information on the structure of the continental crust. The geological, geophysical and technological data from the Kola well were initially published in a monographic account entitled ''Kol'skaja sverchglubokaja''. This English translation makes the results available to non-Soviet scientists as well.

  16. Ethanol production from renewable resources.

    PubMed

    Gong, C S; Cao, N J; Du, J; Tsao, G T

    1999-01-01

    Vast amounts of renewable biomass are available for conversion to liquid fuel, ethanol. In order to convert biomass to ethanol, the efficient utilization of both cellulose-derived and hemicellulose-derived carbohydrates is essential. Six-carbon sugars are readily utilized for this purpose. Pentoses, on the other hand, are more difficult to convert. Several metabolic factors limit the efficient utilization of pentoses (xylose and arabinose). Recent developments in the improvement of microbial cultures provide the versatility of conversion of both hexoses and pentoses to ethanol more efficiently. In addition, novel bioprocess technologies offer a promising prospective for the efficient conversion of biomass and recovery of ethanol.

  17. Understanding genetic variation - the value of systems biology.

    PubMed

    Hütt, Marc-Thorsten

    2014-04-01

    Pharmacology is currently transformed by the vast amounts of genome-associated information available for system-level interpretation. Here I review the potential of systems biology to facilitate this interpretation, thus paving the way for the emerging field of systems pharmacology. In particular, I will show how gene regulatory and metabolic networks can serve as a framework for interpreting high throughput data and as an interface to detailed dynamical models. In addition to the established connectivity analyses of effective networks, I suggest here to also analyze higher order architectural properties of effective networks. © 2013 The British Pharmacological Society.

  18. Interstellar Grains: 50 Years On

    NASA Astrophysics Data System (ADS)

    Wickramasinghe, N. Chandra

    2011-12-01

    Our understanding of the nature of interstellar grains has evolved considerably over the past half century with the present author and Fred Hoyle being intimately involved at several key stages of progress. The currently fashionable graphite-silicate-organic grain model has all its essential aspects unequivocally traceable to original peer-reviewedpublicationsbytheauthorand/orFredHoyle. Theprevailingreluctancetoaccepttheseclear-cut priorities may be linked to our further work that argued for interstellar grains and organics to have a biological provenance - a position perceived as heretical. The biological model, however, continues to provide a powerful unifying hypothesis for a vast amount of otherwise disconnected and disparate astronomical data.

  19. Perspective on pain management in the 21st century.

    PubMed

    Polomano, Rosemary C; Dunwoody, Colleen J; Krenzischek, Dina A; Rathmell, James P

    2008-02-01

    Pain is a predictable consequence of surgery or trauma. Untreated, it is associated with significant physiological, emotional, mental, and economic consequences. Despite the vast amount of current knowledge, uncontrolled postoperative pain is reported by approximately 50% of patients. Thus, techniques for effective acute pain management (APM) represent unmet educational needs. The significance of these unmet needs is reflected in the number of journal and textbook publications dedicated to disseminating research, evidence-based guidelines, and clinical information. Acknowledging the importance of APM, health care accrediting agencies and professional societies have become increasingly focused on ensuring that patients receive prompt and acceptable pain relief.

  20. Flashline Mars Arctic Research Station (FMARS) 2009 Crew Perspectives

    NASA Technical Reports Server (NTRS)

    Ferrone, Kristine; Cusack, Stacy L.; Garvin, Christy; Kramer, Walter Vernon; Palaia, Joseph E., IV; Shiro, Brian

    2010-01-01

    A crew of six "astronauts" inhabited the Mars Society s Flashline Mars Arctic Research Station (FMARS) for the month of July 2009, conducting a simulated Mars exploration mission. In addition to the various technical achievements during the mission, the crew learned a vast amount about themselves and about human factors relevant to a future mission to Mars. Their experiences, detailed in their own words, show the passion of those with strong commitment to space exploration and detail the human experiences for space explorers including separation from loved ones, interpersonal conflict, dietary considerations, and the exhilaration of surmounting difficult challenges.

  1. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  2. A Framework to Explore the Knowledge Structure of Multidisciplinary Research Fields

    PubMed Central

    Uddin, Shahadat; Khan, Arif; Baur, Louise A.

    2015-01-01

    Understanding emerging areas of a multidisciplinary research field is crucial for researchers, policymakers and other stakeholders. For them a knowledge structure based on longitudinal bibliographic data can be an effective instrument. But with the vast amount of available online information it is often hard to understand the knowledge structure for data. In this paper, we present a novel approach for retrieving online bibliographic data and propose a framework for exploring knowledge structure. We also present several longitudinal analyses to interpret and visualize the last 20 years of published obesity research data. PMID:25915521

  3. UNICOS Kernel Internals Application Development

    NASA Technical Reports Server (NTRS)

    Caredo, Nicholas; Craw, James M. (Technical Monitor)

    1995-01-01

    Having an understanding of UNICOS Kernel Internals is valuable information. However, having the knowledge is only half the value. The second half comes with knowing how to use this information and apply it to the development of tools. The kernel contains vast amounts of useful information that can be utilized. This paper discusses the intricacies of developing utilities that utilize kernel information. In addition, algorithms, logic, and code will be discussed for accessing kernel information. Code segments will be provided that demonstrate how to locate and read kernel structures. Types of applications that can utilize kernel information will also be discussed.

  4. FaceIt: face recognition from static and live video for law enforcement

    NASA Astrophysics Data System (ADS)

    Atick, Joseph J.; Griffin, Paul M.; Redlich, A. N.

    1997-01-01

    Recent advances in image and pattern recognition technology- -especially face recognition--are leading to the development of a new generation of information systems of great value to the law enforcement community. With these systems it is now possible to pool and manage vast amounts of biometric intelligence such as face and finger print records and conduct computerized searches on them. We review one of the enabling technologies underlying these systems: the FaceIt face recognition engine; and discuss three applications that illustrate its benefits as a problem-solving technology and an efficient and cost effective investigative tool.

  5. Development of an Open Source Based Sensor Platform for an Advanced and Comprehensive in-situ DOC Monitoring

    NASA Astrophysics Data System (ADS)

    Schima, Robert; Goblirsch, Tobias; Paschen, Mathias; Rinke, Karsten; Schelwat, Heinz; Dietrich, Peter; Bumberger, Jan

    2016-04-01

    The impact of global change, intensive agriculture and complex interactions between humans and the environment show different effects on different scales. However, the desire to obtain a better understanding of ecosystems and process dynamics in nature accentuates the need for observing these processes in higher temporal and spatial resolutions. Especially with regard to the process dynamics and heterogeneity of water catchment areas, a comprehensive monitoring of the ongoing processes and effects remains to be a challenging issue in the field of applied environmental research. Moreover, harsh conditions and a variety of influencing process parameters are representing a particular challenge due to an adaptive in-situ monitoring of vast areas. Today, open source based electronics and cost-effective sensors and sensor components are offering a promising approach to investigate new possibilities of smart phone based mobile data acquisition and comprehensive ad-hoc monitoring of environmental processes. Accordingly, our project aims the development of new strategies for mobile data acquisition and real-time processing of user-specific environmental data, based on a holistic and integrated process. To this end, the concept of our monitoring system covers the data collection, data processing and data integration as well as the data provision within one infrastructure. The whole monitoring system consists of several mobile sensor devices, a smart phone app (Android) and a web service for data processing, data provision and data visualization. The smart phone app allows the configuration of the mobile sensor device and provides some built-in functions such as data visualization or data transmission via e-mail. Besides the measurement of temperature and humidity in air, the mobile sensor device is able to acquire sensor readings for the content of dissolved organic compounds (λ = 254 nm) and turbidity (λ = 860 nm) of surface water based on the developed optical in-situ sensor probe. Here, the miniaturized optical sensor probe allows the monitoring of even shallow water bodies with a depth of less than 5 cm. Compared to common techniques, the inexpensive sensor parts and robust emitting LEDs allow an improved widespread and comprehensive monitoring due to a higher amount of sensor devices. Furthermore, the system consists of a GPS module, a real-time clock and a GSM unit which allow space and time resolved measurements. On October 6th, 2015 an initial experiment was started at the Bode catchment in the Harz region (Germany). Here, the developed DOC and turbidity sensor probes were installed directly at the riverside next to existing sampling points of a large-scaled long-term observation project. The results show a good correspondence between our sensor development and the installed and established instruments. This represents a decisive and cost-effective contribution in the area of environmental research and the monitoring of vast catchment areas.

  6. Using information theory to identify redundancy in common laboratory tests in the intensive care unit.

    PubMed

    Lee, Joon; Maslove, David M

    2015-07-31

    Clinical workflow is infused with large quantities of data, particularly in areas with enhanced monitoring such as the Intensive Care Unit (ICU). Information theory can quantify the expected amounts of total and redundant information contained in a given clinical data type, and as such has the potential to inform clinicians on how to manage the vast volumes of data they are required to analyze in their daily practice. The objective of this proof-of-concept study was to quantify the amounts of redundant information associated with common ICU lab tests. We analyzed the information content of 11 laboratory test results from 29,149 adult ICU admissions in the MIMIC II database. Information theory was applied to quantify the expected amount of redundant information both between lab values from the same ICU day, and between consecutive ICU days. Most lab values showed a decreasing trend over time in the expected amount of novel information they contained. Platelet, blood urea nitrogen (BUN), and creatinine measurements exhibited the most amount of redundant information on days 2 and 3 compared to the previous day. The creatinine-BUN and sodium-chloride pairs had the most redundancy. Information theory can help identify and discourage unnecessary testing and bloodwork, and can in general be a useful data analytic technique for many medical specialties that deal with information overload.

  7. Parallel In Situ Indexing for Data-intensive Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jinoh; Abbasi, Hasan; Chacon, Luis

    2011-09-09

    As computing power increases exponentially, vast amount of data is created by many scientific re- search activities. However, the bandwidth for storing the data to disks and reading the data from disks has been improving at a much slower pace. These two trends produce an ever-widening data access gap. Our work brings together two distinct technologies to address this data access issue: indexing and in situ processing. From decades of database research literature, we know that indexing is an effective way to address the data access issue, particularly for accessing relatively small fraction of data records. As data sets increasemore » in sizes, more and more analysts need to use selective data access, which makes indexing an even more important for improving data access. The challenge is that most implementations of in- dexing technology are embedded in large database management systems (DBMS), but most scientific datasets are not managed by any DBMS. In this work, we choose to include indexes with the scientific data instead of requiring the data to be loaded into a DBMS. We use compressed bitmap indexes from the FastBit software which are known to be highly effective for query-intensive workloads common to scientific data analysis. To use the indexes, we need to build them first. The index building procedure needs to access the whole data set and may also require a significant amount of compute time. In this work, we adapt the in situ processing technology to generate the indexes, thus removing the need of read- ing data from disks and to build indexes in parallel. The in situ data processing system used is ADIOS, a middleware for high-performance I/O. Our experimental results show that the indexes can improve the data access time up to 200 times depending on the fraction of data selected, and using in situ data processing system can effectively reduce the time needed to create the indexes, up to 10 times with our in situ technique when using identical parallel settings.« less

  8. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    NASA Astrophysics Data System (ADS)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from multiple-purpose modules. In the last part of the thesis a well known optimization method (the Broyden-Fletcher-Goldfarb-Shanno memoryless quasi -Newton method) is applied to simple classification problems and shown to be superior to the "error back-propagation" algorithm for numerical stability, automatic selection of parameters, and convergence properties.

  9. E-Collaboration for Earth Observation (E-CEO) with the example of Contest #3 that focuses on the Atmospheric Correction of Ocean Colour data

    NASA Astrophysics Data System (ADS)

    Lavender, Samantha; Brito, Fabrice; Aas, Christina; Casu, Francesco; Ribeiro, Rita; Farres, Jordi

    2014-05-01

    Data challenges are becoming the new method to promote innovation within data-intensive applications; building or evolving user communities and potentially developing sustainable commercial services. These can utilise the vast amount of information (both in scope and volume) that's available online, and profits from reduced processing costs. Data Challenges are also closely related to the recent paradigm shift towards e-Science, also referred to as "data-intensive science'. The E-CEO project aims to deliver a collaborative platform that, through Data Challenge Contests, will improve the adoption and outreach of new applications and methods to processes Earth Observation (EO) data. Underneath, the backbone must be a common environment where the applications can be developed, deployed and executed. Then, the results need to be easily published in a common visualization platform for their effective validation, evaluation and transparent peer comparisons. Contest #3 is based around the atmospheric correction (AC) of ocean colour data with a particular focus on the use of auxiliary data files for processing Level 1 (Top of Atmosphere, TOA, calibrated radiances/reflectances) to Level 2 products (Bottom of Atmosphere, BOA, calibrated radiances/reflectance and derived products). Scientific researchers commonly accept the auxiliary inputs that they've been provided with and/or use the climatological data that accompanies the processing software; often because it can be difficult to obtain multiple data sources and convert them into a format the software accepts. Therefore, it's proposed to compare various ocean colour AC approaches and in the process study the uncertainties associated with using different meteorological auxiliary products for the processing of Medium Resolution Imaging Spectrometer (MERIS) i.e. the sensitivity of different atmospheric correction input assumptions.

  10. Information Overload?

    ERIC Educational Resources Information Center

    Doring, Allan

    1999-01-01

    Access to information is inadequate without attention to learning processes and the use of information in the production of knowledge. The ability to sift sources, discriminate, and think is far more important than the ability to find vast quantities of information. (SK)

  11. The semantic web and computer vision: old AI meets new AI

    NASA Astrophysics Data System (ADS)

    Mundy, J. L.; Dong, Y.; Gilliam, A.; Wagner, R.

    2018-04-01

    There has been vast process in linking semantic information across the billions of web pages through the use of ontologies encoded in the Web Ontology Language (OWL) based on the Resource Description Framework (RDF). A prime example is the Wikipedia where the knowledge contained in its more than four million pages is encoded in an ontological database called DBPedia http://wiki.dbpedia.org/. Web-based query tools can retrieve semantic information from DBPedia encoded in interlinked ontologies that can be accessed using natural language. This paper will show how this vast context can be used to automate the process of querying images and other geospatial data in support of report changes in structures and activities. Computer vision algorithms are selected and provided with context based on natural language requests for monitoring and analysis. The resulting reports provide semantically linked observations from images and 3D surface models.

  12. Conceptual and Epistemological Undercurrents of Learning as a Process of Change

    ERIC Educational Resources Information Center

    Montfort, Devlin B.

    2011-01-01

    In the preparation and education of civil engineers it is essential to both increase student knowledge of the world (conceptual understanding), but also to establish and develop new ways of thinking (epistemology). Both of these processes of change can be considered learning, but they are vastly different in the time, energy and resources they…

  13. Inversion of multicomponent seismic data and rock-physics intepretation for evaluating lithology, fracture and fluid distribution in heterogeneous anisotropic reservoirs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ilya Tsvankin; Kenneth L. Larner

    2004-11-17

    Within the framework of this collaborative project with the Lawrence Livermore National Laboratory (LLNL) and Stanford University, the Colorado School of Mines (CSM) group developed and implemented a new efficient approach to the inversion and processing of multicomponent, multiazimuth seismic data in anisotropic media. To avoid serious difficulties in the processing of mode-converted (PS) waves, we devised a methodology for transforming recorded PP- and PS-wavefields into the corresponding SS-wave reflection data that can be processed by velocity-analysis algorithms designed for pure (unconverted) modes. It should be emphasized that this procedure does not require knowledge of the velocity model and canmore » be applied to data from arbitrarily anisotropic, heterogeneous media. The azimuthally varying reflection moveouts of the PP-waves and constructed SS-waves are then combined in anisotropic stacking-velocity tomography to estimate the velocity field in the depth domain. As illustrated by the case studies discussed in the report, migration of the multicomponent data with the obtained anisotropic velocity model yields a crisp image of the reservoir that is vastly superior to that produced by conventional methods. The scope of this research essentially amounts to building the foundation of 3D multicomponent, anisotropic seismology. We have also worked with the LLNL and Stanford groups on relating the anisotropic parameters obtained from seismic data to stress, lithology, and fluid distribution using a generalized theoretical treatment of fractured, poroelastic rocks.« less

  14. MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads

    PubMed Central

    Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas

    2017-01-01

    An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets. PMID:28467460

  15. MGmapper: Reference based mapping and taxonomy annotation of metagenomics sequence reads.

    PubMed

    Petersen, Thomas Nordahl; Lukjancenko, Oksana; Thomsen, Martin Christen Frølund; Maddalena Sperotto, Maria; Lund, Ole; Møller Aarestrup, Frank; Sicheritz-Pontén, Thomas

    2017-01-01

    An increasing amount of species and gene identification studies rely on the use of next generation sequence analysis of either single isolate or metagenomics samples. Several methods are available to perform taxonomic annotations and a previous metagenomics benchmark study has shown that a vast number of false positive species annotations are a problem unless thresholds or post-processing are applied to differentiate between correct and false annotations. MGmapper is a package to process raw next generation sequence data and perform reference based sequence assignment, followed by a post-processing analysis to produce reliable taxonomy annotation at species and strain level resolution. An in-vitro bacterial mock community sample comprised of 8 genuses, 11 species and 12 strains was previously used to benchmark metagenomics classification methods. After applying a post-processing filter, we obtained 100% correct taxonomy assignments at species and genus level. A sensitivity and precision at 75% was obtained for strain level annotations. A comparison between MGmapper and Kraken at species level, shows MGmapper assigns taxonomy at species level using 84.8% of the sequence reads, compared to 70.5% for Kraken and both methods identified all species with no false positives. Extensive read count statistics are provided in plain text and excel sheets for both rejected and accepted taxonomy annotations. The use of custom databases is possible for the command-line version of MGmapper, and the complete pipeline is freely available as a bitbucked package (https://bitbucket.org/genomicepidemiology/mgmapper). A web-version (https://cge.cbs.dtu.dk/services/MGmapper) provides the basic functionality for analysis of small fastq datasets.

  16. Physical Analytics: An emerging field with real-world applications and impact

    NASA Astrophysics Data System (ADS)

    Hamann, Hendrik

    2015-03-01

    In the past most information on the internet has been originated by humans or computers. However with the emergence of cyber-physical systems, vast amount of data is now being created by sensors from devices, machines etc digitizing the physical world. While cyber-physical systems are subject to active research around the world, the vast amount of actual data generated from the physical world has attracted so far little attention from the engineering and physics community. In this presentation we use examples to highlight the opportunities in this new subject of ``Physical Analytics'' for highly inter-disciplinary research (including physics, engineering and computer science), which aims understanding real-world physical systems by leveraging cyber-physical technologies. More specifically, the convergence of the physical world with the digital domain allows applying physical principles to everyday problems in a much more effective and informed way than what was possible in the past. Very much like traditional applied physics and engineering has made enormous advances and changed our lives by making detailed measurements to understand the physics of an engineered device, we can now apply the same rigor and principles to understand large-scale physical systems. In the talk we first present a set of ``configurable'' enabling technologies for Physical Analytics including ultralow power sensing and communication technologies, physical big data management technologies, numerical modeling for physical systems, machine learning based physical model blending, and physical analytics based automation and control. Then we discuss in detail several concrete applications of Physical Analytics ranging from energy management in buildings and data centers, environmental sensing and controls, precision agriculture to renewable energy forecasting and management.

  17. Data Strategies to Support Automated Multi-Sensor Data Fusion in a Service Oriented Architecture

    DTIC Science & Technology

    2008-06-01

    and employ vast quantities of content. This dissertation provides two software architectural patterns and an auto-fusion process that guide the...UDDI), Simple Order Access Protocol (SOAP), Java, Maritime Domain Awareness (MDA), Business Process Execution Language for Web Service (BPEL4WS) 16...content. This dissertation provides two software architectural patterns and an auto-fusion process that guide the development of a distributed

  18. Collecting, processing, and integrating GPS data into GIS

    DOT National Transportation Integrated Search

    2002-01-01

    A vast storehouse of information exists on nearly every subject of concern to highway : administrators and engineers. Much of this information has resulted from both research : and the successful application of solutions to the problems faced by prac...

  19. Mapping the global potential for marine aquaculture.

    PubMed

    Gentry, Rebecca R; Froehlich, Halley E; Grimm, Dietmar; Kareiva, Peter; Parke, Michael; Rust, Michael; Gaines, Steven D; Halpern, Benjamin S

    2017-09-01

    Marine aquaculture presents an opportunity for increasing seafood production in the face of growing demand for marine protein and limited scope for expanding wild fishery harvests. However, the global capacity for increased aquaculture production from the ocean and the relative productivity potential across countries are unknown. Here, we map the biological production potential for marine aquaculture across the globe using an innovative approach that draws from physiology, allometry and growth theory. Even after applying substantial constraints based on existing ocean uses and limitations, we find vast areas in nearly every coastal country that are suitable for aquaculture. The development potential far exceeds the space required to meet foreseeable seafood demand; indeed, the current total landings of all wild-capture fisheries could be produced using less than 0.015% of the global ocean area. This analysis demonstrates that suitable space is unlikely to limit marine aquaculture development and highlights the role that other factors, such as economics and governance, play in shaping growth trajectories. We suggest that the vast amount of space suitable for marine aquaculture presents an opportunity for countries to develop aquaculture in a way that aligns with their economic, environmental and social objectives.

  20. Development of capability for microtopography-resolving simulations of hydrologic processes in permafrost affected regions

    NASA Astrophysics Data System (ADS)

    Painter, S.; Moulton, J. D.; Berndt, M.; Coon, E.; Garimella, R.; Lewis, K. C.; Manzini, G.; Mishra, P.; Travis, B. J.; Wilson, C. J.

    2012-12-01

    The frozen soils of the Arctic and subarctic regions contain vast amounts of stored organic carbon. This carbon is vulnerable to release to the atmosphere as temperatures warm and permafrost degrades. Understanding the response of the subsurface and surface hydrologic system to degrading permafrost is key to understanding the rate, timing, and chemical form of potential carbon releases to the atmosphere. Simulating the hydrologic system in degrading permafrost regions is challenging because of the potential for topographic evolution and associated drainage network reorganization as permafrost thaws and massive ground ice melts. The critical process models required for simulating hydrology include subsurface thermal hydrology of freezing/thawing soils, thermal processes within ice wedges, mechanical deformation processes, overland flow, and surface energy balances including snow dynamics. A new simulation tool, the Arctic Terrestrial Simulator (ATS), is being developed to simulate these coupled processes. The computational infrastructure must accommodate fully unstructured grids that track evolving topography, allow accurate solutions on distorted grids, provide robust and efficient solutions on highly parallel computer architectures, and enable flexibility in the strategies for coupling among the various processes. The ATS is based on Amanzi (Moulton et al. 2012), an object-oriented multi-process simulator written in C++ that provides much of the necessary computational infrastructure. Status and plans for the ATS including major hydrologic process models and validation strategies will be presented. Highly parallel simulations of overland flow using high-resolution digital elevation maps of polygonal patterned ground landscapes demonstrate the feasibility of the approach. Simulations coupling three-phase subsurface thermal hydrology with a simple thaw-induced subsidence model illustrate the strong feedbacks among the processes. D. Moulton, M. Berndt, M. Day, J. Meza, et al., High-Level Design of Amanzi, the Multi-Process High Performance Computing Simulator, Technical Report ASCEM-HPC-2011-03-1, DOE Environmental Management, 2012.

  1. Fostering Autonomy through Syllabus Design: A Step-by-Step Guide for Success

    ERIC Educational Resources Information Center

    Ramírez Espinosa, Alexánder

    2016-01-01

    Promoting learner autonomy is relevant in the field of applied linguistics due to the multiple benefits it brings to the process of learning a new language. However, despite the vast array of research on how to foster autonomy in the language classroom, it is difficult to find step-by-step processes to design syllabi and curricula focused on the…

  2. All-memristive neuromorphic computing with level-tuned neurons

    NASA Astrophysics Data System (ADS)

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-01

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  3. All-memristive neuromorphic computing with level-tuned neurons.

    PubMed

    Pantazi, Angeliki; Woźniak, Stanisław; Tuma, Tomas; Eleftheriou, Evangelos

    2016-09-02

    In the new era of cognitive computing, systems will be able to learn and interact with the environment in ways that will drastically enhance the capabilities of current processors, especially in extracting knowledge from vast amount of data obtained from many sources. Brain-inspired neuromorphic computing systems increasingly attract research interest as an alternative to the classical von Neumann processor architecture, mainly because of the coexistence of memory and processing units. In these systems, the basic components are neurons interconnected by synapses. The neurons, based on their nonlinear dynamics, generate spikes that provide the main communication mechanism. The computational tasks are distributed across the neural network, where synapses implement both the memory and the computational units, by means of learning mechanisms such as spike-timing-dependent plasticity. In this work, we present an all-memristive neuromorphic architecture comprising neurons and synapses realized by using the physical properties and state dynamics of phase-change memristors. The architecture employs a novel concept of interconnecting the neurons in the same layer, resulting in level-tuned neuronal characteristics that preferentially process input information. We demonstrate the proposed architecture in the tasks of unsupervised learning and detection of multiple temporal correlations in parallel input streams. The efficiency of the neuromorphic architecture along with the homogenous neuro-synaptic dynamics implemented with nanoscale phase-change memristors represent a significant step towards the development of ultrahigh-density neuromorphic co-processors.

  4. Cytomics - importance of multimodal analysis of cell function and proliferation in oncology.

    PubMed

    Tárnok, A; Bocsi, J; Brockhoff, G

    2006-12-01

    Cancer is a highly complex and heterogeneous disease involving a succession of genetic changes (frequently caused or accompanied by exogenous trauma), and resulting in a molecular phenotype that in turn results in a malignant specification. The development of malignancy has been described as a multistep process involving self-sufficiency in growth signals, insensitivity to antigrowth signals, evasion of apoptosis, limitless replicative potential, sustained angiogenesis, and finally tissue invasion and metastasis. The quantitative analysis of networking molecules within the cells might be applied to understand native-state tissue signalling biology, complex drug actions and dysfunctional signalling in transformed cells, that is, in cancer cells. High-content and high-throughput single-cell analysis can lead to systems biology and cytomics. The application of cytomics in cancer research and diagnostics is very broad, ranging from the better understanding of the tumour cell biology to the identification of residual tumour cells after treatment, to drug discovery. The ultimate goal is to pinpoint in detail these processes on the molecular, cellular and tissue level. A comprehensive knowledge of these will require tissue analysis, which is multiplex and functional; thus, vast amounts of data are being collected from current genomic and proteomic platforms for integration and interpretation as well as for new varieties of updated cytomics technology. This overview will briefly highlight the most important aspects of this continuously developing field.

  5. A methodological approach to be used in integrated coastal zone management processes: the case of the Catalan Coast (Catalonia, Spain)

    NASA Astrophysics Data System (ADS)

    Sardá, Rafael; Avila, Conxita; Mora, Joan

    2005-02-01

    Since early 1999, we have been working on an environmental information system as a preliminary phase to develop the National Strategy of the Catalan Coast. Using the tourism industry as the main pressuring driver and the municipality as the territorial unit, we have compiled a vast amount of information that has been converted into an information platform for the general public, politicians, and public administrators. Working in close co-operation with the planning authorities of the Generalitat of Catalonia, we developed decision support tools as a methodological approach for coastal management. The decision support system is composed by: (a) the development of an environmental indicator-based report; (b) the use of a geographical information system (GIS); and (c) the incorporation of different types of graphical packages. These tools have been applied to the 70 municipalities of the Catalan Coast and a specific development of the system was carried out in the region of La Selva, municipalities of Blanes, Lloret de Mar, and Tossa de Mar (southern Costa Brava, Girona). The system has been designed to help coastal managers in Catalonia, and it is thought to be used in the process of developing the National Strategy for Integrated Coastal Zone Management (ICZM) of the Catalan Coast following the EC Recommendation (COM/00/545).

  6. Desulfurization of fuel gases in fluidized bed gasification and hot fuel gas cleanup systems

    DOEpatents

    Steinberg, M.; Farber, G.; Pruzansky, J.; Yoo, H.J.; McGauley, P.

    1983-08-26

    A problem with the commercialization of fluidized bed gasification is that vast amounts of spent sorbent are generated if the sorbent is used on a once-through basis, especially if high sulfur coals are burned. The requirements of a sorbent for regenerative service in the FBG process are: (1) it must be capable of reducing the sulfur containing gas concentration of the FBG flue gas to within acceptable environmental standards; (2) it must not lose its reactivity on cyclic sulfidation and regeneration; (3) it must be capable of regeneration with elimination of substantially all of its sulfur content; (4) it must have good attrition resistance; and, (5) its cost must not be prohibitive. It has now been discovered that calcium silicate pellets, e.g., Portland cement type III pellets meet the criteria aforesaid. Calcium silicate removes COS and H/sub 2/S according to the reactions given to produce calcium sulfide silicate. The sulfur containing product can be regenerated using CO/sub 2/ as the regenerant. The sulfur dioxide can be conveniently reduced to sulfur with hydrogen or carbon for market or storage. The basic reactions in the process of this invention are the reactions with calcium silicate given in the patent. A convenient and inexpensive source of calcium silicate is Portland cement. Portland cement is a readily available, widely used construction meterial.

  7. Convergent Time-Varying Regression Models for Data Streams: Tracking Concept Drift by the Recursive Parzen-Based Generalized Regression Neural Networks.

    PubMed

    Duda, Piotr; Jaworski, Maciej; Rutkowski, Leszek

    2018-03-01

    One of the greatest challenges in data mining is related to processing and analysis of massive data streams. Contrary to traditional static data mining problems, data streams require that each element is processed only once, the amount of allocated memory is constant and the models incorporate changes of investigated streams. A vast majority of available methods have been developed for data stream classification and only a few of them attempted to solve regression problems, using various heuristic approaches. In this paper, we develop mathematically justified regression models working in a time-varying environment. More specifically, we study incremental versions of generalized regression neural networks, called IGRNNs, and we prove their tracking properties - weak (in probability) and strong (with probability one) convergence assuming various concept drift scenarios. First, we present the IGRNNs, based on the Parzen kernels, for modeling stationary systems under nonstationary noise. Next, we extend our approach to modeling time-varying systems under nonstationary noise. We present several types of concept drifts to be handled by our approach in such a way that weak and strong convergence holds under certain conditions. Finally, in the series of simulations, we compare our method with commonly used heuristic approaches, based on forgetting mechanism or sliding windows, to deal with concept drift. Finally, we apply our concept in a real life scenario solving the problem of currency exchange rates prediction.

  8. A Deep Learning Approach to LIBS Spectroscopy for Planetary Applications

    NASA Astrophysics Data System (ADS)

    Mullen, T. H.; Parente, M.; Gemp, I.; Dyar, M. D.

    2017-12-01

    The ChemCam instrument on the Curiousity rover has collected >440,000 laser-induced breakdown spectra (LIBS) from 1500 different geological targets since 2012. The team is using a pipeline of preprocessing and partial least squares techniques to predict compositions of surface materials [1]. Unfortunately, such multivariate techniques are plagued by hard-to-meet assumptions involving constant hyperparameter tuning to specific elements and the amount of training data available; if the whole distribution of data is not seen, the method will overfit to the training data and generalizability will suffer. The rover only has 10 calibration targets on-board that represent a small subset of the geochemical samples the rover is expected to investigate. Deep neural networks have been used to bypass these issues in other fields. Semi-supervised techniques allow researchers to utilized small labeled datasets and vast amounts of unlabeled data. One example is the variational autoencoder model, a semi-supervised generative model in the form of a deep neural network. The autoencoder assumes that LIBS spectra are generated from a distribution conditioned on the elemental compositions in the sample and some nuisance. The system is broken into two models: one that predicts elemental composition from the spectra and one that generates spectra from compositions that may or may not be seen in the training set. The synthesized spectra show strong agreement with geochemical conventions to express specific compositions. The predictions of composition show improved generalizability to PLS. Deep neural networks have also been used to transfer knowledge from one dataset to another to solve unlabeled data problems. Given that vast amounts of laboratry LIBS spectra have been obtained in the past few years, it is now feasible train a deep net to predict elemental composition from lab spectra. Transfer learning (manifold alignment or calibration transfer) [2] is then used to fine-tune the model from terrestrial lab data to Martian field data. Neural networks and generative models provide the flexibility need for elemental composition prediction and unseen spectra synthesis. [1] Clegg S. et al. (2016) Spectrochim. Acta B, 129, 64-85. [2] Boucher T. et al. (2017) J. Chemom., 31, e2877.

  9. Starch: chemistry, microstructure, processing and enzymatic degradation

    USDA-ARS?s Scientific Manuscript database

    Starch is recognized as one of the most abundant and important commodities containing value added attributes for a vast number of industrial applications. Its chemistry, structure, property and susceptibility to various chemical, physical and enzymatic modifications offer a high technological value ...

  10. Comparative effectiveness research and big data: balancing potential with legal and ethical considerations.

    PubMed

    Gray, Elizabeth Alexandra; Thorpe, Jane Hyatt

    2015-01-01

    Big data holds big potential for comparative effectiveness research. The ability to quickly synthesize and use vast amounts of health data to compare medical interventions across settings of care, patient populations, payers and time will greatly inform efforts to improve quality, reduce costs and deliver more patient-centered care. However, the use of big data raises significant legal and ethical issues that may present barriers or limitations to the full potential of big data. This paper addresses the scope of some of these legal and ethical issues and how they may be managed effectively to fully realize the potential of big data.

  11. Managing Content in a Matter of Minutes

    NASA Technical Reports Server (NTRS)

    2004-01-01

    NASA software created to help scientists expeditiously search and organize their research documents is now aiding compliance personnel, law enforcement investigators, and the general public in their efforts to search, store, manage, and retrieve documents more efficiently. Developed at Ames Research Center, NETMARK software was designed to manipulate vast amounts of unstructured and semi-structured NASA documents. NETMARK is both a relational and object-oriented technology built on an Oracle enterprise-wide database. To ensure easy user access, Ames constructed NETMARK as a Web-enabled platform utilizing the latest in Internet technology. One of the significant benefits of the program was its ability to store and manage mission-critical data.

  12. The what, where, how and why of gene ontology—a primer for bioinformaticians

    PubMed Central

    du Plessis, Louis; Škunca, Nives

    2011-01-01

    With high-throughput technologies providing vast amounts of data, it has become more important to provide systematic, quality annotations. The Gene Ontology (GO) project is the largest resource for cataloguing gene function. Nonetheless, its use is not yet ubiquitous and is still fraught with pitfalls. In this review, we provide a short primer to the GO for bioinformaticians. We summarize important aspects of the structure of the ontology, describe sources and types of functional annotations, survey measures of GO annotation similarity, review typical uses of GO and discuss other important considerations pertaining to the use of GO in bioinformatics applications. PMID:21330331

  13. PubChem promiscuity: a web resource for gathering compound promiscuity data from PubChem.

    PubMed

    Canny, Stephanie A; Cruz, Yasel; Southern, Mark R; Griffin, Patrick R

    2012-01-01

    Promiscuity counts allow for a better understanding of a compound's assay activity profile and drug potential. Although PubChem contains a vast amount of compound and assay data, it currently does not have a convenient or efficient method to obtain in-depth promiscuity counts for compounds. PubChem promiscuity fills this gap. It is a Java servlet that uses NCBI Entrez (eUtils) web services to interact with PubChem and provide promiscuity counts in a variety of categories along with compound descriptors, including PAINS-based functional group detection. http://chemutils.florida.scripps.edu/pcpromiscuity southern@scripps.edu

  14. PubChem promiscuity: a web resource for gathering compound promiscuity data from PubChem

    PubMed Central

    Canny, Stephanie A.; Cruz, Yasel; Southern, Mark R.; Griffin, Patrick R.

    2012-01-01

    Summary: Promiscuity counts allow for a better understanding of a compound's assay activity profile and drug potential. Although PubChem contains a vast amount of compound and assay data, it currently does not have a convenient or efficient method to obtain in-depth promiscuity counts for compounds. PubChem promiscuity fills this gap. It is a Java servlet that uses NCBI Entrez (eUtils) web services to interact with PubChem and provide promiscuity counts in a variety of categories along with compound descriptors, including PAINS-based functional group detection. Availability: http://chemutils.florida.scripps.edu/pcpromiscuity Contact: southern@scripps.edu PMID:22084255

  15. The Human Genome Project: applications in the diagnosis and treatment of neurologic disease.

    PubMed

    Evans, G A

    1998-10-01

    The Human Genome Project (HGP), an international program to decode the entire DNA sequence of the human genome in 15 years, represents the largest biological experiment ever conducted. This set of information will contain the blueprint for the construction and operation of a human being. While the primary driving force behind the genome project is the potential to vastly expand the amount of genetic information available for biomedical research, the ramifications for other fields of study in biological research, the biotechnology and pharmaceutical industry, our understanding of evolution, effects on agriculture, and implications for bioethics are likely to be profound.

  16. Prediction of operating parameters range for ammonia removal unit in coke making by-products

    NASA Astrophysics Data System (ADS)

    Tiwari, Hari Prakash; Kumar, Rajesh; Bhattacharjee, Arunabh; Lingam, Ravi Kumar; Roy, Abhijit; Tiwary, Shambhu

    2018-02-01

    Coke oven gas treatment plants are well equipped with distributed control systems (DCS) and therefore recording the vast amount of operational data efficiently. Analyzing the stored information manually from historians is practically impossible. In this study, data mining technique was examined for lowering the ammonia concentration in clean coke oven gas. Results confirm that concentration of ammonia in clean coke oven gas depends on the average PCDC temperature; gas scrubber temperatures stripped liquor flow, stripped liquor concentration and stripped liquor temperature. The optimum operating ranges of the above dependent parameters using data mining technique for lowering the concentration of ammonia is described in this paper.

  17. Data Mining for Web-Based Support Systems: A Case Study in e-Custom Systems

    NASA Astrophysics Data System (ADS)

    Razmerita, Liana; Kirchner, Kathrin

    This chapter provides an example of a Web-based support system (WSS) used to streamline trade procedures, prevent potential security threats, and reduce tax-related fraud in cross-border trade. The architecture is based on a service-oriented architecture that includes smart seals and Web services. We discuss the implications and suggest further enhancements to demonstrate how such systems can move toward a Web-based decision support system with the support of data mining methods. We provide a concrete example of how data mining can help to analyze the vast amount of data collected while monitoring the container movements along its supply chain.

  18. Interstellar Grains: 50 Years on

    NASA Astrophysics Data System (ADS)

    Wickramasinghe, N. C.

    Our understanding of the nature of interstellar grains has evolved considerably over the past half century with the present author and Fred Hoyle being intimately involved at several key stages of progress. The currently fashionable graphite-silicate-organic grain model has all its essential aspects unequivocally traceable to original peer-reviewed publications by the author and/or Fred Hoyle. The prevailing reluctance to accept these clear-cut priorities may be linked to our further work that argued for interstellar grains and organics to have a biological provenance -- a position perceived as heretical. The biological model, however, continues to provide a powerful unifying hypothesis for a vast amount of otherwise disconnected and disparate astronomical data.

  19. Human and ape: the legend, the history and the DNA

    PubMed Central

    Diamandopoulos, AA; Goudas, CP

    2007-01-01

    A vast amount of papers is published every year about species evolution, the most interesting being those recently published in the journal "Nature", concerning the human-ape relationship. The results and the new theories generated from this research are sometimes astonishing, rising not only biological, but also social, religious and cultural questions. One of the new questions concerns the role of species interbreeding as a means of evolution. In the subject of species interbreeding between human and ape we found some interesting historical and mythical information that sort of back-up this theory of interbreeding, with a historical and cultural side of view. PMID:19582186

  20. The deep ocean under climate change

    NASA Astrophysics Data System (ADS)

    Levin, Lisa A.; Le Bris, Nadine

    2015-11-01

    The deep ocean absorbs vast amounts of heat and carbon dioxide, providing a critical buffer to climate change but exposing vulnerable ecosystems to combined stresses of warming, ocean acidification, deoxygenation, and altered food inputs. Resulting changes may threaten biodiversity and compromise key ocean services that maintain a healthy planet and human livelihoods. There exist large gaps in understanding of the physical and ecological feedbacks that will occur. Explicit recognition of deep-ocean climate mitigation and inclusion in adaptation planning by the United Nations Framework Convention on Climate Change (UNFCCC) could help to expand deep-ocean research and observation and to protect the integrity and functions of deep-ocean ecosystems.

Top