Sample records for large scale access

  1. Community-aware charging station network design for electrified vehicles in urban areas : reducing congestion, emissions, improving accessibility, and promoting walking, bicycling, and use of public transportation.

    DOT National Transportation Integrated Search

    2016-08-31

    A major challenge for achieving large-scale adoption of EVs is an accessible infrastructure for the communities. The societal benefits of large-scale adoption of EVs cannot be realized without adequate deployment of publicly accessible charging stati...

  2. An Novel Architecture of Large-scale Communication in IOT

    NASA Astrophysics Data System (ADS)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  3. Contractual Duration and Investment Incentives: Evidence from Large Scale Production Units in China

    NASA Astrophysics Data System (ADS)

    Li, Fang; Feng, Shuyi; D'Haese, Marijke; Lu, Hualiang; Qu, Futian

    2017-04-01

    Large Scale Production Units have become important forces in the supply of agricultural commodities and agricultural modernization in China. Contractual duration in farmland transfer to Large Scale Production Units can be considered to reflect land tenure security. Theoretically, long-term tenancy contracts can encourage Large Scale Production Units to increase long-term investments by ensuring land rights stability or favoring access to credit. Using a unique Large Scale Production Units- and plot-level field survey dataset from Jiangsu and Jiangxi Province, this study aims to examine the effect of contractual duration on Large Scale Production Units' soil conservation behaviours. IV method is applied to take into account the endogeneity of contractual duration and unobserved household heterogeneity. Results indicate that farmland transfer contract duration significantly and positively affects land-improving investments. Policies aimed at improving transaction platforms and intermediary organizations in farmland transfer to facilitate Large Scale Production Units to access farmland with long-term tenancy contracts may therefore play an important role in improving soil quality and land productivity.

  4. Large-Scale 1:1 Computing Initiatives: An Open Access Database

    ERIC Educational Resources Information Center

    Richardson, Jayson W.; McLeod, Scott; Flora, Kevin; Sauers, Nick J.; Kannan, Sathiamoorthy; Sincar, Mehmet

    2013-01-01

    This article details the spread and scope of large-scale 1:1 computing initiatives around the world. What follows is a review of the existing literature around 1:1 programs followed by a description of the large-scale 1:1 database. Main findings include: 1) the XO and the Classmate PC dominate large-scale 1:1 initiatives; 2) if professional…

  5. Methods and apparatus of analyzing electrical power grid data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, Ryan P.; Critchlow, Terence J.; Gibson, Tara D.

    Apparatus and methods of processing large-scale data regarding an electrical power grid are described. According to one aspect, a method of processing large-scale data regarding an electrical power grid includes accessing a large-scale data set comprising information regarding an electrical power grid; processing data of the large-scale data set to identify a filter which is configured to remove erroneous data from the large-scale data set; using the filter, removing erroneous data from the large-scale data set; and after the removing, processing data of the large-scale data set to identify an event detector which is configured to identify events of interestmore » in the large-scale data set.« less

  6. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  7. Macroscopic characterisations of Web accessibility

    NASA Astrophysics Data System (ADS)

    Lopes, Rui; Carriço, Luis

    2010-12-01

    The Web Science framework poses fundamental questions on the analysis of the Web, by focusing on how microscopic properties (e.g. at the level of a Web page or Web site) emerge into macroscopic properties and phenomena. One research topic on the analysis of the Web is Web accessibility evaluation, which centres on understanding how accessible a Web page is for people with disabilities. However, when framing Web accessibility evaluation on Web Science, we have found that existing research stays at the microscopic level. This article presents an experimental study on framing Web accessibility evaluation into Web Science's goals. This study resulted in novel accessibility properties of the Web not found at microscopic levels, as well as of Web accessibility evaluation processes themselves. We observed at large scale some of the empirical knowledge on how accessibility is perceived by designers and developers, such as the disparity of interpretations of accessibility evaluation tools warnings. We also found a direct relation between accessibility quality and Web page complexity. We provide a set of guidelines for designing Web pages, education on Web accessibility, as well as on the computational limits of large-scale Web accessibility evaluations.

  8. A Low Collision and High Throughput Data Collection Mechanism for Large-Scale Super Dense Wireless Sensor Networks.

    PubMed

    Lei, Chunyang; Bie, Hongxia; Fang, Gengfa; Gaura, Elena; Brusey, James; Zhang, Xuekun; Dutkiewicz, Eryk

    2016-07-18

    Super dense wireless sensor networks (WSNs) have become popular with the development of Internet of Things (IoT), Machine-to-Machine (M2M) communications and Vehicular-to-Vehicular (V2V) networks. While highly-dense wireless networks provide efficient and sustainable solutions to collect precise environmental information, a new channel access scheme is needed to solve the channel collision problem caused by the large number of competing nodes accessing the channel simultaneously. In this paper, we propose a space-time random access method based on a directional data transmission strategy, by which collisions in the wireless channel are significantly decreased and channel utility efficiency is greatly enhanced. Simulation results show that our proposed method can decrease the packet loss rate to less than 2 % in large scale WSNs and in comparison with other channel access schemes for WSNs, the average network throughput can be doubled.

  9. Random access in large-scale DNA data storage.

    PubMed

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  10. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  11. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  12. BioPlex Display: An Interactive Suite for Large-Scale AP-MS Protein-Protein Interaction Data.

    PubMed

    Schweppe, Devin K; Huttlin, Edward L; Harper, J Wade; Gygi, Steven P

    2018-01-05

    The development of large-scale data sets requires a new means to display and disseminate research studies to large audiences. Knowledge of protein-protein interaction (PPI) networks has become a principle interest of many groups within the field of proteomics. At the confluence of technologies, such as cross-linking mass spectrometry, yeast two-hybrid, protein cofractionation, and affinity purification mass spectrometry (AP-MS), detection of PPIs can uncover novel biological inferences at a high-throughput. Thus new platforms to provide community access to large data sets are necessary. To this end, we have developed a web application that enables exploration and dissemination of the growing BioPlex interaction network. BioPlex is a large-scale interactome data set based on AP-MS of baits from the human ORFeome. The latest BioPlex data set release (BioPlex 2.0) contains 56 553 interactions from 5891 AP-MS experiments. To improve community access to this vast compendium of interactions, we developed BioPlex Display, which integrates individual protein querying, access to empirical data, and on-the-fly annotation of networks within an easy-to-use and mobile web application. BioPlex Display enables rapid acquisition of data from BioPlex and development of hypotheses based on protein interactions.

  13. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme.

    PubMed

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2016-06-01

    Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government's Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit.

  14. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  15. A federated capability-based access control mechanism for internet of things (IoTs)

    NASA Astrophysics Data System (ADS)

    Xu, Ronghua; Chen, Yu; Blasch, Erik; Chen, Genshe

    2018-05-01

    The prevalence of Internet of Things (IoTs) allows heterogeneous embedded smart devices to collaboratively provide intelligent services with or without human intervention. While leveraging the large-scale IoT-based applications like Smart Gird and Smart Cities, IoT also incurs more concerns on privacy and security. Among the top security challenges that IoTs face is that access authorization is critical in resource and information protection over IoTs. Traditional access control approaches, like Access Control Lists (ACL), Role-based Access Control (RBAC) and Attribute-based Access Control (ABAC), are not able to provide a scalable, manageable and efficient mechanisms to meet requirement of IoT systems. The extraordinary large number of nodes, heterogeneity as well as dynamicity, necessitate more fine-grained, lightweight mechanisms for IoT devices. In this paper, a federated capability-based access control (FedCAC) framework is proposed to enable an effective access control processes to devices, services and information in large scale IoT systems. The federated capability delegation mechanism, based on a propagation tree, is illustrated for access permission propagation. An identity-based capability token management strategy is presented, which involves registering, propagation and revocation of the access authorization. Through delegating centralized authorization decision-making policy to local domain delegator, the access authorization process is locally conducted on the service provider that integrates situational awareness (SAW) and customized contextual conditions. Implemented and tested on both resources-constrained devices, like smart sensors and Raspberry PI, and non-resource-constrained devices, like laptops and smart phones, our experimental results demonstrate the feasibility of the proposed FedCAC approach to offer a scalable, lightweight and fine-grained access control solution to IoT systems connected to a system network.

  16. Differences in Access to Care among Students Using School-Based Health Centers

    ERIC Educational Resources Information Center

    Parasuraman, Sarika Rane; Shi, Leiyu

    2015-01-01

    Health care reform has changed the landscape for the nation's health safety net, and school-based health centers (SBHCs) remain an important part of this system. However, few large-scale studies have been conducted to assess their impact on access to care. This study investigated differences in access among a nationally representative sample of…

  17. Time Discounting and Credit Market Access in a Large-Scale Cash Transfer Programme

    PubMed Central

    Handa, Sudhanshu; Martorano, Bruno; Halpern, Carolyn; Pettifor, Audrey; Thirumurthy, Harsha

    2017-01-01

    Summary Time discounting is thought to influence decision-making in almost every sphere of life, including personal finances, diet, exercise and sexual behavior. In this article we provide evidence on whether a national poverty alleviation program in Kenya can affect inter-temporal decisions. We administered a preferences module as part of a large-scale impact evaluation of the Kenyan Government’s Cash Transfer for Orphans and Vulnerable Children. Four years into the program we find that individuals in the treatment group are only marginally more likely to wait for future money, due in part to the erosion of the value of the transfer by inflation. However among the poorest households for whom the value of transfer is still relatively large we find significant program effects on the propensity to wait. We also find strong program effects among those who have access to credit markets though the program itself does not improve access to credit. PMID:28260842

  18. OMERO and Bio-Formats 5: flexible access to large bioimaging datasets at scale

    NASA Astrophysics Data System (ADS)

    Moore, Josh; Linkert, Melissa; Blackburn, Colin; Carroll, Mark; Ferguson, Richard K.; Flynn, Helen; Gillen, Kenneth; Leigh, Roger; Li, Simon; Lindner, Dominik; Moore, William J.; Patterson, Andrew J.; Pindelski, Blazej; Ramalingam, Balaji; Rozbicki, Emil; Tarkowska, Aleksandra; Walczysko, Petr; Allan, Chris; Burel, Jean-Marie; Swedlow, Jason

    2015-03-01

    The Open Microscopy Environment (OME) has built and released Bio-Formats, a Java-based proprietary file format conversion tool and OMERO, an enterprise data management platform under open source licenses. In this report, we describe new versions of Bio-Formats and OMERO that are specifically designed to support large, multi-gigabyte or terabyte scale datasets that are routinely collected across most domains of biological and biomedical research. Bio- Formats reads image data directly from native proprietary formats, bypassing the need for conversion into a standard format. It implements the concept of a file set, a container that defines the contents of multi-dimensional data comprised of many files. OMERO uses Bio-Formats to read files natively, and provides a flexible access mechanism that supports several different storage and access strategies. These new capabilities of OMERO and Bio-Formats make them especially useful for use in imaging applications like digital pathology, high content screening and light sheet microscopy that create routinely large datasets that must be managed and analyzed.

  19. Planning Alternative Organizational Frameworks For a Large Scale Educational Telecommunications System Served by Fixed/Broadcast Satellites. Memorandum Number 73/3.

    ERIC Educational Resources Information Center

    Walkmeyer, John

    Considerations relating to the design of organizational structures for development and control of large scale educational telecommunications systems using satellites are explored. The first part of the document deals with four issues of system-wide concern. The first is user accessibility to the system, including proximity to entry points, ability…

  20. Facilitating Internet-Scale Code Retrieval

    ERIC Educational Resources Information Center

    Bajracharya, Sushil Krishna

    2010-01-01

    Internet-Scale code retrieval deals with the representation, storage, and access of relevant source code from a large amount of source code available on the Internet. Internet-Scale code retrieval systems support common emerging practices among software developers related to finding and reusing source code. In this dissertation we focus on some…

  1. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network

    PubMed Central

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L.; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-01-01

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae. In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. PMID:28325812

  2. TheCellMap.org: A Web-Accessible Database for Visualizing and Mining the Global Yeast Genetic Interaction Network.

    PubMed

    Usaj, Matej; Tan, Yizhao; Wang, Wen; VanderSluis, Benjamin; Zou, Albert; Myers, Chad L; Costanzo, Michael; Andrews, Brenda; Boone, Charles

    2017-05-05

    Providing access to quantitative genomic data is key to ensure large-scale data validation and promote new discoveries. TheCellMap.org serves as a central repository for storing and analyzing quantitative genetic interaction data produced by genome-scale Synthetic Genetic Array (SGA) experiments with the budding yeast Saccharomyces cerevisiae In particular, TheCellMap.org allows users to easily access, visualize, explore, and functionally annotate genetic interactions, or to extract and reorganize subnetworks, using data-driven network layouts in an intuitive and interactive manner. Copyright © 2017 Usaj et al.

  3. The need for the use of XACML access control policy in a distributed EHR and some performance considerations.

    PubMed

    Sucurovic, Snezana; Milutinovic, Veljko

    2008-01-01

    The Internet based distributed large scale information systems implements attribute based access control (ABAC) rather than Role Based Access Control (RBAC). The reason is that the Internet is identity less and that ABAC scales better. EXtensible Access Control Markup Language is standardized language for writing access control policies, access control requests and access control responses in ABAC. XACML can provide decentralized administration and credentials distribution. In year 2002 version of CEN ENV 13 606 attributes have been attached to EHCR components and in such a system ABAC and XACML have been easy to implement. This paper presents writing XACML policies in the case when attributes are in hierarchical structure. It is presented two possible solutions to write XACML policy in that case and that the solution when set functions are used is more compact and provides 10% better performances.

  4. Accessibility Principles for Reading Assessments

    ERIC Educational Resources Information Center

    Thurlow, Martha L.; Laitusis, Cara Cahalan; Dillon, Deborah R.; Cook, Linda L.; Moen, Ross E.; Abedi, Jamal; O'Brien, David G.

    2009-01-01

    Within the context of standards-based educational systems, states are using large scale reading assessments to help ensure that all children have the opportunity to learn essential knowledge and skills. The challenge for developers of accessible reading assessments is to develop assessments that measure only those student characteristics that are…

  5. Test Design Considerations for Students with Significant Cognitive Disabilities

    ERIC Educational Resources Information Center

    Anderson, Daniel; Farley, Dan; Tindal, Gerald

    2015-01-01

    Students with significant cognitive disabilities present an assessment dilemma that centers on access and validity in large-scale testing programs. Typically, access is improved by eliminating construct-irrelevant barriers, while validity is improved, in part, through test standardization. In this article, one state's alternate assessment data…

  6. Design Sketches For Optical Crossbar Switches Intended For Large-Scale Parallel Processing Applications

    NASA Astrophysics Data System (ADS)

    Hartmann, Alfred; Redfield, Steve

    1989-04-01

    This paper discusses design of large-scale (1000x 1000) optical crossbar switching networks for use in parallel processing supercom-puters. Alternative design sketches for an optical crossbar switching network are presented using free-space optical transmission with either a beam spreading/masking model or a beam steering model for internodal communications. The performances of alternative multiple access channel communications protocol-unslotted and slotted ALOHA and carrier sense multiple access (CSMA)-are compared with the performance of the classic arbitrated bus crossbar of conventional electronic parallel computing. These comparisons indicate an almost inverse relationship between ease of implementation and speed of operation. Practical issues of optical system design are addressed, and an optically addressed, composite spatial light modulator design is presented for fabrication to arbitrarily large scale. The wide range of switch architecture, communications protocol, optical systems design, device fabrication, and system performance problems presented by these design sketches poses a serious challenge to practical exploitation of highly parallel optical interconnects in advanced computer designs.

  7. Can International Large-Scale Assessments Inform a Global Learning Goal? Insights from the Learning Metrics Task Force

    ERIC Educational Resources Information Center

    Winthrop, Rebecca; Simons, Kate Anderson

    2013-01-01

    In recent years, the global community has developed a range of initiatives to inform the post-2015 global development agenda. In the education community, International Large-Scale Assessments (ILSAs) have an important role to play in advancing a global shift in focus to access plus learning. However, there are a number of other assessment tools…

  8. Helping Students Interpret Large-Scale Data Tables

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2016-01-01

    New technologies have completely altered the ways that citizens can access data. Indeed, emerging online data sources give citizens access to an enormous amount of numerical information that provides new sorts of evidence used to influence public opinion. In this new environment, two trends have had a significant impact on our increasingly…

  9. Architectural and Mobility Management Designs in Internet-Based Infrastructure Wireless Mesh Networks

    ERIC Educational Resources Information Center

    Zhao, Weiyi

    2011-01-01

    Wireless mesh networks (WMNs) have recently emerged to be a cost-effective solution to support large-scale wireless Internet access. They have numerous applications, such as broadband Internet access, building automation, and intelligent transportation systems. One research challenge for Internet-based WMNs is to design efficient mobility…

  10. Test Review: ACCESS for ELLs[R

    ERIC Educational Resources Information Center

    Fox, Janna; Fairbairn, Shelley

    2011-01-01

    This article reviews Assessing Comprehension and Communication in English State-to-State for English Language Learners ("ACCESS for ELLs"[R]), which is a large-scale, high-stakes, standards-based, and criterion-referenced English language proficiency test administered in the USA annually to more than 840,000 English Language Learners (ELLs), in…

  11. HIV-1 genetic diversity and primary drug resistance mutations before large-scale access to antiretroviral therapy, Republic of Congo.

    PubMed

    Niama, Fabien Roch; Vidal, Nicole; Diop-Ndiaye, Halimatou; Nguimbi, Etienne; Ahombo, Gabriel; Diakabana, Philippe; Bayonne Kombo, Édith Sophie; Mayengue, Pembe Issamou; Kobawila, Simon-Charles; Parra, Henri Joseph; Toure-Kane, Coumba

    2017-07-05

    In this work, we investigated the genetic diversity of HIV-1 and the presence of mutations conferring antiretroviral drug resistance in 50 drug-naïve infected persons in the Republic of Congo (RoC). Samples were obtained before large-scale access to HAART in 2002 and 2004. To assess the HIV-1 genetic recombination, the sequencing of the pol gene encoding a protease and partial reverse transcriptase was performed and analyzed with updated references, including newly characterized CRFs. The assessment of drug resistance was conducted according to the WHO protocol. Among the 50 samples analyzed for the pol gene, 50% were classified as intersubtype recombinants, charring complex structures inside the pol fragment. Five samples could not be classified (noted U). The most prevalent subtypes were G with 10 isolates and D with 11 isolates. One isolate of A, J, H, CRF05, CRF18 and CRF37 were also found. Two samples (4%) harboring the mutations M230L and Y181C associated with the TAMs M41L and T215Y, respectively, were found. This first study in the RoC, based on WHO classification, shows that the threshold of transmitted drug resistance before large-scale access to antiretroviral therapy is 4%.

  12. DNA sequence variation of wild barley Hordeum spontaneum (L.) across environmental gradients in Israel

    PubMed Central

    Bedada, G; Westerbergh, A; Nevo, E; Korol, A; Schmid, K J

    2014-01-01

    Wild barley Hordeum spontaneum (L.) shows a wide geographic distribution and ecological diversity. A key question concerns the spatial scale at which genetic differentiation occurs and to what extent it is driven by natural selection. The Levant region exhibits a strong ecological gradient along the North–South axis, with numerous small canyons in an East–West direction and with small-scale environmental gradients on the opposing North- and South-facing slopes. We sequenced 34 short genomic regions in 54 accessions of wild barley collected throughout Israel and from the opposing slopes of two canyons. The nucleotide diversity of the total sample is 0.0042, which is about two-thirds of a sample from the whole species range (0.0060). Thirty accessions collected at ‘Evolution Canyon' (EC) at Nahal Oren, close to Haifa, have a nucleotide diversity of 0.0036, and therefore harbor a large proportion of the genetic diversity. There is a high level of genetic clustering throughout Israel and within EC, which roughly differentiates the slopes. Accessions from the hot and dry South-facing slope have significantly reduced genetic diversity and are genetically more distinct from accessions from the North-facing slope, which are more similar to accessions from other regions in Northern Israel. Statistical population models indicate that wild barley within the EC consist of three separate genetic clusters with substantial gene flow. The data indicate a high level of population structure at large and small geographic scales that shows isolation-by-distance, and is also consistent with ongoing natural selection contributing to genetic differentiation at a small geographic scale. PMID:24619177

  13. Data Integration: Charting a Path Forward to 2035

    DTIC Science & Technology

    2011-02-14

    New York, NY: Gotham Books, 2004. Seligman , Len. Mitre Corporation, e-mail interview, 6 Dec 2010. Singer, P.W. Wired for War: The Robotics...articles.aspx (accessed 4 Dec 2010). Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie...Virtualization?‖ 1. 41 Ultra-Large-Scale Systems: The Software Challenge of the Future. Study lead Linda Northrup. Pittsburgh, PA: Carnegie Mellon Software

  14. Reduced representation approaches to interrogate genome diversity in large repetitive plant genomes.

    PubMed

    Hirsch, Cory D; Evans, Joseph; Buell, C Robin; Hirsch, Candice N

    2014-07-01

    Technology and software improvements in the last decade now provide methodologies to access the genome sequence of not only a single accession, but also multiple accessions of plant species. This provides a means to interrogate species diversity at the genome level. Ample diversity among accessions in a collection of species can be found, including single-nucleotide polymorphisms, insertions and deletions, copy number variation and presence/absence variation. For species with small, non-repetitive rich genomes, re-sequencing of query accessions is robust, highly informative, and economically feasible. However, for species with moderate to large sized repetitive-rich genomes, technical and economic barriers prevent en masse genome re-sequencing of accessions. Multiple approaches to access a focused subset of loci in species with larger genomes have been developed, including reduced representation sequencing, exome capture and transcriptome sequencing. Collectively, these approaches have enabled interrogation of diversity on a genome scale for large plant genomes, including crop species important to worldwide food security. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  15. Secure access control and large scale robust representation for online multimedia event detection.

    PubMed

    Liu, Changyu; Lu, Bin; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches.

  16. Epigenome data release: a participant-centered approach to privacy protection.

    PubMed

    Dyke, Stephanie O M; Cheung, Warren A; Joly, Yann; Ammerpohl, Ole; Lutsik, Pavlo; Rothstein, Mark A; Caron, Maxime; Busche, Stephan; Bourque, Guillaume; Rönnblom, Lars; Flicek, Paul; Beck, Stephan; Hirst, Martin; Stunnenberg, Henk; Siebert, Reiner; Walter, Jörn; Pastinen, Tomi

    2015-07-17

    Large-scale epigenome mapping by the NIH Roadmap Epigenomics Project, the ENCODE Consortium and the International Human Epigenome Consortium (IHEC) produces genome-wide DNA methylation data at one base-pair resolution. We examine how such data can be made open-access while balancing appropriate interpretation and genomic privacy. We propose guidelines for data release that both reduce ambiguity in the interpretation of open-access data and limit immediate access to genetic variation data that are made available through controlled access.

  17. Quality Assurance in Asian Open and Distance Learning: Policies and Implementation

    ERIC Educational Resources Information Center

    Darojat, Ojat; Nilson, Michelle; Kaufman, David

    2015-01-01

    Open universities have emerged as an innovative pillar in the expansion of access to higher education participation, with single-mode distance education providers broadening access in many countries through economies of scale supported by large enrolments. These models raise questions about the quality of education provided. This paper reports on…

  18. Flow chemistry kinetic studies reveal reaction conditions for ready access to unsymmetrical trehalose analogues.

    PubMed

    Patel, Mitul K; Davis, Benjamin G

    2010-10-07

    Monofunctionalization of trehalose, a widely-found symmetric plant disaccharide, was studied in a microreactor to give valuable kinetic insights that have allowed improvements in desymmetrization yields and the development of a reaction sequence for large scale monofunctionalizations that allow access to probes of trehalose's biological function.

  19. Using Computing and Data Grids for Large-Scale Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2001-01-01

    We use the term "Grid" to refer to a software system that provides uniform and location independent access to geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. These emerging data and computing Grids promise to provide a highly capable and scalable environment for addressing large-scale science problems. We describe the requirements for science Grids, the resulting services and architecture of NASA's Information Power Grid (IPG) and DOE's Science Grid, and some of the scaling issues that have come up in their implementation.

  20. Remote visual analysis of large turbulence databases at multiple scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  1. Remote visual analysis of large turbulence databases at multiple scales

    DOE PAGES

    Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...

    2018-06-15

    The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less

  2. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  3. Future of applied watershed science at regional scales

    Treesearch

    Lee Benda; Daniel Miller; Steve Lanigan; Gordon Reeves

    2009-01-01

    Resource managers must deal increasingly with land use and conservation plans applied at large spatial scales (watersheds, landscapes, states, regions) involving multiple interacting federal agencies and stakeholders. Access to a geographically focused and application-oriented database would allow users in different locations and with different concerns to quickly...

  4. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  5. We Are Lost: Measuring the Accessibility of Signage in Public General Hospitals

    ERIC Educational Resources Information Center

    Schuster, Michal; Elroy, Irit; Elmakais, Ido

    2017-01-01

    Hospital signage is a critical element in the patients' and visitors understanding of directions, instructions and warnings in the facility. In multilingual environments organizations need to make sure that the information is accessible in the languages of the people who consume their services. As part of a large-scale study that examined the…

  6. Latest COBE results, large-scale data, and predictions of inflation

    NASA Technical Reports Server (NTRS)

    Kashlinsky, A.

    1992-01-01

    One of the predictions of the inflationary scenario of cosmology is that the initial spectrum of primordial density fluctuations (PDFs) must have the Harrison-Zeldovich (HZ) form. Here, in order to test the inflationary scenario, predictions of the microwave background radiation (MBR) anisotropies measured by COBE are computed based on large-scale data for the universe and assuming Omega-1 and the HZ spectrum on large scales. It is found that the minimal scale where the spectrum can first enter the HZ regime is found, constraining the power spectrum of the mass distribution to within the bias factor b. This factor is determined and used to predict parameters of the MBR anisotropy field. For the spectrum of PDFs that reaches the HZ regime immediately after the scale accessible to the APM catalog, the numbers on MBR anisotropies are consistent with the COBE detections and thus the standard inflation can indeed be considered a viable theory for the origin of the large-scale structure in the universe.

  7. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    PubMed

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Secure Access Control and Large Scale Robust Representation for Online Multimedia Event Detection

    PubMed Central

    Liu, Changyu; Li, Huiling

    2014-01-01

    We developed an online multimedia event detection (MED) system. However, there are a secure access control issue and a large scale robust representation issue when we want to integrate traditional event detection algorithms into the online environment. For the first issue, we proposed a tree proxy-based and service-oriented access control (TPSAC) model based on the traditional role based access control model. Verification experiments were conducted on the CloudSim simulation platform, and the results showed that the TPSAC model is suitable for the access control of dynamic online environments. For the second issue, inspired by the object-bank scene descriptor, we proposed a 1000-object-bank (1000OBK) event descriptor. Feature vectors of the 1000OBK were extracted from response pyramids of 1000 generic object detectors which were trained on standard annotated image datasets, such as the ImageNet dataset. A spatial bag of words tiling approach was then adopted to encode these feature vectors for bridging the gap between the objects and events. Furthermore, we performed experiments in the context of event classification on the challenging TRECVID MED 2012 dataset, and the results showed that the robust 1000OBK event descriptor outperforms the state-of-the-art approaches. PMID:25147840

  9. Access control and privacy in large distributed systems

    NASA Technical Reports Server (NTRS)

    Leiner, B. M.; Bishop, M.

    1986-01-01

    Large scale distributed systems consists of workstations, mainframe computers, supercomputers and other types of servers, all connected by a computer network. These systems are being used in a variety of applications including the support of collaborative scientific research. In such an environment, issues of access control and privacy arise. Access control is required for several reasons, including the protection of sensitive resources and cost control. Privacy is also required for similar reasons, including the protection of a researcher's proprietary results. A possible architecture for integrating available computer and communications security technologies into a system that meet these requirements is described. This architecture is meant as a starting point for discussion, rather that the final answer.

  10. Heterogeneity and scale of sustainable development in cities.

    PubMed

    Brelsford, Christa; Lobo, José; Hand, Joe; Bettencourt, Luís M A

    2017-08-22

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals.

  11. Heterogeneity and scale of sustainable development in cities

    PubMed Central

    Brelsford, Christa; Lobo, José; Hand, Joe

    2017-01-01

    Rapid worldwide urbanization is at once the main cause and, potentially, the main solution to global sustainable development challenges. The growth of cities is typically associated with increases in socioeconomic productivity, but it also creates strong inequalities. Despite a growing body of evidence characterizing these heterogeneities in developed urban areas, not much is known systematically about their most extreme forms in developing cities and their consequences for sustainability. Here, we characterize the general patterns of income and access to services in a large number of developing cities, with an emphasis on an extensive, high-resolution analysis of the urban areas of Brazil and South Africa. We use detailed census data to construct sustainable development indices in hundreds of thousands of neighborhoods and show that their statistics are scale-dependent and point to the critical role of large cities in creating higher average incomes and greater access to services within their national context. We then quantify the general statistical trajectory toward universal basic service provision at different scales to show that it is characterized by varying levels of inequality, with initial increases in access being typically accompanied by growing disparities over characteristic spatial scales. These results demonstrate how extensions of these methods to other goals and data can be used over time and space to produce a simple but general quantitative assessment of progress toward internationally agreed sustainable development goals. PMID:28461489

  12. Large scale 20mm photography for range resources analysis in the Western United States. [Casa Grande, Arizona, Mercury, Nevada, and Mojave Desert

    NASA Technical Reports Server (NTRS)

    Tueller, P. T.

    1977-01-01

    Large scale 70mm aerial photography is a valuable supplementary tool for rangeland studies. A wide assortment of applications were developed varying from vegetation mapping to assessing environmental impact on rangelands. Color and color infrared stereo pairs are useful for effectively sampling sites limited by ground accessibility. They allow an increased sample size at similar or lower cost than ground sampling techniques and provide a permanent record.

  13. Changing global essential medicines norms to improve access to AIDS treatment: lessons from Brazil.

    PubMed

    Nunn, A; Fonseca, E Da; Gruskin, S

    2009-01-01

    Brazil's large-scale, successful HIV/AIDS treatment programme is considered by many to be a model for other developing countries aiming to improve access to AIDS treatment. Far less is known about Brazil's important role in changing global norms related to international pharmaceutical policy, particularly international human rights, health and trade policies governing access to essential medicines. Prompted by Brazil's interest in preserving its national AIDS treatment policies during World Trade Organisation trade disputes with the USA, these efforts to change global essential medicines norms have had important implications for other countries, particularly those scaling up AIDS treatment. This paper analyses Brazil's contributions to global essential medicines policy and explains the relevance of Brazil's contributions to global health policy today.

  14. Resolving Properties of Polymers and Nanoparticle Assembly through Coarse-Grained Computational Studies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grest, Gary S.

    2017-09-01

    Coupled length and time scales determine the dynamic behavior of polymers and polymer nanocomposites and underlie their unique properties. To resolve the properties over large time and length scales it is imperative to develop coarse grained models which retain the atomistic specificity. Here we probe the degree of coarse graining required to simultaneously retain significant atomistic details a nd access large length and time scales. The degree of coarse graining in turn sets the minimum length scale instrumental in defining polymer properties and dynamics. Using polyethylene as a model system, we probe how the coarse - graining scale affects themore » measured dynamics with different number methylene group s per coarse - grained beads. Using these models we simulate polyethylene melts for times over 500 ms to study the viscoelastic properties of well - entangled polymer melts and large nanoparticle assembly as the nanoparticles are driven close enough to form nanostructures.« less

  15. Cosmology on ultralarge scales with intensity mapping of the neutral hydrogen 21 cm emission: limits on primordial non-Gaussianity.

    PubMed

    Camera, Stefano; Santos, Mário G; Ferreira, Pedro G; Ferramacho, Luís

    2013-10-25

    The large-scale structure of the Universe supplies crucial information about the physical processes at play at early times. Unresolved maps of the intensity of 21 cm emission from neutral hydrogen HI at redshifts z=/~1-5 are the best hope of accessing the ultralarge-scale information, directly related to the early Universe. A purpose-built HI intensity experiment may be used to detect the large scale effects of primordial non-Gaussianity, placing stringent bounds on different models of inflation. We argue that it may be possible to place tight constraints on the non-Gaussianity parameter f(NL), with an error close to σ(f(NL))~1.

  16. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  17. Oscillatory mechanisms of process binding in memory.

    PubMed

    Klimesch, Wolfgang; Freunberger, Roman; Sauseng, Paul

    2010-06-01

    A central topic in cognitive neuroscience is the question, which processes underlie large scale communication within and between different neural networks. The basic assumption is that oscillatory phase synchronization plays an important role for process binding--the transient linking of different cognitive processes--which may be considered a special type of large scale communication. We investigate this question for memory processes on the basis of different types of oscillatory synchronization mechanisms. The reviewed findings suggest that theta and alpha phase coupling (and phase reorganization) reflect control processes in two large memory systems, a working memory and a complex knowledge system that comprises semantic long-term memory. It is suggested that alpha phase synchronization may be interpreted in terms of processes that coordinate top-down control (a process guided by expectancy to focus on relevant search areas) and access to memory traces (a process leading to the activation of a memory trace). An analogous interpretation is suggested for theta oscillations and the controlled access to episodic memories. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  18. Access Control Management for SCADA Systems

    NASA Astrophysics Data System (ADS)

    Hong, Seng-Phil; Ahn, Gail-Joon; Xu, Wenjuan

    The information technology revolution has transformed all aspects of our society including critical infrastructures and led a significant shift from their old and disparate business models based on proprietary and legacy environments to more open and consolidated ones. Supervisory Control and Data Acquisition (SCADA) systems have been widely used not only for industrial processes but also for some experimental facilities. Due to the nature of open environments, managing SCADA systems should meet various security requirements since system administrators need to deal with a large number of entities and functions involved in critical infrastructures. In this paper, we identify necessary access control requirements in SCADA systems and articulate access control policies for the simulated SCADA systems. We also attempt to analyze and realize those requirements and policies in the context of role-based access control that is suitable for simplifying administrative tasks in large scale enterprises.

  19. Orthographic and Phonological Neighborhood Databases across Multiple Languages.

    PubMed

    Marian, Viorica

    2017-01-01

    The increased globalization of science and technology and the growing number of bilinguals and multilinguals in the world have made research with multiple languages a mainstay for scholars who study human function and especially those who focus on language, cognition, and the brain. Such research can benefit from large-scale databases and online resources that describe and measure lexical, phonological, orthographic, and semantic information. The present paper discusses currently-available resources and underscores the need for tools that enable measurements both within and across multiple languages. A general review of language databases is followed by a targeted introduction to databases of orthographic and phonological neighborhoods. A specific focus on CLEARPOND illustrates how databases can be used to assess and compare neighborhood information across languages, to develop research materials, and to provide insight into broad questions about language. As an example of how using large-scale databases can answer questions about language, a closer look at neighborhood effects on lexical access reveals that not only orthographic, but also phonological neighborhoods can influence visual lexical access both within and across languages. We conclude that capitalizing upon large-scale linguistic databases can advance, refine, and accelerate scientific discoveries about the human linguistic capacity.

  20. Large scale germplasm screening for identification of novel rice blast resistance sources

    PubMed Central

    Vasudevan, Kumar; Vera Cruz, Casiana M.; Gruissem, Wilhelm; Bhullar, Navreet K.

    2014-01-01

    Rice is a major cereal crop that contributes significantly to global food security. Biotic stresses, including the rice blast fungus, cause severe yield losses that significantly impair rice production worldwide. The rapid genetic evolution of the fungus often overcomes the resistance conferred by major genes after a few years of intensive agricultural use. Therefore, resistance breeding requires continuous efforts of enriching the reservoir of resistance genes/alleles to effectively tackle the disease. Seed banks represent a rich stock of genetic diversity, however, they are still under-explored for identifying novel genes and/or their functional alleles. We conducted a large-scale screen for new rice blast resistance sources in 4246 geographically diverse rice accessions originating from 13 major rice-growing countries. The accessions were selected from a total collection of over 120,000 accessions based on their annotated rice blast resistance information in the International Rice Genebank. A two-step resistance screening protocol was used involving natural infection in a rice uniform blast nursery and subsequent artificial infections with five single rice blast isolates. The nursery-resistant accessions showed varied disease responses when infected with single isolates, suggesting the presence of diverse resistance genes/alleles in this accession collection. In addition, 289 accessions showed broad-spectrum resistance against all five single rice blast isolates. The selected resistant accessions were genotyped for the presence of the Pi2 resistance gene, thereby identifying potential accessions for isolation of allelic variants of this blast resistance gene. Together, the accession collection with broad spectrum and isolate specific blast resistance represent the core material for isolation of previously unknown blast resistance genes and/or their allelic variants that can be deployed in rice breeding programs. PMID:25324853

  1. A cross-sectional ecological study of spatial scale and geographic inequality in access to drinking-water and sanitation.

    PubMed

    Yu, Weiyu; Bain, Robert E S; Mansour, Shawky; Wright, Jim A

    2014-11-26

    Measuring inequality in access to safe drinking-water and sanitation is proposed as a component of international monitoring following the expiry of the Millennium Development Goals. This study aims to evaluate the utility of census data in measuring geographic inequality in access to drinking-water and sanitation. Spatially referenced census data were acquired for Colombia, South Africa, Egypt, and Uganda, whilst non-spatially referenced census data were acquired for Kenya. Four variants of the dissimilarity index were used to estimate geographic inequality in access to both services using large and small area units in each country through a cross-sectional, ecological study. Inequality was greatest for piped water in South Africa in 2001 (based on 53 areas (N) with a median population (MP) of 657,015; D = 0.5599) and lowest for access to an improved water source in Uganda in 2008 (N = 56; MP = 419,399; D = 0.2801). For sanitation, inequality was greatest for those lacking any facility in Kenya in 2009 (N = 158; MP = 216,992; D = 0.6981), and lowest for access to an improved facility in Uganda in 2002 (N = 56; MP = 341,954; D = 0.3403). Although dissimilarity index values were greater for smaller areal units, when study countries were ranked in terms of inequality, these ranks remained unaffected by the choice of large or small areal units. International comparability was limited due to definitional and temporal differences between censuses. This five-country study suggests that patterns of inequality for broad regional units do often reflect inequality in service access at a more local scale. This implies household surveys designed to estimate province-level service coverage can provide valuable insights into geographic inequality at lower levels. In comparison with household surveys, censuses facilitate inequality assessment at different spatial scales, but pose challenges in harmonising water and sanitation typologies across countries.

  2. The Power of Technology to Transform Adult Learning: Expanding Access to Adult Education & Workforce Skills through Distance Learning

    ERIC Educational Resources Information Center

    McCain, Mary L.

    2009-01-01

    There is a highly compelling case for using technology on a large scale to increase access to and improve America's adult education and workforce skills enterprise. By the reckoning of the National Commission on Adult Literacy and others, the nation must reach many more millions of adults with effective college- and job-readiness skills programs…

  3. Correlations, Trends and Potential Biases among Publicly Accessible Web-Based Student Evaluations of Teaching: A Large-Scale Study of RateMyProfessors.com Data

    ERIC Educational Resources Information Center

    Rosen, Andrew S.

    2018-01-01

    Student evaluations of teaching are widely adopted across academic institutions, but there are many underlying trends and biases that can influence their interpretation. Publicly accessible web-based student evaluations of teaching are of particular relevance, due to their widespread use by students in the course selection process and the quantity…

  4. Remote visualization and scale analysis of large turbulence datatsets

    NASA Astrophysics Data System (ADS)

    Livescu, D.; Pulido, J.; Burns, R.; Canada, C.; Ahrens, J.; Hamann, B.

    2015-12-01

    Accurate simulations of turbulent flows require solving all the dynamically relevant scales of motions. This technique, called Direct Numerical Simulation, has been successfully applied to a variety of simple flows; however, the large-scale flows encountered in Geophysical Fluid Dynamics (GFD) would require meshes outside the range of the most powerful supercomputers for the foreseeable future. Nevertheless, the current generation of petascale computers has enabled unprecedented simulations of many types of turbulent flows which focus on various GFD aspects, from the idealized configurations extensively studied in the past to more complex flows closer to the practical applications. The pace at which such simulations are performed only continues to increase; however, the simulations themselves are restricted to a small number of groups with access to large computational platforms. Yet the petabytes of turbulence data offer almost limitless information on many different aspects of the flow, from the hierarchy of turbulence moments, spectra and correlations, to structure-functions, geometrical properties, etc. The ability to share such datasets with other groups can significantly reduce the time to analyze the data, help the creative process and increase the pace of discovery. Using the largest DOE supercomputing platforms, we have performed some of the biggest turbulence simulations to date, in various configurations, addressing specific aspects of turbulence production and mixing mechanisms. Until recently, the visualization and analysis of such datasets was restricted by access to large supercomputers. The public Johns Hopkins Turbulence database simplifies the access to multi-Terabyte turbulence datasets and facilitates turbulence analysis through the use of commodity hardware. First, one of our datasets, which is part of the database, will be described and then a framework that adds high-speed visualization and wavelet support for multi-resolution analysis of turbulence will be highlighted. The addition of wavelet support reduces the latency and bandwidth requirements for visualization, allowing for many concurrent users, and enables new types of analyses, including scale decomposition and coherent feature extraction.

  5. Achieving Agility and Stability in Large-Scale Software Development

    DTIC Science & Technology

    2013-01-16

    temporary team is assigned to prepare layers and frameworks for future feature teams. Presentation Layer Domain Layer Data Access Layer Framework...http://www.sei.cmu.edu/training/ elearning ~ Software Engineering Institute CarnegieMellon

  6. Changing global essential medicines norms to improve access to AIDS treatment: Lessons from Brazil

    PubMed Central

    Nunn, A.; Fonseca, E. Da; Gruskin, S.

    2009-01-01

    Brazil's large-scale, successful HIV/AIDS treatment programme is considered by many to be a model for other developing countries aiming to improve access to AIDS treatment. Far less is known about Brazil's important role in changing global norms related to international pharmaceutical policy, particularly international human rights, health and trade policies governing access to essential medicines. Prompted by Brazil's interest in preserving its national AIDS treatment policies during World Trade Organisation trade disputes with the USA, these efforts to change global essential medicines norms have had important implications for other countries, particularly those scaling up AIDS treatment. This paper analyses Brazil's contributions to global essential medicines policy and explains the relevance of Brazil's contributions to global health policy today. PMID:19333805

  7. Wall Modeled Large Eddy Simulation of Airfoil Trailing Edge Noise

    NASA Astrophysics Data System (ADS)

    Kocheemoolayil, Joseph; Lele, Sanjiva

    2014-11-01

    Large eddy simulation (LES) of airfoil trailing edge noise has largely been restricted to low Reynolds numbers due to prohibitive computational cost. Wall modeled LES (WMLES) is a computationally cheaper alternative that makes full-scale Reynolds numbers relevant to large wind turbines accessible. A systematic investigation of trailing edge noise prediction using WMLES is conducted. Detailed comparisons are made with experimental data. The stress boundary condition from a wall model does not constrain the fluctuating velocity to vanish at the wall. This limitation has profound implications for trailing edge noise prediction. The simulation over-predicts the intensity of fluctuating wall pressure and far-field noise. An improved wall model formulation that minimizes the over-prediction of fluctuating wall pressure is proposed and carefully validated. The flow configurations chosen for the study are from the workshop on benchmark problems for airframe noise computations. The large eddy simulation database is used to examine the adequacy of scaling laws that quantify the dependence of trailing edge noise on Mach number, Reynolds number and angle of attack. Simplifying assumptions invoked in engineering approaches towards predicting trailing edge noise are critically evaluated. We gratefully acknowledge financial support from GE Global Research and thank Cascade Technologies Inc. for providing access to their massively-parallel large eddy simulation framework.

  8. SureChEMBL: a large-scale, chemically annotated patent document database.

    PubMed

    Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P

    2016-01-04

    SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. Cloud-based solution to identify statistically significant MS peaks differentiating sample categories.

    PubMed

    Ji, Jun; Ling, Jeffrey; Jiang, Helen; Wen, Qiaojun; Whitin, John C; Tian, Lu; Cohen, Harvey J; Ling, Xuefeng B

    2013-03-23

    Mass spectrometry (MS) has evolved to become the primary high throughput tool for proteomics based biomarker discovery. Until now, multiple challenges in protein MS data analysis remain: large-scale and complex data set management; MS peak identification, indexing; and high dimensional peak differential analysis with the concurrent statistical tests based false discovery rate (FDR). "Turnkey" solutions are needed for biomarker investigations to rapidly process MS data sets to identify statistically significant peaks for subsequent validation. Here we present an efficient and effective solution, which provides experimental biologists easy access to "cloud" computing capabilities to analyze MS data. The web portal can be accessed at http://transmed.stanford.edu/ssa/. Presented web application supplies large scale MS data online uploading and analysis with a simple user interface. This bioinformatic tool will facilitate the discovery of the potential protein biomarkers using MS.

  10. SureChEMBL: a large-scale, chemically annotated patent document database

    PubMed Central

    Papadatos, George; Davies, Mark; Dedman, Nathan; Chambers, Jon; Gaulton, Anna; Siddle, James; Koks, Richard; Irvine, Sean A.; Pettersson, Joe; Goncharoff, Nicko; Hersey, Anne; Overington, John P.

    2016-01-01

    SureChEMBL is a publicly available large-scale resource containing compounds extracted from the full text, images and attachments of patent documents. The data are extracted from the patent literature according to an automated text and image-mining pipeline on a daily basis. SureChEMBL provides access to a previously unavailable, open and timely set of annotated compound-patent associations, complemented with sophisticated combined structure and keyword-based search capabilities against the compound repository and patent document corpus; given the wealth of knowledge hidden in patent documents, analysis of SureChEMBL data has immediate applications in drug discovery, medicinal chemistry and other commercial areas of chemical science. Currently, the database contains 17 million compounds extracted from 14 million patent documents. Access is available through a dedicated web-based interface and data downloads at: https://www.surechembl.org/. PMID:26582922

  11. Changing Patterns of Access to Education in Anglophone and Francophone Countries in Sub Saharan Africa: Is Education for All Pro-Poor? CREATE Pathways to Access. Research Monograph No. 52

    ERIC Educational Resources Information Center

    Lewin, Keith M.; Sabates, Ricardo

    2011-01-01

    This paper explores patterns of growth in participation in six Anglophone and seven Francophone countries in SSA. The countries are Kenya, Malawi, Nigeria, Tanzania, Uganda, Zambia, Benin, Burkina Faso, Cameroon, Madagascar, Mali, Niger and Senegal. These countries all have large scale Universal Primary Education programmes and all have…

  12. Possibility of the market expansion of large capacity optical cold archive

    NASA Astrophysics Data System (ADS)

    Matsumoto, Ikuo; Sakata, Emiko

    2017-08-01

    The field, IoT and Big data, which is activated by the revolution of ICT, has caused rapid increase of distribution data of various business application. As a result, data with low access frequency has been rapidly increasing into a huge scale that human has never experienced before. This data with low access frequency is called "cold data", and the storage for cold data is called "cold storage". In this situation, the specifications of storage including access frequency, response speed and cost is determined by the application's request.

  13. User-Centered Indexing for Adaptive Information Access

    NASA Technical Reports Server (NTRS)

    Chen, James R.; Mathe, Nathalie

    1996-01-01

    We are focusing on information access tasks characterized by large volume of hypermedia connected technical documents, a need for rapid and effective access to familiar information, and long-term interaction with evolving information. The problem for technical users is to build and maintain a personalized task-oriented model of the information to quickly access relevant information. We propose a solution which provides user-centered adaptive information retrieval and navigation. This solution supports users in customizing information access over time. It is complementary to information discovery methods which provide access to new information, since it lets users customize future access to previously found information. It relies on a technique, called Adaptive Relevance Network, which creates and maintains a complex indexing structure to represent personal user's information access maps organized by concepts. This technique is integrated within the Adaptive HyperMan system, which helps NASA Space Shuttle flight controllers organize and access large amount of information. It allows users to select and mark any part of a document as interesting, and to index that part with user-defined concepts. Users can then do subsequent retrieval of marked portions of documents. This functionality allows users to define and access personal collections of information, which are dynamically computed. The system also supports collaborative review by letting users share group access maps. The adaptive relevance network provides long-term adaptation based both on usage and on explicit user input. The indexing structure is dynamic and evolves over time. Leading and generalization support flexible retrieval of information under similar concepts. The network is geared towards more recent information access, and automatically manages its size in order to maintain rapid access when scaling up to large hypermedia space. We present results of simulated learning experiments.

  14. ATLAS and LHC computing on CRAY

    NASA Astrophysics Data System (ADS)

    Sciacca, F. G.; Haug, S.; ATLAS Collaboration

    2017-10-01

    Access and exploitation of large scale computing resources, such as those offered by general purpose HPC centres, is one important measure for ATLAS and the other Large Hadron Collider experiments in order to meet the challenge posed by the full exploitation of the future data within the constraints of flat budgets. We report on the effort of moving the Swiss WLCG T2 computing, serving ATLAS, CMS and LHCb, from a dedicated cluster to the large Cray systems at the Swiss National Supercomputing Centre CSCS. These systems do not only offer very efficient hardware, cooling and highly competent operators, but also have large backfill potentials due to size and multidisciplinary usage and potential gains due to economy at scale. Technical solutions, performance, expected return and future plans are discussed.

  15. CERN data services for LHC computing

    NASA Astrophysics Data System (ADS)

    Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.

    2017-10-01

    Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.

  16. St. Louis Initiative for Integrated Care Excellence (SLI(2)CE): integrated-collaborative care on a large scale model.

    PubMed

    Brawer, Peter A; Martielli, Richard; Pye, Patrice L; Manwaring, Jamie; Tierney, Anna

    2010-06-01

    The primary care health setting is in crisis. Increasing demand for services, with dwindling numbers of providers, has resulted in decreased access and decreased satisfaction for both patients and providers. Moreover, the overwhelming majority of primary care visits are for behavioral and mental health concerns rather than issues of a purely medical etiology. Integrated-collaborative models of health care delivery offer possible solutions to this crisis. The purpose of this article is to review the existing data available after 2 years of the St. Louis Initiative for Integrated Care Excellence; an example of integrated-collaborative care on a large scale model within a regional Veterans Affairs Health Care System. There is clear evidence that the SLI(2)CE initiative rather dramatically increased access to health care, and modified primary care practitioners' willingness to address mental health issues within the primary care setting. In addition, data suggests strong fidelity to a model of integrated-collaborative care which has been successful in the past. Integrated-collaborative care offers unique advantages to the traditional view and practice of medical care. Through careful implementation and practice, success is possible on a large scale model. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  17. Bringing Policy and Practice to the Table: Young Women's Nutritional Experiences in an Ontario Secondary School

    ERIC Educational Resources Information Center

    Gray, Sarah K.

    2015-01-01

    In recent years, media, health organizations and researchers have raised concern over the health of Canadian children and adolescents. Stakeholders have called on the government to confront the problem. Schools are seen as an ideal location for developing and implementing large-scale interventions because of the ease of access to large groups of…

  18. Implementing 15 Essential Elements for High Quality: A State and Local Policy Scan

    ERIC Educational Resources Information Center

    Barnett, W. Steven; Weisenfeld, G. G.; Brown, Kirsty; Squires, Jim; Horowitz, Michelle

    2016-01-01

    This report explores the extent to which states (and several large cities) are positioned to provide high quality preschool education on a large scale. States and cities that are already doing so or that could do so with modest improvements offer opportunities for advocacy to advance access to high quality early education as well as for rigorous…

  19. A transportable Paul-trap for levitation and accurate positioning of micron-scale particles in vacuum for laser-plasma experiments

    NASA Astrophysics Data System (ADS)

    Ostermayr, T. M.; Gebhard, J.; Haffa, D.; Kiefer, D.; Kreuzer, C.; Allinger, K.; Bömer, C.; Braenzel, J.; Schnürer, M.; Cermak, I.; Schreiber, J.; Hilz, P.

    2018-01-01

    We report on a Paul-trap system with large access angles that allows positioning of fully isolated micrometer-scale particles with micrometer precision as targets in high-intensity laser-plasma interactions. This paper summarizes theoretical and experimental concepts of the apparatus as well as supporting measurements that were performed for the trapping process of single particles.

  20. Decentralized Opportunistic Spectrum Resources Access Model and Algorithm toward Cooperative Ad-Hoc Networks.

    PubMed

    Liu, Ming; Xu, Yang; Mohammed, Abdul-Wahid

    2016-01-01

    Limited communication resources have gradually become a critical factor toward efficiency of decentralized large scale multi-agent coordination when both system scales up and tasks become more complex. In current researches, due to the agent's limited communication and observational capability, an agent in a decentralized setting can only choose a part of channels to access, but cannot perceive or share global information. Each agent's cooperative decision is based on the partial observation of the system state, and as such, uncertainty in the communication network is unavoidable. In this situation, it is a major challenge working out cooperative decision-making under uncertainty with only a partial observation of the environment. In this paper, we propose a decentralized approach that allows agents cooperatively search and independently choose channels. The key to our design is to build an up-to-date observation for each agent's view so that a local decision model is achievable in a large scale team coordination. We simplify the Dec-POMDP model problem, and each agent can jointly work out its communication policy in order to improve its local decision utilities for the choice of communication resources. Finally, we discuss an implicate resource competition game, and show that, there exists an approximate resources access tradeoff balance between agents. Based on this discovery, the tradeoff between real-time decision-making and the efficiency of cooperation using these channels can be well improved.

  1. Programmable Direct-Memory-Access Controller

    NASA Technical Reports Server (NTRS)

    Hendry, David F.

    1990-01-01

    Proposed programmable direct-memory-access controller (DMAC) operates with computer systems of 32000 series, which have 32-bit data buses and use addresses of 24 (or potentially 32) bits. Controller functions with or without help of central processing unit (CPU) and starts itself. Includes such advanced features as ability to compare two blocks of memory for equality and to search block of memory for specific value. Made as single very-large-scale integrated-circuit chip.

  2. Patterns and Prevalence of School Access, Transitions and Equity in South Africa: Secondary Analyses of BT20 Large-Scale Data Sources. CREATE Pathways to Access. Research Monograph No. 27

    ERIC Educational Resources Information Center

    Fleisch, Brahm; Shindler, Jennifer

    2009-01-01

    This monograph looks at patterns and prevalence of initial school enrolment, late entry, attainment promotion, and repetition in urban South Africa. The paper pays special attention to the particular gender nature of the patterns of school participation. The study analyses data generated in the genuine representative cohort study, Birth-to-Twenty…

  3. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  4. Large-scale modeling of rain fields from a rain cell deterministic model

    NASA Astrophysics Data System (ADS)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  5. The role and benefits of accessing primary care patient records during unscheduled care: a systematic review.

    PubMed

    Bowden, Tom; Coiera, Enrico

    2017-09-22

    The purpose of this study was to assess the impact of accessing primary care records on unscheduled care. Unscheduled care is typically delivered in hospital Emergency Departments. Studies published to December 2014 reporting on primary care record access during unscheduled care were retrieved. Twenty-two articles met inclusion criteria from a pool of 192. Many shared electronic health records (SEHRs) were large in scale, servicing many millions of patients. Reported utilization rates by clinicians was variable, with rates >20% amongst health management organizations but much lower in nation-scale systems. No study reported on clinical outcomes or patient safety, and no economic studies of SEHR access during unscheduled care were available. Design factors that may affect utilization included consent and access models, SEHR content, and system usability and reliability. Despite their size and expense, SEHRs designed to support unscheduled care have been poorly evaluated, and it is not possible to draw conclusions about any likely benefits associated with their use. Heterogeneity across the systems and the populations they serve make generalization about system design or performance difficult. None of the reviewed studies used a theoretical model to guide evaluation. Value of Information models may be a useful theoretical approach to design evaluation metrics, facilitating comparison across systems in future studies. Well-designed SEHRs should in principle be capable of improving the efficiency, quality and safety of unscheduled care, but at present the evidence for such benefits is weak, largely because it has not been sought.

  6. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the time this contribution is being written, the proposed testbed represents the first implementation of a distributed large-scale, multi-model experiment in the ESGF/CMIP context, joining together server-side approaches for scientific data analysis, HPDA frameworks, end-to-end workflow management, and cloud computing.

  7. Feasibility of large-scale power plants based on thermoelectric effects

    NASA Astrophysics Data System (ADS)

    Liu, Liping

    2014-12-01

    Heat resources of small temperature difference are easily accessible, free and enormous on the Earth. Thermoelectric effects provide the technology for converting these heat resources directly into electricity. We present designs for electricity generators based on thermoelectric effects that utilize heat resources of small temperature difference, e.g., ocean water at different depths and geothermal resources, and conclude that large-scale power plants based on thermoelectric effects are feasible and economically competitive. The key observation is that the power factor of thermoelectric materials, unlike the figure of merit, can be improved by orders of magnitude upon laminating good conductors and good thermoelectric materials. The predicted large-scale power generators based on thermoelectric effects, if validated, will have the advantages of the scalability, renewability, and free supply of heat resources of small temperature difference on the Earth.

  8. An objective approach to determining the weight ranges of prey preferred by and accessible to the five large African carnivores.

    PubMed

    Clements, Hayley S; Tambling, Craig J; Hayward, Matt W; Kerley, Graham I H

    2014-01-01

    Broad-scale models describing predator prey preferences serve as useful departure points for understanding predator-prey interactions at finer scales. Previous analyses used a subjective approach to identify prey weight preferences of the five large African carnivores, hence their accuracy is questionable. This study uses a segmented model of prey weight versus prey preference to objectively quantify the prey weight preferences of the five large African carnivores. Based on simulations of known predator prey preference, for prey species sample sizes above 32 the segmented model approach detects up to four known changes in prey weight preference (represented by model break-points) with high rates of detection (75% to 100% of simulations, depending on number of break-points) and accuracy (within 1.3±4.0 to 2.7±4.4 of known break-point). When applied to the five large African carnivores, using carnivore diet information from across Africa, the model detected weight ranges of prey that are preferred, killed relative to their abundance, and avoided by each carnivore. Prey in the weight ranges preferred and killed relative to their abundance are together termed "accessible prey". Accessible prey weight ranges were found to be 14-135 kg for cheetah Acinonyx jubatus, 1-45 kg for leopard Panthera pardus, 32-632 kg for lion Panthera leo, 15-1600 kg for spotted hyaena Crocuta crocuta and 10-289 kg for wild dog Lycaon pictus. An assessment of carnivore diets throughout Africa found these accessible prey weight ranges include 88±2% (cheetah), 82±3% (leopard), 81±2% (lion), 97±2% (spotted hyaena) and 96±2% (wild dog) of kills. These descriptions of prey weight preferences therefore contribute to our understanding of the diet spectrum of the five large African carnivores. Where datasets meet the minimum sample size requirements, the segmented model approach provides a means of determining, and comparing, the prey weight range preferences of any carnivore species.

  9. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks.

    PubMed

    Fan, Guangwen; Shen, Yu; Hao, Xiaowei; Yuan, Zongming; Zhou, Zhi

    2015-09-18

    Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG) storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications.

  10. Research Guidelines in the Era of Large-scale Collaborations: An Analysis of Genome-wide Association Study Consortia

    PubMed Central

    Austin, Melissa A.; Hair, Marilyn S.; Fullerton, Stephanie M.

    2012-01-01

    Scientific research has shifted from studies conducted by single investigators to the creation of large consortia. Genetic epidemiologists, for example, now collaborate extensively for genome-wide association studies (GWAS). The effect has been a stream of confirmed disease-gene associations. However, effects on human subjects oversight, data-sharing, publication and authorship practices, research organization and productivity, and intellectual property remain to be examined. The aim of this analysis was to identify all research consortia that had published the results of a GWAS analysis since 2005, characterize them, determine which have publicly accessible guidelines for research practices, and summarize the policies in these guidelines. A review of the National Human Genome Research Institute’s Catalog of Published Genome-Wide Association Studies identified 55 GWAS consortia as of April 1, 2011. These consortia were comprised of individual investigators, research centers, studies, or other consortia and studied 48 different diseases or traits. Only 14 (25%) were found to have publicly accessible research guidelines on consortia websites. The available guidelines provide information on organization, governance, and research protocols; half address institutional review board approval. Details of publication, authorship, data-sharing, and intellectual property vary considerably. Wider access to consortia guidelines is needed to establish appropriate research standards with broad applicability to emerging forms of large-scale collaboration. PMID:22491085

  11. Quantum internet using code division multiple access

    PubMed Central

    Zhang, Jing; Liu, Yu-xi; Özdemir, Şahin Kaya; Wu, Re-Bing; Gao, Feifei; Wang, Xiang-Bin; Yang, Lan; Nori, Franco

    2013-01-01

    A crucial open problem inS large-scale quantum networks is how to efficiently transmit quantum data among many pairs of users via a common data-transmission medium. We propose a solution by developing a quantum code division multiple access (q-CDMA) approach in which quantum information is chaotically encoded to spread its spectral content, and then decoded via chaos synchronization to separate different sender-receiver pairs. In comparison to other existing approaches, such as frequency division multiple access (FDMA), the proposed q-CDMA can greatly increase the information rates per channel used, especially for very noisy quantum channels. PMID:23860488

  12. Rare b-hadron decays as probe of new physics

    NASA Astrophysics Data System (ADS)

    Lanfranchi, Gaia

    2018-05-01

    The unexpected absence of unambiguous signals of New Physics (NP) at the TeV scale at the Large Hadron Collider (LHC) puts today flavor physics at the forefront. In particular, rare decays of b-hadrons represent a unique probe to challenge the Standard Model (SM) paradigm and test models of NP at a scale much higher than that accessible by direct searches. This article reviews the status of the field.

  13. Genome-wide association implicates numerous genes and pleiotropy underlying ecological trait variation in natural populations of Populus trichocarpa

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKown, Athena; Klapste, Jaroslav; Guy, Robert

    2014-01-01

    To uncover the genetic basis of phenotypic trait variation, we used 448 unrelated wild accessions of black cottonwood (Populus trichocarpa Torr. & Gray) from natural populations throughout western North America. Extensive information from large-scale trait phenotyping (with spatial and temporal replications within a common garden) and genotyping (with a 34K Populus SNP array) of all accessions were used for gene discovery in a genome-wide association study (GWAS).

  14. Concepts for a global resources information system

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.; Urena, J. L.

    1984-01-01

    The objective of the Global Resources Information System (GRIS) is to establish an effective and efficient information management system to meet the data access requirements of NASA and NASA-related scientists conducting large-scale, multi-disciplinary, multi-mission scientific investigations. Using standard interfaces and operating guidelines, diverse data systems can be integrated to provide the capabilities to access and process multiple geographically dispersed data sets and to develop the necessary procedures and algorithms to derive global resource information.

  15. Accessing VA Healthcare During Large-Scale Natural Disasters.

    PubMed

    Der-Martirosian, Claudia; Pinnock, Laura; Dobalian, Aram

    2017-01-01

    Natural disasters can lead to the closure of medical facilities including the Veterans Affairs (VA), thus impacting access to healthcare for U.S. military veteran VA users. We examined the characteristics of VA patients who reported having difficulty accessing care if their usual source of VA care was closed because of natural disasters. A total of 2,264 veteran VA users living in the U.S. northeast region participated in a 2015 cross-sectional representative survey. The study used VA administrative data in a complex stratified survey design with a multimode approach. A total of 36% of veteran VA users reported having difficulty accessing care elsewhere, negatively impacting the functionally impaired and lower income VA patients.

  16. Natural Allelic Diversity, Genetic Structure and Linkage Disequilibrium Pattern in Wild Chickpea

    PubMed Central

    Kujur, Alice; Das, Shouvik; Badoni, Saurabh; Kumar, Vinod; Singh, Mohar; Bansal, Kailash C.; Tyagi, Akhilesh K.; Parida, Swarup K.

    2014-01-01

    Characterization of natural allelic diversity and understanding the genetic structure and linkage disequilibrium (LD) pattern in wild germplasm accessions by large-scale genotyping of informative microsatellite and single nucleotide polymorphism (SNP) markers is requisite to facilitate chickpea genetic improvement. Large-scale validation and high-throughput genotyping of genome-wide physically mapped 478 genic and genomic microsatellite markers and 380 transcription factor gene-derived SNP markers using gel-based assay, fluorescent dye-labelled automated fragment analyser and matrix-assisted laser desorption ionization-time of flight (MALDI-TOF) mass array have been performed. Outcome revealed their high genotyping success rate (97.5%) and existence of a high level of natural allelic diversity among 94 wild and cultivated Cicer accessions. High intra- and inter-specific polymorphic potential and wider molecular diversity (11–94%) along with a broader genetic base (13–78%) specifically in the functional genic regions of wild accessions was assayed by mapped markers. It suggested their utility in monitoring introgression and transferring target trait-specific genomic (gene) regions from wild to cultivated gene pool for the genetic enhancement. Distinct species/gene pool-wise differentiation, admixed domestication pattern, and differential genome-wide recombination and LD estimates/decay observed in a six structured population of wild and cultivated accessions using mapped markers further signifies their usefulness in chickpea genetics, genomics and breeding. PMID:25222488

  17. Large-Scale Event Extraction from Literature with Multi-Level Gene Normalization

    PubMed Central

    Wei, Chih-Hsuan; Hakala, Kai; Pyysalo, Sampo; Ananiadou, Sophia; Kao, Hung-Yu; Lu, Zhiyong; Salakoski, Tapio; Van de Peer, Yves; Ginter, Filip

    2013-01-01

    Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/). Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from http://evexdb.org/download/, under the Creative Commons – Attribution – Share Alike (CC BY-SA) license. PMID:23613707

  18. Multiple pathways of commodity crop expansion in tropical forest landscapes

    NASA Astrophysics Data System (ADS)

    Meyfroidt, Patrick; Carlson, Kimberly M.; Fagan, Matthew E.; Gutiérrez-Vélez, Victor H.; Macedo, Marcia N.; Curran, Lisa M.; DeFries, Ruth S.; Dyer, George A.; Gibbs, Holly K.; Lambin, Eric F.; Morton, Douglas C.; Robiglio, Valentina

    2014-07-01

    Commodity crop expansion, for both global and domestic urban markets, follows multiple land change pathways entailing direct and indirect deforestation, and results in various social and environmental impacts. Here we compare six published case studies of rapid commodity crop expansion within forested tropical regions. Across cases, between 1.7% and 89.5% of new commodity cropland was sourced from forestlands. Four main factors controlled pathways of commodity crop expansion: (i) the availability of suitable forestland, which is determined by forest area, agroecological or accessibility constraints, and land use policies, (ii) economic and technical characteristics of agricultural systems, (iii) differences in constraints and strategies between small-scale and large-scale actors, and (iv) variable costs and benefits of forest clearing. When remaining forests were unsuitable for agriculture and/or policies restricted forest encroachment, a larger share of commodity crop expansion occurred by conversion of existing agricultural lands, and land use displacement was smaller. Expansion strategies of large-scale actors emerge from context-specific balances between the search for suitable lands; transaction costs or conflicts associated with expanding into forests or other state-owned lands versus smallholder lands; net benefits of forest clearing; and greater access to infrastructure in already-cleared lands. We propose five hypotheses to be tested in further studies: (i) land availability mediates expansion pathways and the likelihood that land use is displaced to distant, rather than to local places; (ii) use of already-cleared lands is favored when commodity crops require access to infrastructure; (iii) in proportion to total agricultural expansion, large-scale actors generate more clearing of mature forests than smallholders; (iv) property rights and land tenure security influence the actors participating in commodity crop expansion, the form of land use displacement, and livelihood outcomes; (v) intensive commodity crops may fail to spare land when inducing displacement. We conclude that understanding pathways of commodity crop expansion is essential to improve land use governance.

  19. Classification of time series patterns from complex dynamic systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Rao, N.

    1998-07-01

    An increasing availability of high-performance computing and data storage media at decreasing cost is making possible the proliferation of large-scale numerical databases and data warehouses. Numeric warehousing enterprises on the order of hundreds of gigabytes to terabytes are a reality in many fields such as finance, retail sales, process systems monitoring, biomedical monitoring, surveillance and transportation. Large-scale databases are becoming more accessible to larger user communities through the internet, web-based applications and database connectivity. Consequently, most researchers now have access to a variety of massive datasets. This trend will probably only continue to grow over the next several years. Unfortunately,more » the availability of integrated tools to explore, analyze and understand the data warehoused in these archives is lagging far behind the ability to gain access to the same data. In particular, locating and identifying patterns of interest in numerical time series data is an increasingly important problem for which there are few available techniques. Temporal pattern recognition poses many interesting problems in classification, segmentation, prediction, diagnosis and anomaly detection. This research focuses on the problem of classification or characterization of numerical time series data. Highway vehicles and their drivers are examples of complex dynamic systems (CDS) which are being used by transportation agencies for field testing to generate large-scale time series datasets. Tools for effective analysis of numerical time series in databases generated by highway vehicle systems are not yet available, or have not been adapted to the target problem domain. However, analysis tools from similar domains may be adapted to the problem of classification of numerical time series data.« less

  20. Diazo compounds in continuous-flow technology.

    PubMed

    Müller, Simon T R; Wirth, Thomas

    2015-01-01

    Diazo compounds are very versatile reagents in organic chemistry and meet the challenge of selective assembly of structurally complex molecules. Their leaving group is dinitrogen; therefore, they are very clean and atom-efficient reagents. However, diazo compounds are potentially explosive and extremely difficult to handle on an industrial scale. In this review, it is discussed how continuous flow technology can help to make these powerful reagents accessible on large scale. Microstructured devices can improve heat transfer greatly and help with the handling of dangerous reagents safely. The in situ formation and subsequent consumption of diazo compounds are discussed along with advances in handling diazomethane and ethyl diazoacetate. The potential large-scale applications of a given methodology is emphasized. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  2. A model for optimizing file access patterns using spatio-temporal parallelism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boonthanome, Nouanesengsy; Patchett, John; Geveci, Berk

    2013-01-01

    For many years now, I/O read time has been recognized as the primary bottleneck for parallel visualization and analysis of large-scale data. In this paper, we introduce a model that can estimate the read time for a file stored in a parallel filesystem when given the file access pattern. Read times ultimately depend on how the file is stored and the access pattern used to read the file. The file access pattern will be dictated by the type of parallel decomposition used. We employ spatio-temporal parallelism, which combines both spatial and temporal parallelism, to provide greater flexibility to possible filemore » access patterns. Using our model, we were able to configure the spatio-temporal parallelism to design optimized read access patterns that resulted in a speedup factor of approximately 400 over traditional file access patterns.« less

  3. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    NASA Astrophysics Data System (ADS)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  4. Web based visualization of large climate data sets

    USGS Publications Warehouse

    Alder, Jay R.; Hostetler, Steven W.

    2015-01-01

    We have implemented the USGS National Climate Change Viewer (NCCV), which is an easy-to-use web application that displays future projections from global climate models over the United States at the state, county and watershed scales. We incorporate the NASA NEX-DCP30 statistically downscaled temperature and precipitation for 30 global climate models being used in the Fifth Assessment Report (AR5) of the Intergovernmental Panel on Climate Change (IPCC), and hydrologic variables we simulated using a simple water-balance model. Our application summarizes very large, complex data sets at scales relevant to resource managers and citizens and makes climate-change projection information accessible to users of varying skill levels. Tens of terabytes of high-resolution climate and water-balance data are distilled to compact binary format summary files that are used in the application. To alleviate slow response times under high loads, we developed a map caching technique that reduces the time it takes to generate maps by several orders of magnitude. The reduced access time scales to >500 concurrent users. We provide code examples that demonstrate key aspects of data processing, data exporting/importing and the caching technique used in the NCCV.

  5. Access to a scale and self-weighing habits among public housing residents.

    PubMed

    Bramante, C T; Clark, J M; Gudzune, K A

    2018-05-31

    Having access to a scale is essential for individuals to engage in self-weighing; however, few studies examine scale access, particularly among low-income individuals. Our objectives were to (i) determine how many public housing residents have access to a scale and (ii) describe their self-weighing habits. We conducted a cross-sectional survey of public housing residents in Baltimore, MD, from August 2014 to August 2015. Participants answered questions about their access to a scale ('yes'/'no') and daily self-weighing habits ('no scale/never or hardly ever' vs. 'some/about half/much of the time/always'). We used t-tests or chi-square tests to examine the association of scale access with respondent characteristics. Overall, 266 adults participated (48% response rate). Mean age was 45 years with 86% women, 95% black and 54% with obesity. Only 32% had access to a scale; however, 78% of those with this access reported engaging in some self-weighing. Residents who lacked access to a scale were younger (P = 0.03), and more likely to be unemployed/disabled (P = 0.01) or food insecure (P < 0.01). While few public housing residents have access to a scale, those who do report daily self-weighing with some regularity. Financial hardship may influence scale access in this population, as potential proxies of this status were associated with no scale access. © 2018 World Obesity Federation.

  6. The Department of Defense and the Power of Cloud Computing: Weighing Acceptable Cost Versus Acceptable Risk

    DTIC Science & Technology

    2016-04-01

    the DOD will put DOD systems and data at a risk level comparable to that of their neighbors in the cloud. Just as a user browses a Web page on the...proxy servers for controlling user access to Web pages, and large-scale storage for data management. Each of these devices allows access to the...user to develop applications. Acunetics.com describes Web applications as “computer programs allowing Website visitors to submit and retrieve data

  7. GIGGLE: a search engine for large-scale integrated genome analysis.

    PubMed

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  8. GIGGLE: a search engine for large-scale integrated genome analysis

    PubMed Central

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  9. A fast time-difference inverse solver for 3D EIT with application to lung imaging.

    PubMed

    Javaherian, Ashkan; Soleimani, Manuchehr; Moeller, Knut

    2016-08-01

    A class of sparse optimization techniques that require solely matrix-vector products, rather than an explicit access to the forward matrix and its transpose, has been paid much attention in the recent decade for dealing with large-scale inverse problems. This study tailors application of the so-called Gradient Projection for Sparse Reconstruction (GPSR) to large-scale time-difference three-dimensional electrical impedance tomography (3D EIT). 3D EIT typically suffers from the need for a large number of voxels to cover the whole domain, so its application to real-time imaging, for example monitoring of lung function, remains scarce since the large number of degrees of freedom of the problem extremely increases storage space and reconstruction time. This study shows the great potential of the GPSR for large-size time-difference 3D EIT. Further studies are needed to improve its accuracy for imaging small-size anomalies.

  10. Large-Scale Wireless Temperature Monitoring System for Liquefied Petroleum Gas Storage Tanks

    PubMed Central

    Fan, Guangwen; Shen, Yu; Hao, Xiaowei; Yuan, Zongming; Zhou, Zhi

    2015-01-01

    Temperature distribution is a critical indicator of the health condition for Liquefied Petroleum Gas (LPG) storage tanks. In this paper, we present a large-scale wireless temperature monitoring system to evaluate the safety of LPG storage tanks. The system includes wireless sensors networks, high temperature fiber-optic sensors, and monitoring software. Finally, a case study on real-world LPG storage tanks proves the feasibility of the system. The unique features of wireless transmission, automatic data acquisition and management, local and remote access make the developed system a good alternative for temperature monitoring of LPG storage tanks in practical applications. PMID:26393596

  11. Smallholder farmers' attitudes toward the provision of drinking water for dairy cows in Kagera, Tanzania.

    PubMed

    Forbes, Barbara; Kepe, Thembela

    2015-02-01

    Agriculture's large share of Tanzanian GDP and the large percentage of rural poor engaged in the sector make it a focus for many development projects that see it as an area of attention for reducing rural poverty. This paper uses a case of the Kamachumu community, where a dairy cow loan project was implemented using the heifer-in-trust (HIT) model. This study finds that productivity is limited by how the cows are being managed, particularly with many animals not having ad lib access to drinking water. The paper explores reasons why farmers do or do not provide their cows with unlimited access to drinking water. The study concludes that there are many barriers farmers face, including water accessibility, education and training, infrastructure, simple negligence, and security. These results suggest an increase in extension services and national and local livestock policies that consider the specific realities of small-scale dairy farmers.

  12. Evaluation of Targeted Sequencing for Transcriptional Analysis of Archival Formalin-Fixed Paraffin-Embedded (FFPE) Samples

    EPA Science Inventory

    Next-generation sequencing provides unprecedented access to genomic information in archival FFPE tissue samples. However, costs and technical challenges related to RNA isolation and enrichment limit use of whole-genome RNA-sequencing for large-scale studies of FFPE specimens. Rec...

  13. PeanutBase and other bioinformatic resources for peanut

    USDA-ARS?s Scientific Manuscript database

    Large-scale genomic data for peanut have only become available in the last few years, with the advent of low-cost sequencing technologies. To make the data accessible to researchers and to integrate across diverse types of data, the International Peanut Genomics Consortium funded the development of ...

  14. Measurement-Driven Characterization of the Mobile Environment

    ERIC Educational Resources Information Center

    Soroush, Hamed

    2013-01-01

    The concurrent deployment of high-quality wireless networks and large-scale cloud services offers the promise of secure ubiquitous access to seemingly limitless amount of content. However, as users' expectations have grown more demanding, the performance and connectivity failures endemic to the existing networking infrastructure have become more…

  15. Learning Deep Representations for Ground to Aerial Geolocalization (Open Access)

    DTIC Science & Technology

    2015-10-15

    proposed approach, Where-CNN, is inspired by deep learning success in face verification and achieves significant improvements over tra- ditional hand...crafted features and existing deep features learned from other large-scale databases. We show the ef- fectiveness of Where-CNN in finding matches

  16. Resolving Dynamic Properties of Polymers through Coarse-Grained Computational Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salerno, K. Michael; Agrawal, Anupriya; Perahia, Dvora

    2016-02-05

    Coupled length and time scales determine the dynamic behavior of polymers and underlie their unique viscoelastic properties. To resolve the long-time dynamics it is imperative to determine which time and length scales must be correctly modeled. In this paper, we probe the degree of coarse graining required to simultaneously retain significant atomistic details and access large length and time scales. The degree of coarse graining in turn sets the minimum length scale instrumental in defining polymer properties and dynamics. Using linear polyethylene as a model system, we probe how the coarse-graining scale affects the measured dynamics. Iterative Boltzmann inversion ismore » used to derive coarse-grained potentials with 2–6 methylene groups per coarse-grained bead from a fully atomistic melt simulation. We show that atomistic detail is critical to capturing large-scale dynamics. Finally, using these models we simulate polyethylene melts for times over 500 μs to study the viscoelastic properties of well-entangled polymer melts.« less

  17. Transmission Technologies and Operational Characteristic Analysis of Hybrid UHV AC/DC Power Grids in China

    NASA Astrophysics Data System (ADS)

    Tian, Zhang; Yanfeng, Gong

    2017-05-01

    In order to solve the contradiction between demand and distribution range of primary energy resource, Ultra High Voltage (UHV) power grids should be developed rapidly to meet development of energy bases and accessing of large-scale renewable energy. This paper reviewed the latest research processes of AC/DC transmission technologies, summarized the characteristics of AC/DC power grids, concluded that China’s power grids certainly enter a new period of large -scale hybrid UHV AC/DC power grids and characteristics of “strong DC and weak AC” becomes increasingly pro minent; possible problems in operation of AC/DC power grids was discussed, and interaction or effect between AC/DC power grids was made an intensive study of; according to above problems in operation of power grids, preliminary scheme is summarized as fo llows: strengthening backbone structures, enhancing AC/DC transmission technologies, promoting protection measures of clean energ y accessing grids, and taking actions to solve stability problems of voltage and frequency etc. It’s valuable for making hybrid UHV AC/DC power grids adapt to operating mode of large power grids, thus guaranteeing security and stability of power system.

  18. Scaling up HIV viral load - lessons from the large-scale implementation of HIV early infant diagnosis and CD4 testing.

    PubMed

    Peter, Trevor; Zeh, Clement; Katz, Zachary; Elbireer, Ali; Alemayehu, Bereket; Vojnov, Lara; Costa, Alex; Doi, Naoko; Jani, Ilesh

    2017-11-01

    The scale-up of effective HIV viral load (VL) testing is an urgent public health priority. Implementation of testing is supported by the availability of accurate, nucleic acid based laboratory and point-of-care (POC) VL technologies and strong WHO guidance recommending routine testing to identify treatment failure. However, test implementation faces challenges related to the developing health systems in many low-resource countries. The purpose of this commentary is to review the challenges and solutions from the large-scale implementation of other diagnostic tests, namely nucleic-acid based early infant HIV diagnosis (EID) and CD4 testing, and identify key lessons to inform the scale-up of VL. Experience with EID and CD4 testing provides many key lessons to inform VL implementation and may enable more effective and rapid scale-up. The primary lessons from earlier implementation efforts are to strengthen linkage to clinical care after testing, and to improve the efficiency of testing. Opportunities to improve linkage include data systems to support the follow-up of patients through the cascade of care and test delivery, rapid sample referral networks, and POC tests. Opportunities to increase testing efficiency include improvements to procurement and supply chain practices, well connected tiered laboratory networks with rational deployment of test capacity across different levels of health services, routine resource mapping and mobilization to ensure adequate resources for testing programs, and improved operational and quality management of testing services. If applied to VL testing programs, these approaches could help improve the impact of VL on ART failure management and patient outcomes, reduce overall costs and help ensure the sustainable access to reduced pricing for test commodities, as well as improve supportive health systems such as efficient, and more rigorous quality assurance. These lessons draw from traditional laboratory practices as well as fields such as logistics, operations management and business. The lessons and innovations from large-scale EID and CD4 programs described here can be adapted to inform more effective scale-up approaches for VL. They demonstrate that an integrated approach to health system strengthening focusing on key levers for test access such as data systems, supply efficiencies and network management. They also highlight the challenges with implementation and the need for more innovative approaches and effective partnerships to achieve equitable and cost-effective test access. © 2017 The Authors. Journal of the International AIDS Society published by John Wiley & sons Ltd on behalf of the International AIDS Society.

  19. "Fast Track" and "Traditional Path" Coaches: Affordances, Agency and Social Capital

    ERIC Educational Resources Information Center

    Rynne, Steven

    2014-01-01

    A recent development in large-scale coach accreditation (certification) structures has been the "fast tracking" of former elite athletes. Former elite athletes are often exempted from entry-level qualifications and are generally granted access to fast track courses that are shortened versions of the accreditation courses undertaken by…

  20. Scaling Online Education: Increasing Access to Higher Education

    ERIC Educational Resources Information Center

    Moloney, Jacqueline F.; Oakley, Burks, II

    2010-01-01

    Over the past decade, online courses and entire online degree programs have been made available, serving millions of students in higher education. These online courses largely have been designed and taught using the theoretical concepts and practical strategies of Asynchronous Learning Networks (ALN). During 2003-04, approximately two million…

  1. 61 FR 41385 - Notice of Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    1996-08-08

    ... PRESSURE VESSEL; filed 24 February 1995; patented 21 November 1995.// Patent 5,468,356: LARGE SCALE...,477,482: ULTRA HIGH DENSITY, NON- VOLATILE FERROMAGNETIC RANDOM ACCESS MEMORY; filed 1 October 1993....// Patent 5,483,017: HIGH TEMPERATURE THERMOSETS AND CERAMICS DERIVED FROM LINEAR CARBORANE-(SILOXANE OR...

  2. Web-based Visual Analytics for Extreme Scale Climate Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Evans, Katherine J; Harney, John F

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via newmore » visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.« less

  3. Initialization, Prediction and Diagnosis of the Rapid Intensification of Tropical Cyclones using the Australian Community Climate and Earth System Simulator, ACCESS

    DTIC Science & Technology

    2012-10-12

    structure on the evolving storm behaviour. 13 7. Large scale influences on Rapid Intensification and Extratropical Transition: RI and ET...assimilation techniques to better initialize and validate TC structures (including the intense inner core and storm asymmetries) consistent with the large...Without vortex specification, initial conditions usually contain a weak and misplaced circulation. Based on estimates of central pressure and storm size

  4. Evolving bipartite authentication graph partitions

    DOE PAGES

    Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.

    2017-01-16

    As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less

  5. Evolving bipartite authentication graph partitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, Aaron Scott; Tauritz, Daniel Remy; Kent, Alexander D.

    As large scale enterprise computer networks become more ubiquitous, finding the appropriate balance between user convenience and user access control is an increasingly challenging proposition. Suboptimal partitioning of users’ access and available services contributes to the vulnerability of enterprise networks. Previous edge-cut partitioning methods unduly restrict users’ access to network resources. This paper introduces a novel method of network partitioning superior to the current state-of-the-art which minimizes user impact by providing alternate avenues for access that reduce vulnerability. Networks are modeled as bipartite authentication access graphs and a multi-objective evolutionary algorithm is used to simultaneously minimize the size of largemore » connected components while minimizing overall restrictions on network users. Lastly, results are presented on a real world data set that demonstrate the effectiveness of the introduced method compared to previous naive methods.« less

  6. Web tools for large-scale 3D biological images and atlases

    PubMed Central

    2012-01-01

    Background Large-scale volumetric biomedical image data of three or more dimensions are a significant challenge for distributed browsing and visualisation. Many images now exceed 10GB which for most users is too large to handle in terms of computer RAM and network bandwidth. This is aggravated when users need to access tens or hundreds of such images from an archive. Here we solve the problem for 2D section views through archive data delivering compressed tiled images enabling users to browse through very-large volume data in the context of a standard web-browser. The system provides an interactive visualisation for grey-level and colour 3D images including multiple image layers and spatial-data overlay. Results The standard Internet Imaging Protocol (IIP) has been extended to enable arbitrary 2D sectioning of 3D data as well a multi-layered images and indexed overlays. The extended protocol is termed IIP3D and we have implemented a matching server to deliver the protocol and a series of Ajax/Javascript client codes that will run in an Internet browser. We have tested the server software on a low-cost linux-based server for image volumes up to 135GB and 64 simultaneous users. The section views are delivered with response times independent of scale and orientation. The exemplar client provided multi-layer image views with user-controlled colour-filtering and overlays. Conclusions Interactive browsing of arbitrary sections through large biomedical-image volumes is made possible by use of an extended internet protocol and efficient server-based image tiling. The tools open the possibility of enabling fast access to large image archives without the requirement of whole image download and client computers with very large memory configurations. The system was demonstrated using a range of medical and biomedical image data extending up to 135GB for a single image volume. PMID:22676296

  7. Large-scale impacts of herbivores on the structural diversity of African savannas

    PubMed Central

    Asner, Gregory P.; Levick, Shaun R.; Kennedy-Bowdoin, Ty; Knapp, David E.; Emerson, Ruth; Jacobson, James; Colgan, Matthew S.; Martin, Roberta E.

    2009-01-01

    African savannas are undergoing management intensification, and decision makers are increasingly challenged to balance the needs of large herbivore populations with the maintenance of vegetation and ecosystem diversity. Ensuring the sustainability of Africa's natural protected areas requires information on the efficacy of management decisions at large spatial scales, but often neither experimental treatments nor large-scale responses are available for analysis. Using a new airborne remote sensing system, we mapped the three-dimensional (3-D) structure of vegetation at a spatial resolution of 56 cm throughout 1640 ha of savanna after 6-, 22-, 35-, and 41-year exclusions of herbivores, as well as in unprotected areas, across Kruger National Park in South Africa. Areas in which herbivores were excluded over the short term (6 years) contained 38%–80% less bare ground compared with those that were exposed to mammalian herbivory. In the longer-term (> 22 years), the 3-D structure of woody vegetation differed significantly between protected and accessible landscapes, with up to 11-fold greater woody canopy cover in the areas without herbivores. Our maps revealed 2 scales of ecosystem response to herbivore consumption, one broadly mediated by geologic substrate and the other mediated by hillslope-scale variation in soil nutrient availability and moisture conditions. Our results are the first to quantitatively illustrate the extent to which herbivores can affect the 3-D structural diversity of vegetation across large savanna landscapes. PMID:19258457

  8. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  9. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  10. An Objective Approach to Determining the Weight Ranges of Prey Preferred by and Accessible to the Five Large African Carnivores

    PubMed Central

    Clements, Hayley S.; Tambling, Craig J.; Hayward, Matt W.; Kerley, Graham I. H.

    2014-01-01

    Broad-scale models describing predator prey preferences serve as useful departure points for understanding predator-prey interactions at finer scales. Previous analyses used a subjective approach to identify prey weight preferences of the five large African carnivores, hence their accuracy is questionable. This study uses a segmented model of prey weight versus prey preference to objectively quantify the prey weight preferences of the five large African carnivores. Based on simulations of known predator prey preference, for prey species sample sizes above 32 the segmented model approach detects up to four known changes in prey weight preference (represented by model break-points) with high rates of detection (75% to 100% of simulations, depending on number of break-points) and accuracy (within 1.3±4.0 to 2.7±4.4 of known break-point). When applied to the five large African carnivores, using carnivore diet information from across Africa, the model detected weight ranges of prey that are preferred, killed relative to their abundance, and avoided by each carnivore. Prey in the weight ranges preferred and killed relative to their abundance are together termed “accessible prey”. Accessible prey weight ranges were found to be 14–135 kg for cheetah Acinonyx jubatus, 1–45 kg for leopard Panthera pardus, 32–632 kg for lion Panthera leo, 15–1600 kg for spotted hyaena Crocuta crocuta and 10–289 kg for wild dog Lycaon pictus. An assessment of carnivore diets throughout Africa found these accessible prey weight ranges include 88±2% (cheetah), 82±3% (leopard), 81±2% (lion), 97±2% (spotted hyaena) and 96±2% (wild dog) of kills. These descriptions of prey weight preferences therefore contribute to our understanding of the diet spectrum of the five large African carnivores. Where datasets meet the minimum sample size requirements, the segmented model approach provides a means of determining, and comparing, the prey weight range preferences of any carnivore species. PMID:24988433

  11. Defining equity in physical access to clinical services using geographical information systems as part of malaria planning and monitoring in Kenya

    PubMed Central

    Noor, A. M.; Zurovac, D.; Hay, S. I.; Ochola, S. A.; Snow, R. W.

    2010-01-01

    Summary Distance is a crucial feature of health service use and yet its application and utility to health care planning have not been well explored, particularly in the light of large-scale international and national efforts such as Roll Back Malaria. We have developed a high-resolution map of population-to-service access in four districts of Kenya. Theoretical physical access, based upon national targets, developed as part of the Kenyan health sector reform agenda, was compared with actual health service usage data among 1668 paediatric patients attending 81 sampled government health facilities. Actual and theoretical use were highly correlated. Patients in the larger districts of Kwale and Makueni, where access to government health facilities was relatively poor, travelled greater mean distances than those in Greater Kisii and Bondo. More than 60% of the patients in the four districts attended health facilities within a 5-km range. Interpolated physical access surfaces across districts highlighted areas of poor access and large differences between urban and rural settings. Users from rural communities travelled greater distances to health facilities than those in urban communities. The implications of planning and monitoring equitable delivery of clinical services at national and international levels are discussed. PMID:14516303

  12. Non-volatile, high density, high speed, Micromagnet-Hall effect Random Access Memory (MHRAM)

    NASA Technical Reports Server (NTRS)

    Wu, Jiin C.; Katti, Romney R.; Stadler, Henry L.

    1991-01-01

    The micromagnetic Hall effect random access memory (MHRAM) has the potential of replacing ROMs, EPROMs, EEPROMs, and SRAMs because of its ability to achieve non-volatility, radiation hardness, high density, and fast access times, simultaneously. Information is stored magnetically in small magnetic elements (micromagnets), allowing unlimited data retention time, unlimited numbers of rewrite cycles, and inherent radiation hardness and SEU immunity, making the MHRAM suitable for ground based as well as spaceflight applications. The MHRAM device design is not affected by areal property fluctuations in the micromagnet, so high operating margins and high yield can be achieved in large scale integrated circuit (IC) fabrication. The MHRAM has short access times (less than 100 nsec). Write access time is short because on-chip transistors are used to gate current quickly, and magnetization reversal in the micromagnet can occur in a matter of a few nanoseconds. Read access time is short because the high electron mobility sensor (InAs or InSb) produces a large signal voltage in response to the fringing magnetic field from the micromagnet. High storage density is achieved since a unit cell consists only of two transistors and one micromagnet Hall effect element. By comparison, a DRAM unit cell has one transistor and one capacitor, and a SRAM unit cell has six transistors.

  13. Performance model-directed data sieving for high-performance I/O

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yong; Lu, Yin; Amritkar, Prathamesh

    2014-09-10

    Many scientific computing applications and engineering simulations exhibit noncontiguous I/O access patterns. Data sieving is an important technique to improve the performance of noncontiguous I/O accesses by combining small and noncontiguous requests into a large and contiguous request. It has been proven effective even though more data are potentially accessed than demanded. In this study, we propose a new data sieving approach namely performance model-directed data sieving, or PMD data sieving in short. It improves the existing data sieving approach from two aspects: (1) dynamically determines when it is beneficial to perform data sieving; and (2) dynamically determines how tomore » perform data sieving if beneficial. It improves the performance of the existing data sieving approach considerably and reduces the memory consumption as verified by both theoretical analysis and experimental results. Given the importance of supporting noncontiguous accesses effectively and reducing the memory pressure in a large-scale system, the proposed PMD data sieving approach in this research holds a great promise and will have an impact on high-performance I/O systems.« less

  14. Flexible services for the support of research.

    PubMed

    Turilli, Matteo; Wallom, David; Williams, Chris; Gough, Steve; Curran, Neal; Tarrant, Richard; Bretherton, Dan; Powell, Andy; Johnson, Matt; Harmer, Terry; Wright, Peter; Gordon, John

    2013-01-28

    Cloud computing has been increasingly adopted by users and providers to promote a flexible, scalable and tailored access to computing resources. Nonetheless, the consolidation of this paradigm has uncovered some of its limitations. Initially devised by corporations with direct control over large amounts of computational resources, cloud computing is now being endorsed by organizations with limited resources or with a more articulated, less direct control over these resources. The challenge for these organizations is to leverage the benefits of cloud computing while dealing with limited and often widely distributed computing resources. This study focuses on the adoption of cloud computing by higher education institutions and addresses two main issues: flexible and on-demand access to a large amount of storage resources, and scalability across a heterogeneous set of cloud infrastructures. The proposed solutions leverage a federated approach to cloud resources in which users access multiple and largely independent cloud infrastructures through a highly customizable broker layer. This approach allows for a uniform authentication and authorization infrastructure, a fine-grained policy specification and the aggregation of accounting and monitoring. Within a loosely coupled federation of cloud infrastructures, users can access vast amount of data without copying them across cloud infrastructures and can scale their resource provisions when the local cloud resources become insufficient.

  15. The Eczema Education Programme: intervention development and model feasibility.

    PubMed

    Jackson, K; Ersser, S J; Dennis, H; Farasat, H; More, A

    2014-07-01

    The systematic support of parents of children with eczema is essential to their effective management; however, we have few models of support. This study examines the rationale, evidence base and development of a large-scale, structured, theory-based, nurse-led intervention, the 'Eczema Education Programme' (EEP), for parents of children with eczema. To outline development of the EEP, model of delivery, determine its feasibility and evaluate this based on service access and parental satisfaction data. Parent-child dyads meeting EEP referral criteria were recruited and demographic information recorded. A questionnaire survey of parental satisfaction was conducted 4 weeks post EEP; parental focus groups at 6 weeks provided comparative qualitative data. Descriptive statistics were derived from the questionnaire data using Predictive Analytics Software (PASW); content analysis was applied to focus group data. A total of 356 parents attended the EEP during the evaluation period. Service access was achieved for those in a challenging population. Both survey data (n = 146 parents, 57%) and focus group data (n = 21) revealed a significant level of parental satisfaction with the programme. It was feasible to provide the EEP as an adjunct to normal clinical care on a large scale, achieving a high level of patient/parent satisfaction and access within an urban area of multiple deprivation and high mobility. The intervention is transferable and the results are generalizable to other ethnically diverse child eczema populations within metropolitan areas in Britain. A multicentre RCT is required to test the effectiveness of this intervention on a larger scale. © 2013 European Academy of Dermatology and Venereology.

  16. Fundamental tests of galaxy formation theory

    NASA Technical Reports Server (NTRS)

    Silk, J.

    1982-01-01

    The structure of the universe as an environment where traces exist of the seed fluctuations from which galaxies formed is studied. The evolution of the density fluctuation modes that led to the eventual formation of matter inhomogeneities is reviewed, How the resulting clumps developed into galaxies and galaxy clusters acquiring characteristic masses, velocity dispersions, and metallicities, is discussed. Tests are described that utilize the large scale structure of the universe, including the dynamics of the local supercluster, the large scale matter distribution, and the anisotropy of the cosmic background radiation, to probe the earliest accessible stages of evolution. Finally, the role of particle physics is described with regard to its observable implications for galaxy formation.

  17. Information Power Grid (IPG) Tutorial 2003

    NASA Technical Reports Server (NTRS)

    Meyers, George

    2003-01-01

    For NASA and the general community today Grid middleware: a) provides tools to access/use data sources (databases, instruments, ...); b) provides tools to access computing (unique and generic); c) Is an enabler of large scale collaboration. Dynamically responding to needs is a key selling point of a grid. Independent resources can be joined as appropriate to solve a problem. Provide tools to enable the building of a frameworks for application. Provide value added service to the NASA user base for utilizing resources on the grid in new and more efficient ways. Provides tools for development of Frameworks.

  18. A distributed parallel storage architecture and its potential application within EOSDIS

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Tierney, Brian; Feuquay, Jay; Butzer, Tony

    1994-01-01

    We describe the architecture, implementation, use of a scalable, high performance, distributed-parallel data storage system developed in the ARPA funded MAGIC gigabit testbed. A collection of wide area distributed disk servers operate in parallel to provide logical block level access to large data sets. Operated primarily as a network-based cache, the architecture supports cooperation among independently owned resources to provide fast, large-scale, on-demand storage to support data handling, simulation, and computation.

  19. Flexible, High-Speed CdSe Nanocrystal Integrated Circuits.

    PubMed

    Stinner, F Scott; Lai, Yuming; Straus, Daniel B; Diroll, Benjamin T; Kim, David K; Murray, Christopher B; Kagan, Cherie R

    2015-10-14

    We report large-area, flexible, high-speed analog and digital colloidal CdSe nanocrystal integrated circuits operating at low voltages. Using photolithography and a newly developed process to fabricate vertical interconnect access holes, we scale down device dimensions, reducing parasitic capacitances and increasing the frequency of circuit operation, and scale up device fabrication over 4 in. flexible substrates. We demonstrate amplifiers with ∼7 kHz bandwidth, ring oscillators with <10 μs stage delays, and NAND and NOR logic gates.

  20. Telehealth and Indian healthcare: moving to scale and sustainability.

    PubMed

    Carroll, Mark; Horton, Mark B

    2013-05-01

    Telehealth innovation has brought important improvements in access to quality healthcare for American Indian and Alaska Native communities. Despite these improvements, substantive work remains before telehealth capability can be more available and sustainable across Indian healthcare. Some of this work will rely on system change guided by new care model development. Such care model development depends on expansion of telehealth reimbursement. The U.S. Indian healthcare system is an ideal framework for implementing and evaluating large-scale change in U.S. telehealth reimbursement policy.

  1. Towards Scalable Deep Learning via I/O Analysis and Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pumma, Sarunya; Si, Min; Feng, Wu-Chun

    Deep learning systems have been growing in prominence as a way to automatically characterize objects, trends, and anomalies. Given the importance of deep learning systems, researchers have been investigating techniques to optimize such systems. An area of particular interest has been using large supercomputing systems to quickly generate effective deep learning networks: a phase often referred to as “training” of the deep learning neural network. As we scale existing deep learning frameworks—such as Caffe—on these large supercomputing systems, we notice that the parallelism can help improve the computation tremendously, leaving data I/O as the major bottleneck limiting the overall systemmore » scalability. In this paper, we first present a detailed analysis of the performance bottlenecks of Caffe on large supercomputing systems. Our analysis shows that the I/O subsystem of Caffe—LMDB—relies on memory-mapped I/O to access its database, which can be highly inefficient on large-scale systems because of its interaction with the process scheduling system and the network-based parallel filesystem. Based on this analysis, we then present LMDBIO, our optimized I/O plugin for Caffe that takes into account the data access pattern of Caffe in order to vastly improve I/O performance. Our experimental results show that LMDBIO can improve the overall execution time of Caffe by nearly 20-fold in some cases.« less

  2. Panoptes: web-based exploration of large scale genome variation data.

    PubMed

    Vauterin, Paul; Jeffery, Ben; Miles, Alistair; Amato, Roberto; Hart, Lee; Wright, Ian; Kwiatkowski, Dominic

    2017-10-15

    The size and complexity of modern large-scale genome variation studies demand novel approaches for exploring and sharing the data. In order to unlock the potential of these data for a broad audience of scientists with various areas of expertise, a unified exploration framework is required that is accessible, coherent and user-friendly. Panoptes is an open-source software framework for collaborative visual exploration of large-scale genome variation data and associated metadata in a web browser. It relies on technology choices that allow it to operate in near real-time on very large datasets. It can be used to browse rich, hybrid content in a coherent way, and offers interactive visual analytics approaches to assist the exploration. We illustrate its application using genome variation data of Anopheles gambiae, Plasmodium falciparum and Plasmodium vivax. Freely available at https://github.com/cggh/panoptes, under the GNU Affero General Public License. paul.vauterin@gmail.com. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  3. Bringing Abstract Academic Integrity and Ethical Concepts into Real-Life Situations

    ERIC Educational Resources Information Center

    Kwong, Theresa; Wong, Eva; Yue, Kevin

    2017-01-01

    This paper reports the learning analytics on the initial stages of a large-scale, government-funded project which inducts university students in Hong Kong into consideration of academic integrity and ethics through mobile Augmented Reality (AR) learning trails--Trails of Integrity and Ethics (TIEs)--accessed on smart devices. The trails immerse…

  4. Development of a high-throughput SNP resource to advance genomic, genetic and breeding research in carrot (Daucus carota L.)

    USDA-ARS?s Scientific Manuscript database

    The rapid advancement in high-throughput SNP genotyping technologies along with next generation sequencing (NGS) platforms has decreased the cost, improved the quality of large-scale genome surveys, and allowed specialty crops with limited genomic resources such as carrot (Daucus carota) to access t...

  5. Software Tools | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  6. A Survey and Analysis of Access Control Architectures for XML Data

    DTIC Science & Technology

    2006-03-01

    13 4. XML Query Engines ...castle and the drawbridge over the moat. Extending beyond the visual analogy, there are many key components to the protection of information and...technology. While XML’s original intent was to enable large-scale electronic publishing over the internet, its functionality is firmly rooted in its

  7. Design Research on Personalized Problem Posing in Algebra

    ERIC Educational Resources Information Center

    Walkington, Candace

    2017-01-01

    Algebra is an area of pressing national concern around issues of equity and access in education. Recent theories and research suggest that personalization of instruction can allow students to activate their funds of knowledge and can elicit interest in the content to be learned. This paper examines the results of a large-scale teaching experiment…

  8. Informal Nature Experience on the School Playground

    ERIC Educational Resources Information Center

    Raith, Andreas

    2015-01-01

    In Germany, all-day care and all-day schooling are currently increasing on a large-scale. The extended time children spend in educational institutions could potentially result in limited access to nature experience for children. On the other hand, it could equally create opportunities for informal nature experience if school playgrounds have a…

  9. Aspiration Growth, Talent Development, and Self-Fulfillment in a Context of Democratic Erosion

    ERIC Educational Resources Information Center

    Ambrose, Don

    2005-01-01

    More comprehensive understanding of giftedness and talent growth will be accessible once people explore the large-scale contexts that surround and shape the development of high ability individuals. Many analyses of close-proximity contexts, such as classrooms and schools, currently enrich the gifted education literature. More in-depth explorations…

  10. Network Access to Visual Information: A Study of Costs and Uses.

    ERIC Educational Resources Information Center

    Besser, Howard

    This paper summarizes a subset of the findings of a study of digital image distribution that focused on the Museum Educational Site Licensing (MESL) project--the first large-scale multi-institutional project to explore digital delivery of art images and accompanying text/metadata from disparate sources. This Mellon Foundation-sponsored study…

  11. The Dissemination and Implementation of Evidence-Based Psychological Treatments: A Review of Current Efforts

    ERIC Educational Resources Information Center

    McHugh, R. Kathryn; Barlow, David H.

    2010-01-01

    Recognizing an urgent need for increased access to evidenced-based psychological treatments, public health authorities have recently allocated over $2 billion to better disseminate these interventions. In response, implementation of these programs has begun, some of it on a very large scale, with substantial implications for the science and…

  12. Implementation Blueprint and Self-Assessment: Positive Behavioral Interventions and Supports

    ERIC Educational Resources Information Center

    Technical Assistance Center on Positive Behavioral Interventions and Supports, 2010

    2010-01-01

    A "blueprint" is a guide designed to improve large-scale implementations of a specific systems or organizational approach, like School-Wide Positive Behavior Support (SWPBS). This blueprint is intended to make the conceptual theory, organizational models, and practices of SWPBS more accessible for those involved in enhancing how schools,…

  13. Contextual Compression of Large-Scale Wind Turbine Array Simulations: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interactive visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contexualized representation is a valid approach and encourages contextual data management.« less

  14. Contextual Compression of Large-Scale Wind Turbine Array Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gruchalla, Kenny M; Brunhart-Lupo, Nicholas J; Potter, Kristin C

    Data sizes are becoming a critical issue particularly for HPC applications. We have developed a user-driven lossy wavelet-based storage model to facilitate the analysis and visualization of large-scale wind turbine array simulations. The model stores data as heterogeneous blocks of wavelet coefficients, providing high-fidelity access to user-defined data regions believed the most salient, while providing lower-fidelity access to less salient regions on a block-by-block basis. In practice, by retaining the wavelet coefficients as a function of feature saliency, we have seen data reductions in excess of 94 percent, while retaining lossless information in the turbine-wake regions most critical to analysismore » and providing enough (low-fidelity) contextual information in the upper atmosphere to track incoming coherent turbulent structures. Our contextual wavelet compression approach has allowed us to deliver interative visual analysis while providing the user control over where data loss, and thus reduction in accuracy, in the analysis occurs. We argue this reduced but contextualized representation is a valid approach and encourages contextual data management.« less

  15. INFN, IT the GENIUS grid portal and the robot certificates to perform phylogenetic analysis on large scale: a success story from the International LIBI project

    NASA Astrophysics Data System (ADS)

    Barbera, Roberto; Donvit, Giacinto; Falzone, Alberto; Rocca, Giuseppe La; Maggi, Giorgio Pietro; Milanesi, Luciano; Vicarioicario, Saverio

    This paper depicts the solution proposed by INFN to allow users, not owning a personal digital certificate and therefore not belonging to any specific Virtual Organization (VO), to access Grid infrastructures via the GENIUS Grid portal enabled with robot certificates. Robot certificates, also known as portal certificates, are associated with a specific application that the user wants to share with the whole Grid community and have recently been introduced by the EUGridPMA (European Policy Management Authority for Grid Authentication) to perform automated tasks on Grids on behalf of users. They are proven to be extremely useful to automate grid service monitoring, data processing production, distributed data collection systems, etc. In this paper, robot certificates have been used to allow bioinformaticians involved in the Italian LIBI project to perform large scale phylogenetic analyses. The distributed environment set up in this work strongly simplify the grid access of occasional users and represents a valuable step forward to wide the communities of users.

  16. The Plant Genome Integrative Explorer Resource: PlantGenIE.org.

    PubMed

    Sundell, David; Mannapperuma, Chanaka; Netotea, Sergiu; Delhomme, Nicolas; Lin, Yao-Cheng; Sjödin, Andreas; Van de Peer, Yves; Jansson, Stefan; Hvidsten, Torgeir R; Street, Nathaniel R

    2015-12-01

    Accessing and exploring large-scale genomics data sets remains a significant challenge to researchers without specialist bioinformatics training. We present the integrated PlantGenIE.org platform for exploration of Populus, conifer and Arabidopsis genomics data, which includes expression networks and associated visualization tools. Standard features of a model organism database are provided, including genome browsers, gene list annotation, Blast homology searches and gene information pages. Community annotation updating is supported via integration of WebApollo. We have produced an RNA-sequencing (RNA-Seq) expression atlas for Populus tremula and have integrated these data within the expression tools. An updated version of the ComPlEx resource for performing comparative plant expression analyses of gene coexpression network conservation between species has also been integrated. The PlantGenIE.org platform provides intuitive access to large-scale and genome-wide genomics data from model forest tree species, facilitating both community contributions to annotation improvement and tools supporting use of the included data resources to inform biological insight. © 2015 The Authors. New Phytologist © 2015 New Phytologist Trust.

  17. Resonant soft X-ray scattering for polymer materials

    DOE PAGES

    Liu, Feng; Brady, Michael A.; Wang, Cheng

    2016-04-16

    Resonant Soft X-ray Scattering (RSoXS) was developed within the last few years, and the first dedicated resonant soft X-ray scattering beamline for soft materials was constructed at the Advanced Light Source, LBNL. RSoXS combines soft X-ray spectroscopy with X-ray scattering and thus offers statistical information for 3D chemical morphology over a large length scale range from nanometers to micrometers. Using RSoXS to characterize multi-length scale soft materials with heterogeneous chemical structures, we have demonstrated that soft X-ray scattering is a unique complementary technique to conventional hard X-ray and neutron scattering. Its unique chemical sensitivity, large accessible size scale, molecular bondmore » orientation sensitivity with polarized X-rays, and high coherence have shown great potential for chemically specific structural characterization for many classes of materials.« less

  18. Inquiry-Based Educational Design for Large-Scale High School Astronomy Projects Using Real Telescopes

    NASA Astrophysics Data System (ADS)

    Fitzgerald, Michael; McKinnon, David H.; Danaia, Lena

    2015-12-01

    In this paper, we outline the theory behind the educational design used to implement a large-scale high school astronomy education project. This design was created in response to the realization of ineffective educational design in the initial early stages of the project. The new design follows an iterative improvement model where the materials and general approach can evolve in response to solicited feedback. The improvement cycle concentrates on avoiding overly positive self-evaluation while addressing relevant external school and community factors while concentrating on backward mapping from clearly set goals. Limiting factors, including time, resources, support and the potential for failure in the classroom, are dealt with as much as possible in the large-scale design allowing teachers the best chance of successful implementation in their real-world classroom. The actual approach adopted following the principles of this design is also outlined, which has seen success in bringing real astronomical data and access to telescopes into the high school classroom.

  19. XLinkDB 2.0: integrated, large-scale structural analysis of protein crosslinking data

    PubMed Central

    Schweppe, Devin K.; Zheng, Chunxiang; Chavez, Juan D.; Navare, Arti T.; Wu, Xia; Eng, Jimmy K.; Bruce, James E.

    2016-01-01

    Motivation: Large-scale chemical cross-linking with mass spectrometry (XL-MS) analyses are quickly becoming a powerful means for high-throughput determination of protein structural information and protein–protein interactions. Recent studies have garnered thousands of cross-linked interactions, yet the field lacks an effective tool to compile experimental data or access the network and structural knowledge for these large scale analyses. We present XLinkDB 2.0 which integrates tools for network analysis, Protein Databank queries, modeling of predicted protein structures and modeling of docked protein structures. The novel, integrated approach of XLinkDB 2.0 enables the holistic analysis of XL-MS protein interaction data without limitation to the cross-linker or analytical system used for the analysis. Availability and Implementation: XLinkDB 2.0 can be found here, including documentation and help: http://xlinkdb.gs.washington.edu/. Contact: jimbruce@uw.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153666

  20. Comparative transcriptomics with self-organizing map reveals cryptic photosynthetic differences between two accessions of North American Lake cress.

    PubMed

    Nakayama, Hokuto; Sakamoto, Tomoaki; Okegawa, Yuki; Kaminoyama, Kaori; Fujie, Manabu; Ichihashi, Yasunori; Kurata, Tetsuya; Motohashi, Ken; Al-Shehbaz, Ihsan; Sinha, Neelima; Kimura, Seisuke

    2018-02-19

    Because natural variation in wild species is likely the result of local adaptation, it provides a valuable resource for understanding plant-environmental interactions. Rorippa aquatica (Brassicaceae) is a semi-aquatic North American plant with morphological differences between several accessions, but little information available on any physiological differences. Here, we surveyed the transcriptomes of two R. aquatica accessions and identified cryptic physiological differences between them. We first reconstructed a Rorippa phylogeny to confirm relationships between the accessions. We performed large-scale RNA-seq and de novo assembly; the resulting 87,754 unigenes were then annotated via comparisons to different databases. Between-accession physiological variation was identified with transcriptomes from both accessions. Transcriptome data were analyzed with principal component analysis and self-organizing map. Results of analyses suggested that photosynthetic capability differs between the accessions. Indeed, physiological experiments revealed between-accession variation in electron transport rate and the redox state of the plastoquinone pool. These results indicated that one accession may have adapted to differences in temperature or length of the growing season.

  1. Improving data workflow systems with cloud services and use of open data for bioinformatics research.

    PubMed

    Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich

    2017-04-16

    Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.

  2. A spatial picture of the synthetic large-scale motion from dynamic roughness

    NASA Astrophysics Data System (ADS)

    Huynh, David; McKeon, Beverley

    2017-11-01

    Jacobi and McKeon (2011) set up a dynamic roughness apparatus to excite a synthetic, travelling wave-like disturbance in a wind tunnel, boundary layer study. In the present work, this dynamic roughness has been adapted for a flat-plate, turbulent boundary layer experiment in a water tunnel. A key advantage of operating in water as opposed to air is the longer flow timescales. This makes accessible higher non-dimensional actuation frequencies and correspondingly shorter synthetic length scales, and is thus more amenable to particle image velocimetry. As a result, this experiment provides a novel spatial picture of the synthetic mode, the coupled small scales, and their streamwise development. It is demonstrated that varying the roughness actuation frequency allows for significant tuning of the streamwise wavelength of the synthetic mode, with a range of 3 δ-13 δ being achieved. Employing a phase-locked decomposition, spatial snapshots are constructed of the synthetic large scale and used to analyze its streamwise behavior. Direct spatial filtering is used to separate the synthetic large scale and the related small scales, and the results are compared to those obtained by temporal filtering that invokes Taylor's hypothesis. The support of AFOSR (Grant # FA9550-16-1-0361) is gratefully acknowledged.

  3. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  4. Tethys – A Python Package for Spatial and Temporal Downscaling of Global Water Withdrawals

    DOE PAGES

    Li, Xinya; Vernon, Chris R.; Hejazi, Mohamad I.; ...

    2018-02-09

    Downscaling of water withdrawals from regional/national to local scale is a fundamental step and also a common problem when integrating large scale economic and integrated assessment models with high-resolution detailed sectoral models. Tethys, an open-access software written in Python, is developed with statistical downscaling algorithms, to spatially and temporally downscale water withdrawal data to a finer scale. The spatial resolution will be downscaled from region/basin scale to grid (0.5 geographic degree) scale and the temporal resolution will be downscaled from year to month. Tethys is used to produce monthly global gridded water withdrawal products based on estimates from the Globalmore » Change Assessment Model (GCAM).« less

  5. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Western U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  6. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Westem U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  7. Large scale preparation of high mannose and paucimannose N-glycans from soybean proteins by oxidative release of natural glycans (ORNG).

    PubMed

    Zhu, Yuyang; Yan, Maomao; Lasanajak, Yi; Smith, David F; Song, Xuezheng

    2018-07-15

    Despite the important advances in chemical and chemoenzymatic synthesis of glycans, access to large quantities of complex natural glycans remains a major impediment to progress in Glycoscience. Here we report a large-scale preparation of N-glycans from a kilogram of commercial soy proteins using oxidative release of natural glycans (ORNG). The high mannose and paucimannose N-glycans were labeled with a fluorescent tag and purified by size exclusion and multidimensional preparative HPLC. Side products are identified and potential mechanisms for the oxidative release of natural N-glycans from glycoproteins are proposed. This study demonstrates the potential for using the ORNG approach as a complementary route to synthetic approaches for the preparation of multi-milligram quantities of biomedically relevant complex glycans. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Hierarchical coarse-graining strategy for protein-membrane systems to access mesoscopic scales

    PubMed Central

    Ayton, Gary S.; Lyman, Edward

    2014-01-01

    An overall multiscale simulation strategy for large scale coarse-grain simulations of membrane protein systems is presented. The protein is modeled as a heterogeneous elastic network, while the lipids are modeled using the hybrid analytic-systematic (HAS) methodology, where in both cases atomistic level information obtained from molecular dynamics simulation is used to parameterize the model. A feature of this approach is that from the outset liposome length scales are employed in the simulation (i.e., on the order of ½ a million lipids plus protein). A route to develop highly coarse-grained models from molecular-scale information is proposed and results for N-BAR domain protein remodeling of a liposome are presented. PMID:20158037

  9. Field of genes: using Apache Kafka as a bioinformatic data repository.

    PubMed

    Lawlor, Brendan; Lynch, Richard; Mac Aogáin, Micheál; Walsh, Paul

    2018-04-01

    Bioinformatic research is increasingly dependent on large-scale datasets, accessed either from private or public repositories. An example of a public repository is National Center for Biotechnology Information's (NCBI's) Reference Sequence (RefSeq). These repositories must decide in what form to make their data available. Unstructured data can be put to almost any use but are limited in how access to them can be scaled. Highly structured data offer improved performance for specific algorithms but limit the wider usefulness of the data. We present an alternative: lightly structured data stored in Apache Kafka in a way that is amenable to parallel access and streamed processing, including subsequent transformations into more highly structured representations. We contend that this approach could provide a flexible and powerful nexus of bioinformatic data, bridging the gap between low structure on one hand, and high performance and scale on the other. To demonstrate this, we present a proof-of-concept version of NCBI's RefSeq database using this technology. We measure the performance and scalability characteristics of this alternative with respect to flat files. The proof of concept scales almost linearly as more compute nodes are added, outperforming the standard approach using files. Apache Kafka merits consideration as a fast and more scalable but general-purpose way to store and retrieve bioinformatic data, for public, centralized reference datasets such as RefSeq and for private clinical and experimental data.

  10. The Climate-G testbed: towards a large scale data sharing environment for climate change

    NASA Astrophysics Data System (ADS)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for data visualization; metadata search/discovery across several countries/institutions; open environment for climate change data sharing.

  11. Large scale land acquisitions and REDD+: a synthesis of conflicts and opportunities

    NASA Astrophysics Data System (ADS)

    Carter, Sarah; Manceur, Ameur M.; Seppelt, Ralf; Hermans-Neumann, Kathleen; Herold, Martin; Verchot, Lou

    2017-03-01

    Large scale land acquisitions (LSLA), and Reducing Emissions from Deforestation and forest Degradation (REDD+) are both land based phenomena which when occurring in the same area, can compete with each other for land. A quantitative analysis of country characteristics revealed that land available for agriculture, accessibility, and political stability are key explanatory factors for a country being targeted for LSLA. Surprisingly LSLA occur in countries with lower accessibility. Countries with good land availability, poor accessibility and political stability may become future targets if they do not already have LSLA. Countries which high levels of agriculture-driven deforestation and LSLA, should develop interventions which reduce forest loss driven either directly or indirectly by LSLA as part of their REDD+ strategies. Both host country and investor-side policies have been identified which could be used more widely to reduce conflicts between LSLA and REDD+. Findings from this research highlight the need for and can inform the development of national and international policies on land acquisitions including green acquisitions such as REDD+. Land management must be considered with all its objectives—including food security, biodiversity conservation, and climate change mitigation—in a coherent strategy which engages relevant stakeholders. This is not currently occurring and might be a key ingredient to achieve the targets under the Sustainable Development Goals 2 and 15 and 16 (related to food security and sustainable agriculture and the protection of forests) among others.

  12. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    PubMed Central

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  13. The role of advanced reactive surface area characterization in improving predictions of mineral reaction rates

    NASA Astrophysics Data System (ADS)

    Beckingham, L. E.; Zhang, S.; Mitnick, E.; Cole, D. R.; Yang, L.; Anovitz, L. M.; Sheets, J.; Swift, A.; Kneafsey, T. J.; Landrot, G.; Mito, S.; Xue, Z.; Steefel, C. I.; DePaolo, D. J.; Ajo Franklin, J. B.

    2014-12-01

    Geologic sequestration of CO2 in deep sedimentary formations is a promising means of mitigating carbon emissions from coal-fired power plants but the long-term fate of injected CO2 is challenging to predict. Reactive transport models are used to gain insight over long times but rely on laboratory determined mineral reaction rates that have been difficult to extrapolate to field systems. This, in part, is due to a lack of understanding of mineral reactive surface area. Many models use an arbitrary approximation of reactive surface area, applying orders of magnitude scaling factors to measured BET or geometric surface areas. Recently, a few more sophisticated approaches have used 2D and 3D image analyses to determine mineral-specific reactive surface areas that account for the accessibility of minerals. However, the ability of these advanced surface area estimates to improve predictions of mineral reaction rates has yet to be determined. In this study, we fuse X-ray microCT, SEM QEMSCAN, XRD, SANS, and SEM-FIB analysis to determine mineral-specific accessible reactive surface areas for a core sample from the Nagaoka pilot CO2 injection site (Japan). This sample is primarily quartz, plagioclase, smectite, K-feldspar, and pyroxene. SEM imaging shows abundant smectite cement and grain coatings that decrease the fluid accessibility of other minerals. However, analysis of FIB-SEM images reveals that smectite nano-pores are well connected such that access to underlying minerals is not occluded by smectite coatings. Mineral-specific accessible surfaces are determined, accounting for the connectivity of the pore space with and without connected smectite nano-pores. The large-scale impact of variations in accessibility and dissolution rates are then determined through continuum scale modeling using grid-cell specific information on accessible surface areas. This approach will be compared with a traditional continuum scale model using mineral abundances and common surface area estimates. Ultimately, the effectiveness of advanced surface area characterization to improve mineral dissolution rates will be evaluated by comparison of model results with dissolution rates measured from a flow-through column experiment.

  14. A contextual role-based access control authorization model for electronic patient record.

    PubMed

    Motta, Gustavo H M B; Furuie, Sergio S

    2003-09-01

    The design of proper models for authorization and access control for electronic patient record (EPR) is essential to a wide scale use of EPR in large health organizations. In this paper, we propose a contextual role-based access control authorization model aiming to increase the patient privacy and the confidentiality of patient data, whereas being flexible enough to consider specific cases. This model regulates user's access to EPR based on organizational roles. It supports a role-tree hierarchy with authorization inheritance; positive and negative authorizations; static and dynamic separation of duties based on weak and strong role conflicts. Contextual authorizations use environmental information available at access time, like user/patient relationship, in order to decide whether a user is allowed to access an EPR resource. This enables the specification of a more flexible and precise authorization policy, where permission is granted or denied according to the right and the need of the user to carry out a particular job function.

  15. Gender Perspectives on Spatial Tasks in a National Assessment: A Secondary Data Analysis

    ERIC Educational Resources Information Center

    Logan, Tracy; Lowrie, Tom

    2017-01-01

    Most large-scale summative assessments present results in terms of cumulative scores. Although such descriptions can provide insights into general trends over time, they do not provide detail of how students solved the tasks. Less restrictive access to raw data from these summative assessments has occurred in recent years, resulting in…

  16. Technical Assessment: Integrated Photonics

    DTIC Science & Technology

    2015-10-01

    in global internet protocol traffic as a function of time by local access technology. Photonics continues to play a critical role in enabling this...communication networks. This has enabled services like the internet , high performance computing, and power-efficient large-scale data centers. The...signal processing, quantum information science, and optics for free space applications. However major obstacles challenge the implementation of

  17. Estimates of Down Woody Materials in Eastern US Forests

    Treesearch

    David C. Chojnacky; Robert A. Mickler; Linda S. Heath; Christopher W. Woodall

    2004-01-01

    Down woody materials (WVMs) are an important part of forest ecosystems for wildlife habitat, carbon storage, structural diversity, wildfire hazard, and other large-scale ecosystem processes. To better manage forests for DWMs, available and easily accessible data on DWM components are needed. We examined data on DWMs, collected in 2001 by the US Department of...

  18. The State of the Gate: A Description of Instructional Practice in Algebra in Five Urban Districts

    ERIC Educational Resources Information Center

    Litke, Erica G.

    2015-01-01

    Algebra is considered a linchpin for success in secondary mathematics, serving as a gatekeeper to higher-level courses. Access to algebra is also considered an important lever for educational equity. Yet despite its prominence, large-scale examinations of algebra instruction are rare. In my dissertation, I endeavor to better understand what…

  19. Gender Implications in Curriculum and Entrance Exam Grouping: Institutional Factors and Their Effects

    ERIC Educational Resources Information Center

    Hsaieh, Hsiao-Chin; Yang, Chia-Ling

    2014-01-01

    While access to higher education has reached gender parity in Taiwan, the phenomenon of gender segregation and stratification by fields of study and by division of labor persist. In this article, we trace the historical evolution of Taiwan's education system and data using large-scale educational databases to analyze the association of…

  20. Modeling large-scale winter recreation terrain selection with implications for recreation management and wildlife

    Treesearch

    Lucretia E. Olson; John R. Squires; Elizabeth K. Roberts; Aubrey D. Miller; Jacob S. Ivan; Mark Hebblewhite

    2017-01-01

    Winter recreation is a rapidly growing activity, and advances in technology make it possible for increasing numbers of people to access remote backcountry terrain. Increased winter recreation may lead to more frequent conflict between recreationists, as well as greater potential disturbance to wildlife. To better understand the environmental characteristics favored by...

  1. Preservation and Access to Manuscript Collections of the Czech National Library.

    ERIC Educational Resources Information Center

    Karen, Vladimir; Psohlavec, Stanislav

    In 1996, the Czech National Library started a large-scale digitization of its extensive and invaluable collection of historical manuscripts and printed books. Each page of the selected documents is scanned using a high-resolution, full-color digital camera, processed, and archived on a CD-ROM disk. HTML coded description is added to the entire…

  2. Building Capacity for Assessment in PISA for Development Countries. PISA for Development Brief 14

    ERIC Educational Resources Information Center

    OECD Publishing, 2017

    2017-01-01

    This article explains how the Program for International Student Initiative for Development (PISA-D) initiative aims to make PISA more accessible to middle- and low-income countries. A key component of PISA-D is building capacity in the participating countries for managing large-scale student learning assessments and using the results to support…

  3. Multiresolution persistent homology for excessively large biomolecular datasets

    NASA Astrophysics Data System (ADS)

    Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei

    2015-10-01

    Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topological analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.

  4. Designing and developing portable large-scale JavaScript web applications within the Experiment Dashboard framework

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Dzhunov, I.; Karavakis, E.; Kokoszkiewicz, L.; Nowotka, M.; Saiz, P.; Tuckett, D.

    2012-12-01

    Improvements in web browser performance and web standards compliance, as well as the availability of comprehensive JavaScript libraries, provides an opportunity to develop functionally rich yet intuitive web applications that allow users to access, render and analyse data in novel ways. However, the development of such large-scale JavaScript web applications presents new challenges, in particular with regard to code sustainability and team-based work. We present an approach that meets the challenges of large-scale JavaScript web application design and development, including client-side model-view-controller architecture, design patterns, and JavaScript libraries. Furthermore, we show how the approach leads naturally to the encapsulation of the data source as a web API, allowing applications to be easily ported to new data sources. The Experiment Dashboard framework is used for the development of applications for monitoring the distributed computing activities of virtual organisations on the Worldwide LHC Computing Grid. We demonstrate the benefits of the approach for large-scale JavaScript web applications in this context by examining the design of several Experiment Dashboard applications for data processing, data transfer and site status monitoring, and by showing how they have been ported for different virtual organisations and technologies.

  5. Investigating large-scale secondary circulations within impact crater topographies in a refractive index-matched facility

    NASA Astrophysics Data System (ADS)

    Blois, Gianluca; Kim, Taehoon; Bristow, Nathan; Day, Mackenzie; Kocurek, Gary; Anderson, William; Christensen, Kenneth

    2017-11-01

    Impact craters, common large-scale topographic features on the surface of Mars, are circular depressions delimited by a sharp ridge. A variety of crater fill morphologies exist, suggesting that complex intracrater circulations affect their evolution. Some large craters (diameter >10 km), particularly at mid latitudes on Mars, exhibit a central mound surrounded by circular moat. Foremost among these examples is Gale crater, landing site of NASA's Curiosity rover, since large-scale climatic processes early in in the history of Mars are preserved in the stratigraphic record of the inner mound. Investigating the intracrater flow produced by large scale winds aloft Mars craters is key to a number of important scientific issues including ongoing research on Mars paleo-environmental reconstruction and the planning of future missions (these results must be viewed in conjunction with the affects of radial katabatibc flows, the importance of which is already established in preceding studies). In this work we consider a number of crater shapes inspired by Gale morphology, including idealized craters. Access to the flow field within such geometrically complex topography is achieved herein using a refractive index matched approach. Instantaneous velocity maps, using both planar and volumetric PIV techniques, are presented to elucidate complex three-dimensional flow within the crater. In addition, first- and second-order statistics will be discussed in the context of wind-driven (aeolian) excavation of crater fill.

  6. Transferability of STS markers in studying genetic relationships of marvel grass (Dichanthium annulatum).

    PubMed

    Saxena, Raghvendra; Chandra, Amaresh

    2011-11-01

    Transferability of sequence-tagged-sites (STS) markers was assessed for genetic relationships study among accessions of marvel grass (Dichanthium annulatum Forsk.). In total, 17 STS primers of Stylosanthes origin were tested for their reactivity with thirty accessions of Dichanthium annulatum. Of these, 14 (82.4%) reacted and a total 106 (84 polymorphic) bands were scored. The number of bands generated by individual primer pairs ranged from 4 to 11 with an average of 7.57 bands, whereas polymorphic bands ranged from 4 to 9 with an average of 6.0 bands accounts to an average polymorphism of 80.1%. Polymorphic information content (PIC) ranged from 0.222 to 0.499 and marker index (MI) from 1.33 to 4.49. Utilizing Dice coefficient of genetic similarity dendrogram was generated through un-weighted pairgroup method with arithmetic mean (UPGMA) algorithm. Further, clustering through sequential agglomerative hierarchical and nested (SAHN) method resulted three main clusters constituted all accessions except IGBANG-D-2. Though there was intermixing of few accessions of one agro-climatic region to another, largely groupings of accessions were with their regions of collections. Bootstrap analysis at 1000 scale also showed large number of nodes (11 to 17) having strong clustering (> 50). Thus, results demonstrate the utility of STS markers of Stylosanthes in studying the genetic relationships among accessions of Dichanthium.

  7. Fabrication of a 3D micro/nano dual-scale carbon array and its demonstration as the microelectrodes for supercapacitors

    NASA Astrophysics Data System (ADS)

    Jiang, Shulan; Shi, Tielin; Gao, Yang; Long, Hu; Xi, Shuang; Tang, Zirong

    2014-04-01

    An easily accessible method is proposed for the fabrication of a 3D micro/nano dual-scale carbon array with a large surface area. The process mainly consists of three critical steps. Firstly, a hemispherical photoresist micro-array was obtained by the cost-effective nanoimprint lithography process. Then the micro-array was transformed into hierarchical structures with longitudinal nanowires on the microstructure surface by oxygen plasma etching. Finally, the micro/nano dual-scale carbon array was fabricated by carbonizing these hierarchical photoresist structures. It has also been demonstrated that the micro/nano dual-scale carbon array can be used as the microelectrodes for supercapacitors by the electrodeposition of a manganese dioxide (MnO2) film onto the hierarchical carbon structures with greatly enhanced electrochemical performance. The specific gravimetric capacitance of the deposited micro/nano dual-scale microelectrodes is estimated to be 337 F g-1 at the scan rate of 5 mV s-1. This proposed approach of fabricating a micro/nano dual-scale carbon array provides a facile way in large-scale microstructures’ manufacturing for a wide variety of applications, including sensors and on-chip energy storage devices.

  8. Selecting habitat to survive: the impact of road density on survival in a large carnivore.

    PubMed

    Basille, Mathieu; Van Moorter, Bram; Herfindal, Ivar; Martin, Jodie; Linnell, John D C; Odden, John; Andersen, Reidar; Gaillard, Jean-Michel

    2013-01-01

    Habitat selection studies generally assume that animals select habitat and food resources at multiple scales to maximise their fitness. However, animals sometimes prefer habitats of apparently low quality, especially when considering the costs associated with spatially heterogeneous human disturbance. We used spatial variation in human disturbance, and its consequences on lynx survival, a direct fitness component, to test the Hierarchical Habitat Selection hypothesis from a population of Eurasian lynx Lynx lynx in southern Norway. Data from 46 lynx monitored with telemetry indicated that a high proportion of forest strongly reduced the risk of mortality from legal hunting at the home range scale, while increasing road density strongly increased such risk at the finer scale within the home range. We found hierarchical effects of the impact of human disturbance, with a higher road density at a large scale reinforcing its negative impact at a fine scale. Conversely, we demonstrated that lynx shifted their habitat selection to avoid areas with the highest road densities within their home ranges, thus supporting a compensatory mechanism at fine scale enabling lynx to mitigate the impact of large-scale disturbance. Human impact, positively associated with high road accessibility, was thus a stronger driver of lynx space use at a finer scale, with home range characteristics nevertheless constraining habitat selection. Our study demonstrates the truly hierarchical nature of habitat selection, which aims at maximising fitness by selecting against limiting factors at multiple spatial scales, and indicates that scale-specific heterogeneity of the environment is driving individual spatial behaviour, by means of trade-offs across spatial scales.

  9. Multi-static networked 3D ladar for surveillance and access control

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Ogirala, S. S. R.; Hu, B.; Le, Han Q.

    2007-04-01

    A theoretical design and simulation of a 3D ladar system concept for surveillance, intrusion detection, and access control is described. It is a non-conventional system architecture that consists of: i) multi-static configuration with an arbitrarily scalable number of transmitters (Tx's) and receivers (Rx's) that form an optical wireless code-division-multiple-access (CDMA) network, and ii) flexible system architecture with modular plug-and-play components that can be deployed for any facility with arbitrary topology. Affordability is a driving consideration; and a key feature for low cost is an asymmetric use of many inexpensive Rx's in conjunction with fewer Tx's, which are generally more expensive. The Rx's are spatially distributed close to the surveyed area for large coverage, and capable of receiving signals from multiple Tx's with moderate laser power. The system produces sensing information that scales as NxM, where N, M are the number of Tx's and Rx's, as opposed to linear scaling ~N in non-network system. Also, for target positioning, besides laser pointing direction and time-of-flight, the algorithm includes multiple point-of-view image fusion and triangulation for enhanced accuracy, which is not applicable to non-networked monostatic ladars. Simulation and scaled model experiments on some aspects of this concept are discussed.

  10. Scaling the Drosophila Wing: TOR-Dependent Target Gene Access by the Hippo Pathway Transducer Yorkie

    PubMed Central

    Parker, Joseph; Struhl, Gary

    2015-01-01

    Organ growth is controlled by patterning signals that operate locally (e.g., Wingless/Ints [Wnts], Bone Morphogenetic Proteins [BMPs], and Hedgehogs [Hhs]) and scaled by nutrient-dependent signals that act systemically (e.g., Insulin-like peptides [ILPs] transduced by the Target of Rapamycin [TOR] pathway). How cells integrate these distinct inputs to generate organs of the appropriate size and shape is largely unknown. The transcriptional coactivator Yorkie (Yki, a YES-Associated Protein, or YAP) acts downstream of patterning morphogens and other tissue-intrinsic signals to promote organ growth. Yki activity is regulated primarily by the Warts/Hippo (Wts/Hpo) tumour suppressor pathway, which impedes nuclear access of Yki by a cytoplasmic tethering mechanism. Here, we show that the TOR pathway regulates Yki by a separate and novel mechanism in the Drosophila wing. Instead of controlling Yki nuclear access, TOR signaling governs Yki action after it reaches the nucleus by allowing it to gain access to its target genes. When TOR activity is inhibited, Yki accumulates in the nucleus but is sequestered from its normal growth-promoting target genes—a phenomenon we term “nuclear seclusion.” Hence, we posit that in addition to its well-known role in stimulating cellular metabolism in response to nutrients, TOR also promotes wing growth by liberating Yki from nuclear seclusion, a parallel pathway that we propose contributes to the scaling of wing size with nutrient availability. PMID:26474042

  11. Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing

    NASA Astrophysics Data System (ADS)

    Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey

    Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.

  12. Silicone elastomers capable of large isotropic dimensional change

    DOEpatents

    Lewicki, James; Worsley, Marcus A.

    2017-07-18

    Described herein is a highly effective route towards the controlled and isotropic reduction in size-scale, of complex 3D structures using silicone network polymer chemistry. In particular, a class of silicone structures were developed that once patterned and cured can `shrink` micron scale additive manufactured and lithographically patterned structures by as much as 1 order of magnitude while preserving the dimensions and integrity of these parts. This class of silicone materials is compatible with existing additive manufacture and soft lithographic fabrication processes and will allow access to a hitherto unobtainable dimensionality of fabrication.

  13. Accessibility to primary health care in Belgium: an evaluation of policies awarding financial assistance in shortage areas.

    PubMed

    Dewulf, Bart; Neutens, Tijs; De Weerdt, Yves; Van de Weghe, Nico

    2013-08-22

    In many countries, financial assistance is awarded to physicians who settle in an area that is designated as a shortage area to prevent unequal accessibility to primary health care. Today, however, policy makers use fairly simple methods to define health care accessibility, with physician-to-population ratios (PPRs) within predefined administrative boundaries being overwhelmingly favoured. Our purpose is to verify whether these simple methods are accurate enough for adequately designating medical shortage areas and explore how these perform relative to more advanced GIS-based methods. Using a geographical information system (GIS), we conduct a nation-wide study of accessibility to primary care physicians in Belgium using four different methods: PPR, distance to closest physician, cumulative opportunity, and floating catchment area (FCA) methods. The official method used by policy makers in Belgium (calculating PPR per physician zone) offers only a crude representation of health care accessibility, especially because large contiguous areas (physician zones) are considered. We found substantial differences in the number and spatial distribution of medical shortage areas when applying different methods. The assessment of spatial health care accessibility and concomitant policy initiatives are affected by and dependent on the methodology used. The major disadvantage of PPR methods is its aggregated approach, masking subtle local variations. Some simple GIS methods overcome this issue, but have limitations in terms of conceptualisation of physician interaction and distance decay. Conceptually, the enhanced 2-step floating catchment area (E2SFCA) method, an advanced FCA method, was found to be most appropriate for supporting areal health care policies, since this method is able to calculate accessibility at a small scale (e.g., census tracts), takes interaction between physicians into account, and considers distance decay. While at present in health care research methodological differences and modifiable areal unit problems have remained largely overlooked, this manuscript shows that these aspects have a significant influence on the insights obtained. Hence, it is important for policy makers to ascertain to what extent their policy evaluations hold under different scales of analysis and when different methods are used.

  14. Accessibility to primary health care in Belgium: an evaluation of policies awarding financial assistance in shortage areas

    PubMed Central

    2013-01-01

    Background In many countries, financial assistance is awarded to physicians who settle in an area that is designated as a shortage area to prevent unequal accessibility to primary health care. Today, however, policy makers use fairly simple methods to define health care accessibility, with physician-to-population ratios (PPRs) within predefined administrative boundaries being overwhelmingly favoured. Our purpose is to verify whether these simple methods are accurate enough for adequately designating medical shortage areas and explore how these perform relative to more advanced GIS-based methods. Methods Using a geographical information system (GIS), we conduct a nation-wide study of accessibility to primary care physicians in Belgium using four different methods: PPR, distance to closest physician, cumulative opportunity, and floating catchment area (FCA) methods. Results The official method used by policy makers in Belgium (calculating PPR per physician zone) offers only a crude representation of health care accessibility, especially because large contiguous areas (physician zones) are considered. We found substantial differences in the number and spatial distribution of medical shortage areas when applying different methods. Conclusions The assessment of spatial health care accessibility and concomitant policy initiatives are affected by and dependent on the methodology used. The major disadvantage of PPR methods is its aggregated approach, masking subtle local variations. Some simple GIS methods overcome this issue, but have limitations in terms of conceptualisation of physician interaction and distance decay. Conceptually, the enhanced 2-step floating catchment area (E2SFCA) method, an advanced FCA method, was found to be most appropriate for supporting areal health care policies, since this method is able to calculate accessibility at a small scale (e.g. census tracts), takes interaction between physicians into account, and considers distance decay. While at present in health care research methodological differences and modifiable areal unit problems have remained largely overlooked, this manuscript shows that these aspects have a significant influence on the insights obtained. Hence, it is important for policy makers to ascertain to what extent their policy evaluations hold under different scales of analysis and when different methods are used. PMID:23964751

  15. Ocean Research Enabled by Underwater Gliders.

    PubMed

    Rudnick, Daniel L

    2016-01-01

    Underwater gliders are autonomous underwater vehicles that profile vertically by changing their buoyancy and use wings to move horizontally. Gliders are useful for sustained observation at relatively fine horizontal scales, especially to connect the coastal and open ocean. In this review, research topics are grouped by time and length scales. Large-scale topics addressed include the eastern and western boundary currents and the regional effects of climate variability. The accessibility of horizontal length scales of order 1 km allows investigation of mesoscale and submesoscale features such as fronts and eddies. Because the submesoscales dominate vertical fluxes in the ocean, gliders have found application in studies of biogeochemical processes. At the finest scales, gliders have been used to measure internal waves and turbulent dissipation. The review summarizes gliders' achievements to date and assesses their future in ocean observation.

  16. A public health perspective to environmental barriers and accessibility problems for senior citizens living in ordinary housing.

    PubMed

    Granbom, Marianne; Iwarsson, Susanne; Kylberg, Marianne; Pettersson, Cecilia; Slaug, Björn

    2016-08-11

    Housing environments that hinder performance of daily activities and impede participation in social life have negative health consequences particularly for the older segment of the population. From a public health perspective accessible housing that supports active and healthy ageing is therefore crucial. The objective of the present study was to make an inventory of environmental barriers and investigate accessibility problems in the ordinary housing stock in Sweden as related to the functional capacity of senior citizens. Particular attention was paid to differences between housing types and building periods and to identify environmental barriers generating the most accessibility problems for sub-groups of senior citizens. Data on environmental barriers in dwellings from three databases on housing and health in old age was analysed (N = 1021). Four functional profiles representing large groups of senior citizens were used in analyses of the magnitude and severity of potential accessibility problems. Differences in terms of type of housing and building period were examined. High proportions of one-family houses as well as multi-dwellings had substantial numbers of environmental barriers, with significantly lower numbers in later building periods. Accessibility problems occurred already for senior citizens with few functional limitations, but more profound for those dependent on mobility devices. The most problematic housing sections were entrances in one-family houses and kitchens of multi-dwellings. Despite a high housing standard in the Swedish ordinary housing stock the results show substantial accessibility problems for senior citizens with functional limitations. To make housing accessible large-scale and systematic efforts are required.

  17. Ventilator-dependent children and the health services system. Unmet needs and coordination of care.

    PubMed

    Hefner, Jennifer L; Tsai, Wan Chong

    2013-10-01

    Children dependent on mechanical ventilation are a vulnerable population by virtue of their chronic disability and are therefore at increased risk for health disparities and access barriers. The present study is the first, to our knowledge, to conduct a large-scale survey of caregivers of ventilator-dependent children to develop a comprehensive socio-demographic profile. To describe the demographic and health status profile of ventilator-dependent children, to identify the types of unmet needs families caring for a child on a ventilator face, and to determine the correlates of access to care coordination. A survey was administered to 122 parents whose children attended a pediatric home ventilator clinic at a large tertiary Midwestern medical center (84% of the clinic population). Half of the patient population had severe functional limitations, and 70% had one or more comorbidities. One quarter of caregivers reported current financial struggles, and 16% screened positive for a probable depressive disorder. More than half of families reported unmet needs for care, most frequently therapeutic services and skilled nursing care. Of those reporting an unmet need for skilled nursing care, lack of adequate staffing was the main barrier (71.1%). Financial struggles and a probable caregiver depressive disorder were significantly associated with an unmet need for care coordination. This is the first large-scale quantitative study to investigate the themes of unmet need and care coordination within this vulnerable population. The results suggest these families face barriers accessing therapeutic and skilled nursing services, and caregiver mental health and financial struggles may be important points of intervention for service providers through the inclusion of multidisciplinary care teams and the strengthening of social services referral networks.

  18. Accessing Secondary Markets as a Capital Source for Energy Efficiency Finance Programs: Program Design Considerations for Policymakers and Administrators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, C.; Martin, E. Fadrhonc; Thompson, P.

    Estimates of the total opportunity for investment in cost-effective energy efficiency in the United States are typically in the range of several hundred billion dollars (Choi Granade, et al., 2009 and Fulton & Brandenburg, 2012).1,2 To access this potential, many state policymakers and utility regulators have established aggressive energy efficiency savings targets. Current levels of taxpayer and utility bill-payer funding for energy efficiency is only a small fraction of the total investment needed to meet these targets (SEE Action Financing Solutions Working Group, 2013). Given this challenge, some energy efficiency program administrators are working to access private capital sources withmore » the aim of amplifying the funds available for investment. In this context, efficient access to secondary market capital has been advanced as one important enabler of the energy efficiency industry “at scale.”3 The question of what role secondary markets can play in bringing energy efficiency to scale is largely untested despite extensive attention from media, technical publications, advocates, and others. Only a handful of transactions of energy efficiency loan products have been executed to date, and it is too soon to draw robust conclusions from these deals. At the same time, energy efficiency program administrators and policymakers face very real decisions regarding whether and how to access secondary markets as part of their energy efficiency deployment strategy.« less

  19. Fault Tolerant Frequent Pattern Mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shohdy, Sameh; Vishnu, Abhinav; Agrawal, Gagan

    FP-Growth algorithm is a Frequent Pattern Mining (FPM) algorithm that has been extensively used to study correlations and patterns in large scale datasets. While several researchers have designed distributed memory FP-Growth algorithms, it is pivotal to consider fault tolerant FP-Growth, which can address the increasing fault rates in large scale systems. In this work, we propose a novel parallel, algorithm-level fault-tolerant FP-Growth algorithm. We leverage algorithmic properties and MPI advanced features to guarantee an O(1) space complexity, achieved by using the dataset memory space itself for checkpointing. We also propose a recovery algorithm that can use in-memory and disk-based checkpointing,more » though in many cases the recovery can be completed without any disk access, and incurring no memory overhead for checkpointing. We evaluate our FT algorithm on a large scale InfiniBand cluster with several large datasets using up to 2K cores. Our evaluation demonstrates excellent efficiency for checkpointing and recovery in comparison to the disk-based approach. We have also observed 20x average speed-up in comparison to Spark, establishing that a well designed algorithm can easily outperform a solution based on a general fault-tolerant programming model.« less

  20. Modeling near-wall turbulent flows

    NASA Astrophysics Data System (ADS)

    Marusic, Ivan; Mathis, Romain; Hutchins, Nicholas

    2010-11-01

    The near-wall region of turbulent boundary layers is a crucial region for turbulence production, but it is also a region that becomes increasing difficult to access and make measurements in as the Reynolds number becomes very high. Consequently, it is desirable to model the turbulence in this region. Recent studies have shown that the classical description, with inner (wall) scaling alone, is insufficient to explain the behaviour of the streamwise turbulence intensities with increasing Reynolds number. Here we will review our recent near-wall model (Marusic et al., Science 329, 2010), where the near-wall turbulence is predicted given information from only the large-scale signature at a single measurement point in the logarithmic layer, considerably far from the wall. The model is consistent with the Townsend attached eddy hypothesis in that the large-scale structures associated with the log-region are felt all the way down to the wall, but also includes a non-linear amplitude modulation effect of the large structures on the near-wall turbulence. Detailed predicted spectra across the entire near- wall region will be presented, together with other higher order statistics over a large range of Reynolds numbers varying from laboratory to atmospheric flows.

  1. Pynamic: the Python Dynamic Benchmark

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, G L; Ahn, D H; de Supinksi, B R

    2007-07-10

    Python is widely used in scientific computing to facilitate application development and to support features such as computational steering. Making full use of some of Python's popular features, which improve programmer productivity, leads to applications that access extremely high numbers of dynamically linked libraries (DLLs). As a result, some important Python-based applications severely stress a system's dynamic linking and loading capabilities and also cause significant difficulties for most development environment tools, such as debuggers. Furthermore, using the Python paradigm for large scale MPI-based applications can create significant file IO and further stress tools and operating systems. In this paper, wemore » present Pynamic, the first benchmark program to support configurable emulation of a wide-range of the DLL usage of Python-based applications for large scale systems. Pynamic has already accurately reproduced system software and tool issues encountered by important large Python-based scientific applications on our supercomputers. Pynamic provided insight for our system software and tool vendors, and our application developers, into the impact of several design decisions. As we describe the Pynamic benchmark, we will highlight some of the issues discovered in our large scale system software and tools using Pynamic.« less

  2. TomoMiner and TomoMinerCloud: A software platform for large-scale subtomogram structural analysis

    PubMed Central

    Frazier, Zachary; Xu, Min; Alber, Frank

    2017-01-01

    SUMMARY Cryo-electron tomography (cryoET) captures the 3D electron density distribution of macromolecular complexes in close to native state. With the rapid advance of cryoET acquisition technologies, it is possible to generate large numbers (>100,000) of subtomograms, each containing a macromolecular complex. Often, these subtomograms represent a heterogeneous sample due to variations in structure and composition of a complex in situ form or because particles are a mixture of different complexes. In this case subtomograms must be classified. However, classification of large numbers of subtomograms is a time-intensive task and often a limiting bottleneck. This paper introduces an open source software platform, TomoMiner, for large-scale subtomogram classification, template matching, subtomogram averaging, and alignment. Its scalable and robust parallel processing allows efficient classification of tens to hundreds of thousands of subtomograms. Additionally, TomoMiner provides a pre-configured TomoMinerCloud computing service permitting users without sufficient computing resources instant access to TomoMiners high-performance features. PMID:28552576

  3. An Innovative, Multidisciplinary Educational Program in Interactive Information Storage and Retrieval. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Gallagher, Mary C.

    1985-01-01

    There exists a large number of large-scale bibliographic Information Storage and Retrieval Systems containing large amounts of valuable data of interest in a wide variety of research applications. These systems are not used to capacity because the end users, i.e., the researchers, have not been trained in the techniques of accessing such systems. This thesis describes the development of a transportable, university-level course in methods of querying on-line interactive Information Storage and Retrieval systems as a solution to this problem. This course was designed to instruct upper division science and engineering students to enable these end users to directly access such systems. The course is designed to be taught by instructors who are not specialists in either computer science or research skills. It is independent of any particular IS and R system or computer hardware. The project is sponsored by NASA and conducted by the University of Southwestern Louisiana and Southern University.

  4. Unified Access Architecture for Large-Scale Scientific Datasets

    NASA Astrophysics Data System (ADS)

    Karna, Risav

    2014-05-01

    Data-intensive sciences have to deploy diverse large scale database technologies for data analytics as scientists have now been dealing with much larger volume than ever before. While array databases have bridged many gaps between the needs of data-intensive research fields and DBMS technologies (Zhang 2011), invocation of other big data tools accompanying these databases is still manual and separate the database management's interface. We identify this as an architectural challenge that will increasingly complicate the user's work flow owing to the growing number of useful but isolated and niche database tools. Such use of data analysis tools in effect leaves the burden on the user's end to synchronize the results from other data manipulation analysis tools with the database management system. To this end, we propose a unified access interface for using big data tools within large scale scientific array database using the database queries themselves to embed foreign routines belonging to the big data tools. Such an invocation of foreign data manipulation routines inside a query into a database can be made possible through a user-defined function (UDF). UDFs that allow such levels of freedom as to call modules from another language and interface back and forth between the query body and the side-loaded functions would be needed for this purpose. For the purpose of this research we attempt coupling of four widely used tools Hadoop (hadoop1), Matlab (matlab1), R (r1) and ScaLAPACK (scalapack1) with UDF feature of rasdaman (Baumann 98), an array-based data manager, for investigating this concept. The native array data model used by an array-based data manager provides compact data storage and high performance operations on ordered data such as spatial data, temporal data, and matrix-based data for linear algebra operations (scidbusr1). Performances issues arising due to coupling of tools with different paradigms, niche functionalities, separate processes and output data formats have been anticipated and considered during the design of the unified architecture. The research focuses on the feasibility of the designed coupling mechanism and the evaluation of the efficiency and benefits of our proposed unified access architecture. Zhang 2011: Zhang, Ying and Kersten, Martin and Ivanova, Milena and Nes, Niels, SciQL: Bridging the Gap Between Science and Relational DBMS, Proceedings of the 15th Symposium on International Database Engineering Applications, 2011. Baumann 98: Baumann, P., Dehmel, A., Furtado, P., Ritsch, R., Widmann, N., "The Multidimensional Database System RasDaMan", SIGMOD 1998, Proceedings ACM SIGMOD International Conference on Management of Data, June 2-4, 1998, Seattle, Washington, 1998. hadoop1: hadoop.apache.org, "Hadoop", http://hadoop.apache.org/, [Online; accessed 12-Jan-2014]. scalapack1: netlib.org/scalapack, "ScaLAPACK", http://www.netlib.org/scalapack,[Online; accessed 12-Jan-2014]. r1: r-project.org, "R", http://www.r-project.org/,[Online; accessed 12-Jan-2014]. matlab1: mathworks.com, "Matlab Documentation", http://www.mathworks.de/de/help/matlab/,[Online; accessed 12-Jan-2014]. scidbusr1: scidb.org, "SciDB User's Guide", http://scidb.org/HTMLmanual/13.6/scidb_ug,[Online; accessed 01-Dec-2013].

  5. Citizen journalism in a time of crisis: lessons from a large-scale California wildfire

    Treesearch

    S. Gillette; J. Taylor; D.J. Chavez; R. Hodgson; J. Downing

    2007-01-01

    The accessibility of news production tools through consumer communication technology has made it possible for media consumers to become media producers. The evolution of media consumer to media producer has important implications for the shape of public discourse during a time of crisis. Citizen journalists cover crisis events using camera cell phones and digital...

  6. Using Systematic Item Selection Methods to Improve Universal Design of Assessments. Policy Directions. Number 18

    ERIC Educational Resources Information Center

    Johnstone, Christopher; Thurlow, Martha; Moore, Michael; Altman, Jason

    2006-01-01

    The No Child Left Behind Act of 2001 (NCLB) and other recent changes in federal legislation have placed greater emphasis on accountability in large-scale testing. Included in this emphasis are regulations that require assessments to be accessible. States are accountable for the success of all students, and tests should be designed in a way that…

  7. eIFL (Electronic Information for Libraries): A Global Initiative of the Soros Foundations Network.

    ERIC Educational Resources Information Center

    Feret, Blazej; Kay, Michael

    This paper presents the history, current status, and future development of eIFL (Electronic Information for Libraries Direct)--a large-scale project run by the Soros Foundations Network and the Open Society Institute. The project aims to provide libraries in developing countries with access to a menu of electronic information resources. In 1999,…

  8. The Practical Impact of Recent Computer Advances on the Analysis and Design of Large Scale Networks

    DTIC Science & Technology

    1974-12-01

    Communications, ICC-74, June 17-19, Minneapolis, Minnesota, pp. 31C-1-21C-5. 28. Gitman , I., R, M. Van Slvke and H. Frank, "On Splitting Random Access Broadcast...1974. 29. Gitman , I., "On the Capacity of Slotted ALOHA Network and Some Design Problems," IEEE Transactions on Communications, Maren, 1975. 30

  9. Detecting Potentially Compromised Credentials in a Large-Scale Production Single-Signon System

    DTIC Science & Technology

    2014-06-01

    Attention Deficit Hyperactivity Disorder ( ADHD ), Post-Traumatic Stress Disorder (PTSD), anxiety, they are neurotic, and have memory issues. They... Deficit Hyperactivity Disorder API Application Programming Interface CAC Common Access Card CBL Composite Blocking List CDF Cumulative Distribution...Service Logons (DSLs) system . . . . . . . . . . . . . . . . 49 xi THIS PAGE INTENTIONALLY LEFT BLANK xii List of Acronyms and Abbreviations ADHD Attention

  10. Social Support as a Factor Inhibiting Teenage Risk-Taking: Views of Students, Parents and Professionals

    ERIC Educational Resources Information Center

    Abbott-Chapman, Joan; Denholm, Carey; Wyld, Colin

    2008-01-01

    A large-scale study conducted in Tasmania, Australia, of teenage risk-taking across 26 potentially harmful risk activities has examined a range of factors that encourage or inhibit risk-taking. Among these factors, the degree of social and professional support the teenage students say they would access for personal, study or health problems has…

  11. Scaling predictive modeling in drug development with cloud computing.

    PubMed

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  12. MouseNet database: digital management of a large-scale mutagenesis project.

    PubMed

    Pargent, W; Heffner, S; Schäble, K F; Soewarto, D; Fuchs, H; Hrabé de Angelis, M

    2000-07-01

    The Munich ENU Mouse Mutagenesis Screen is a large-scale mutant production, phenotyping, and mapping project. It encompasses two animal breeding facilities and a number of screening groups located in the general area of Munich. A central database is required to manage and process the immense amount of data generated by the mutagenesis project. This database, which we named MouseNet(c), runs on a Sybase platform and will finally store and process all data from the entire project. In addition, the system comprises a portfolio of functions needed to support the workflow management of the core facility and the screening groups. MouseNet(c) will make all of the data available to the participating screening groups, and later to the international scientific community. MouseNet(c) will consist of three major software components:* Animal Management System (AMS)* Sample Tracking System (STS)* Result Documentation System (RDS)MouseNet(c) provides the following major advantages:* being accessible from different client platforms via the Internet* being a full-featured multi-user system (including access restriction and data locking mechanisms)* relying on a professional RDBMS (relational database management system) which runs on a UNIX server platform* supplying workflow functions and a variety of plausibility checks.

  13. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    NASA Technical Reports Server (NTRS)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  14. LSST Resources for the Community

    NASA Astrophysics Data System (ADS)

    Jones, R. Lynne

    2011-01-01

    LSST will generate 100 petabytes of images and 20 petabytes of catalogs, covering 18,000-20,000 square degrees of area sampled every few days, throughout a total of ten years of time -- all publicly available and exquisitely calibrated. The primary access to this data will be through Data Access Centers (DACs). DACs will provide access to catalogs of sources (single detections from individual images) and objects (associations of sources from multiple images). Simple user interfaces or direct SQL queries at the DAC can return user-specified portions of data from catalogs or images. More complex manipulations of the data, such as calculating multi-point correlation functions or creating alternative photo-z measurements on terabyte-scale data, can be completed with the DAC's own resources. Even more data-intensive computations requiring access to large numbers of image pixels on petabyte-scale could also be conducted at the DAC, using compute resources allocated in a similar manner to a TAC. DAC resources will be available to all individuals in member countries or institutes and LSST science collaborations. DACs will also assist investigators with requests for allocations at national facilities such as the Petascale Computing Facility, TeraGrid, and Open Science Grid. Using data on this scale requires new approaches to accessibility and analysis which are being developed through interactions with the LSST Science Collaborations. We are producing simulated images (as might be acquired by LSST) based on models of the universe and generating catalogs from these images (as well as from the base model) using the LSST data management framework in a series of data challenges. The resulting images and catalogs are being made available to the science collaborations to verify the algorithms and develop user interfaces. All LSST software is open source and available online, including preliminary catalog formats. We encourage feedback from the community.

  15. Using AberOWL for fast and scalable reasoning over BioPortal ontologies.

    PubMed

    Slater, Luke; Gkoutos, Georgios V; Schofield, Paul N; Hoehndorf, Robert

    2016-08-08

    Reasoning over biomedical ontologies using their OWL semantics has traditionally been a challenging task due to the high theoretical complexity of OWL-based automated reasoning. As a consequence, ontology repositories, as well as most other tools utilizing ontologies, either provide access to ontologies without use of automated reasoning, or limit the number of ontologies for which automated reasoning-based access is provided. We apply the AberOWL infrastructure to provide automated reasoning-based access to all accessible and consistent ontologies in BioPortal (368 ontologies). We perform an extensive performance evaluation to determine query times, both for queries of different complexity and for queries that are performed in parallel over the ontologies. We demonstrate that, with the exception of a few ontologies, even complex and parallel queries can now be answered in milliseconds, therefore allowing automated reasoning to be used on a large scale, to run in parallel, and with rapid response times.

  16. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles.

    PubMed

    Piwowar, Heather; Priem, Jason; Larivière, Vincent; Alperin, Juan Pablo; Matthias, Lisa; Norlander, Bree; Farley, Ashley; West, Jevin; Haustein, Stefanie

    2018-01-01

    Despite growing interest in Open Access (OA) to scholarly literature, there is an unmet need for large-scale, up-to-date, and reproducible studies assessing the prevalence and characteristics of OA. We address this need using oaDOI, an open online service that determines OA status for 67 million articles. We use three samples, each of 100,000 articles, to investigate OA in three populations: (1) all journal articles assigned a Crossref DOI, (2) recent journal articles indexed in Web of Science, and (3) articles viewed by users of Unpaywall, an open-source browser extension that lets users find OA articles using oaDOI. We estimate that at least 28% of the scholarly literature is OA (19M in total) and that this proportion is growing, driven particularly by growth in Gold and Hybrid. The most recent year analyzed (2015) also has the highest percentage of OA (45%). Because of this growth, and the fact that readers disproportionately access newer articles, we find that Unpaywall users encounter OA quite frequently: 47% of articles they view are OA. Notably, the most common mechanism for OA is not Gold, Green, or Hybrid OA, but rather an under-discussed category we dub Bronze: articles made free-to-read on the publisher website, without an explicit Open license. We also examine the citation impact of OA articles, corroborating the so-called open-access citation advantage: accounting for age and discipline, OA articles receive 18% more citations than average, an effect driven primarily by Green and Hybrid OA. We encourage further research using the free oaDOI service, as a way to inform OA policy and practice.

  17. The state of OA: a large-scale analysis of the prevalence and impact of Open Access articles

    PubMed Central

    Larivière, Vincent; Alperin, Juan Pablo; Matthias, Lisa; Norlander, Bree; Farley, Ashley; West, Jevin; Haustein, Stefanie

    2018-01-01

    Despite growing interest in Open Access (OA) to scholarly literature, there is an unmet need for large-scale, up-to-date, and reproducible studies assessing the prevalence and characteristics of OA. We address this need using oaDOI, an open online service that determines OA status for 67 million articles. We use three samples, each of 100,000 articles, to investigate OA in three populations: (1) all journal articles assigned a Crossref DOI, (2) recent journal articles indexed in Web of Science, and (3) articles viewed by users of Unpaywall, an open-source browser extension that lets users find OA articles using oaDOI. We estimate that at least 28% of the scholarly literature is OA (19M in total) and that this proportion is growing, driven particularly by growth in Gold and Hybrid. The most recent year analyzed (2015) also has the highest percentage of OA (45%). Because of this growth, and the fact that readers disproportionately access newer articles, we find that Unpaywall users encounter OA quite frequently: 47% of articles they view are OA. Notably, the most common mechanism for OA is not Gold, Green, or Hybrid OA, but rather an under-discussed category we dub Bronze: articles made free-to-read on the publisher website, without an explicit Open license. We also examine the citation impact of OA articles, corroborating the so-called open-access citation advantage: accounting for age and discipline, OA articles receive 18% more citations than average, an effect driven primarily by Green and Hybrid OA. We encourage further research using the free oaDOI service, as a way to inform OA policy and practice. PMID:29456894

  18. Scaling Irregular Applications through Data Aggregation and Software Multithreading

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morari, Alessandro; Tumeo, Antonino; Chavarría-Miranda, Daniel

    Bioinformatics, data analytics, semantic databases, knowledge discovery are emerging high performance application areas that exploit dynamic, linked data structures such as graphs, unbalanced trees or unstructured grids. These data structures usually are very large, requiring significantly more memory than available on single shared memory systems. Additionally, these data structures are difficult to partition on distributed memory systems. They also present poor spatial and temporal locality, thus generating unpredictable memory and network accesses. The Partitioned Global Address Space (PGAS) programming model seems suitable for these applications, because it allows using a shared memory abstraction across distributed-memory clusters. However, current PGAS languagesmore » and libraries are built to target regular remote data accesses and block transfers. Furthermore, they usually rely on the Single Program Multiple Data (SPMD) parallel control model, which is not well suited to the fine grained, dynamic and unbalanced parallelism of irregular applications. In this paper we present {\\bf GMT} (Global Memory and Threading library), a custom runtime library that enables efficient execution of irregular applications on commodity clusters. GMT integrates a PGAS data substrate with simple fork/join parallelism and provides automatic load balancing on a per node basis. It implements multi-level aggregation and lightweight multithreading to maximize memory and network bandwidth with fine-grained data accesses and tolerate long data access latencies. A key innovation in the GMT runtime is its thread specialization (workers, helpers and communication threads) that realize the overall functionality. We compare our approach with other PGAS models, such as UPC running using GASNet, and hand-optimized MPI code on a set of typical large-scale irregular applications, demonstrating speedups of an order of magnitude.« less

  19. The USA-NPN Information Management System: A tool in support of phenological assessments

    NASA Astrophysics Data System (ADS)

    Rosemartin, A.; Vazquez, R.; Wilson, B. E.; Denny, E. G.

    2009-12-01

    The USA National Phenology Network (USA-NPN) serves science and society by promoting a broad understanding of plant and animal phenology and the relationships among phenological patterns and all aspects of environmental change. Data management and information sharing are central to the USA-NPN mission. The USA-NPN develops, implements, and maintains a comprehensive Information Management System (IMS) to serve the needs of the network, including the collection, storage and dissemination of phenology data, access to phenology-related information, tools for data interpretation, and communication among partners of the USA-NPN. The IMS includes components for data storage, such as the National Phenology Database (NPD), and several online user interfaces to accommodate data entry, data download, data visualization and catalog searches for phenology-related information. The IMS is governed by a set of standards to ensure security, privacy, data access, and data quality. The National Phenology Database is designed to efficiently accommodate large quantities of phenology data, to be flexible to the changing needs of the network, and to provide for quality control. The database stores phenology data from multiple sources (e.g., partner organizations, researchers and citizen observers), and provides for integration with legacy datasets. Several services will be created to provide access to the data, including reports, visualization interfaces, and web services. These services will provide integrated access to phenology and related information for scientists, decision-makers and general audiences. Phenological assessments at any scale will rely on secure and flexible information management systems for the organization and analysis of phenology data. The USA-NPN’s IMS can serve phenology assessments directly, through data management and indirectly as a model for large-scale integrated data management.

  20. The Role of Genome Accessibility in Transcription Factor Binding in Bacteria.

    PubMed

    Gomes, Antonio L C; Wang, Harris H

    2016-04-01

    ChIP-seq enables genome-scale identification of regulatory regions that govern gene expression. However, the biological insights generated from ChIP-seq analysis have been limited to predictions of binding sites and cooperative interactions. Furthermore, ChIP-seq data often poorly correlate with in vitro measurements or predicted motifs, highlighting that binding affinity alone is insufficient to explain transcription factor (TF)-binding in vivo. One possibility is that binding sites are not equally accessible across the genome. A more comprehensive biophysical representation of TF-binding is required to improve our ability to understand, predict, and alter gene expression. Here, we show that genome accessibility is a key parameter that impacts TF-binding in bacteria. We developed a thermodynamic model that parameterizes ChIP-seq coverage in terms of genome accessibility and binding affinity. The role of genome accessibility is validated using a large-scale ChIP-seq dataset of the M. tuberculosis regulatory network. We find that accounting for genome accessibility led to a model that explains 63% of the ChIP-seq profile variance, while a model based in motif score alone explains only 35% of the variance. Moreover, our framework enables de novo ChIP-seq peak prediction and is useful for inferring TF-binding peaks in new experimental conditions by reducing the need for additional experiments. We observe that the genome is more accessible in intergenic regions, and that increased accessibility is positively correlated with gene expression and anti-correlated with distance to the origin of replication. Our biophysically motivated model provides a more comprehensive description of TF-binding in vivo from first principles towards a better representation of gene regulation in silico, with promising applications in systems biology.

  1. The seismo-hydromechanical behavior during deep geothermal reservoir stimulations: open questions tackled in a decameter-scale in situ stimulation experiment

    NASA Astrophysics Data System (ADS)

    Amann, Florian; Gischig, Valentin; Evans, Keith; Doetsch, Joseph; Jalali, Reza; Valley, Benoît; Krietsch, Hannes; Dutler, Nathan; Villiger, Linus; Brixel, Bernard; Klepikova, Maria; Kittilä, Anniina; Madonna, Claudio; Wiemer, Stefan; Saar, Martin O.; Loew, Simon; Driesner, Thomas; Maurer, Hansruedi; Giardini, Domenico

    2018-02-01

    In this contribution, we present a review of scientific research results that address seismo-hydromechanically coupled processes relevant for the development of a sustainable heat exchanger in low-permeability crystalline rock and introduce the design of the In situ Stimulation and Circulation (ISC) experiment at the Grimsel Test Site dedicated to studying such processes under controlled conditions. The review shows that research on reservoir stimulation for deep geothermal energy exploitation has been largely based on laboratory observations, large-scale projects and numerical models. Observations of full-scale reservoir stimulations have yielded important results. However, the limited access to the reservoir and limitations in the control on the experimental conditions during deep reservoir stimulations is insufficient to resolve the details of the hydromechanical processes that would enhance process understanding in a way that aids future stimulation design. Small-scale laboratory experiments provide fundamental insights into various processes relevant for enhanced geothermal energy, but suffer from (1) difficulties and uncertainties in upscaling the results to the field scale and (2) relatively homogeneous material and stress conditions that lead to an oversimplistic fracture flow and/or hydraulic fracture propagation behavior that is not representative of a heterogeneous reservoir. Thus, there is a need for intermediate-scale hydraulic stimulation experiments with high experimental control that bridge the various scales and for which access to the target rock mass with a comprehensive monitoring system is possible. The ISC experiment is designed to address open research questions in a naturally fractured and faulted crystalline rock mass at the Grimsel Test Site (Switzerland). Two hydraulic injection phases were executed to enhance the permeability of the rock mass. During the injection phases the rock mass deformation across fractures and within intact rock, the pore pressure distribution and propagation, and the microseismic response were monitored at a high spatial and temporal resolution.

  2. geoknife: Reproducible web-processing of large gridded datasets

    USGS Publications Warehouse

    Read, Jordan S.; Walker, Jordan I.; Appling, Alison P.; Blodgett, David L.; Read, Emily K.; Winslow, Luke A.

    2016-01-01

    Geoprocessing of large gridded data according to overlap with irregular landscape features is common to many large-scale ecological analyses. The geoknife R package was created to facilitate reproducible analyses of gridded datasets found on the U.S. Geological Survey Geo Data Portal web application or elsewhere, using a web-enabled workflow that eliminates the need to download and store large datasets that are reliably hosted on the Internet. The package provides access to several data subset and summarization algorithms that are available on remote web processing servers. Outputs from geoknife include spatial and temporal data subsets, spatially-averaged time series values filtered by user-specified areas of interest, and categorical coverage fractions for various land-use types.

  3. Limited accessibility to designs and results of Japanese large-scale clinical trials for cardiovascular diseases.

    PubMed

    Sawata, Hiroshi; Ueshima, Kenji; Tsutani, Kiichiro

    2011-04-14

    Clinical evidence is important for improving the treatment of patients by health care providers. In the study of cardiovascular diseases, large-scale clinical trials involving thousands of participants are required to evaluate the risks of cardiac events and/or death. The problems encountered in conducting the Japanese Acute Myocardial Infarction Prospective (JAMP) study highlighted the difficulties involved in obtaining the financial and infrastructural resources necessary for conducting large-scale clinical trials. The objectives of the current study were: 1) to clarify the current funding and infrastructural environment surrounding large-scale clinical trials in cardiovascular and metabolic diseases in Japan, and 2) to find ways to improve the environment surrounding clinical trials in Japan more generally. We examined clinical trials examining cardiovascular diseases that evaluated true endpoints and involved 300 or more participants using Pub-Med, Ichushi (by the Japan Medical Abstracts Society, a non-profit organization), websites of related medical societies, the University Hospital Medical Information Network (UMIN) Clinical Trials Registry, and clinicaltrials.gov at three points in time: 30 November, 2004, 25 February, 2007 and 25 July, 2009. We found a total of 152 trials that met our criteria for 'large-scale clinical trials' examining cardiovascular diseases in Japan. Of these, 72.4% were randomized controlled trials (RCTs). Of 152 trials, 9.2% of the trials examined more than 10,000 participants, and 42.8% examined between 1,000 and 10,000 participants. The number of large-scale clinical trials markedly increased from 2001 to 2004, but suddenly decreased in 2007, then began to increase again. Ischemic heart disease (39.5%) was the most common target disease. Most of the larger-scale trials were funded by private organizations such as pharmaceutical companies. The designs and results of 13 trials were not disclosed. To improve the quality of clinical trials, all sponsors should register trials and disclose the funding sources before the enrolment of participants, and publish their results after the completion of each study.

  4. Field of genes: using Apache Kafka as a bioinformatic data repository

    PubMed Central

    Lynch, Richard; Walsh, Paul

    2018-01-01

    Abstract Background Bioinformatic research is increasingly dependent on large-scale datasets, accessed either from private or public repositories. An example of a public repository is National Center for Biotechnology Information's (NCBI’s) Reference Sequence (RefSeq). These repositories must decide in what form to make their data available. Unstructured data can be put to almost any use but are limited in how access to them can be scaled. Highly structured data offer improved performance for specific algorithms but limit the wider usefulness of the data. We present an alternative: lightly structured data stored in Apache Kafka in a way that is amenable to parallel access and streamed processing, including subsequent transformations into more highly structured representations. We contend that this approach could provide a flexible and powerful nexus of bioinformatic data, bridging the gap between low structure on one hand, and high performance and scale on the other. To demonstrate this, we present a proof-of-concept version of NCBI’s RefSeq database using this technology. We measure the performance and scalability characteristics of this alternative with respect to flat files. Results The proof of concept scales almost linearly as more compute nodes are added, outperforming the standard approach using files. Conclusions Apache Kafka merits consideration as a fast and more scalable but general-purpose way to store and retrieve bioinformatic data, for public, centralized reference datasets such as RefSeq and for private clinical and experimental data. PMID:29635394

  5. Numerical method for accessing the universal scaling function for a multiparticle discrete time asymmetric exclusion process

    NASA Astrophysics Data System (ADS)

    Chia, Nicholas; Bundschuh, Ralf

    2005-11-01

    In the universality class of the one-dimensional Kardar-Parisi-Zhang (KPZ) surface growth, Derrida and Lebowitz conjectured the universality of not only the scaling exponents, but of an entire scaling function. Since and Derrida and Lebowitz’s original publication [Phys. Rev. Lett. 80, 209 (1998)] this universality has been verified for a variety of continuous-time, periodic-boundary systems in the KPZ universality class. Here, we present a numerical method for directly examining the entire particle flux of the asymmetric exclusion process (ASEP), thus providing an alternative to more difficult cumulant ratios studies. Using this method, we find that the Derrida-Lebowitz scaling function (DLSF) properly characterizes the large-system-size limit (N→∞) of a single-particle discrete time system, even in the case of very small system sizes (N⩽22) . This fact allows us to not only verify that the DLSF properly characterizes multiple-particle discrete-time asymmetric exclusion processes, but also provides a way to numerically solve for quantities of interest, such as the particle hopping flux. This method can thus serve to further increase the ease and accessibility of studies involving even more challenging dynamics, such as the open-boundary ASEP.

  6. Achievable Rate Estimation of IEEE 802.11ad Visual Big-Data Uplink Access in Cloud-Enabled Surveillance Applications.

    PubMed

    Kim, Joongheon; Kim, Jong-Kook

    2016-01-01

    This paper addresses the computation procedures for estimating the impact of interference in 60 GHz IEEE 802.11ad uplink access in order to construct visual big-data database from randomly deployed surveillance camera sensing devices. The acquired large-scale massive visual information from surveillance camera devices will be used for organizing big-data database, i.e., this estimation is essential for constructing centralized cloud-enabled surveillance database. This performance estimation study captures interference impacts on the target cloud access points from multiple interference components generated by the 60 GHz wireless transmissions from nearby surveillance camera devices to their associated cloud access points. With this uplink interference scenario, the interference impacts on the main wireless transmission from a target surveillance camera device to its associated target cloud access point with a number of settings are measured and estimated under the consideration of 60 GHz radiation characteristics and antenna radiation pattern models.

  7. Geographic Accessibility Of Food Outlets Not Associated With Body Mass Index Change Among Veterans, 2009-14.

    PubMed

    Zenk, Shannon N; Tarlov, Elizabeth; Wing, Coady; Matthews, Stephen A; Jones, Kelly; Tong, Hao; Powell, Lisa M

    2017-08-01

    In recent years, various levels of government in the United States have adopted or discussed subsidies, tax breaks, zoning laws, and other public policies that promote geographic access to healthy food. However, there is little evidence from large-scale longitudinal or quasi-experimental research to suggest that the local mix of food outlets actually affects body mass index (BMI). We used a longitudinal design to examine whether the proximity of food outlets, by type, was associated with BMI changes between 2009 and 2014 among 1.7 million veterans in 382 metropolitan areas. We found no evidence that either absolute or relative geographic accessibility of supermarkets, fast-food restaurants, or mass merchandisers was associated with changes in an individual's BMI over time. While policies that alter only geographic access to food outlets may promote equitable access to healthy food and improve nutrition, our findings suggest they will do little to combat obesity in adults. Project HOPE—The People-to-People Health Foundation, Inc.

  8. Atom-Role-Based Access Control Model

    NASA Astrophysics Data System (ADS)

    Cai, Weihong; Huang, Richeng; Hou, Xiaoli; Wei, Gang; Xiao, Shui; Chen, Yindong

    Role-based access control (RBAC) model has been widely recognized as an efficient access control model and becomes a hot research topic of information security at present. However, in the large-scale enterprise application environments, the traditional RBAC model based on the role hierarchy has the following deficiencies: Firstly, it is unable to reflect the role relationships in complicated cases effectively, which does not accord with practical applications. Secondly, the senior role unconditionally inherits all permissions of the junior role, thus if a user is under the supervisor role, he may accumulate all permissions, and this easily causes the abuse of permission and violates the least privilege principle, which is one of the main security principles. To deal with these problems, we, after analyzing permission types and role relationships, proposed the concept of atom role and built an atom-role-based access control model, called ATRBAC, by dividing the permission set of each regular role based on inheritance path relationships. Through the application-specific analysis, this model can well meet the access control requirements.

  9. Development of a database system for mapping insertional mutations onto the mouse genome with large-scale experimental data

    PubMed Central

    2009-01-01

    Background Insertional mutagenesis is an effective method for functional genomic studies in various organisms. It can rapidly generate easily tractable mutations. A large-scale insertional mutagenesis with the piggyBac (PB) transposon is currently performed in mice at the Institute of Developmental Biology and Molecular Medicine (IDM), Fudan University in Shanghai, China. This project is carried out via collaborations among multiple groups overseeing interconnected experimental steps and generates a large volume of experimental data continuously. Therefore, the project calls for an efficient database system for recording, management, statistical analysis, and information exchange. Results This paper presents a database application called MP-PBmice (insertional mutation mapping system of PB Mutagenesis Information Center), which is developed to serve the on-going large-scale PB insertional mutagenesis project. A lightweight enterprise-level development framework Struts-Spring-Hibernate is used here to ensure constructive and flexible support to the application. The MP-PBmice database system has three major features: strict access-control, efficient workflow control, and good expandability. It supports the collaboration among different groups that enter data and exchange information on daily basis, and is capable of providing real time progress reports for the whole project. MP-PBmice can be easily adapted for other large-scale insertional mutation mapping projects and the source code of this software is freely available at http://www.idmshanghai.cn/PBmice. Conclusion MP-PBmice is a web-based application for large-scale insertional mutation mapping onto the mouse genome, implemented with the widely used framework Struts-Spring-Hibernate. This system is already in use by the on-going genome-wide PB insertional mutation mapping project at IDM, Fudan University. PMID:19958505

  10. The global palm oil sector must change to save biodiversity and improve food security in the tropics.

    PubMed

    Azhar, Badrul; Saadun, Norzanalia; Prideaux, Margi; Lindenmayer, David B

    2017-12-01

    Most palm oil currently available in global markets is sourced from certified large-scale plantations. Comparatively little is sourced from (typically uncertified) smallholders. We argue that sourcing sustainable palm oil should not be determined by commercial certification alone and that the certification process should be revisited. There are so-far unrecognized benefits of sourcing palm oil from smallholders that should be considered if genuine biodiversity conservation is to be a foundation of 'environmentally sustainable' palm oil production. Despite a lack of certification, smallholder production is often more biodiversity-friendly than certified production from large-scale plantations. Sourcing palm oil from smallholders also alleviates poverty among rural farmers, promoting better conservation outcomes. Yet, certification schemes - the current measure of 'sustainability' - are financially accessible only for large-scale plantations that operate as profit-driven monocultures. Industrial palm oil is expanding rapidly in regions with weak environmental laws and enforcement. This warrants the development of an alternative certification scheme for smallholders. Greater attention should be directed to deforestation-free palm oil production in smallholdings, where production is less likely to cause large scale biodiversity loss. These small-scale farmlands in which palm oil is mixed with other crops should be considered by retailers and consumers who are interested in promoting sustainable palm oil production. Simultaneously, plantation companies should be required to make their existing production landscapes more compatible with enhanced biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle

    1999-01-01

    This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.

  12. Methane hydrates and the future of natural gas

    USGS Publications Warehouse

    Ruppel, Carolyn

    2011-01-01

    For decades, gas hydrates have been discussed as a potential resource, particularly for countries with limited access to conventional hydrocarbons or a strategic interest in establishing alternative, unconventional gas reserves. Methane has never been produced from gas hydrates at a commercial scale and, barring major changes in the economics of natural gas supply and demand, commercial production at a large scale is considered unlikely to commence within the next 15 years. Given the overall uncertainty still associated with gas hydrates as a potential resource, they have not been included in the EPPA model in MITEI’s Future of Natural Gas report. Still, gas hydrates remain a potentially large methane resource and must necessarily be included in any consideration of the natural gas supply beyond two decades from now.

  13. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  14. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Novel Route to Fabrication of Metal-Sandwiched Nanoscale Tapered Structures

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Yu, Da-Peng

    2009-08-01

    Tapered dielectric structures in metal have exhibited extraordinary performance in both surface plasmon polariton (SPP) waveguiding and SPP focusing. This is crucial to plasmonic research and industrial plasmonic device integration. We present a method that facilitates easy fabrication of smooth-surfaced sub-micron tapered structures in large scale simply with electron beam lithography (EBL). When a PMMA layer is spin-coated on previously-EBL-defined PMMA structures, steep edges can be transformed into a declining slope to form tapered PMMA structures, scaled from 10 nm to 1000 nm. Despite the simplicity of our method, patterns with PMMA surface smoothness can be well-positioned and replicated in large numbers, which therefore gives scientists easy access to research on the properties of tapered structures.

  15. Strategy for large-scale isolation of enantiomers in drug discovery.

    PubMed

    Leek, Hanna; Thunberg, Linda; Jonson, Anna C; Öhlén, Kristina; Klarqvist, Magnus

    2017-01-01

    A strategy for large-scale chiral resolution is illustrated by the isolation of pure enantiomer from a 5kg batch. Results from supercritical fluid chromatography will be presented and compared with normal phase liquid chromatography. Solubility of the compound in the supercritical mobile phase was shown to be the limiting factor. To circumvent this, extraction injection was used but shown not to be efficient for this compound. Finally, a method for chiral resolution by crystallization was developed and applied to give diastereomeric salt with an enantiomeric excess of 99% at a 91% yield. Direct access to a diverse separation tool box will be shown to be essential for solving separation problems in the most cost and time efficient way. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. On the linearity of tracer bias around voids

    NASA Astrophysics Data System (ADS)

    Pollina, Giorgia; Hamaus, Nico; Dolag, Klaus; Weller, Jochen; Baldi, Marco; Moscardini, Lauro

    2017-07-01

    The large-scale structure of the Universe can be observed only via luminous tracers of the dark matter. However, the clustering statistics of tracers are biased and depend on various properties, such as their host-halo mass and assembly history. On very large scales, this tracer bias results in a constant offset in the clustering amplitude, known as linear bias. Towards smaller non-linear scales, this is no longer the case and tracer bias becomes a complicated function of scale and time. We focus on tracer bias centred on cosmic voids, I.e. depressions of the density field that spatially dominate the Universe. We consider three types of tracers: galaxies, galaxy clusters and active galactic nuclei, extracted from the hydrodynamical simulation Magneticum Pathfinder. In contrast to common clustering statistics that focus on auto-correlations of tracers, we find that void-tracer cross-correlations are successfully described by a linear bias relation. The tracer-density profile of voids can thus be related to their matter-density profile by a single number. We show that it coincides with the linear tracer bias extracted from the large-scale auto-correlation function and expectations from theory, if sufficiently large voids are considered. For smaller voids we observe a shift towards higher values. This has important consequences on cosmological parameter inference, as the problem of unknown tracer bias is alleviated up to a constant number. The smallest scales in existing data sets become accessible to simpler models, providing numerous modes of the density field that have been disregarded so far, but may help to further reduce statistical errors in constraining cosmology.

  17. Influence of Distributed Residential Energy Storage on Voltage in Rural Distribution Network and Capacity Configuration

    NASA Astrophysics Data System (ADS)

    Liu, Lu; Tong, Yibin; Zhao, Zhigang; Zhang, Xuefen

    2018-03-01

    Large-scale access of distributed residential photovoltaic (PV) in rural areas has solved the voltage problem to a certain extent. However, due to the intermittency of PV and the particularity of rural residents’ power load, the problem of low voltage in the evening peak remains to be resolved. This paper proposes to solve the problem by accessing residential energy storage. Firstly, the influence of access location and capacity of energy storage on voltage distribution in rural distribution network is analyzed. Secondly, the relation between the storage capacity and load capacity is deduced for four typical load and energy storage cases when the voltage deviation meets the demand. Finally, the optimal storage position and capacity are obtained by using PSO and power flow simulation.

  18. Large-Scale ATP-Independent Nucleosome Unfolding by a Histone Chaperone

    PubMed Central

    Valieva, Maria E.; Armeev, Grigoriy A.; Kudryashova, Kseniya S.; Gerasimova, Nadezhda S.; Shaytan, Alexey K.; Kulaeva, Olga I.; McCullough, Laura L.; Formosa, Tim; Georgiev, Pavel G.; Kirpichnikov, Mikhail P.; Studitsky, Vasily M.; Feofanov, Alexey V.

    2017-01-01

    DNA accessibility to regulatory proteins is significantly affected by nucleosome structure and dynamics. FACT (facilitates chromatin transcription) increases the accessibility of nucleosomal DNA but the mechanism and extent of this nucleosome reorganization are unknown. We report here the effects of FACT on single nucleosomes revealed with spFRET microscopy. FACT binding results in a dramatic, ATP-independent, and reversible uncoiling of DNA that affects at least 70% of the DNA in a nucleosome. A mutated version of FACT is defective in this uncoiling, and a histone mutation that suppresses phenotypes caused by this FACT mutation in vivo restores the uncoiling activity in vitro. Thus FACT-dependent nucleosome unfolding modulates the accessibility of nucleosomal DNA, and this is an important function of FACT in vivo. PMID:27820806

  19. Robotic percutaneous access to the kidney: comparison with standard manual access.

    PubMed

    Su, Li-Ming; Stoianovici, Dan; Jarrett, Thomas W; Patriciu, Alexandru; Roberts, William W; Cadeddu, Jeffrey A; Ramakumar, Sanjay; Solomon, Stephen B; Kavoussi, Louis R

    2002-09-01

    To evaluate the efficiency, accuracy, and safety of robotic percutaneous access to the kidney (PAKY) for percutaneous nephrolithotomy in comparison with conventional manual techniques. We compared the intraoperative access variables (number of access attempts, time to successful access, estimated blood loss, complications) of 23 patients who underwent robotic PAKY with the remote center of motion device (PAKY-RCM) with the same data from a contemporaneous series of 23 patients who underwent conventional manual percutaneous access to the kidney. The PAKY-RCM incorporates a robotic arm and a friction transmission with axial loading system to accurately position and insert a standard 18-gauge needle percutaneously into the kidney. The blood loss during percutaneous access was estimated on a four-point scale (1 = minimal to 4 = large). The color of effluent urine was graded on a four-point scale (1 = clear to 4 = red). The mean target calix width was 13.5 +/- 9.2 mm in the robotic group and 12.2 +/- 4.5 mm in the manual group (P = 0.57). When comparing PAKY-RCM with standard manual techniques, the mean number of attempts was 2.2 +/- 1.6 v 3.2 +/- 2.5 (P = 0.14), time to access was 10.4 +/- 6.5 minutes v 15.1 +/- 8.8 minutes (P = 0.06), estimated blood loss score was 1.3 +/- 0.49 v 1.7 +/- 0.66 (P = 0.14), and color of effluent urine following access was 2.0 +/- 0.90 v 2.1 +/- 0.7 (P = 0.82). The PAKY-RCM was successful in obtaining access in 87% (20 of 23) of cases. The other three patients (13%) required conversion to manual techniques. There were no major intraoperative complications in either group. Robotic PAKY is a feasible, safe, and efficacious method of obtaining renal access for nephrolithotomy. The number of attempts and time to access were comparable to those of standard manual percutaneous access techniques. These findings provide the groundwork for the development of a completely automated robot-assisted percutaneous renal access device.

  20. Data-based discharge extrapolation: estimating annual discharge for a partially gauged large river basin from its small sub-basins

    NASA Astrophysics Data System (ADS)

    Gong, L.

    2013-12-01

    Large-scale hydrological models and land surface models are by far the only tools for accessing future water resources in climate change impact studies. Those models estimate discharge with large uncertainties, due to the complex interaction between climate and hydrology, the limited quality and availability of data, as well as model uncertainties. A new purely data-based scale-extrapolation method is proposed, to estimate water resources for a large basin solely from selected small sub-basins, which are typically two-orders-of-magnitude smaller than the large basin. Those small sub-basins contain sufficient information, not only on climate and land surface, but also on hydrological characteristics for the large basin In the Baltic Sea drainage basin, best discharge estimation for the gauged area was achieved with sub-basins that cover 2-4% of the gauged area. There exist multiple sets of sub-basins that resemble the climate and hydrology of the basin equally well. Those multiple sets estimate annual discharge for gauged area consistently well with 5% average error. The scale-extrapolation method is completely data-based; therefore it does not force any modelling error into the prediction. The multiple predictions are expected to bracket the inherent variations and uncertainties of the climate and hydrology of the basin. The method can be applied in both un-gauged basins and un-gauged periods with uncertainty estimation.

  1. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    PubMed

    Marzinelli, Ezequiel M; Williams, Stefan B; Babcock, Russell C; Barrett, Neville S; Johnson, Craig R; Jordan, Alan; Kendrick, Gary A; Pizarro, Oscar R; Smale, Dan A; Steinberg, Peter D

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia's Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km) and depths (15-60 m) across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  2. The Ensembl REST API: Ensembl Data for Any Language.

    PubMed

    Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R S; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul

    2015-01-01

    We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. © The Author 2014. Published by Oxford University Press.

  3. [Status of libraries and databases for natural products at abroad].

    PubMed

    Zhao, Li-Mei; Tan, Ning-Hua

    2015-01-01

    For natural products are one of the important sources for drug discovery, libraries and databases of natural products are significant for the development and research of natural products. At present, most of compound libraries at abroad are synthetic or combinatorial synthetic molecules, resulting to access natural products difficult; for information of natural products are scattered with different standards, it is difficult to construct convenient, comprehensive and large-scale databases for natural products. This paper reviewed the status of current accessing libraries and databases for natural products at abroad and provided some important information for the development of libraries and database for natural products.

  4. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination.

    PubMed

    Wood, Fiona; Kowalczuk, Jenny; Elwyn, Glyn; Mitchell, Clive; Gallacher, John

    2011-08-01

    Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies is acceptable for prospective participants using an example online genetics study. We conducted semi-structured interviews with 42 members of the public stratified by age group, gender and newspaper readership (a measure of social status). Respondents were asked to use a website designed to recruit for a large-scale genetic study. After using the website a semi-structured interview was conducted to explore opinions and any issues they would have. Responses were analysed using thematic content analysis. The majority of respondents said they would take part in the research (32/42). Those who said they would decline to participate saw fewer benefits from the research, wanted more information and expressed a greater number of concerns about the study. Younger respondents had concerns over time commitment. Middle aged respondents were concerned about privacy and security. Older respondents were more altruistic in their motivation to participate. Common themes included trust in the authenticity of the website, security of personal data, curiosity about their own genetic profile, operational concerns and a desire for more information about the research. Online consent to large-scale genetic studies is likely to be acceptable to the public. The online consent process must establish trust quickly and effectively by asserting authenticity and credentials, and provide access to a range of information to suit different information preferences.

  5. Vaccinium meridionale Swartz Supercritical CO₂ Extraction: Effect of Process Conditions and Scaling Up.

    PubMed

    López-Padilla, Alexis; Ruiz-Rodriguez, Alejandro; Restrepo Flórez, Claudia Estela; Rivero Barrios, Diana Marsela; Reglero, Guillermo; Fornari, Tiziana

    2016-06-25

    Vaccinium meridionale Swartz (Mortiño or Colombian blueberry) is one of the Vaccinium species abundantly found across the Colombian mountains, which are characterized by high contents of polyphenolic compounds (anthocyanins and flavonoids). The supercritical fluid extraction (SFE) of Vaccinium species has mainly focused on the study of V. myrtillus L. (blueberry). In this work, the SFE of Mortiño fruit from Colombia was studied in a small-scale extraction cell (273 cm³) and different extraction pressures (20 and 30 MPa) and temperatures (313 and 343 K) were investigated. Then, process scaling-up to a larger extraction cell (1350 cm³) was analyzed using well-known semi-empirical engineering approaches. The Broken and Intact Cell (BIC) model was adjusted to represent the kinetic behavior of the low-scale extraction and to simulate the large-scale conditions. Extraction yields obtained were in the range 0.1%-3.2%. Most of the Mortiño solutes are readily accessible and, thus, 92% of the extractable material was recovered in around 30 min. The constant CO₂ residence time criterion produced excellent results regarding the small-scale kinetic curve according to the BIC model, and this conclusion was experimentally validated in large-scale kinetic experiments.

  6. Vaccinium meridionale Swartz Supercritical CO2 Extraction: Effect of Process Conditions and Scaling Up

    PubMed Central

    López-Padilla, Alexis; Ruiz-Rodriguez, Alejandro; Restrepo Flórez, Claudia Estela; Rivero Barrios, Diana Marsela; Reglero, Guillermo; Fornari, Tiziana

    2016-01-01

    Vaccinium meridionale Swartz (Mortiño or Colombian blueberry) is one of the Vaccinium species abundantly found across the Colombian mountains, which are characterized by high contents of polyphenolic compounds (anthocyanins and flavonoids). The supercritical fluid extraction (SFE) of Vaccinium species has mainly focused on the study of V. myrtillus L. (blueberry). In this work, the SFE of Mortiño fruit from Colombia was studied in a small-scale extraction cell (273 cm3) and different extraction pressures (20 and 30 MPa) and temperatures (313 and 343 K) were investigated. Then, process scaling-up to a larger extraction cell (1350 cm3) was analyzed using well-known semi-empirical engineering approaches. The Broken and Intact Cell (BIC) model was adjusted to represent the kinetic behavior of the low-scale extraction and to simulate the large-scale conditions. Extraction yields obtained were in the range 0.1%–3.2%. Most of the Mortiño solutes are readily accessible and, thus, 92% of the extractable material was recovered in around 30 min. The constant CO2 residence time criterion produced excellent results regarding the small-scale kinetic curve according to the BIC model, and this conclusion was experimentally validated in large-scale kinetic experiments. PMID:28773640

  7. WikiPEATia - a web based platform for assembling peatland data through ‘crowd sourcing’

    NASA Astrophysics Data System (ADS)

    Wisser, D.; Glidden, S.; Fieseher, C.; Treat, C. C.; Routhier, M.; Frolking, S. E.

    2009-12-01

    The Earth System Science community is realizing that peatlands are an important and unique terrestrial ecosystem that has not yet been well-integrated into large-scale earth system analyses. A major hurdle is the lack of accessible, geospatial data of peatland distribution, coupled with data on peatland properties (e.g., vegetation composition, peat depth, basal dates, soil chemistry, peatland class) at the global scale. This data, however, is available at the local scale. Although a comprehensive global database on peatlands probably lags similar data on more economically important ecosystems such as forests, grasslands, croplands, a large amount of field data have been collected over the past several decades. A few efforts have been made to map peatlands at large scales but existing data have not been assembled into a single geospatial database that is publicly accessible or do not depict data with a level of detail that is needed in the Earth System Science Community. A global peatland database would contribute to advances in a number of research fields such as hydrology, vegetation and ecosystem modeling, permafrost modeling, and earth system modeling. We present a Web 2.0 approach that uses state-of-the-art webserver and innovative online mapping technologies and is designed to create such a global database through ‘crowd-sourcing’. Primary functions of the online system include form-driven textual user input of peatland research metadata, spatial data input of peatland areas via a mapping interface, database editing and querying editing capabilities, as well as advanced visualization and data analysis tools. WikiPEATia provides an integrated information technology platform for assembling, integrating, and posting peatland-related geospatial datasets facilitates and encourages research community involvement. A successful effort will make existing peatland data much more useful to the research community, and will help to identify significant data gaps.

  8. Establishing the Role and Impact of Academic Librarians in Supporting Open Research: A Case Study at Leeds Beckett University, UK

    ERIC Educational Resources Information Center

    Bower, Kirsty; Sheppard, Nick; Bayjoo, Jennifer; Pease, Adele

    2017-01-01

    This practical article presents findings of a small scale study undertaken at a large U.K. University. The purpose of the study was to encourage academic engagement with Open Access (OA) and the Higher Education Funding Council for England (HEFCE) mandate with the measurable impact being increased engagement with the Repository and dissemination…

  9. A Holistic Redundancy- and Incentive-Based Framework to Improve Content Availability in Peer-to-Peer Networks

    ERIC Educational Resources Information Center

    Herrera-Ruiz, Octavio

    2012-01-01

    Peer-to-Peer (P2P) technology has emerged as an important alternative to the traditional client-server communication paradigm to build large-scale distributed systems. P2P enables the creation, dissemination and access to information at low cost and without the need of dedicated coordinating entities. However, existing P2P systems fail to provide…

  10. Genetic diversity and genetic structure of Persian walnut (Juglans regia) accessions from 14 European, African, and Asian countries using SSR markers

    Treesearch

    Aziz Ebrahimi; Abdolkarim Zarei; Shaneka Lawson; Keith E. Woeste; M. J. M. Smulders

    2016-01-01

    Persian walnut (Juglans regia L.) is the world's most widely grown nut crop, but large-scale assessments and comparisons of the genetic diversity of the crop are notably lacking. To guide the conservation and utilization of Persian walnut genetic resources, genotypes (n = 189) from 25 different regions in 14 countries on...

  11. Soldier Data Tag Study Effort.

    DTIC Science & Technology

    1985-06-10

    interested in protecting it. The tag itself is difficult--though not impossible--to counterfeit . Also, it (’• iii 71 -, potentially improves the data...attacks during the design, manufacture, and distribution processes, counterfeiting , unauthorized access/alteration of tag data, and use of the tag to...45 3.3.2 Hijacking of SOT System Shipments, or Large- Scale Counterfeit of SOT Systems ....................... 46 3.3.3 Unauthorized Alteration

  12. Post Graduations in Technologies and Computing Applied to Education: From F2F Classes to Multimedia Online Open Courses

    ERIC Educational Resources Information Center

    Marques, Bertil P.; Carvalho, Piedade; Escudeiro, Paula; Barata, Ana; Silva, Ana; Queiros, Sandra

    2017-01-01

    Promoted by the significant increase of large scale internet access, many audiences have turned to the web and to its resources for learning and inspiration, with diverse sets of skills and intents. In this context, Multimedia Online Open Courses (MOOC) consist in learning models supported on user-friendly web tools that allow anyone with minimum…

  13. Statistical Literacy in Data Revolution Era: Building Blocks and Instructional Dilemmas

    ERIC Educational Resources Information Center

    Prodromou, Theodosia; Dunne, Tim

    2017-01-01

    The data revolution has given citizens access to enormous large-scale open databases. In order to take into account the full complexity of data, we have to change the way we think in terms of the nature of data and its availability, the ways in which it is displayed and used, and the skills that are required for its interpretation. Substantial…

  14. A user-defined data type for the storage of time series data allowing efficient similarity screening.

    PubMed

    Sorokin, Anatoly; Selkov, Gene; Goryanin, Igor

    2012-07-16

    The volume of the experimentally measured time series data is rapidly growing, while storage solutions offering better data types than simple arrays of numbers or opaque blobs for keeping series data are sorely lacking. A number of indexing methods have been proposed to provide efficient access to time series data, but none has so far been integrated into a tried-and-proven database system. To explore the possibility of such integration, we have developed a data type for time series storage in PostgreSQL, an object-relational database system, and equipped it with an access method based on SAX (Symbolic Aggregate approXimation). This new data type has been successfully tested in a database supporting a large-scale plant gene expression experiment, and it was additionally tested on a very large set of simulated time series data. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  16. OASIS: A Data Fusion System Optimized for Access to Distributed Archives

    NASA Astrophysics Data System (ADS)

    Berriman, G. B.; Kong, M.; Good, J. C.

    2002-05-01

    The On-Line Archive Science Information Services (OASIS) is accessible as a java applet through the NASA/IPAC Infrared Science Archive home page. It uses Geographical Information System (GIS) technology to provide data fusion and interaction services for astronomers. These services include the ability to process and display arbitrarily large image files, and user-controlled contouring, overlay regeneration and multi-table/image interactions. OASIS has been optimized for access to distributed archives and data sets. Its second release (June 2002) provides a mechanism that enables access to OASIS from "third-party" services and data providers. That is, any data provider who creates a query form to an archive containing a collection of data (images, catalogs, spectra) can direct the result files from the query into OASIS. Similarly, data providers who serve links to datasets or remote services on a web page can access all of these data with one instance of OASIS. In this was any data or service provider is given access to the full suite of capabilites of OASIS. We illustrate the "third-party" access feature with two examples: queries to the high-energy image datasets accessible from GSFC SkyView, and links to data that are returned from a target-based query to the NASA Extragalactic Database (NED). The second release of OASIS also includes a file-transfer manager that reports the status of multiple data downloads from remote sources to the client machine. It is a prototype for a request management system that will ultimately control and manage compute-intensive jobs submitted through OASIS to computing grids, such as request for large scale image mosaics and bulk statistical analysis.

  17. Genome-scale approaches to the epigenetics of common human disease

    PubMed Central

    2011-01-01

    Traditionally, the pathology of human disease has been focused on microscopic examination of affected tissues, chemical and biochemical analysis of biopsy samples, other available samples of convenience, such as blood, and noninvasive or invasive imaging of varying complexity, in order to classify disease and illuminate its mechanistic basis. The molecular age has complemented this armamentarium with gene expression arrays and selective analysis of individual genes. However, we are entering a new era of epigenomic profiling, i.e., genome-scale analysis of cell-heritable nonsequence genetic change, such as DNA methylation. The epigenome offers access to stable measurements of cellular state and to biobanked material for large-scale epidemiological studies. Some of these genome-scale technologies are beginning to be applied to create the new field of epigenetic epidemiology. PMID:19844740

  18. Experiment-scale molecular simulation study of liquid crystal thin films

    NASA Astrophysics Data System (ADS)

    Nguyen, Trung Dac; Carrillo, Jan-Michael Y.; Matheson, Michael A.; Brown, W. Michael

    2014-03-01

    Supercomputers have now reached a performance level adequate for studying thin films with molecular detail at the relevant scales. By exploiting the power of GPU accelerators on Titan, we have been able to perform simulations of characteristic liquid crystal films that provide remarkable qualitative agreement with experimental images. We have demonstrated that key features of spinodal instability can only be observed with sufficiently large system sizes, which were not accessible with previous simulation studies. Our study emphasizes the capability and significance of petascale simulations in providing molecular-level insights in thin film systems as well as other interfacial phenomena.

  19. Analysis and optimization of gyrokinetic toroidal simulations on homogenous and heterogenous platforms

    DOE PAGES

    Ibrahim, Khaled Z.; Madduri, Kamesh; Williams, Samuel; ...

    2013-07-18

    The Gyrokinetic Toroidal Code (GTC) uses the particle-in-cell method to efficiently simulate plasma microturbulence. This paper presents novel analysis and optimization techniques to enhance the performance of GTC on large-scale machines. We introduce cell access analysis to better manage locality vs. synchronization tradeoffs on CPU and GPU-based architectures. Finally, our optimized hybrid parallel implementation of GTC uses MPI, OpenMP, and NVIDIA CUDA, achieves up to a 2× speedup over the reference Fortran version on multiple parallel systems, and scales efficiently to tens of thousands of cores.

  20. Lagrangian velocity and acceleration correlations of large inertial particles in a closed turbulent flow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Machicoane, Nathanaël; Volk, Romain

    We investigate the response of large inertial particle to turbulent fluctuations in an inhomogeneous and anisotropic flow. We conduct a Lagrangian study using particles both heavier and lighter than the surrounding fluid, and whose diameters are comparable to the flow integral scale. Both velocity and acceleration correlation functions are analyzed to compute the Lagrangian integral time and the acceleration time scale of such particles. The knowledge of how size and density affect these time scales is crucial in understanding particle dynamics and may permit stochastic process modelization using two-time models (for instance, Sawford’s). As particles are tracked over long timesmore » in the quasi-totality of a closed flow, the mean flow influences their behaviour and also biases the velocity time statistics, in particular the velocity correlation functions. By using a method that allows for the computation of turbulent velocity trajectories, we can obtain unbiased Lagrangian integral time. This is particularly useful in accessing the scale separation for such particles and to comparing it to the case of fluid particles in a similar configuration.« less

  1. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user community comes new requests for algorithms and processing capabilities. To address this demand, OT is developing an extensible service based architecture for integrating community-developed software. This "plugable" approach to Web service deployment will enable new processing and analysis tools to run collocated with OT hosted data.

  2. Influence of socio-economic status on access to different components of SCI management across Indian population.

    PubMed

    Chhabra, H S; Bhalla, A M

    2015-11-01

    To assess the influence of financial constraints on access to different components of spinal cord injury (SCI) management in various socio-economic strata of the Indian population. Indian Spinal Injuries Centre (ISIC). One hundred fifty SCI individuals who came for follow-up at ISIC between March 2009 and March 2013 with at least 1 year of community exposure after discharge were included in the study. Socio-economic classification was carried out according to the Kuppuswamy scale, a standard scale for the Indian population. A self-designed questionnaire was administered. No sample was available from the lower group. There was a statistically significant difference (P<0.05) for the levels of difficulty perceived by different socio-economic groups in accessing different components of SCI management. Aided upper lower group was dependent on welfare schemes for in-hospital treatment but could not access other components of management once discharged. Unaided upper lower group either faced severe difficulty or could not access management. Majority of lower middle group faced severe difficulty. Upper middle group was equally divided into facing severe, moderate or no difficulty. Most patients in the upper group faced no difficulty, whereas some faced moderate and a small number of severe difficulty. Financial constraints affected all components of SCI management in all except the upper group. The results of the survey suggest that a very large percentage of the Indian population would find it difficult to access comprehensive SCI management and advocate extension of essential medical coverage to unaided upper lower, lower middle and upper middle groups.

  3. Moving contact lines on vibrating surfaces

    NASA Astrophysics Data System (ADS)

    Solomenko, Zlatko; Spelt, Peter; Scott, Julian

    2017-11-01

    Large-scale simulations of flows with moving contact lines for realistic conditions generally requires a subgrid scale model (analyses based on matched asymptotics) to account for the unresolved part of the flow, given the large range of length scales involved near contact lines. Existing models for the interface shape in the contact-line region are primarily for steady flows on homogeneous substrates, with encouraging results in 3D simulations. Introduction of complexities would require further investigation of the contact-line region, however. Here we study flows with moving contact lines on planar substrates subject to vibrations, with applications in controlling wetting/dewetting. The challenge here is to determine the change in interface shape near contact lines due to vibrations. To develop further insight, 2D direct numerical simulations (wherein the flow is resolved down to an imposed slip length) have been performed to enable comparison with asymptotic theory, which is also developed further. Perspectives will also be presented on the final objective of the work, which is to develop a subgrid scale model that can be utilized in large-scale simulations. The authors gratefully acknowledge the ANR for financial support (ANR-15-CE08-0031) and the meso-centre FLMSN for use of computational resources. This work was Granted access to the HPC resources of CINES under the allocation A0012B06893 made by GENCI.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Kelin; Zhao, Zhixiong; Wei, Guo-Wei, E-mail: wei@math.msu.edu

    Although persistent homology has emerged as a promising tool for the topological simplification of complex data, it is computationally intractable for large datasets. We introduce multiresolution persistent homology to handle excessively large datasets. We match the resolution with the scale of interest so as to represent large scale datasets with appropriate resolution. We utilize flexibility-rigidity index to access the topological connectivity of the data set and define a rigidity density for the filtration analysis. By appropriately tuning the resolution of the rigidity density, we are able to focus the topological lens on the scale of interest. The proposed multiresolution topologicalmore » analysis is validated by a hexagonal fractal image which has three distinct scales. We further demonstrate the proposed method for extracting topological fingerprints from DNA molecules. In particular, the topological persistence of a virus capsid with 273 780 atoms is successfully analyzed which would otherwise be inaccessible to the normal point cloud method and unreliable by using coarse-grained multiscale persistent homology. The proposed method has also been successfully applied to the protein domain classification, which is the first time that persistent homology is used for practical protein domain analysis, to our knowledge. The proposed multiresolution topological method has potential applications in arbitrary data sets, such as social networks, biological networks, and graphs.« less

  5. Community health worker programs in India: a rights-based review.

    PubMed

    Bhatia, Kavita

    2014-09-01

    This article presents a historical review of national community health worker (CHW) programs in India using a gender- and rights-based lens. The aim is to derive relevant policy implications to stem attrition and enable sustenance of large-scale CHW programs. For the literature review, relevant government policies, minutes of meetings, reports, newspaper articles and statistics were accessed through official websites and a hand search was conducted for studies on the rights-based aspects of large-scale CHW programs. The analysis shows that the CHWs in three successive Indian national CHW programs have consistently asked for reforms in their service conditions, including increased remuneration. Despite an evolution in stakeholder perspectives regarding the rights of CHWs, service reforms are slow. Performance-based payments do not provide the financial security expected by CHWs as demonstrated in the recent Accredited Social Health Activist (ASHA) program. In most countries, CHWs, who are largely women, have never been integrated into the established, salaried team of health system workers. The two hallmark characteristics of CHWs, namely, their volunteer status and the flexibility of their tasks and timings, impede their rights. The consequences of initiating or neglecting standardization should be considered by all countries with large-scale CHW programs like the ASHA program. © Royal Society for Public Health 2014.

  6. Optimal distribution of medical backpacks and health surveillance assistants in Malawi.

    PubMed

    Kunkel, Amber G; Van Itallie, Elizabeth S; Wu, Duo

    2014-09-01

    Despite recent progress, Malawi continues to perform poorly on key health indicators such as child mortality and life expectancy. These problems are exacerbated by a severe lack of access to health care. Health Surveillance Assistants (HSAs) help bridge this gap by providing community-level access to basic health care services. However, the success of these HSAs is limited by a lack of supplies and long distances between HSAs and patients. To address this issue, we used large-scale weighted p-median and capacitated facility location problems to create a scalable, three-tiered plan for optimal allocation of HSAs, HSA designated medical backpacks, and backpack resupply centers. Our analysis uses real data on the location and characteristics of hospitals, health centers, and the general population. In addition to offering specific recommendations for HSA, backpack, and resupply center locations, it provides general insights into the scope of the proposed HSA backpack program scale-up. In particular, it demonstrates the importance of local health centers to the resupply network. The proposed assignments are robust to changes in the underlying population structure, and could significantly improve access to medical supplies for both HSAs and patients.

  7. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  8. Advanced Multidimensional Separations in Mass Spectrometry: Navigating the Big Data Deluge

    PubMed Central

    May, Jody C.; McLean, John A.

    2017-01-01

    Hybrid analytical instrumentation constructed around mass spectrometry (MS) are becoming preferred techniques for addressing many grand challenges in science and medicine. From the omics sciences to drug discovery and synthetic biology, multidimensional separations based on MS provide the high peak capacity and high measurement throughput necessary to obtain large-scale measurements which are used to infer systems-level information. In this review, we describe multidimensional MS configurations as technologies which are big data drivers and discuss some new and emerging strategies for mining information from large-scale datasets. A discussion is included on the information content which can be obtained from individual dimensions, as well as the unique information which can be derived by comparing different levels of data. Finally, we discuss some emerging data visualization strategies which seek to make highly dimensional datasets both accessible and comprehensible. PMID:27306312

  9. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    NASA Astrophysics Data System (ADS)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  10. Performance of an MPI-only semiconductor device simulator on a quad socket/quad core InfiniBand platform.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shadid, John Nicolas; Lin, Paul Tinphone

    2009-01-01

    This preliminary study considers the scaling and performance of a finite element (FE) semiconductor device simulator on a capacity cluster with 272 compute nodes based on a homogeneous multicore node architecture utilizing 16 cores. The inter-node communication backbone for this Tri-Lab Linux Capacity Cluster (TLCC) machine is comprised of an InfiniBand interconnect. The nonuniform memory access (NUMA) nodes consist of 2.2 GHz quad socket/quad core AMD Opteron processors. The performance results for this study are obtained with a FE semiconductor device simulation code (Charon) that is based on a fully-coupled Newton-Krylov solver with domain decomposition and multilevel preconditioners. Scaling andmore » multicore performance results are presented for large-scale problems of 100+ million unknowns on up to 4096 cores. A parallel scaling comparison is also presented with the Cray XT3/4 Red Storm capability platform. The results indicate that an MPI-only programming model for utilizing the multicore nodes is reasonably efficient on all 16 cores per compute node. However, the results also indicated that the multilevel preconditioner, which is critical for large-scale capability type simulations, scales better on the Red Storm machine than the TLCC machine.« less

  11. Integrating complexity into data-driven multi-hazard supply chain network strategies

    USGS Publications Warehouse

    Long, Suzanna K.; Shoberg, Thomas G.; Ramachandran, Varun; Corns, Steven M.; Carlo, Hector J.

    2013-01-01

    Major strategies in the wake of a large-scale disaster have focused on short-term emergency response solutions. Few consider medium-to-long-term restoration strategies that reconnect urban areas to the national supply chain networks (SCN) and their supporting infrastructure. To re-establish this connectivity, the relationships within the SCN must be defined and formulated as a model of a complex adaptive system (CAS). A CAS model is a representation of a system that consists of large numbers of inter-connections, demonstrates non-linear behaviors and emergent properties, and responds to stimulus from its environment. CAS modeling is an effective method of managing complexities associated with SCN restoration after large-scale disasters. In order to populate the data space large data sets are required. Currently access to these data is hampered by proprietary restrictions. The aim of this paper is to identify the data required to build a SCN restoration model, look at the inherent problems associated with these data, and understand the complexity that arises due to integration of these data.

  12. Small water and wastewater systems: pathways to sustainable development?

    PubMed

    Ho, G

    2003-01-01

    Globally we are faced with billions of people without access to safe water and adequate sanitation. These are generally located in developing communities. Even in developed communities the current large scale systems for supplying water, collecting wastewater and treating it are not environmentally sustainable, because it is difficult to close the cycle of water and nutrients. This paper discusses the advantages of small scale water and wastewater systems in overcoming the difficulties in providing water and wastewater systems in developing communities and in achieving sustainability in both developed and developing communities. Particular attention is given to technology and technology choice, even though technology alone does not provide the complete answer. Disadvantages of small scale systems and how they may be overcome are discussed.

  13. Living in a network of scaling cities and finite resources.

    PubMed

    Qubbaj, Murad R; Shutters, Shade T; Muneepeerakul, Rachata

    2015-02-01

    Many urban phenomena exhibit remarkable regularity in the form of nonlinear scaling behaviors, but their implications on a system of networked cities has never been investigated. Such knowledge is crucial for our ability to harness the complexity of urban processes to further sustainability science. In this paper, we develop a dynamical modeling framework that embeds population-resource dynamics-a generalized Lotka-Volterra system with modifications to incorporate the urban scaling behaviors-in complex networks in which cities may be linked to the resources of other cities and people may migrate in pursuit of higher welfare. We find that isolated cities (i.e., no migration) are susceptible to collapse if they do not have access to adequate resources. Links to other cities may help cities that would otherwise collapse due to insufficient resources. The effects of inter-city links, however, can vary due to the interplay between the nonlinear scaling behaviors and network structure. The long-term population level of a city is, in many settings, largely a function of the city's access to resources over which the city has little or no competition. Nonetheless, careful investigation of dynamics is required to gain mechanistic understanding of a particular city-resource network because cities and resources may collapse and the scaling behaviors may influence the effects of inter-city links, thereby distorting what topological metrics really measure.

  14. A multi-scale framework to link remotely sensed metrics with socioeconomic data

    NASA Astrophysics Data System (ADS)

    Watmough, Gary; Svenning, Jens-Christian; Palm, Cheryl; Sullivan, Clare; Danylo, Olha; McCallum, Ian

    2017-04-01

    There is increasing interest in the use of remotely sensed satellite data for estimating human poverty as it can bridge data gaps that prevent fine scale monitoring of development goals across large areas. The ways in which metrics derived from satellite imagery are linked with socioeconomic data are crucial for accurate estimation of poverty. Yet, to date, approaches in the literature linking satellite metrics with socioeconomic data are poorly characterized. Typically, approaches use a GIS approach such as circular buffer zones around a village or household or an administrative boundary such as a district or census enumeration area. These polygons are then used to extract environmental data from satellite imagery and related to the socioeconomic data in statistical analyses. The use of a single polygon to link environment and socioeconomic data is inappropriate in coupled human-natural systems as processes operate over multiple scales. Human interactions with the environment occur at multiple levels from individual (household) access to agricultural plots adjacent to homes, to communal access to common pool resources (CPR) such as forests at the village level. Here, we present a multi-scale framework that explicitly considers how people use the landscape. The framework is presented along with a case study example in Kenya. The multi-scale approach could enhance the modelling of human-environment interactions which will have important consequences for monitoring the sustainable development goals for human livelihoods and biodiversity conservation.

  15. Developing enterprise tools and capacities for large-scale natural resource monitoring: A visioning workshop

    USGS Publications Warehouse

    Bayer, Jennifer M.; Weltzin, Jake F.; Scully, Rebecca A.

    2017-01-01

    Objectives of the workshop were: 1) identify resources that support natural resource monitoring programs working across the data life cycle; 2) prioritize desired capacities and tools to facilitate monitoring design and implementation; 3) identify standards and best practices that improve discovery, accessibility, and interoperability of data across programs and jurisdictions; and 4) contribute to an emerging community of practice focused on natural resource monitoring.

  16. Compiling Planning into Quantum Optimization Problems: A Comparative Study

    DTIC Science & Technology

    2015-06-07

    and Sipser, M. 2000. Quantum computation by adiabatic evolution. arXiv:quant- ph/0001106. Fikes, R. E., and Nilsson, N. J. 1972. STRIPS: A new...become available: quantum annealing. Quantum annealing is one of the most accessible quantum algorithms for a computer sci- ence audience not versed...in quantum computing because of its close ties to classical optimization algorithms such as simulated annealing. While large-scale universal quantum

  17. Fault-Tolerant Sequencer Using FPGA-Based Logic Designs for Space Applications

    DTIC Science & Technology

    2013-12-01

    Prototype Board SBU single bit upset SDK software development kit SDRAM synchronous dynamic random-access memory SEB single-event burnout ...current VHDL VHSIC hardware description language VHSIC very-high-speed integrated circuits VLSI very-large- scale integration VQFP very...transient pulse, called a single-event transient (SET), or even cause permanent damage to the device in the form of a burnout or gate rupture. The SEE

  18. Behind the Scenes with OpenLearn: The Challenges of Researching the Provision of Open Educational Resources

    ERIC Educational Resources Information Center

    Godwin, Stephen; McAndrew, Patrick; Santos, Andreia

    2008-01-01

    Web-enabled technology is now being applied on a large scale. In this paper we look at open access provision of teaching and learning leading to many users with varying patterns and motivations for use. This has provided us with a research challenge to find methods that help us understand and explain such initiatives. We describe ways to model the…

  19. A Large-Scale Internet/Computer-Based, Training Module: Dissemination of Evidence-Based Management of Postpartum Hemorrhage to Front-Line Health Care Workers

    ERIC Educational Resources Information Center

    Abawi, Karim; Gertiser, Lynn; Idris, Raqibat; Villar, José; Langer, Ana; Chatfield, Alison; Campana, Aldo

    2017-01-01

    Postpartum hemorrhage (PPH) is the leading cause of maternal mortality in most developing and low-income countries and the cause of one-quarter of maternal deaths worldwide. With appropriate and prompt care, these deaths can be prevented. With the current and rapidly developing research and worldwide access to information, a lack of knowledge of…

  20. An Examination of the Impact of Teacher Quality and "Opportunity Gap" on Student Science Achievement in China

    ERIC Educational Resources Information Center

    Zhang, Danhui; Campbell, Todd

    2015-01-01

    This study aims to better understand questions related to the impact of teacher quality and access to qualified teachers in China. A large-scale data set collected in 2010 in China was used along with concurrently collected teacher questionnaires. In total, surveys from 9,943 8th grade students from 343 middle schools in 6 provinces were used,…

  1. Large-size porous ZnO flakes with superior gas-sensing performance

    NASA Astrophysics Data System (ADS)

    Wen, Wei; Wu, Jin-Ming; Wang, Yu-De

    2012-06-01

    A simple top-down route is developed to fabricate large size porous ZnO flakes via solution combustion synthesis followed by a subsequent calcination in air, which is template-free and can be easily enlarged to an industrial scale. The achieved porous ZnO flakes, which are tens to hundreds of micrometers in flat and tens of nanometers in thickness, exhibit high response for detecting acetone and ethanol, because the unique two-dimensional architecture shortens effectively the gas diffusion distance and provides highly accessible open channels and active surfaces for the target gas.

  2. Clinical benchmarking enabled by the digital health record.

    PubMed

    Ricciardi, T N; Masarie, F E; Middleton, B

    2001-01-01

    Office-based physicians are often ill equipped to report aggregate information about their patients and practice of medicine, since their practices have relied upon paper records for the management of clinical information. Physicians who do not have access to large-scale information technology support can now benefit from low-cost clinical documentation and reporting tools. We developed a hosted clinical data mart for users of a web-enabled charting tool, targeting the solo or small group practice. The system uses secure Java Server Pages with a dashboard-like menu to provide point-and-click access to simple reports such as case mix, medications, utilization, productivity, and patient demographics in its first release. The system automatically normalizes user-entered clinical terms to enhance the quality of structured data. Individual providers benefit from rapid patient identification for disease management, quality of care self-assessments, drug recalls, and compliance with clinical guidelines. The system provides knowledge integration by linking to trusted sources of online medical information in context. Information derived from the clinical record is clinically more accurate than billing data. Provider self-assessment and benchmarking empowers physicians, who may resent "being profiled" by external entities. In contrast to large-scale data warehouse projects, the current system delivers immediate value to individual physicians who choose an electronic clinical documentation tool.

  3. Applications of species accumulation curves in large-scale biological data analysis.

    PubMed

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2015-09-01

    The species accumulation curve, or collector's curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45-63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k -mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible.

  4. Applications of species accumulation curves in large-scale biological data analysis

    PubMed Central

    Deng, Chao; Daley, Timothy; Smith, Andrew D

    2016-01-01

    The species accumulation curve, or collector’s curve, of a population gives the expected number of observed species or distinct classes as a function of sampling effort. Species accumulation curves allow researchers to assess and compare diversity across populations or to evaluate the benefits of additional sampling. Traditional applications have focused on ecological populations but emerging large-scale applications, for example in DNA sequencing, are orders of magnitude larger and present new challenges. We developed a method to estimate accumulation curves for predicting the complexity of DNA sequencing libraries. This method uses rational function approximations to a classical non-parametric empirical Bayes estimator due to Good and Toulmin [Biometrika, 1956, 43, 45–63]. Here we demonstrate how the same approach can be highly effective in other large-scale applications involving biological data sets. These include estimating microbial species richness, immune repertoire size, and k-mer diversity for genome assembly applications. We show how the method can be modified to address populations containing an effectively infinite number of species where saturation cannot practically be attained. We also introduce a flexible suite of tools implemented as an R package that make these methods broadly accessible. PMID:27252899

  5. Polymer Dynamics from Synthetic to Biological Macromolecules

    NASA Astrophysics Data System (ADS)

    Richter, D.; Niedzwiedz, K.; Monkenbusch, M.; Wischnewski, A.; Biehl, R.; Hoffmann, B.; Merkel, R.

    2008-02-01

    High resolution neutron scattering together with a meticulous choice of the contrast conditions allows to access the large scale dynamics of soft materials including biological molecules in space and time. In this contribution we present two examples. One from the world of synthetic polymers, the other from biomolecules. First, we will address the peculiar dynamics of miscible polymer blends with very different component glass transition temperatures. Polymethylmetacrylate (PMMA), polyethyleneoxide (PEO) are perfectly miscible but exhibit a difference in the glass transition temperature by 200 K. We present quasielastic neutron scattering investigations on the dynamics of the fast component in the range from angströms to nanometers over a time frame of five orders of magnitude. All data may be consistently described in terms of a Rouse model with random friction, reflecting the random environment imposed by the nearly frozen PMMA matrix on the fast mobile PEO. In the second part we touch on some new developments relating to large scale internal dynamics of proteins by neutron spin echo. We will report results of some pioneering studies which show the feasibility of such experiments on large scale protein motion which will most likely initiate further studies in the future.

  6. Web-based visualization of very large scientific astronomy imagery

    NASA Astrophysics Data System (ADS)

    Bertin, E.; Pillay, R.; Marmo, C.

    2015-04-01

    Visualizing and navigating through large astronomy images from a remote location with current astronomy display tools can be a frustrating experience in terms of speed and ergonomics, especially on mobile devices. In this paper, we present a high performance, versatile and robust client-server system for remote visualization and analysis of extremely large scientific images. Applications of this work include survey image quality control, interactive data query and exploration, citizen science, as well as public outreach. The proposed software is entirely open source and is designed to be generic and applicable to a variety of datasets. It provides access to floating point data at terabyte scales, with the ability to precisely adjust image settings in real-time. The proposed clients are light-weight, platform-independent web applications built on standard HTML5 web technologies and compatible with both touch and mouse-based devices. We put the system to the test and assess the performance of the system and show that a single server can comfortably handle more than a hundred simultaneous users accessing full precision 32 bit astronomy data.

  7. SEED Servers: High-Performance Access to the SEED Genomes, Annotations, and Metabolic Models

    PubMed Central

    Aziz, Ramy K.; Devoid, Scott; Disz, Terrence; Edwards, Robert A.; Henry, Christopher S.; Olsen, Gary J.; Olson, Robert; Overbeek, Ross; Parrello, Bruce; Pusch, Gordon D.; Stevens, Rick L.; Vonstein, Veronika; Xia, Fangfang

    2012-01-01

    The remarkable advance in sequencing technology and the rising interest in medical and environmental microbiology, biotechnology, and synthetic biology resulted in a deluge of published microbial genomes. Yet, genome annotation, comparison, and modeling remain a major bottleneck to the translation of sequence information into biological knowledge, hence computational analysis tools are continuously being developed for rapid genome annotation and interpretation. Among the earliest, most comprehensive resources for prokaryotic genome analysis, the SEED project, initiated in 2003 as an integration of genomic data and analysis tools, now contains >5,000 complete genomes, a constantly updated set of curated annotations embodied in a large and growing collection of encoded subsystems, a derived set of protein families, and hundreds of genome-scale metabolic models. Until recently, however, maintaining current copies of the SEED code and data at remote locations has been a pressing issue. To allow high-performance remote access to the SEED database, we developed the SEED Servers (http://www.theseed.org/servers): four network-based servers intended to expose the data in the underlying relational database, support basic annotation services, offer programmatic access to the capabilities of the RAST annotation server, and provide access to a growing collection of metabolic models that support flux balance analysis. The SEED servers offer open access to regularly updated data, the ability to annotate prokaryotic genomes, the ability to create metabolic reconstructions and detailed models of metabolism, and access to hundreds of existing metabolic models. This work offers and supports a framework upon which other groups can build independent research efforts. Large integrations of genomic data represent one of the major intellectual resources driving research in biology, and programmatic access to the SEED data will provide significant utility to a broad collection of potential users. PMID:23110173

  8. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    NASA Astrophysics Data System (ADS)

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-02-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications.

  9. Access to and use of health services among undocumented Mexican immigrants in a US urban area.

    PubMed

    Nandi, Arijit; Galea, Sandro; Lopez, Gerald; Nandi, Vijay; Strongarone, Stacey; Ompad, Danielle C

    2008-11-01

    We assessed access to and use of health services among Mexican-born undocumented immigrants living in New York City in 2004. We used venue-based sampling to recruit participants from locations where undocumented immigrants were likely to congregate. Participants were 18 years or older, born in Mexico, and current residents of New York City. The main outcome measures were health insurance coverage, access to a regular health care provider, and emergency department care. In multivariable models, living in a residence with fewer other adults, linguistic acculturation, higher levels of formal income, higher levels of social support, and poor health were associated with health insurance coverage. Female gender, fewer children, arrival before 1997, higher levels of formal income, health insurance coverage, greater social support, and not reporting discrimination were associated with access to a regular health care provider. Higher levels of education, higher levels of formal income, and poor health were associated with emergency department care. Absent large-scale political solutions to the challenges of undocumented immigrants, policies that address factors shown to limit access to care may improve health among this growing population.

  10. Implementation of a Cross-Layer Sensing Medium-Access Control Scheme.

    PubMed

    Su, Yishan; Fu, Xiaomei; Han, Guangyao; Xu, Naishen; Jin, Zhigang

    2017-04-10

    In this paper, compressed sensing (CS) theory is utilized in a medium-access control (MAC) scheme for wireless sensor networks (WSNs). We propose a new, cross-layer compressed sensing medium-access control (CL CS-MAC) scheme, combining the physical layer and data link layer, where the wireless transmission in physical layer is considered as a compress process of requested packets in a data link layer according to compressed sensing (CS) theory. We first introduced using compressive complex requests to identify the exact active sensor nodes, which makes the scheme more efficient. Moreover, because the reconstruction process is executed in a complex field of a physical layer, where no bit and frame synchronizations are needed, the asynchronous and random requests scheme can be implemented without synchronization payload. We set up a testbed based on software-defined radio (SDR) to implement the proposed CL CS-MAC scheme practically and to demonstrate the validation. For large-scale WSNs, the simulation results show that the proposed CL CS-MAC scheme provides higher throughput and robustness than the carrier sense multiple access (CSMA) and compressed sensing medium-access control (CS-MAC) schemes.

  11. Accessing and Visualizing Satellite Data for Fisheries Managers in the Northeast Large Marine Ecosystem

    NASA Astrophysics Data System (ADS)

    Young Morse, R.; Mecray, E. L.; Pershing, A. J.

    2015-12-01

    As interest in the global change in temperatures and precipitation patterns grow, federal, state, and local agencies are turning to the delivery of 'actionable science and information' or 'information for decision-makers.' NOAA/National Centers for Environmental Information's Regional Climate Services program builds these bridges between the user of information and the producers of the information. With the Climate Data Records program, this study will present the extraction and use of the sea-surface temperature datasets specifically for access and use by fisheries managers in the north Atlantic. The work demonstrates the staged approach of accessing the records, converting their initial data formats into maps and charts, and the delivery of the data as a value-added information dashboard for use by managers. The questions to be reviewed include the ease of access, the delivery of open source software for visualizing the information, and a discussion on the roles of government and the private sector in the provision of climate information at different scales.

  12. FunRich proteomics software analysis, let the fun begin!

    PubMed

    Benito-Martin, Alberto; Peinado, Héctor

    2015-08-01

    Protein MS analysis is the preferred method for unbiased protein identification. It is normally applied to a large number of both small-scale and high-throughput studies. However, user-friendly computational tools for protein analysis are still needed. In this issue, Mathivanan and colleagues (Proteomics 2015, 15, 2597-2601) report the development of FunRich software, an open-access software that facilitates the analysis of proteomics data, providing tools for functional enrichment and interaction network analysis of genes and proteins. FunRich is a reinterpretation of proteomic software, a standalone tool combining ease of use with customizable databases, free access, and graphical representations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. In the absence of a "landscape of fear": How lions, hyenas, and cheetahs coexist.

    PubMed

    Swanson, Alexandra; Arnold, Todd; Kosmala, Margaret; Forester, James; Packer, Craig

    2016-12-01

    Aggression by top predators can create a "landscape of fear" in which subordinate predators restrict their activity to low-risk areas or times of day. At large spatial or temporal scales, this can result in the costly loss of access to resources. However, fine-scale reactive avoidance may minimize the risk of aggressive encounters for subordinate predators while maintaining access to resources, thereby providing a mechanism for coexistence. We investigated fine-scale spatiotemporal avoidance in a guild of African predators characterized by intense interference competition. Vulnerable to food stealing and direct killing, cheetahs are expected to avoid both larger predators; hyenas are expected to avoid lions. We deployed a grid of 225 camera traps across 1,125 km 2 in Serengeti National Park, Tanzania, to evaluate concurrent patterns of habitat use by lions, hyenas, cheetahs, and their primary prey. We used hurdle models to evaluate whether smaller species avoided areas preferred by larger species, and we used time-to-event models to evaluate fine-scale temporal avoidance in the hours immediately surrounding top predator activity. We found no evidence of long-term displacement of subordinate species, even at fine spatial scales. Instead, hyenas and cheetahs were positively associated with lions except in areas with exceptionally high lion use. Hyenas and lions appeared to actively track each, while cheetahs appear to maintain long-term access to sites with high lion use by actively avoiding those areas just in the hours immediately following lion activity. Our results suggest that cheetahs are able to use patches of preferred habitat by avoiding lions on a moment-to-moment basis. Such fine-scale temporal avoidance is likely to be less costly than long-term avoidance of preferred areas: This may help explain why cheetahs are able to coexist with lions despite high rates of lion-inflicted mortality, and highlights reactive avoidance as a general mechanism for predator coexistence.

  14. Characterizing parallel file-access patterns on a large-scale multiprocessor

    NASA Technical Reports Server (NTRS)

    Purakayastha, Apratim; Ellis, Carla Schlatter; Kotz, David; Nieuwejaar, Nils; Best, Michael

    1994-01-01

    Rapid increases in the computational speeds of multiprocessors have not been matched by corresponding performance enhancements in the I/O subsystem. To satisfy the large and growing I/O requirements of some parallel scientific applications, we need parallel file systems that can provide high-bandwidth and high-volume data transfer between the I/O subsystem and thousands of processors. Design of such high-performance parallel file systems depends on a thorough grasp of the expected workload. So far there have been no comprehensive usage studies of multiprocessor file systems. Our CHARISMA project intends to fill this void. The first results from our study involve an iPSC/860 at NASA Ames. This paper presents results from a different platform, the CM-5 at the National Center for Supercomputing Applications. The CHARISMA studies are unique because we collect information about every individual read and write request and about the entire mix of applications running on the machines. The results of our trace analysis lead to recommendations for parallel file system design. First the file system should support efficient concurrent access to many files, and I/O requests from many jobs under varying load conditions. Second, it must efficiently manage large files kept open for long periods. Third, it should expect to see small requests predominantly sequential access patterns, application-wide synchronous access, no concurrent file-sharing between jobs appreciable byte and block sharing between processes within jobs, and strong interprocess locality. Finally, the trace data suggest that node-level write caches and collective I/O request interfaces may be useful in certain environments.

  15. Two Non Linear Dynamics Plasma Astrophysics Experiments At LANL

    NASA Astrophysics Data System (ADS)

    Intrator, T.; Weber, T.; Feng, Y.; Sears, J.; Smith, R. J.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J. P.

    2013-12-01

    Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane to astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, along with creation and annihilation of magnetic field. The Relaxation Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that are observed to kink, bounce, merge and reconnect, shred, and reform in complicated ways. We show recent movies from a large detailed data set that describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence.

  16. Confidentiality, electronic health records, and the clinician.

    PubMed

    Graves, Stuart

    2013-01-01

    The advent of electronic health records (EHRs) to improve access and enable research in the everyday clinical world has simultaneously made medical information much more vulnerable to illicit, non-beneficent uses. This wealth of identified, aggregated data has and will attract attacks by domestic governments for surveillance and protection, foreign governments for espionage and sabotage, organized crime for illegal profits, and large corporations for "legal" profits. Against these powers with almost unlimited resources no security scheme is likely to prevail, so the design of such systems should include appropriate security measures. Unlike paper records, where the person maintaining and controlling the existence of the records also controls access to them, these two functions can be separated for EHRs. By giving physical control over access to individual records to their individual owners, the aggregate is dismantled, thereby protecting the nation's identified health information from large-scale data mining or tampering. Control over the existence and integrity of all the records--yet without the ability to examine their contents--would be left with larger institutions. This article discusses the implications of all of the above for the role of the clinician in assuring confidentiality (a cornerstone of clinical practice), for research and everyday practice, and for current security designs.

  17. ENGINES: exploring single nucleotide variation in entire human genomes.

    PubMed

    Amigo, Jorge; Salas, Antonio; Phillips, Christopher

    2011-04-19

    Next generation ultra-sequencing technologies are starting to produce extensive quantities of data from entire human genome or exome sequences, and therefore new software is needed to present and analyse this vast amount of information. The 1000 Genomes project has recently released raw data for 629 complete genomes representing several human populations through their Phase I interim analysis and, although there are certain public tools available that allow exploration of these genomes, to date there is no tool that permits comprehensive population analysis of the variation catalogued by such data. We have developed a genetic variant site explorer able to retrieve data for Single Nucleotide Variation (SNVs), population by population, from entire genomes without compromising future scalability and agility. ENGINES (ENtire Genome INterface for Exploring SNVs) uses data from the 1000 Genomes Phase I to demonstrate its capacity to handle large amounts of genetic variation (>7.3 billion genotypes and 28 million SNVs), as well as deriving summary statistics of interest for medical and population genetics applications. The whole dataset is pre-processed and summarized into a data mart accessible through a web interface. The query system allows the combination and comparison of each available population sample, while searching by rs-number list, chromosome region, or genes of interest. Frequency and FST filters are available to further refine queries, while results can be visually compared with other large-scale Single Nucleotide Polymorphism (SNP) repositories such as HapMap or Perlegen. ENGINES is capable of accessing large-scale variation data repositories in a fast and comprehensive manner. It allows quick browsing of whole genome variation, while providing statistical information for each variant site such as allele frequency, heterozygosity or FST values for genetic differentiation. Access to the data mart generating scripts and to the web interface is granted from http://spsmart.cesga.es/engines.php. © 2011 Amigo et al; licensee BioMed Central Ltd.

  18. A research agenda for a people-centred approach to energy access in the urbanizing global south

    NASA Astrophysics Data System (ADS)

    Broto, Vanesa Castán; Stevens, Lucy; Ackom, Emmanuel; Tomei, Julia; Parikh, Priti; Bisaga, Iwona; To, Long Seng; Kirshner, Joshua; Mulugetta, Yacob

    2017-10-01

    Energy access is typically viewed as a problem for rural areas, but people living in urban settings also face energy challenges that have not received sufficient attention. A revised agenda in research and practice that puts the user and local planning complexities centre stage is needed to change the way we look at energy access in urban areas, to understand the implications of the concentration of vulnerable people in slums and to identify opportunities for planned management and innovation that can deliver urban energy transitions while leaving no one behind. Here, we propose a research agenda focused on three key issues: understanding the needs of urban energy users; enabling the use of context-specific, disaggregated data; and engaging with effective modes of energy and urban governance. This agenda requires interdisciplinary scholarship across the social and physical sciences to support local action and deliver large-scale, inclusive transformations.

  19. Supporting Clean Energy Development in Swaziland

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2016-04-01

    Swaziland, a country largely dependent on regional fossil fuel imports to meet power needs, is vulnerable to supply changes and price shocks. To address this challenge, the country's National Energy Policy and Implementation Strategy prioritizes actions to enhance energy independence through scaling up renewable energy and energy efficiency. With approximately 70 percent of the country lacking electricity, Swaziland is also strongly committed to expanding energy access to support key economic and social development goals. Within this context, energy security and energy access are two foundational objectives for clean energy development in Swaziland. The partnership between the Swaziland Energy Regulatory Authoritymore » and the Clean Energy Solutions Center led to concrete outcomes to support clean energy development in Swaziland. Improving renewable energy project licensing processes will enable Swaziland to achieve key national objectives to expand clean energy access and transition to greater energy independence.« less

  20. Health financing to promote access in low income settings-how much do we know?

    PubMed

    Palmer, Natasha; Mueller, Dirk H; Gilson, Lucy; Mills, Anne; Haines, Andy

    In this article we outline research since 1995 on the impact of various financing strategies on access to health services or health outcomes in low income countries. The limited evidence available suggests, in general, that user fees deterred utilisation. Prepayment or insurance schemes offered potential for improving access, but are very limited in scope. Conditional cash payments showed promise for improving uptake of interventions, but could also create a perverse incentive. The largely African origin of the reports of user fees, and the evidence from Latin America on conditional cash transfers, demonstrate the importance of the context in which studies are done. There is a need for improved quality of research in this area. Larger scale, upfront funding for evaluation of health financing initiatives is necessary to ensure an evidence base that corresponds to the importance of this issue for achieving development goals.

  1. GenBank

    PubMed Central

    Benson, Dennis A.; Karsch-Mizrachi, Ilene; Lipman, David J.; Ostell, James; Wheeler, David L.

    2007-01-01

    GenBank (R) is a comprehensive database that contains publicly available nucleotide sequences for more than 240 000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the EMBL Data Library in Europe and the DNA Data Bank of Japan ensures worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, begin at the NCBI Homepage (). PMID:17202161

  2. Bringing modeling to the masses: A web based system to predict potential species distributions

    USGS Publications Warehouse

    Graham, Jim; Newman, Greg; Kumar, Sunil; Jarnevich, Catherine S.; Young, Nick; Crall, Alycia W.; Stohlgren, Thomas J.; Evangelista, Paul

    2010-01-01

    Predicting current and potential species distributions and abundance is critical for managing invasive species, preserving threatened and endangered species, and conserving native species and habitats. Accurate predictive models are needed at local, regional, and national scales to guide field surveys, improve monitoring, and set priorities for conservation and restoration. Modeling capabilities, however, are often limited by access to software and environmental data required for predictions. To address these needs, we built a comprehensive web-based system that: (1) maintains a large database of field data; (2) provides access to field data and a wealth of environmental data; (3) accesses values in rasters representing environmental characteristics; (4) runs statistical spatial models; and (5) creates maps that predict the potential species distribution. The system is available online at www.niiss.org, and provides web-based tools for stakeholders to create potential species distribution models and maps under current and future climate scenarios.

  3. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    PubMed

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  4. Low Pressure Seeder Development for PIV in Large Scale Open Loop Wind Tunnels

    NASA Astrophysics Data System (ADS)

    Schmit, Ryan

    2010-11-01

    A low pressure seeding techniques have been developed for Particle Image Velocimetry (PIV) in large scale wind tunnel facilities was performed at the Subsonic Aerodynamic Research Laboratory (SARL) facility at Wright-Patterson Air Force Base. The SARL facility is an open loop tunnel with a 7 by 10 foot octagonal test section that has 56% optical access and the Mach number varies from 0.2 to 0.5. A low pressure seeder sprayer was designed and tested in the inlet of the wind tunnel. The seeder sprayer was designed to produce an even and uniform distribution of seed while reducing the seeders influence in the test section. ViCount Compact 5000 using Smoke Oil 180 was using as the seeding material. The results show that this low pressure seeder does produce streaky seeding but excellent PIV images are produced.

  5. Differentiation in Access to, and the Use and Sharing of (Open) Educational Resources among Students and Lecturers at Kenyan Universities

    ERIC Educational Resources Information Center

    Pete, Judith; Mulder, Fred; Neto, Jose Dutra Oliveira

    2017-01-01

    In order to obtain a fair "OER picture" for the Global South a large-scale study has been carried out for a series of countries, including Kenya. In this paper we report on the Kenya study, run at four universities that have been selected with randomly sampled students and lecturers. Empirical data have been generated by the use of a…

  6. Sirocco Storage Server v. pre-alpha 0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curry, Matthew L.; Danielson, Geoffrey; Ward, H. Lee

    Sirocco is a parallel storage system under development, designed for write-intensive workloads on large-scale HPC platforms. It implements a keyvalue object store on top of a set of loosely federated storage servers that cooperate to ensure data integrity and performance. It includes support for a range of different types of storage transactions. This software release constitutes a conformant storage server, along with the client-side libraries to access the storage over a network.

  7. Cosmology: A research briefing

    NASA Technical Reports Server (NTRS)

    1995-01-01

    As part of its effort to update topics dealt with in the 1986 decadal physics survey, the Board on Physics and Astronomy of the National Research Council (NRC) formed a Panel on Cosmology. The Panel produced this report, intended to be accessible to science policymakers and nonscientists. The chapters include an overview ('What Is Cosmology?'), a discussion of cosmic microwave background radiation, the large-scale structure of the universe, the distant universe, and physics of the early universe.

  8. Cloud-Based Distributed Control of Unmanned Systems

    DTIC Science & Technology

    2015-04-01

    during mission execution. At best, the data is saved onto hard-drives and is accessible only by the local team. Data history in a form available and...following open source technologies: GeoServer, OpenLayers, PostgreSQL , and PostGIS are chosen to implement the back-end database and server. A brief...geospatial map data. 3. PostgreSQL : An SQL-compliant object-relational database that easily scales to accommodate large amounts of data - upwards to

  9. Explore, Visualize, and Analyze Functional Cancer Proteomic Data Using the Cancer Proteome Atlas. | Office of Cancer Genomics

    Cancer.gov

    Reverse-phase protein arrays (RPPA) represent a powerful functional proteomic approach to elucidate cancer-related molecular mechanisms and to develop novel cancer therapies. To facilitate community-based investigation of the large-scale protein expression data generated by this platform, we have developed a user-friendly, open-access bioinformatic resource, The Cancer Proteome Atlas (TCPA, http://tcpaportal.org), which contains two separate web applications.

  10. An All-Optical Access Metro Interface for Hybrid WDM/TDM PON Based on OBS

    NASA Astrophysics Data System (ADS)

    Segarra, Josep; Sales, Vicent; Prat, Josep

    2007-04-01

    A new all-optical access metro network interface based on optical burst switching (OBS) is proposed. A hybrid wavelength-division multiplexing/time-division multiplexing (WDM/TDM) access architecture with reflective optical network units (ONUs), an arrayed-waveguide-grating outside plant, and a tunable laser stack at the optical line terminal (OLT) is presented as a solution for the passive optical network. By means of OBS and a dynamic bandwidth allocation (DBA) protocol, which polls the ONUs, the available access bandwidth is managed. All the network intelligence and costly equipment is located at the OLT, where the DBA module is centrally implemented, providing quality of service (QoS). To scale this access network, an optical cross connect (OXC) is then used to attain a large number of ONUs by the same OLT. The hybrid WDM/TDM structure is also extended toward the metropolitan area network (MAN) by introducing the concept of OBS multiplexer (OBS-M). The network element OBS-M bridges the MAN and access networks by offering all-optical cross connection, wavelength conversion, and data signaling. The proposed innovative OBS-M node yields a full optical data network, interfacing access and metro with a geographically distributed access control. The resulting novel access metro architectures are nonblocking and, with an improved signaling, provide QoS, scalability, and very low latency. Finally, numerical analysis and simulations demonstrate the traffic performance of the proposed access scheme and all-optical access metro interface and architectures.

  11. The Amma-Sat Database

    NASA Astrophysics Data System (ADS)

    Ramage, K.; Desbois, M.; Eymard, L.

    2004-12-01

    The African Monsoon Multidisciplinary Analysis project is a French initiative, which aims at identifying and analysing in details the multidisciplinary and multi-scales processes that lead to a better understanding of the physical mechanisms linked to the African Monsoon. The main components of the African Monsoon are: Atmospheric Dynamics, the Continental Water Cycle, Atmospheric Chemistry, Oceanic and Continental Surface Conditions. Satellites contribute to various objectives of the project both for process analysis and for large scale-long term studies: some series of satellites (METEOSAT, NOAA,.) have been flown for more than 20 years, ensuring a good quality monitoring of some of the West African atmosphere and surface characteristics. Moreover, several recent missions, and several projects will strongly improve and complement this survey. The AMMA project offers an opportunity to develop the exploitation of satellite data and to make collaboration between specialist and non-specialist users. In this purpose databases are being developed to collect all past and future satellite data related to the African Monsoon. It will then be possible to compare different types of data from different resolution, to validate satellite data with in situ measurements or numerical simulations. AMMA-SAT database main goal is to offer an easy access to satellite data to the AMMA scientific community. The database contains geophysical products estimated from operational or research algorithms and covering the different components of the AMMA project. Nevertheless, the choice has been made to group data within pertinent scales rather than within their thematic. In this purpose, five regions of interest where defined to extract the data: An area covering Tropical Atlantic and Africa for large scale studies, an area covering West Africa for mesoscale studies and three local areas surrounding sites of in situ observations. Within each of these regions satellite data are projected on a regular grid with a spatial resolution compatible with the spatial variability of the geophysical parameter. Data are stored in NetCDF files to facilitate their use. Satellite products can be selected using several spatial and temporal criteria and ordered through a web interface developed in PHP-MySQL. More common means of access are also available such as direct FTP or NFS access for identified users. A Live Access Server allows quick visualization of the data. A meta-data catalogue based on the Directory Interchange Format manages the documentation of each satellite product. The database is currently under development, but some products are already available. The database will be complete by the end of 2005.

  12. Accessing Data Federations with CVMFS

    DOE PAGES

    Weitzel, Derek; Bockelman, Brian; Dykstra, Dave; ...

    2017-11-23

    Data federations have become an increasingly common tool for large collaborations such as CMS and Atlas to efficiently distribute large data files. Unfortunately, these typically are implemented with weak namespace semantics and a non-POSIX API. On the other hand, CVMFS has provided a POSIX-compliant read-only interface for use cases with a small working set size (such as software distribution). The metadata required for the CVMFS POSIX interface is distributed through a caching hierarchy, allowing it to scale to the level of about a hundred thousand hosts. In this paper, we will describe our contributions to CVMFS that merges the datamore » scalability of XRootD-based data federations (such as AAA) with metadata scalability and POSIX interface of CVMFS. We modified CVMFS so it can serve unmodified files without copying them to the repository server. CVMFS 2.2.0 is also able to redirect requests for data files to servers outside of the CVMFS content distribution network. Finally, we added the ability to manage authorization and authentication using security credentials such as X509 proxy certificates. We combined these modifications with the OSGs StashCache regional XRootD caching infrastructure to create a cached data distribution network. Here, we will show performance metrics accessing the data federation through CVMFS compared to direct data federation access. Additionally, we will discuss the improved user experience of providing access to a data federation through a POSIX filesystem.« less

  13. Accessing Data Federations with CVMFS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Dykstra, Dave

    Data federations have become an increasingly common tool for large collaborations such as CMS and Atlas to efficiently distribute large data files. Unfortunately, these typically are implemented with weak namespace semantics and a non-POSIX API. On the other hand, CVMFS has provided a POSIX-compliant read-only interface for use cases with a small working set size (such as software distribution). The metadata required for the CVMFS POSIX interface is distributed through a caching hierarchy, allowing it to scale to the level of about a hundred thousand hosts. In this paper, we will describe our contributions to CVMFS that merges the datamore » scalability of XRootD-based data federations (such as AAA) with metadata scalability and POSIX interface of CVMFS. We modified CVMFS so it can serve unmodified files without copying them to the repository server. CVMFS 2.2.0 is also able to redirect requests for data files to servers outside of the CVMFS content distribution network. Finally, we added the ability to manage authorization and authentication using security credentials such as X509 proxy certificates. We combined these modifications with the OSGs StashCache regional XRootD caching infrastructure to create a cached data distribution network. Here, we will show performance metrics accessing the data federation through CVMFS compared to direct data federation access. Additionally, we will discuss the improved user experience of providing access to a data federation through a POSIX filesystem.« less

  14. Accessing Data Federations with CVMFS

    NASA Astrophysics Data System (ADS)

    Weitzel, Derek; Bockelman, Brian; Dykstra, Dave; Blomer, Jakob; Meusel, Ren

    2017-10-01

    Data federations have become an increasingly common tool for large collaborations such as CMS and Atlas to efficiently distribute large data files. Unfortunately, these typically are implemented with weak namespace semantics and a non-POSIX API. On the other hand, CVMFS has provided a POSIX-compliant read-only interface for use cases with a small working set size (such as software distribution). The metadata required for the CVMFS POSIX interface is distributed through a caching hierarchy, allowing it to scale to the level of about a hundred thousand hosts. In this paper, we will describe our contributions to CVMFS that merges the data scalability of XRootD-based data federations (such as AAA) with metadata scalability and POSIX interface of CVMFS. We modified CVMFS so it can serve unmodified files without copying them to the repository server. CVMFS 2.2.0 is also able to redirect requests for data files to servers outside of the CVMFS content distribution network. Finally, we added the ability to manage authorization and authentication using security credentials such as X509 proxy certificates. We combined these modifications with the OSGs StashCache regional XRootD caching infrastructure to create a cached data distribution network. We will show performance metrics accessing the data federation through CVMFS compared to direct data federation access. Additionally, we will discuss the improved user experience of providing access to a data federation through a POSIX filesystem.

  15. Raccoon spatial requirements and multi-scale habitat selection within an intensively managed central Appalachian forest

    USGS Publications Warehouse

    Owen, Sheldon F.; Berl, Jacob L.; Edwards, John W.; Ford, W. Mark; Wood, Petra Bohall

    2015-01-01

    We studied a raccoon (Procyon lotor) population within a managed central Appalachian hardwood forest in West Virginia to investigate the effects of intensive forest management on raccoon spatial requirements and habitat selection. Raccoon home-range (95% utilization distribution) and core-area (50% utilization distribution) size differed between sexes with males maintaining larger (2×) home ranges and core areas than females. Home-range and core-area size did not differ between seasons for either sex. We used compositional analysis to quantify raccoon selection of six different habitat types at multiple spatial scales. Raccoons selected riparian corridors (riparian management zones [RMZ]) and intact forests (> 70 y old) at the core-area spatial scale. RMZs likely were used by raccoons because they provided abundant denning resources (i.e., large-diameter trees) as well as access to water. Habitat composition associated with raccoon foraging locations indicated selection for intact forests, riparian areas, and regenerating harvest (stands <10 y old). Although raccoons were able to utilize multiple habitat types for foraging resources, a selection of intact forest and RMZs at multiple spatial scales indicates the need of mature forest (with large-diameter trees) for this species in managed forests in the central Appalachians.

  16. Accessibility of medical and psychosocial services following disasters and other traumatic events: experiences of Deaf and hard-of-hearing individuals in Denmark.

    PubMed

    Skøt, Lotte; Jeppesen, Tina; Mellentin, Angelina Isabella; Elklit, Ask

    2017-12-01

    This descriptive study sought to explore barriers faced by Deaf and hard-of-hearing (D/HH) individuals in Denmark when accessing medical and psychosocial services following large-scale disasters and individual traumatic experiences. Semi-structured interviews were conducted with nine D/HH individuals who had experienced at least one disaster or other traumatic event. Difficulties were encountered during interactions with first response and healthcare services, which centered on: (1) lack of Deaf awareness among professionals, (2) problems accessing interpreter services, (3) professionals relying on hearing relatives to disseminate information, and (4) professionals who were unwilling to adjust their speech or try different forms of communication. Barriers reported in relation to accessing psychosocial services included: (1) lack of all-Deaf or hard-of-hearing support groups, and (2) limited availability of crisis psychologists who are trained to service the needs of the hearing impaired. Suggestions for improvements to service provision were provided, including a list of practical recommendations for professionals. This study has identified significant gaps in post-disaster service provision for D/HH individuals. Results can inform policy makers and other authorities in the position to enhance existing services and/or develop new services for this vulnerable target population. Implications for Rehabilitation Being Deaf or hard-of-hearing compromises a person's ability to obtain and share vital information during times of disaster. Medical and psychosocial services are expected to play critical response roles in times of disaster, and, should be properly equipped to assist Deaf and hard-of-hearing (D/HH) individuals. In a relatively small sample, this study highlights barriers faced by D/HH individuals in Denmark when accessing first response, healthcare, and psychosocial services following large-scale disasters and individual traumatic events, all of which centered on communication problems and resulted in suboptimal care. Regarding rehabilitation after disasters, evidence-based information about how to service the heterogeneous communication needs of D/HH populations should be disseminated to professionals, and preferably incorporated into training programs.

  17. Spatio-temporal trends in crop damage inform recent climate-mediated expansion of a large boreal herbivore into an agro-ecosystem.

    PubMed

    Laforge, Michel P; Michel, Nicole L; Brook, Ryan K

    2017-11-09

    Large-scale climatic fluctuations have caused species range shifts. Moose (Alces alces) have expanded their range southward into agricultural areas previously not considered moose habitat. We found that moose expansion into agro-ecosystems is mediated by broad-scale climatic factors and access to high-quality forage (i.e., crops). We used crop damage records to quantify moose presence across the Canadian Prairies. We regressed latitude of crop damage against North Atlantic Oscillation (NAO) and crop area to test the hypotheses that NAO-mediated wetland recharge and occurrence of more nutritious crop types would result in more frequent occurrences of crop damage by moose at southerly latitudes. We examined local-scale land use by generating a habitat selection model to test our hypothesis that moose selected for areas of high crop cover in agro-ecosystems. We found that crop damage by moose occurred farther south during dry winters and in years with greater coverage of oilseeds. The results of our analyses support our hypothesis that moose movement into cropland is mediated by high-protein crops, but not by thermoregulatory habitat at the scale examined. We conclude that broad-scale climate combined with changing land-use regimes are causal factors in species' range shifts and are important considerations when studying changing animal distributions.

  18. The beaming of subhalo accretion

    NASA Astrophysics Data System (ADS)

    Libeskind, Noam I.

    2016-10-01

    We examine the infall pattern of subhaloes onto hosts in the context of the large-scale structure. We find that the infall pattern is essentially driven by the shear tensor of the ambient velocity field. Dark matter subhaloes are preferentially accreted along the principal axis of the shear tensor which corresponds to the direction of weakest collapse. We examine the dependence of this preferential infall on subhalo mass, host halo mass and redshift. Although strongest for the most massive hosts and the most massive subhaloes at high redshift, the preferential infall of subhaloes is effectively universal in the sense that its always aligned with the axis of weakest collapse of the velocity shear tensor. It is the same shear tensor that dictates the structure of the cosmic web and hence the shear field emerges as the key factor that governs the local anisotropic pattern of structure formation. Since the small (sub-Mpc) scale is strongly correlated with the mid-range (~ 10 Mpc) scale - a scale accessible by current surveys of peculiar velocities - it follows that findings presented here open a new window into the relation between the observed large scale structure unveiled by current surveys of peculiar velocities and the preferential infall direction of the Local Group. This may shed light on the unexpected alignments of dwarf galaxies seen in the Local Group.

  19. Neutron Scattering Studies on Large Length Scale Sample Structures

    NASA Astrophysics Data System (ADS)

    Feng, Hao

    Neutron scattering can be used to study structures of matter. Depending on the interested sample properties, different scattering techniques can be chosen. Neutron reflectivity is more often used to detect in-depth profile of layered structures and the interfacial roughness while transmission is more sensitive to sample bulk properties. Neutron Reflectometry (NR) technique, one technique in neutron reflectivity, is first discussed in this thesis. Both specular reflectivity and the first order Bragg intensity were measured in the NR experiment with a diffraction grating in order to study the in-depth and the lateral structure of a sample (polymer) deposited on the grating. However, the first order Bragg intensity solely is sometimes inadequate to determine the lateral structure and high order Bragg intensities are difficult to measure using traditional neutron scattering techniques due to the low brightness of the current neutron sources. Spin Echo Small Angle Neutron Scattering (SESANS) technique overcomes this resolution problem by measuring the Fourier transforms of all the Bragg intensities, resulting in measuring the real-space density correlations of samples and allowing the accessible length scale from few-tens of nanometers to several microns. SESANS can be implemented by using two pairs of magnetic Wollaston prims (WP) and the accessible length scale is proportional to the magnetic field intensity in WPs. To increase the magnetic field and thus increase the accessible length scale, an apparatus named Superconducting Wollaston Prisms (SWP) which has a series of strong, well-defined shaped magnetic fields created by superconducting coils was developed in Indiana University in 2016. Since then, various kinds of optimization have been implemented, which are addressed in this thesis. Finally, applications of SWPs in other neutron scattering techniques like Neutron Larmor Diffraction (NLD) are discussed.

  20. Large-Scale Geographic Variation in Distribution and Abundance of Australian Deep-Water Kelp Forests

    PubMed Central

    Marzinelli, Ezequiel M.; Williams, Stefan B.; Babcock, Russell C.; Barrett, Neville S.; Johnson, Craig R.; Jordan, Alan; Kendrick, Gary A.; Pizarro, Oscar R.; Smale, Dan A.; Steinberg, Peter D.

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia’s Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10–100 m to 100–1,000 km) and depths (15–60 m) across several regions ca 2–6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40–50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves. PMID:25693066

  1. Collective motion of macroscopic spheres floating on capillary ripples: Dynamic heterogeneity and dynamic criticality

    NASA Astrophysics Data System (ADS)

    Sanlı, Ceyda; Saitoh, Kuniyasu; Luding, Stefan; van der Meer, Devaraj

    2014-09-01

    When a densely packed monolayer of macroscopic spheres floats on chaotic capillary Faraday waves, a coexistence of large scale convective motion and caging dynamics typical for glassy systems is observed. We subtract the convective mean flow using a coarse graining (homogenization) method and reveal subdiffusion for the caging time scales followed by a diffusive regime at later times. We apply the methods developed to study dynamic heterogeneity and show that the typical time and length scales of the fluctuations due to rearrangements of observed particle groups significantly increase when the system approaches its largest experimentally accessible packing concentration. To connect the system to the dynamic criticality literature, we fit power laws to our results. The resultant critical exponents are consistent with those found in densely packed suspensions of colloids.

  2. Collective motion of macroscopic spheres floating on capillary ripples: dynamic heterogeneity and dynamic criticality.

    PubMed

    Sanlı, Ceyda; Saitoh, Kuniyasu; Luding, Stefan; van der Meer, Devaraj

    2014-09-01

    When a densely packed monolayer of macroscopic spheres floats on chaotic capillary Faraday waves, a coexistence of large scale convective motion and caging dynamics typical for glassy systems is observed. We subtract the convective mean flow using a coarse graining (homogenization) method and reveal subdiffusion for the caging time scales followed by a diffusive regime at later times. We apply the methods developed to study dynamic heterogeneity and show that the typical time and length scales of the fluctuations due to rearrangements of observed particle groups significantly increase when the system approaches its largest experimentally accessible packing concentration. To connect the system to the dynamic criticality literature, we fit power laws to our results. The resultant critical exponents are consistent with those found in densely packed suspensions of colloids.

  3. Physical Modeling of Flow Over Gale Crater, Mars: Laboratory Measurements of Basin Secondary Circulations

    NASA Astrophysics Data System (ADS)

    Bristow, N.; Blois, G.; Kim, T.; Anderson, W.; Day, M. D.; Kocurek, G.; Christensen, K. T.

    2017-12-01

    Impact craters, common large-scale topographic features on the surface of Mars, are circular depressions delimited by a sharp ridge. A variety of crater fill morphologies exist, suggesting that complex intracrater circulations affect their evolution. Some large craters (diameter > 10 km), particularly at mid latitudes on Mars, exhibit a central mound surrounded by circular moat. Foremost among these examples is Gale crater, landing site of NASA's Curiosity rover, since large-scale climatic processes early in in the history of Mars are preserved in the stratigraphic record of the inner mound. Investigating the intracrater flow produced by large scale winds aloft Mars craters is key to a number of important scientific issues including ongoing research on Mars paleo-environmental reconstruction and the planning of future missions (these results must be viewed in conjunction with the affects of radial katabatibc flows, the importance of which is already established in preceding studies). In this work we consider a number of crater shapes inspired by Gale morphology, including idealized craters. Access to the flow field within such geometrically complex topography is achieved herein using a refractive index matched approach. Instantaneous velocity maps, using both planar and volumetric PIV techniques, are presented to elucidate complex three-dimensional flow within the crater. In addition, first- and second-order statistics will be discussed in the context of wind-driven (aeolian) excavation of crater fill.

  4. Large density expansion of a hydrodynamic theory for self-propelled particles

    NASA Astrophysics Data System (ADS)

    Ihle, T.

    2015-07-01

    Recently, an Enskog-type kinetic theory for Vicsek-type models for self-propelled particles has been proposed [T. Ihle, Phys. Rev. E 83, 030901 (2011)]. This theory is based on an exact equation for a Markov chain in phase space and is not limited to small density. Previously, the hydrodynamic equations were derived from this theory and its transport coefficients were given in terms of infinite series. Here, I show that the transport coefficients take a simple form in the large density limit. This allows me to analytically evaluate the well-known density instability of the polarly ordered phase near the flocking threshold at moderate and large densities. The growth rate of a longitudinal perturbation is calculated and several scaling regimes, including three different power laws, are identified. It is shown that at large densities, the restabilization of the ordered phase at smaller noise is analytically accessible within the range of validity of the hydrodynamic theory. Analytical predictions for the width of the unstable band, the maximum growth rate, and for the wave number below which the instability occurs are given. In particular, the system size below which spatial perturbations of the homogeneous ordered state are stable is predicted to scale with where √ M is the average number of collision partners. The typical time scale until the instability becomes visible is calculated and is proportional to M.

  5. Development of an instrument to measure Faculty's information and communication technology access (FICTA).

    PubMed

    Soomro, Kamal Ahmed; Kale, Ugur; Curtis, Reagan; Akcaoglu, Mete; Bernstein, Malayna

    2018-01-01

    The phenomenon of "digital divide" is complex and multidimensional, extending beyond issues of physical access. The purpose of this study was to develop a scale to measure a range of factors related to digital divide among higher education faculty and to evaluate its reliability and validity. Faculty's Information and Communication Technology Access (FICTA) scale was tested and validated with 322 faculty teaching in public and private sector universities. Principal components analysis with varimax rotation confirmed an 8-factor solution corresponding to various dimensions of ICT access. The 57-item FICTA scale demonstrated good psychometric properties and offers researchers a tool to examine faculty's access to ICT at four levels - motivational, physical, skills, and usage access.

  6. Engineering two-photon high-dimensional states through quantum interference

    PubMed Central

    Zhang, Yingwen; Roux, Filippus S.; Konrad, Thomas; Agnew, Megan; Leach, Jonathan; Forbes, Andrew

    2016-01-01

    Many protocols in quantum science, for example, linear optical quantum computing, require access to large-scale entangled quantum states. Such systems can be realized through many-particle qubits, but this approach often suffers from scalability problems. An alternative strategy is to consider a lesser number of particles that exist in high-dimensional states. The spatial modes of light are one such candidate that provides access to high-dimensional quantum states, and thus they increase the storage and processing potential of quantum information systems. We demonstrate the controlled engineering of two-photon high-dimensional states entangled in their orbital angular momentum through Hong-Ou-Mandel interference. We prepare a large range of high-dimensional entangled states and implement precise quantum state filtering. We characterize the full quantum state before and after the filter, and are thus able to determine that only the antisymmetric component of the initial state remains. This work paves the way for high-dimensional processing and communication of multiphoton quantum states, for example, in teleportation beyond qubits. PMID:26933685

  7. Accessing the bottleneck in all-solid state batteries, lithium-ion transport over the solid-electrolyte-electrode interface.

    PubMed

    Yu, Chuang; Ganapathy, Swapna; Eck, Ernst R H van; Wang, Heng; Basak, Shibabrata; Li, Zhaolong; Wagemaker, Marnix

    2017-10-20

    Solid-state batteries potentially offer increased lithium-ion battery energy density and safety as required for large-scale production of electrical vehicles. One of the key challenges toward high-performance solid-state batteries is the large impedance posed by the electrode-electrolyte interface. However, direct assessment of the lithium-ion transport across realistic electrode-electrolyte interfaces is tedious. Here we report two-dimensional lithium-ion exchange NMR accessing the spontaneous lithium-ion transport, providing insight on the influence of electrode preparation and battery cycling on the lithium-ion transport over the interface between an argyrodite solid-electrolyte and a sulfide electrode. Interfacial conductivity is shown to depend strongly on the preparation method and demonstrated to drop dramatically after a few electrochemical (dis)charge cycles due to both losses in interfacial contact and increased diffusional barriers. The reported exchange NMR facilitates non-invasive and selective measurement of lithium-ion interfacial transport, providing insight that can guide the electrolyte-electrode interface design for future all-solid-state batteries.

  8. Mahanaxar: quality of service guarantees in high-bandwidth, real-time streaming data storage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bigelow, David; Bent, John; Chen, Hsing-Bung

    2010-04-05

    Large radio telescopes, cyber-security systems monitoring real-time network traffic, and others have specialized data storage needs: guaranteed capture of an ultra-high-bandwidth data stream, retention of the data long enough to determine what is 'interesting,' retention of interesting data indefinitely, and concurrent read/write access to determine what data is interesting, without interrupting the ongoing capture of incoming data. Mahanaxar addresses this problem. Mahanaxar guarantees streaming real-time data capture at (nearly) the full rate of the raw device, allows concurrent read and write access to the device on a best-effort basis without interrupting the data capture, and retains data as long asmore » possible given the available storage. It has built in mechanisms for reliability and indexing, can scale to meet arbitrary bandwidth requirements, and handles both small and large data elements equally well. Results from our prototype implementation shows that Mahanaxar provides both better guarantees and better performance than traditional file systems.« less

  9. Exploring Google Earth Engine platform for big data processing: classification of multi-temporal satellite imagery for crop mapping

    NASA Astrophysics Data System (ADS)

    Shelestov, Andrii; Lavreniuk, Mykola; Kussul, Nataliia; Novikov, Alexei; Skakun, Sergii

    2017-02-01

    Many applied problems arising in agricultural monitoring and food security require reliable crop maps at national or global scale. Large scale crop mapping requires processing and management of large amount of heterogeneous satellite imagery acquired by various sensors that consequently leads to a “Big Data” problem. The main objective of this study is to explore efficiency of using the Google Earth Engine (GEE) platform when classifying multi-temporal satellite imagery with potential to apply the platform for a larger scale (e.g. country level) and multiple sensors (e.g. Landsat-8 and Sentinel-2). In particular, multiple state-of-the-art classifiers available in the GEE platform are compared to produce a high resolution (30 m) crop classification map for a large territory ( 28,100 km2 and 1.0 M ha of cropland). Though this study does not involve large volumes of data, it does address efficiency of the GEE platform to effectively execute complex workflows of satellite data processing required with large scale applications such as crop mapping. The study discusses strengths and weaknesses of classifiers, assesses accuracies that can be achieved with different classifiers for the Ukrainian landscape, and compares them to the benchmark classifier using a neural network approach that was developed in our previous studies. The study is carried out for the Joint Experiment of Crop Assessment and Monitoring (JECAM) test site in Ukraine covering the Kyiv region (North of Ukraine) in 2013. We found that Google Earth Engine (GEE) provides very good performance in terms of enabling access to the remote sensing products through the cloud platform and providing pre-processing; however, in terms of classification accuracy, the neural network based approach outperformed support vector machine (SVM), decision tree and random forest classifiers available in GEE.

  10. The emergence of retail-based clinics in the United States: early observations.

    PubMed

    Laws, Margaret; Scott, Mary Kate

    2008-01-01

    Retail-based clinics have proliferated rapidly in the past two years, with approximately 1,000 sites in thirty-seven states representing almost three million cumulative visits. Clinic operators have evolved from a dispersed group of privately financed concerns to a concentrated, largely corporate-owned group. A major development has been the move to large-scale acceptance of insurance, deviating from the initial cash-pay model. Consumers' acceptance and the fact that the clinics appear to increase access for both the uninsured and the insured has encouraged providers and policymakers to consider this approach to basic, acute care while seeking a better understanding of these clinics.

  11. NASA Out-of-Autoclave Process Technology Development

    NASA Technical Reports Server (NTRS)

    Johnston, Norman, J.; Clinton, R. G., Jr.; McMahon, William M.

    2000-01-01

    Polymer matrix composites (PMCS) will play a significant role in the construction of large reusable launch vehicles (RLVs), mankind's future major access to low earth orbit and the international space station. PMCs are lightweight and offer attractive economies of scale and automated fabrication methodology. Fabrication of large RLV structures will require non-autoclave methods which have yet to be matured including (1) thermoplastic forming: heated head robotic tape placement, sheet extrusion, pultrusion, molding and forming; (2) electron beam curing: bulk and ply-by-ply automated placement; (3) RTM and VARTM. Research sponsored by NASA in industrial and NASA laboratories on automated placement techniques involving the first 2 categories will be presented.

  12. Opal web services for biomedical applications.

    PubMed

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  13. Anisotropic modulus stabilisation: strings at LHC scales with micron-sized extra dimensions

    NASA Astrophysics Data System (ADS)

    Cicoli, M.; Burgess, C. P.; Quevedo, F.

    2011-10-01

    We construct flux-stabilised Type IIB string compactifications whose extra dimensions have very different sizes, and use these to describe several types of vacua with a TeV string scale. Because we can access regimes where two dimensions are hierarchically larger than the other four, we find examples where two dimensions are micron-sized while the other four are at the weak scale in addition to more standard examples with all six extra dimensions equally large. Besides providing ultraviolet completeness, the phenomenology of these models is richer than vanilla large-dimensional models in several generic ways: ( i) they are supersymmetric, with supersymmetry broken at sub-eV scales in the bulk but only nonlinearly realised in the Standard Model sector, leading to no MSSM superpartners for ordinary particles and many more bulk missing-energy channels, as in supersymmetric large extra dimensions (SLED); ( ii) small cycles in the more complicated extra-dimensional geometry allow some KK states to reside at TeV scales even if all six extra dimensions are nominally much larger; ( iii) a rich spectrum of string and KK states at TeV scales; and ( iv) an equally rich spectrum of very light moduli exist having unusually small (but technically natural) masses, with potentially interesting implications for cosmology and astrophysics that nonetheless evade new-force constraints. The hierarchy problem is solved in these models because the extra-dimensional volume is naturally stabilised at exponentially large values: the extra dimensions are Calabi-Yau geometries with a 4D K3 or T 4-fibration over a 2D base, with moduli stabilised within the well-established LARGE-Volume scenario. The new technical step is the use of poly-instanton corrections to the superpotential (which, unlike for simpler models, are likely to be present on K3 or T 4-fibered Calabi-Yau compactifications) to obtain a large hierarchy between the sizes of different dimensions. For several scenarios we identify the low-energy spectrum and briefly discuss some of their astrophysical, cosmological and phenomenological implications.

  14. Constraints Imposed by the Membrane Selectively Guide the Alternating Access Dynamics of the Glutamate Transporter GltPh

    PubMed Central

    Lezon, Timothy R.; Bahar, Ivet

    2012-01-01

    Substrate transport in sodium-coupled amino acid symporters involves a large-scale conformational change that shifts the access to the substrate-binding site from one side of the membrane to the other. The structural change is particularly substantial and entails a unique piston-like quaternary rearrangement in glutamate transporters, as evidenced by the difference between the outward-facing and inward-facing structures resolved for the archaeal aspartate transporter GltPh. These structural changes occur over time and length scales that extend beyond the reach of current fully atomic models, but are regularly explored with the use of elastic network models (ENMs). Despite their success with other membrane proteins, ENM-based approaches for exploring the collective dynamics of GltPh have fallen short of providing a plausible mechanism. This deficiency is attributed here to the anisotropic constraints imposed by the membrane, which are not incorporated into conventional ENMs. Here we employ two novel (to our knowledge) ENMs to demonstrate that one can largely capture the experimentally observed structural change using only the few lowest-energy modes of motion that are intrinsically accessible to the transporter, provided that the surrounding lipid molecules are incorporated into the ENM. The presence of the membrane reduces the overall energy of the transition compared with conventional models, showing that the membrane not only guides the selected mechanism but also acts as a facilitator. Finally, we show that the dynamics of GltPh is biased toward transitions of individual subunits of the trimer rather than cooperative transitions of all three subunits simultaneously, suggesting a mechanism of transport that exploits the intrinsic dynamics of individual subunits. Our software is available online at http://www.membranm.csb.pitt.edu. PMID:22455916

  15. Constraints imposed by the membrane selectively guide the alternating access dynamics of the glutamate transporter GltPh.

    PubMed

    Lezon, Timothy R; Bahar, Ivet

    2012-03-21

    Substrate transport in sodium-coupled amino acid symporters involves a large-scale conformational change that shifts the access to the substrate-binding site from one side of the membrane to the other. The structural change is particularly substantial and entails a unique piston-like quaternary rearrangement in glutamate transporters, as evidenced by the difference between the outward-facing and inward-facing structures resolved for the archaeal aspartate transporter Glt(Ph). These structural changes occur over time and length scales that extend beyond the reach of current fully atomic models, but are regularly explored with the use of elastic network models (ENMs). Despite their success with other membrane proteins, ENM-based approaches for exploring the collective dynamics of Glt(Ph) have fallen short of providing a plausible mechanism. This deficiency is attributed here to the anisotropic constraints imposed by the membrane, which are not incorporated into conventional ENMs. Here we employ two novel (to our knowledge) ENMs to demonstrate that one can largely capture the experimentally observed structural change using only the few lowest-energy modes of motion that are intrinsically accessible to the transporter, provided that the surrounding lipid molecules are incorporated into the ENM. The presence of the membrane reduces the overall energy of the transition compared with conventional models, showing that the membrane not only guides the selected mechanism but also acts as a facilitator. Finally, we show that the dynamics of Glt(Ph) is biased toward transitions of individual subunits of the trimer rather than cooperative transitions of all three subunits simultaneously, suggesting a mechanism of transport that exploits the intrinsic dynamics of individual subunits. Our software is available online at http://www.membranm.csb.pitt.edu. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. A ground-base Radar network to access the 3D structure of MLT winds

    NASA Astrophysics Data System (ADS)

    Stober, G.; Chau, J. L.; Wilhelm, S.; Jacobi, C.

    2016-12-01

    The mesosphere/lower thermosphere (MLT) is a highly variable atmospheric region driven by wave dynamics at various scales including planetary waves, tides and gravity waves. Some of these propagate through the MLT into the thermosphere/ionosphere carrying energy and momentum from the middle atmosphere into the upper atmosphere. To improve our understanding of the wave energetics and momentum transfer during their dissipation it is essential to characterize their space time properties. During the last two years we developed a new experimental approach to access the horizontal structure of wind fields at the MLT using a meteor radar network in Germany, which we called MMARIA - Multi-static Multi-frequency Agile Radar for Investigation of the Atmosphere. The network combines classical backscatter meteor radars and passive forward scatter radio links. We present our preliminary results using up to 7 different active and passive radio links to obtain horizontally resolved wind fields applying a statistical inverse method. The wind fields are retrieved with 15-30 minutes temporal resolution on a grid with 30x30 km horizontal spacing. Depending on the number of observed meteors, we are able to apply the wind field inversion at heights between 84-94 km. The horizontally resolved wind fields provide insights of the typical horizontal gravity wave length and the energy cascade from large scales to small scales. We present first power spectra indicating the transition from the synoptic wave scale to the gravity wave scale.

  17. Efficient analysis of large-scale genome-wide data with two R packages: bigstatsr and bigsnpr.

    PubMed

    Privé, Florian; Aschard, Hugues; Ziyatdinov, Andrey; Blum, Michael G B

    2017-03-30

    Genome-wide datasets produced for association studies have dramatically increased in size over the past few years, with modern datasets commonly including millions of variants measured in dozens of thousands of individuals. This increase in data size is a major challenge severely slowing down genomic analyses, leading to some software becoming obsolete and researchers having limited access to diverse analysis tools. Here we present two R packages, bigstatsr and bigsnpr, allowing for the analysis of large scale genomic data to be performed within R. To address large data size, the packages use memory-mapping for accessing data matrices stored on disk instead of in RAM. To perform data pre-processing and data analysis, the packages integrate most of the tools that are commonly used, either through transparent system calls to existing software, or through updated or improved implementation of existing methods. In particular, the packages implement fast and accurate computations of principal component analysis and association studies, functions to remove SNPs in linkage disequilibrium and algorithms to learn polygenic risk scores on millions of SNPs. We illustrate applications of the two R packages by analyzing a case-control genomic dataset for celiac disease, performing an association study and computing Polygenic Risk Scores. Finally, we demonstrate the scalability of the R packages by analyzing a simulated genome-wide dataset including 500,000 individuals and 1 million markers on a single desktop computer. https://privefl.github.io/bigstatsr/ & https://privefl.github.io/bigsnpr/. florian.prive@univ-grenoble-alpes.fr & michael.blum@univ-grenoble-alpes.fr. Supplementary materials are available at Bioinformatics online.

  18. "We get by with a little help from our friends": Small-scale informal and large-scale formal peer distribution networks of sterile injecting equipment in Australia.

    PubMed

    Newland, Jamee; Newman, Christy; Treloar, Carla

    2016-08-01

    In Australia, sterile needles and syringes are distributed to people who inject drugs (PWID) through formal services for the purposes of preventing blood borne viruses (BBV). Peer distribution involves people acquiring needles from formal services and redistributing them to others. This paper investigates the dynamics of the distribution of sterile injecting equipment among networks of people who inject drugs in four sites in New South Wales (NSW), Australia. Qualitative data exploring the practice of peer distribution were collected through in-depth, semi-structured interviews and participatory social network mapping. These interviews explored injecting equipment demand, access to services, relationship pathways through which peer distribution occurred, an estimate of the size of the different peer distribution roles and participants' understanding of the illegality of peer distribution in NSW. Data were collected from 32 participants, and 31 (98%) reported participating in peer distribution in the months prior to interview. Of those 31 participants, five reported large-scale formal distribution, with an estimated volume of 34,970 needles and syringes annually. Twenty-two participated in reciprocal exchange, where equipment was distributed and received on an informal basis that appeared dependent on context and circumstance and four participants reported recipient peer distribution as their only access to sterile injecting equipment. Most (n=27) were unaware that it was illegal to distribute injecting equipment to their peers. Peer distribution was almost ubiquitous amongst the PWID participating in the study, and although five participants reported taking part in the highly organised, large-scale distribution of injecting equipment for altruistic reasons, peer distribution was more commonly reported to take place in small networks of friends and/or partners for reasons of convenience. The law regarding the illegality of peer distribution needs to change so that NSPs can capitalise on peer distribution to increase the options available to PWID and to acknowledge PWID as essential harm reduction agents in the prevention of BBVs. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. A Memory-Based Programmable Logic Device Using Look-Up Table Cascade with Synchronous Static Random Access Memories

    NASA Astrophysics Data System (ADS)

    Nakamura, Kazuyuki; Sasao, Tsutomu; Matsuura, Munehiro; Tanaka, Katsumasa; Yoshizumi, Kenichi; Nakahara, Hiroki; Iguchi, Yukihiro

    2006-04-01

    A large-scale memory-technology-based programmable logic device (PLD) using a look-up table (LUT) cascade is developed in the 0.35-μm standard complementary metal oxide semiconductor (CMOS) logic process. Eight 64 K-bit synchronous SRAMs are connected to form an LUT cascade with a few additional circuits. The features of the LUT cascade include: 1) a flexible cascade connection structure, 2) multi phase pseudo asynchronous operations with synchronous static random access memory (SRAM) cores, and 3) LUT-bypass redundancy. This chip operates at 33 MHz in 8-LUT cascades at 122 mW. Benchmark results show that it achieves a comparable performance to field programmable gate array (FPGAs).

  20. The Ensembl REST API: Ensembl Data for Any Language

    PubMed Central

    Yates, Andrew; Beal, Kathryn; Keenan, Stephen; McLaren, William; Pignatelli, Miguel; Ritchie, Graham R. S.; Ruffier, Magali; Taylor, Kieron; Vullo, Alessandro; Flicek, Paul

    2015-01-01

    Motivation: We present a Web service to access Ensembl data using Representational State Transfer (REST). The Ensembl REST server enables the easy retrieval of a wide range of Ensembl data by most programming languages, using standard formats such as JSON and FASTA while minimizing client work. We also introduce bindings to the popular Ensembl Variant Effect Predictor tool permitting large-scale programmatic variant analysis independent of any specific programming language. Availability and implementation: The Ensembl REST API can be accessed at http://rest.ensembl.org and source code is freely available under an Apache 2.0 license from http://github.com/Ensembl/ensembl-rest. Contact: ayates@ebi.ac.uk or flicek@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25236461

  1. The Australian Computational Earth Systems Simulator

    NASA Astrophysics Data System (ADS)

    Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.

    2001-12-01

    Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.

  2. The development of a multi-dimensional gambling accessibility scale.

    PubMed

    Hing, Nerilee; Haw, John

    2009-12-01

    The aim of the current study was to develop a scale of gambling accessibility that would have theoretical significance to exposure theory and also serve to highlight the accessibility risk factors for problem gambling. Scale items were generated from the Productivity Commission's (Australia's Gambling Industries: Report No. 10. AusInfo, Canberra, 1999) recommendations and tested on a group with high exposure to the gambling environment. In total, 533 gaming venue employees (aged 18-70 years; 67% women) completed a questionnaire that included six 13-item scales measuring accessibility across a range of gambling forms (gaming machines, keno, casino table games, lotteries, horse and dog racing, sports betting). Also included in the questionnaire was the Problem Gambling Severity Index (PGSI) along with measures of gambling frequency and expenditure. Principal components analysis indicated that a common three factor structure existed across all forms of gambling and these were labelled social accessibility, physical accessibility and cognitive accessibility. However, convergent validity was not demonstrated with inconsistent correlations between each subscale and measures of gambling behaviour. These results are discussed in light of exposure theory and the further development of a multi-dimensional measure of gambling accessibility.

  3. Communication architecture for large geostationary platforms

    NASA Technical Reports Server (NTRS)

    Bond, F. E.

    1979-01-01

    Large platforms have been proposed for supporting multipurpose communication payloads to exploit economy of scale, reduce congestion in the geostationary orbit, provide interconnectivity between diverse earth stations, and obtain significant frequency reuse with large multibeam antennas. This paper addresses a specific system design, starting with traffic projections in the next two decades and discussing tradeoffs and design approaches for major components including: antennas, transponders, and switches. Other issues explored are selection of frequency bands, modulation, multiple access, switching methods, and techniques for servicing areas with nonuniform traffic demands. Three-major services are considered: a high-volume trunking system, a direct-to-user system, and a broadcast system for video distribution and similar functions. Estimates of payload weight and d.c. power requirements are presented. Other subjects treated are: considerations of equipment layout for servicing by an orbit transfer vehicle, mechanical stability requirements for the large antennas, and reliability aspects of the large number of transponders employed.

  4. Characterizing stroke lesions using digital templates and lesion quantification tools in a web-based imaging informatics system for a large-scale stroke rehabilitation clinical trial

    NASA Astrophysics Data System (ADS)

    Wang, Ximing; Edwardson, Matthew; Dromerick, Alexander; Winstein, Carolee; Wang, Jing; Liu, Brent

    2015-03-01

    Previously, we presented an Interdisciplinary Comprehensive Arm Rehabilitation Evaluation (ICARE) imaging informatics system that supports a large-scale phase III stroke rehabilitation trial. The ePR system is capable of displaying anonymized patient imaging studies and reports, and the system is accessible to multiple clinical trial sites and users across the United States via the web. However, the prior multicenter stroke rehabilitation trials lack any significant neuroimaging analysis infrastructure. In stroke related clinical trials, identification of the stroke lesion characteristics can be meaningful as recent research shows that lesion characteristics are related to stroke scale and functional recovery after stroke. To facilitate the stroke clinical trials, we hope to gain insight into specific lesion characteristics, such as vascular territory, for patients enrolled into large stroke rehabilitation trials. To enhance the system's capability for data analysis and data reporting, we have integrated new features with the system: a digital brain template display, a lesion quantification tool and a digital case report form. The digital brain templates are compiled from published vascular territory templates at each of 5 angles of incidence. These templates were updated to include territories in the brainstem using a vascular territory atlas and the Medical Image Processing, Analysis and Visualization (MIPAV) tool. The digital templates are displayed for side-by-side comparisons and transparent template overlay onto patients' images in the image viewer. The lesion quantification tool quantifies planimetric lesion area from user-defined contour. The digital case report form stores user input into a database, then displays contents in the interface to allow for reviewing, editing, and new inputs. In sum, the newly integrated system features provide the user with readily-accessible web-based tools to identify the vascular territory involved, estimate lesion area, and store these results in a web-based digital format.

  5. Security and Efficiency Concerns With Distributed Collaborative Networking Environments

    DTIC Science & Technology

    2003-09-01

    have the ability to access Web communications services of the WebEx MediaTone Network from a single login. [24] WebEx provides a range of secure...Web. WebEx services enable secure data, voice and video communications through the browser and are supported by the WebEx MediaTone Network, a global...designed to host large-scale, structured events and conferences, featuring a Q&A Manager that allows multiple moderators to handle questions while

  6. One-pot synthesis of hypervalent iodine reagents for electrophilic trifluoromethylation.

    PubMed

    Matoušek, Václav; Pietrasiak, Ewa; Schwenk, Rino; Togni, Antonio

    2013-07-05

    Simplified syntheses suited for large scale preparations of the two hypervalent iodine reagents 1 and 2 for electrophilic trifluoromethylation are reported. In both cases, the stoichiometric oxidants sodium metaperiodate and tert-butyl hypochlorite have been replaced by trichloroisocyanuric acid. Reagent 1 is accessible in a one-pot procedure from 2-iodobenzoic acid in 72% yield. Reagent 2 was prepared via fluoroiodane 11 in a considerably shorter reaction time and with no need of an accurate temperature control.

  7. What retail clinic growth can teach us about patient demand. Threat or opportunity: retail clinic popularity is about convenience.

    PubMed

    Zamosky, Lisa

    2014-01-10

    Access: It's one word that may ultimately reignite the expansion of retail medicine in 2014 and beyond. CVS Caremark has added 200 new clinics since 2011, with another 850 planned by 2017. While it's still too soon to predict a large-scale national expansion in clinic numbers, some experts believe their calling card--convenience--should be a consideration for every medical practice in the United States.

  8. Demonstration of nanoimprinted hyperlens array for high-throughput sub-diffraction imaging

    NASA Astrophysics Data System (ADS)

    Byun, Minsueop; Lee, Dasol; Kim, Minkyung; Kim, Yangdoo; Kim, Kwan; Ok, Jong G.; Rho, Junsuk; Lee, Heon

    2017-04-01

    Overcoming the resolution limit of conventional optics is regarded as the most important issue in optical imaging science and technology. Although hyperlenses, super-resolution imaging devices based on highly anisotropic dispersion relations that allow the access of high-wavevector components, have recently achieved far-field sub-diffraction imaging in real-time, the previously demonstrated devices have suffered from the extreme difficulties of both the fabrication process and the non-artificial objects placement. This results in restrictions on the practical applications of the hyperlens devices. While implementing large-scale hyperlens arrays in conventional microscopy is desirable to solve such issues, it has not been feasible to fabricate such large-scale hyperlens array with the previously used nanofabrication methods. Here, we suggest a scalable and reliable fabrication process of a large-scale hyperlens device based on direct pattern transfer techniques. We fabricate a 5 cm × 5 cm size hyperlenses array and experimentally demonstrate that it can resolve sub-diffraction features down to 160 nm under 410 nm wavelength visible light. The array-based hyperlens device will provide a simple solution for much more practical far-field and real-time super-resolution imaging which can be widely used in optics, biology, medical science, nanotechnology and other closely related interdisciplinary fields.

  9. Accelerating large-scale protein structure alignments with graphics processing units

    PubMed Central

    2012-01-01

    Background Large-scale protein structure alignment, an indispensable tool to structural bioinformatics, poses a tremendous challenge on computational resources. To ensure structure alignment accuracy and efficiency, efforts have been made to parallelize traditional alignment algorithms in grid environments. However, these solutions are costly and of limited accessibility. Others trade alignment quality for speedup by using high-level characteristics of structure fragments for structure comparisons. Findings We present ppsAlign, a parallel protein structure Alignment framework designed and optimized to exploit the parallelism of Graphics Processing Units (GPUs). As a general-purpose GPU platform, ppsAlign could take many concurrent methods, such as TM-align and Fr-TM-align, into the parallelized algorithm design. We evaluated ppsAlign on an NVIDIA Tesla C2050 GPU card, and compared it with existing software solutions running on an AMD dual-core CPU. We observed a 36-fold speedup over TM-align, a 65-fold speedup over Fr-TM-align, and a 40-fold speedup over MAMMOTH. Conclusions ppsAlign is a high-performance protein structure alignment tool designed to tackle the computational complexity issues from protein structural data. The solution presented in this paper allows large-scale structure comparisons to be performed using massive parallel computing power of GPU. PMID:22357132

  10. Reducing HIV infection among new injecting drug users in the China-Vietnam Cross Border Project.

    PubMed

    Des Jarlais, Don C; Kling, Ryan; Hammett, Theodore M; Ngu, Doan; Liu, Wei; Chen, Yi; Binh, Kieu Thanh; Friedmann, Patricia

    2007-12-01

    To assess an HIV prevention programme for injecting drug users (IDU) in the crossborder area between China and Vietnam. Serial cross-sectional surveys (0, 6, 12, 18, 24 and 36 months) of community-recruited current IDU. The project included peer educator outreach and the large-scale distribution of sterile injection equipment. Serial cross-sectional surveys with HIV testing of community recruited IDU were conducted at baseline (before implementation) and 6, 12, 18, 24 and 36 months post-baseline. HIV prevalence and estimated HIV incidence among new injectors (individuals injecting drugs for < 3 years) in each survey wave were the primary outcome measures. The percentages of new injectors among all subjects declined across each survey waves in both Ning Ming and Lang Son. HIV prevalence and estimated incidence fell by approximately half at the 24-month survey and by approximately three quarters at the 36-month survey in both areas (all P < 0.01). The implementation of large-scale outreach and syringe access programmes was followed by substantial reductions in HIV infection among new injectors, with no evidence of any increase in individuals beginning to inject drugs. This project may serve as a model for large-scale HIV prevention programming for IDU in China, Vietnam, and other developing/transitional countries.

  11. An HTML5-Based Pure Website Solution for Rapidly Viewing and Processing Large-Scale 3D Medical Volume Reconstruction on Mobile Internet

    PubMed Central

    Chen, Xin; Zhang, Ye; Zhang, Jingna; Li, Ying; Mo, Xuemei; Chen, Wei

    2017-01-01

    This study aimed to propose a pure web-based solution to serve users to access large-scale 3D medical volume anywhere with good user experience and complete details. A novel solution of the Master-Slave interaction mode was proposed, which absorbed advantages of remote volume rendering and surface rendering. On server side, we designed a message-responding mechanism to listen to interactive requests from clients (Slave model) and to guide Master volume rendering. On client side, we used HTML5 to normalize user-interactive behaviors on Slave model and enhance the accuracy of behavior request and user-friendly experience. The results showed that more than four independent tasks (each with a data size of 249.4 MB) could be simultaneously carried out with a 100-KBps client bandwidth (extreme test); the first loading time was <12 s, and the response time of each behavior request for final high quality image remained at approximately 1 s, while the peak value of bandwidth was <50-KBps. Meanwhile, the FPS value for each client was ≥40. This solution could serve the users by rapidly accessing the application via one URL hyperlink without special software and hardware requirement in a diversified network environment and could be easily integrated into other telemedical systems seamlessly. PMID:28638406

  12. An HTML5-Based Pure Website Solution for Rapidly Viewing and Processing Large-Scale 3D Medical Volume Reconstruction on Mobile Internet.

    PubMed

    Qiao, Liang; Chen, Xin; Zhang, Ye; Zhang, Jingna; Wu, Yi; Li, Ying; Mo, Xuemei; Chen, Wei; Xie, Bing; Qiu, Mingguo

    2017-01-01

    This study aimed to propose a pure web-based solution to serve users to access large-scale 3D medical volume anywhere with good user experience and complete details. A novel solution of the Master-Slave interaction mode was proposed, which absorbed advantages of remote volume rendering and surface rendering. On server side, we designed a message-responding mechanism to listen to interactive requests from clients ( Slave model) and to guide Master volume rendering. On client side, we used HTML5 to normalize user-interactive behaviors on Slave model and enhance the accuracy of behavior request and user-friendly experience. The results showed that more than four independent tasks (each with a data size of 249.4 MB) could be simultaneously carried out with a 100-KBps client bandwidth (extreme test); the first loading time was <12 s, and the response time of each behavior request for final high quality image remained at approximately 1 s, while the peak value of bandwidth was <50-KBps. Meanwhile, the FPS value for each client was ≥40. This solution could serve the users by rapidly accessing the application via one URL hyperlink without special software and hardware requirement in a diversified network environment and could be easily integrated into other telemedical systems seamlessly.

  13. A simple method for the production of large volume 3D macroporous hydrogels for advanced biotechnological, medical and environmental applications

    PubMed Central

    Savina, Irina N.; Ingavle, Ganesh C.; Cundy, Andrew B.; Mikhalovsky, Sergey V.

    2016-01-01

    The development of bulk, three-dimensional (3D), macroporous polymers with high permeability, large surface area and large volume is highly desirable for a range of applications in the biomedical, biotechnological and environmental areas. The experimental techniques currently used are limited to the production of small size and volume cryogel material. In this work we propose a novel, versatile, simple and reproducible method for the synthesis of large volume porous polymer hydrogels by cryogelation. By controlling the freezing process of the reagent/polymer solution, large-scale 3D macroporous gels with wide interconnected pores (up to 200 μm in diameter) and large accessible surface area have been synthesized. For the first time, macroporous gels (of up to 400 ml bulk volume) with controlled porous structure were manufactured, with potential for scale up to much larger gel dimensions. This method can be used for production of novel 3D multi-component macroporous composite materials with a uniform distribution of embedded particles. The proposed method provides better control of freezing conditions and thus overcomes existing drawbacks limiting production of large gel-based devices and matrices. The proposed method could serve as a new design concept for functional 3D macroporous gels and composites preparation for biomedical, biotechnological and environmental applications. PMID:26883390

  14. 3D granulometry: grain-scale shape and size distribution from point cloud dataset of river environments

    NASA Astrophysics Data System (ADS)

    Steer, Philippe; Lague, Dimitri; Gourdon, Aurélie; Croissant, Thomas; Crave, Alain

    2016-04-01

    The grain-scale morphology of river sediments and their size distribution are important factors controlling the efficiency of fluvial erosion and transport. In turn, constraining the spatial evolution of these two metrics offer deep insights on the dynamics of river erosion and sediment transport from hillslopes to the sea. However, the size distribution of river sediments is generally assessed using statistically-biased field measurements and determining the grain-scale shape of river sediments remains a real challenge in geomorphology. Here we determine, with new methodological approaches based on the segmentation and geomorphological fitting of 3D point cloud dataset, the size distribution and grain-scale shape of sediments located in river environments. Point cloud segmentation is performed using either machine-learning algorithms or geometrical criterion, such as local plan fitting or curvature analysis. Once the grains are individualized into several sub-clouds, each grain-scale morphology is determined using a 3D geometrical fitting algorithm applied on the sub-cloud. If different geometrical models can be conceived and tested, only ellipsoidal models were used in this study. A phase of results checking is then performed to remove grains showing a best-fitting model with a low level of confidence. The main benefits of this automatic method are that it provides 1) an un-biased estimate of grain-size distribution on a large range of scales, from centimeter to tens of meters; 2) access to a very large number of data, only limited by the number of grains in the point-cloud dataset; 3) access to the 3D morphology of grains, in turn allowing to develop new metrics characterizing the size and shape of grains. The main limit of this method is that it is only able to detect grains with a characteristic size greater than the resolution of the point cloud. This new 3D granulometric method is then applied to river terraces both in the Poerua catchment in New-Zealand and along the Laonong river in Taiwan, which point clouds were obtained using both terrestrial lidar scanning and structure from motion photogrammetry.

  15. Development and analysis of educational technologies for a blended organic chemistry course

    NASA Astrophysics Data System (ADS)

    Evans, Michael James

    Blended courses incorporate elements of both face-to-face and online instruction. The extent to which blended courses are conducted online, and the proper role of the online components of blended courses, have been debated and may vary. What can be said in general, however, is that online tools for blended courses are typically culled together from a variety of sources, are often very large scale, and may present distractions for students that decrease their utility as teaching tools. Furthermore, large-scale educational technologies may not be amenable to rigorous, detailed study, limiting evaluation of their effectiveness. Small-scale educational technologies run from the instructor's own server have the potential to mitigate many of these issues. Such tools give the instructor or researcher direct access to all available data, facilitating detailed analysis of student use. Code modification is simple and rapid if errors arise, since code is stored where the instructor can easily access it. Finally, the design of a small-scale tool can target a very specific application. With these ideas in mind, this work describes several projects aimed at exploring the use of small-scale, web-based software in a blended organic chemistry course. A number of activities were developed and evaluated using the Student Assessment of Learning Gains survey, and data from the activities were analyzed using quantitative methods of statistics and social network analysis methods. Findings from this work suggest that small-scale educational technologies provide significant learning benefits for students of organic chemistry---with the important caveat that instructors must offer appropriate levels of technical and pedagogical support for students. Most notably, students reported significant learning gains from activities that included collaborative learning supported by novel online tools. For the particular context of organic chemistry, which has a unique semantic language (Lewis structures), the incorporation of shared video was a novel but important element of these activities. In fields for which mere text would not provide enough information in communications between students, video offers an appealing medium for student-student interaction.

  16. The importance of creating a social business to produce low-cost hearing aids.

    PubMed

    Caccamo, Samantha; Voloshchenko, Anastasia; Dankyi, Nana Yaa

    2014-09-01

    The World Health Organization (WHO) estimates that about 280 million people worldwide have a bilateral hearing loss, mostly living in poor countries. Hearing loss causes heavy social burdens on individuals, families, communities and countries. However, due to the lack of accessibility and affordability, the vast majority of people in the world who need hearing aids do not have access to them. Low-income countries are thus pulled into a disability/poverty spiral. From this standpoint, the production of available, accessible and affordable hearing aids for the poorest populations of our planet should be one of the main issues in global hearing healthcare. Designing and producing a brand new low-cost hearing aid is the most effective option. Involving a large producer of hearing aids in the creation of a social business to solve the problem of access to affordable hearing aids is an essential step to reduce hearing disability on a large scale globally. Today's technology allows for the creation of a "minimal design" product that does not exceed $100-$150, that can be further lowered when purchased in large quantities and dispensed with alternative models. It is conceivable that by making a sustainable social business, the low cost product could be sold with a cross-subsidy model in order to recover the overhead costs. Social business is an economic model that has the potential to produce and distribute affordable hearing aids in low- and middle-income countries. Rehabilitation of hearing impaired children will be carried out in partnership with Sahic (Society of Assistance to Hearing Impaired Children) in Dhaka, Bangladesh and the ENT Department of Ospedale Burlo di Trieste, Dr. Eva Orzan.

  17. Cross-polar transport and scavenging of Siberian aerosols containing black carbon during the 2012 ACCESS summer campaign

    NASA Astrophysics Data System (ADS)

    Raut, Jean-Christophe; Marelle, Louis; Fast, Jerome D.; Thomas, Jennie L.; Weinzierl, Bernadett; Law, Katharine S.; Berg, Larry K.; Roiger, Anke; Easter, Richard C.; Heimerl, Katharina; Onishi, Tatsuo; Delanoë, Julien; Schlager, Hans

    2017-09-01

    During the ACCESS airborne campaign in July 2012, extensive boreal forest fires resulted in significant aerosol transport to the Arctic. A 10-day episode combining intense biomass burning over Siberia and low-pressure systems over the Arctic Ocean resulted in efficient transport of plumes containing black carbon (BC) towards the Arctic, mostly in the upper troposphere (6-8 km). A combination of in situ observations (DLR Falcon aircraft), satellite analysis and WRF-Chem simulations is used to understand the vertical and horizontal transport mechanisms of BC with a focus on the role of wet removal. Between the northwestern Norwegian coast and the Svalbard archipelago, the Falcon aircraft sampled plumes with enhanced CO concentrations up to 200 ppbv and BC mixing ratios up to 25 ng kg-1. During transport to the Arctic region, a large fraction of BC particles are scavenged by two wet deposition processes, namely wet removal by large-scale precipitation and removal in wet convective updrafts, with both processes contributing almost equally to the total accumulated deposition of BC. Our results underline that applying a finer horizontal resolution (40 instead of 100 km) improves the model performance, as it significantly reduces the overestimation of BC levels observed at a coarser resolution in the mid-troposphere. According to the simulations at 40 km, the transport efficiency of BC (TEBC) in biomass burning plumes was larger (60 %), because it was impacted by small accumulated precipitation along trajectory (1 mm). In contrast TEBC was small (< 30 %) and accumulated precipitation amounts were larger (5-10 mm) in plumes influenced by urban anthropogenic sources and flaring activities in northern Russia, resulting in transport to lower altitudes. TEBC due to large-scale precipitation is responsible for a sharp meridional gradient in the distribution of BC concentrations. Wet removal in cumulus clouds is the cause of modeled vertical gradient of TEBC, especially in the mid-latitudes, reflecting the distribution of convective precipitation, but is dominated in the Arctic region by the large-scale wet removal associated with the formation of stratocumulus clouds in the planetary boundary layer (PBL) that produce frequent drizzle.

  18. A New Stochastic Approach to Predict Peak and Residual Shear Strength of Natural Rock Discontinuities

    NASA Astrophysics Data System (ADS)

    Casagrande, D.; Buzzi, O.; Giacomini, A.; Lambert, C.; Fenton, G.

    2018-01-01

    Natural discontinuities are known to play a key role in the stability of rock masses. However, it is a non-trivial task to estimate the shear strength of large discontinuities. Because of the inherent complexity to access to the full surface of the large in situ discontinuities, researchers or engineers tend to work on small-scale specimens. As a consequence, the results are often plagued by the well-known scale effect. A new approach is here proposed to predict shear strength of discontinuities. This approach has the potential to avoid the scale effect. The rationale of the approach is as follows: a major parameter that governs the shear strength of a discontinuity within a rock mass is roughness, which can be accounted for by surveying the discontinuity surface. However, this is typically not possible for discontinuities contained within the rock mass where only traces are visible. For natural surfaces, it can be assumed that traces are, to some extent, representative of the surface. It is here proposed to use the available 2D information (from a visible trace, referred to as a seed trace) and a random field model to create a large number of synthetic surfaces (3D data sets). The shear strength of each synthetic surface can then be estimated using a semi-analytical model. By using a large number of synthetic surfaces and a Monte Carlo strategy, a meaningful shear strength distribution can be obtained. This paper presents the validation of the semi-analytical mechanistic model required to support the new approach for prediction of discontinuity shear strength. The model can predict both peak and residual shear strength. The second part of the paper lays the foundation of a random field model to support the creation of synthetic surfaces having statistical properties in line with those of the data of the seed trace. The paper concludes that it is possible to obtain a reasonable estimate of peak and residual shear strength of the discontinuities tested from the information from a single trace, without having access to the whole surface.

  19. Land grabbing: a preliminary quantification of economic impacts on rural livelihoods.

    PubMed

    Davis, Kyle F; D'Odorico, Paolo; Rulli, Maria Cristina

    2014-01-01

    Global demands on agricultural land are increasing due to population growth, dietary changes and the use of biofuels. Their effect on food security is to reduce humans' ability to cope with the uncertainties of global climate change. In light of the 2008 food crisis, to secure reliable future access to sufficient agricultural land, many nations and corporations have begun purchasing large tracts of land in the global South, a phenomenon deemed "land grabbing" by popular media. Because land investors frequently export crops without providing adequate employment, this represents an effective income loss for local communities. We study 28 countries targeted by large-scale land acquisitions [comprising 87 % of reported cases and 27 million hectares (ha)] and estimate the effects of such investments on local communities' incomes. We find that this phenomenon can potentially affect the incomes of ~12 million people globally with implications for food security, poverty levels and urbanization. While it is important to note that our study incorporates a number of assumptions and limitations, it provides a much needed initial quantification of the economic impacts of large-scale land acquisitions on rural livelihoods.

  20. Preservation of large-scale chromatin structure in FISH experiments

    PubMed Central

    Hepperger, Claudia; Otten, Simone; von Hase, Johann

    2006-01-01

    The nuclear organization of specific endogenous chromatin regions can be investigated only by fluorescence in situ hybridization (FISH). One of the two fixation procedures is typically applied: (1) buffered formaldehyde or (2) hypotonic shock with methanol acetic acid fixation followed by dropping of nuclei on glass slides and air drying. In this study, we compared the effects of these two procedures and some variations on nuclear morphology and on FISH signals. We analyzed mouse erythroleukemia and mouse embryonic stem cells because their clusters of subcentromeric heterochromatin provide an easy means to assess preservation of chromatin. Qualitative and quantitative analyses revealed that formaldehyde fixation provided good preservation of large-scale chromatin structures, while classical methanol acetic acid fixation after hypotonic treatment severely impaired nuclear shape and led to disruption of chromosome territories, heterochromatin structures, and large transgene arrays. Our data show that such preparations do not faithfully reflect in vivo nuclear architecture. Electronic supplementary material Supplementary material is available in the online version of this article at http://dx.doi.org/10.1007/s00412-006-0084-2 and is accessible for authorized users. PMID:17119992

  1. Cross-lingual neighborhood effects in generalized lexical decision and natural reading.

    PubMed

    Dirix, Nicolas; Cop, Uschi; Drieghe, Denis; Duyck, Wouter

    2017-06-01

    The present study assessed intra- and cross-lingual neighborhood effects, using both a generalized lexical decision task and an analysis of a large-scale bilingual eye-tracking corpus (Cop, Dirix, Drieghe, & Duyck, 2016). Using new neighborhood density and frequency measures, the general lexical decision task yielded an inhibitory cross-lingual neighborhood density effect on reading times of second language words, replicating van Heuven, Dijkstra, and Grainger (1998). Reaction times for native language words were not influenced by neighborhood density or frequency but error rates showed cross-lingual neighborhood effects depending on target word frequency. The large-scale eye movement corpus confirmed effects of cross-lingual neighborhood on natural reading, even though participants were reading a novel in a unilingual context. Especially second language reading and to a lesser extent native language reading were influenced by lexical candidates from the nontarget language, although these effects in natural reading were largely facilitatory. These results offer strong and direct support for bilingual word recognition models that assume language-independent lexical access. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. The Potato Cryobank at The International Potato Center (Cip): A Model for Long Term Conservation of Clonal Plant Genetic Resources Collections of the Future.

    PubMed

    Vollmer, R; Villagaray, R; Egusquiza, V; Espirilla, J; García, M; Torres, A; Rojas, E; Panta, A; Barkley, N A; Ellis, D

    Cryobanks are a secure, efficient and low cost method for the long-term conservation of plant genetic resources for theoretically centuries or millennia with minimal maintenance. The present manuscript describes CIP's modified protocol for potato cryopreservation, its large-scale application, and the establishment of quality and operational standards, which included a viability reassessment of material entering the cryobank. In 2013, CIP established stricter quality and operational standards under which 1,028 potato accessions were cryopreserved with an improved PVS2-droplet protocol. In 2014 the viability of 114 accessions cryopreserved in 2013 accessions were reassessed. The average recovery rate (full plant recovery after LN exposure) of 1028 cryopreserved Solanum species ranged from 34 to 59%, and 70% of the processed accessions showed a minimum recovery rate of ≥20% and were considered as successfully cryopreserved. CIP has established a new high quality management system for cryobanking. Periodic viability reassessment, strict and clear recovery criteria and the monitoring of the percent of successful accessions meeting the criteria as well as contamination rates are metrics that need to be considered in cryobanks.

  3. Big Sugar in southern Africa: rural development and the perverted potential of sugar/ethanol exports.

    PubMed

    Richardson, Ben

    2010-01-01

    This paper asks how investment in large-scale sugar cane production has contributed, and will contribute, to rural development in southern Africa. Taking a case study of the South African company Illovo in Zambia, the argument is made that the potential for greater tax revenue, domestic competition, access to resources and wealth distribution from sugar/ethanol production have all been perverted and with relatively little payoff in wage labour opportunities in return. If the benefits of agro-exports cannot be so easily assumed, then the prospective 'balance sheet' of biofuels needs to be re-examined. In this light, the paper advocates smaller-scale agrarian initiatives.

  4. High-resolution CMOS MEA platform to study neurons at subcellular, cellular, and network levels†

    PubMed Central

    Müller, Jan; Ballini, Marco; Livi, Paolo; Chen, Yihui; Radivojevic, Milos; Shadmani, Amir; Viswam, Vijay; Jones, Ian L.; Fiscella, Michele; Diggelmann, Roland; Stettler, Alexander; Frey, Urs; Bakkum, Douglas J.; Hierlemann, Andreas

    2017-01-01

    Studies on information processing and learning properties of neuronal networks would benefit from simultaneous and parallel access to the activity of a large fraction of all neurons in such networks. Here, we present a CMOS-based device, capable of simultaneously recording the electrical activity of over a thousand cells in in vitro neuronal networks. The device provides sufficiently high spatiotemporal resolution to enable, at the same time, access to neuronal preparations on subcellular, cellular, and network level. The key feature is a rapidly reconfigurable array of 26 400 microelectrodes arranged at low pitch (17.5 μm) within a large overall sensing area (3.85 × 2.10 mm2). An arbitrary subset of the electrodes can be simultaneously connected to 1024 low-noise readout channels as well as 32 stimulation units. Each electrode or electrode subset can be used to electrically stimulate or record the signals of virtually any neuron on the array. We demonstrate the applicability and potential of this device for various different experimental paradigms: large-scale recordings from whole networks of neurons as well as investigations of axonal properties of individual neurons. PMID:25973786

  5. High-resolution CMOS MEA platform to study neurons at subcellular, cellular, and network levels.

    PubMed

    Müller, Jan; Ballini, Marco; Livi, Paolo; Chen, Yihui; Radivojevic, Milos; Shadmani, Amir; Viswam, Vijay; Jones, Ian L; Fiscella, Michele; Diggelmann, Roland; Stettler, Alexander; Frey, Urs; Bakkum, Douglas J; Hierlemann, Andreas

    2015-07-07

    Studies on information processing and learning properties of neuronal networks would benefit from simultaneous and parallel access to the activity of a large fraction of all neurons in such networks. Here, we present a CMOS-based device, capable of simultaneously recording the electrical activity of over a thousand cells in in vitro neuronal networks. The device provides sufficiently high spatiotemporal resolution to enable, at the same time, access to neuronal preparations on subcellular, cellular, and network level. The key feature is a rapidly reconfigurable array of 26 400 microelectrodes arranged at low pitch (17.5 μm) within a large overall sensing area (3.85 × 2.10 mm(2)). An arbitrary subset of the electrodes can be simultaneously connected to 1024 low-noise readout channels as well as 32 stimulation units. Each electrode or electrode subset can be used to electrically stimulate or record the signals of virtually any neuron on the array. We demonstrate the applicability and potential of this device for various different experimental paradigms: large-scale recordings from whole networks of neurons as well as investigations of axonal properties of individual neurons.

  6. Threshold-voltage modulated phase change heterojunction for application of high density memory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Baihan; Tong, Hao, E-mail: tonghao@hust.edu.cn; Qian, Hang

    2015-09-28

    Phase change random access memory is one of the most important candidates for the next generation non-volatile memory technology. However, the ability to reduce its memory size is compromised by the fundamental limitations inherent in the CMOS technology. While 0T1R configuration without any additional access transistor shows great advantages in improving the storage density, the leakage current and small operation window limit its application in large-scale arrays. In this work, phase change heterojunction based on GeTe and n-Si is fabricated to address those problems. The relationship between threshold voltage and doping concentration is investigated, and energy band diagrams and X-raymore » photoelectron spectroscopy measurements are provided to explain the results. The threshold voltage is modulated to provide a large operational window based on this relationship. The switching performance of the heterojunction is also tested, showing a good reverse characteristic, which could effectively decrease the leakage current. Furthermore, a reliable read-write-erase function is achieved during the tests. Phase change heterojunction is proposed for high-density memory, showing some notable advantages, such as modulated threshold voltage, large operational window, and low leakage current.« less

  7. Human rights approach to maternal & child health: has India fared well?

    PubMed

    Ram, F; Singh, Abhishek; Ram, Usha

    2013-04-01

    The objectives of the study were to examine: right to access maternal health; right to access child health; and right to access improved water and sanitation in India. We used large-scale data sets like District Level Household Survey conducted in 2007-08 and National Family Health Surveys conducted during 1992-93, 1998-99, and 2005-06 to fulfil the objectives. The selection of the indicator variables was guided by the Human Rights' Framework for Health and Convention of the Rights of the Child- Articles 7, 24 and 27. We used univariate and bivariate analysis along with ratio of access among non-poor to access among poor to fulfil the objectives. Evidence clearly suggested gross violation of human rights starting from the birth of an individual. Even after 60 years of independence, significant proportions of women and children do not have access to basic services like improved drinking water and sanitation. There were enormous socio-economic and residence related inequalities in maternal and child health indicators included in the study. These inequalities were mostly to the disadvantage of the poor. The fulfilment of the basic human rights of women and children is likely to pay dividends in many other domains related to overall population and health in India.

  8. Human rights approach to maternal & child health: Has India fared well?

    PubMed Central

    Ram, F.; Singh, Abhishek; Ram, Usha

    2013-01-01

    Background & objectives: The objectives of the study were to examine: right to access maternal health; right to access child health; and right to access improved water and sanitation in India. Methods: We used large-scale data sets like District Level Household Survey conducted in 2007-08 and National Family Health Surveys conducted during 1992-93, 1998-99, and 2005-06 to fulfil the objectives. The selection of the indicator variables was guided by the Human Rights’ Framework for Health and Convention of the Rights of the Child- Articles 7, 24 and 27. We used univariate and bivariate analysis along with ratio of access among non-poor to access among poor to fulfil the objectives. Results: Evidence clearly suggested gross violation of human rights starting from the birth of an individual. Even after 60 years of independence, significant proportions of women and children do not have access to basic services like improved drinking water and sanitation. Interpretation & conclusions: There were enormous socio-economic and residence related inequalities in maternal and child health indicators included in the study. These inequalities were mostly to the disadvantage of the poor. The fulfilment of the basic human rights of women and children is likely to pay dividends in many other domains related to overall population and health in India. PMID:23703339

  9. Tracing Galactic Outflows to the Source: Spatially Resolved Feedback in M83 with COS

    NASA Astrophysics Data System (ADS)

    Aloisi, Alessandra

    2016-10-01

    Star-formation (SF) feedback plays a vital role in shaping galaxy properties, but there are many open questions about how this feedback is created, propagated, and felt by galaxies. SF-driven feedback can be observationally constrained with rest-frame UV absorption-line spectroscopy that accesses a range of powerful gas density and kinematic diagnostics. Studies at both high and low redshift show clear evidence for large-scale outflows in star-forming galaxies that scale with galaxy SF rate. However, by sampling one sightline or the galaxy as a whole, these studies are not tailored to reveal how the large-scale outflows develop from their ultimate sources at the scale of individual SF regions. We propose the first spatially-resolved COS G130M/G160M (1130-1800 A) study of the ISM in the nearby (4.6 Mpc) face-on spiral starburst M83 using individual young star clusters as background sources. This is the first down-the-barrel study where blueshifted absorptions can be identified directly with outflowing gas in a spatially resolved fashion. The kpc-scale flows sampled by the COS pointings will be anchored to the properties of the large-scale (10-100 kpc) flows thanks to the wealth of multi-wavelength observations of M83 from X-ray to radio. A comparison of COS data with mock spectra from constrained simulations of spiral galaxies with FIRE (Feedback In Realistic Environments; a code with unprecedented 1-100 pc spatial resolution and self-consistent treatments of stellar feedback) will provide an important validation of these simulations and will supply the community with a powerful and well-tested tool for galaxy formation predictions applicable to all redshifts.

  10. Innovative Visualizations Shed Light on Avian Nocturnal Migration

    PubMed Central

    Farnsworth, Andrew; Aelterman, Bart; Alves, Jose A.; Azijn, Kevin; Bernstein, Garrett; Branco, Sérgio; Desmet, Peter; Dokter, Adriaan M.; Horton, Kyle; Kelling, Steve; Kelly, Jeffrey F.; Leijnse, Hidde; Rong, Jingjing; Sheldon, Daniel; Van den Broeck, Wouter; Van Den Meersche, Jan Klaas; Van Doren, Benjamin Mark; van Gasteren, Hans

    2016-01-01

    Globally, billions of flying animals undergo seasonal migrations, many of which occur at night. The temporal and spatial scales at which migrations occur and our inability to directly observe these nocturnal movements makes monitoring and characterizing this critical period in migratory animals’ life cycles difficult. Remote sensing, therefore, has played an important role in our understanding of large-scale nocturnal bird migrations. Weather surveillance radar networks in Europe and North America have great potential for long-term low-cost monitoring of bird migration at scales that have previously been impossible to achieve. Such long-term monitoring, however, poses a number of challenges for the ornithological and ecological communities: how does one take advantage of this vast data resource, integrate information across multiple sensors and large spatial and temporal scales, and visually represent the data for interpretation and dissemination, considering the dynamic nature of migration? We assembled an interdisciplinary team of ecologists, meteorologists, computer scientists, and graphic designers to develop two different flow visualizations, which are interactive and open source, in order to create novel representations of broad-front nocturnal bird migration to address a primary impediment to long-term, large-scale nocturnal migration monitoring. We have applied these visualization techniques to mass bird migration events recorded by two different weather surveillance radar networks covering regions in Europe and North America. These applications show the flexibility and portability of such an approach. The visualizations provide an intuitive representation of the scale and dynamics of these complex systems, are easily accessible for a broad interest group, and are biologically insightful. Additionally, they facilitate fundamental ecological research, conservation, mitigation of human–wildlife conflicts, improvement of meteorological products, and public outreach, education, and engagement. PMID:27557096

  11. Innovative Visualizations Shed Light on Avian Nocturnal Migration.

    PubMed

    Shamoun-Baranes, Judy; Farnsworth, Andrew; Aelterman, Bart; Alves, Jose A; Azijn, Kevin; Bernstein, Garrett; Branco, Sérgio; Desmet, Peter; Dokter, Adriaan M; Horton, Kyle; Kelling, Steve; Kelly, Jeffrey F; Leijnse, Hidde; Rong, Jingjing; Sheldon, Daniel; Van den Broeck, Wouter; Van Den Meersche, Jan Klaas; Van Doren, Benjamin Mark; van Gasteren, Hans

    2016-01-01

    Globally, billions of flying animals undergo seasonal migrations, many of which occur at night. The temporal and spatial scales at which migrations occur and our inability to directly observe these nocturnal movements makes monitoring and characterizing this critical period in migratory animals' life cycles difficult. Remote sensing, therefore, has played an important role in our understanding of large-scale nocturnal bird migrations. Weather surveillance radar networks in Europe and North America have great potential for long-term low-cost monitoring of bird migration at scales that have previously been impossible to achieve. Such long-term monitoring, however, poses a number of challenges for the ornithological and ecological communities: how does one take advantage of this vast data resource, integrate information across multiple sensors and large spatial and temporal scales, and visually represent the data for interpretation and dissemination, considering the dynamic nature of migration? We assembled an interdisciplinary team of ecologists, meteorologists, computer scientists, and graphic designers to develop two different flow visualizations, which are interactive and open source, in order to create novel representations of broad-front nocturnal bird migration to address a primary impediment to long-term, large-scale nocturnal migration monitoring. We have applied these visualization techniques to mass bird migration events recorded by two different weather surveillance radar networks covering regions in Europe and North America. These applications show the flexibility and portability of such an approach. The visualizations provide an intuitive representation of the scale and dynamics of these complex systems, are easily accessible for a broad interest group, and are biologically insightful. Additionally, they facilitate fundamental ecological research, conservation, mitigation of human-wildlife conflicts, improvement of meteorological products, and public outreach, education, and engagement.

  12. Maglev Launch: Ultra-low Cost, Ultra-high Volume Access to Space for Cargo and Humans

    NASA Astrophysics Data System (ADS)

    Powell, James; Maise, George; Rather, John

    2010-01-01

    Despite decades of efforts to reduce rocket launch costs, improvements are marginal. Launch cost to LEO for cargo is ~$10,000 per kg of payload, and to higher orbit and beyond much greater. Human access to the ISS costs $20 million for a single passenger. Unless launch costs are greatly reduced, large scale commercial use and human exploration of the solar system will not occur. A new approach for ultra low cost access to space-Maglev Launch-magnetically accelerates levitated spacecraft to orbital speeds, 8 km/sec or more, in evacuated tunnels on the surface, using Maglev technology like that operating in Japan for high speed passenger transport. The cost of electric energy to reach orbital speed is less than $1 per kilogram of payload. Two Maglev launch systems are described, the Gen-1System for unmanned cargo craft to orbit and Gen-2, for large-scale access of human to space. Magnetically levitated and propelled Gen-1 cargo craft accelerate in a 100 kilometer long evacuated tunnel, entering the atmosphere at the tunnel exit, which is located in high altitude terrain (~5000 meters) through an electrically powered ``MHD Window'' that prevents outside air from flowing into the tunnel. The Gen-1 cargo craft then coasts upwards to space where a small rocket burn, ~0.5 km/sec establishes, the final orbit. The Gen-1 reference design launches a 40 ton, 2 meter diameter spacecraft with 35 tons of payload. At 12 launches per day, a single Gen-1 facility could launch 150,000 tons annually. Using present costs for tunneling, superconductors, cryogenic equipment, materials, etc., the projected construction cost for the Gen-1 facility is 20 billion dollars. Amortization cost, plus Spacecraft and O&M costs, total $43 per kg of payload. For polar orbit launches, sites exist in Alaska, Russia, and China. For equatorial orbit launches, sites exist in the Andes and Africa. With funding, the Gen-1 system could operate by 2020 AD. The Gen-2 system requires more advanced technology. Passenger spacecraft enter the atmosphere at 70,000 feet, where deceleration is acceptable. A levitated evacuated launch tube is used, with the levitation force generated by magnetic interaction between superconducting cables on the levitated launch tube and superconducting cables on the ground beneath. The Gen-2 system could launch 100's of thousands of passengers per year, and operate by 2030 AD. Maglev launch will enable large human scale exploration of space, thousands of gigawatts of space solar power satellites for beamed power to Earth, a robust defense against asteroids and comets, and many other applications not possible now.

  13. Synthesis of underreported small-scale fisheries catch in Pacific island waters

    NASA Astrophysics Data System (ADS)

    Zeller, D.; Harper, S.; Zylich, K.; Pauly, D.

    2015-03-01

    We synthesize fisheries catch reconstruction studies for 25 Pacific island countries, states and territories, which compare estimates of total domestic catches with officially reported catch data. We exclude data for the large-scale tuna fleets, which have largely foreign beneficial ownership, even when flying Pacific flags. However, we recognize the considerable financial contributions derived from foreign access or charter fees for Pacific host countries. The reconstructions for the 25 entities from 1950 to 2010 suggested that total domestic catches were 2.5 times the data reported to FAO. This discrepancy was largest in early periods (1950: 6.4 times), while for 2010, total catches were 1.7 times the reported data. There was a significant difference in trend between reported and reconstructed catches since 2000, with reconstructed catches declining strongly since their peak in 2000. Total catches increased from 110,000 t yr-1 in 1950 (of which 17,400 t were reported) to a peak of over 250,000 t yr-1 in 2000, before declining to around 200,000 t yr-1 by 2010. This decrease is driven by a declining artisanal (small-scale commercial) catch, which was not compensated for by increasing domestic industrial (large-scale commercial) catches. The artisanal fisheries appear to be declining from a peak of 97,000 t yr-1 in 1992 to less than 50,000 t yr-1 by 2010. However, total catches were dominated by subsistence (small-scale, non-commercial) fisheries, which accounted for 69 % of total catches, with the majority missing from the reported data. Artisanal catches accounted for 22 %, while truly domestic industrial fisheries accounted for only 6 % of total catches. The smallest component is the recreational (small-scale, non-commercial and largely for leisure) sector (2 %), which, although small in catch, is likely of economic importance in some areas due to its direct link to tourism income.

  14. Designing for Peta-Scale in the LSST Database

    NASA Astrophysics Data System (ADS)

    Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.

    2007-10-01

    The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.

  15. Industrial biomanufacturing: The future of chemical production.

    PubMed

    Clomburg, James M; Crumbley, Anna M; Gonzalez, Ramon

    2017-01-06

    The current model for industrial chemical manufacturing employs large-scale megafacilities that benefit from economies of unit scale. However, this strategy faces environmental, geographical, political, and economic challenges associated with energy and manufacturing demands. We review how exploiting biological processes for manufacturing (i.e., industrial biomanufacturing) addresses these concerns while also supporting and benefiting from economies of unit number. Key to this approach is the inherent small scale and capital efficiency of bioprocesses and the ability of engineered biocatalysts to produce designer products at high carbon and energy efficiency with adjustable output, at high selectivity, and under mild process conditions. The biological conversion of single-carbon compounds represents a test bed to establish this paradigm, enabling rapid, mobile, and widespread deployment, access to remote and distributed resources, and adaptation to new and changing markets. Copyright © 2017, American Association for the Advancement of Science.

  16. Measuring public opinion on alcohol policy: a factor analytic study of a US probability sample.

    PubMed

    Latimer, William W; Harwood, Eileen M; Newcomb, Michael D; Wagenaar, Alexander C

    2003-03-01

    Public opinion has been one factor affecting change in policies designed to reduce underage alcohol use. Extant research, however, has been criticized for using single survey items of unknown reliability to define adult attitudes on alcohol policy issues. The present investigation addresses a critical gap in the literature by deriving scales on public attitudes, knowledge, and concerns pertinent to alcohol policies designed to reduce underage drinking using a US probability sample survey of 7021 adults. Five attitudinal scales were derived from exploratory and confirmatory factor analyses addressing policies to: (1) regulate alcohol marketing, (2) regulate alcohol consumption in public places, (3) regulate alcohol distribution, (4) increase alcohol taxes, and (5) regulate youth access. The scales exhibited acceptable psychometric properties and were largely consistent with a rational framework which guided the survey construction.

  17. Framework for rapid assessment and adoption of new vector control tools.

    PubMed

    Vontas, John; Moore, Sarah; Kleinschmidt, Immo; Ranson, Hilary; Lindsay, Steve; Lengeler, Christian; Hamon, Nicholas; McLean, Tom; Hemingway, Janet

    2014-04-01

    Evidence-informed health policy making is reliant on systematic access to, and appraisal of, the best available research evidence. This review suggests a strategy to improve the speed at which evidence is gathered on new vector control tools (VCTs) using a framework based on measurements of the vectorial capacity of an insect population to transmit disease. We explore links between indicators of VCT efficacy measurable in small-scale experiments that are relevant to entomological and epidemiological parameters measurable only in large-scale proof-of-concept randomised control trials (RCTs). We hypothesise that once RCTs establish links between entomological and epidemiological indicators then rapid evaluation of new products within the same product category may be conducted through smaller scale experiments without repetition of lengthy and expensive RCTs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Two non linear dynamics plasma astrophysics experiments at LANL

    NASA Astrophysics Data System (ADS)

    Intrator, T. P.; Weber, T. E.; Feng, Y.; Sears, J. A.; Swan, H.; Hutchinson, T.; Boguski, J.; Gao, K.; Chapdelaine, L.; Dunn, J.

    2013-10-01

    Two laboratory experiments at Los Alamos National Laboratory (LANL) have been built to gain access to a wide range of fundamental plasma physics issues germane astro, space, and fusion plasmas. The over arching theme is magnetized plasma dynamics that include currents, MHD forces and instabilities, sheared flows and shocks, creation and annihilation of magnetic field. The Reconnection Scaling Experiment (RSX) creates current sheets and flux ropes that exhibit fully 3D dynamics, that can kink, bounce, merge and reconnect, shred, and reform in complicated ways. The most recent movies from a large detailed data set describe the 3D magnetic structure and helicity budget of a driven and dissipative system that spontaneously self saturates a kink instability. The Magnetized Shock Experiment (MSX) uses a Field reversed configuration (FRC) that is ejected at high speed and then stagnated onto a stopping mirror field, which drives a collisionless magnetized shock. A plasmoid accelerator will also access super critical shocks at much larger Alfven Mach numbers. Unique features include access to parallel, oblique and perpendicular shocks, in regions much larger than ion gyro radius and inertial length, large magnetic and fluid Reynolds numbers, and volume for turbulence. Center for Magnetic Self Organization, NASA Geospace NNHIOA044I-Basic, Department of Energy DE-AC52-06NA25369.

  19. 3D plasmonic nanoantennas integrated with MEA biosensors

    NASA Astrophysics Data System (ADS)

    Dipalo, Michele; Messina, Gabriele C.; Amin, Hayder; La Rocca, Rosanna; Shalabaeva, Victoria; Simi, Alessandro; Maccione, Alessandro; Zilio, Pierfrancesco; Berdondini, Luca; de Angelis, Francesco

    2015-02-01

    Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level.Neuronal signaling in brain circuits occurs at multiple scales ranging from molecules and cells to large neuronal assemblies. However, current sensing neurotechnologies are not designed for parallel access of signals at multiple scales. With the aim of combining nanoscale molecular sensing with electrical neural activity recordings within large neuronal assemblies, in this work three-dimensional (3D) plasmonic nanoantennas are integrated with multielectrode arrays (MEA). Nanoantennas are fabricated by fast ion beam milling on optical resist; gold is deposited on the nanoantennas in order to connect them electrically to the MEA microelectrodes and to obtain plasmonic behavior. The optical properties of these 3D nanostructures are studied through finite elements method (FEM) simulations that show a high electromagnetic field enhancement. This plasmonic enhancement is confirmed by surface enhancement Raman spectroscopy of a dye performed in liquid, which presents an enhancement of almost 100 times the incident field amplitude at resonant excitation. Finally, the reported MEA devices are tested on cultured rat hippocampal neurons. Neurons develop by extending branches on the nanostructured electrodes and extracellular action potentials are recorded over multiple days in vitro. Raman spectra of living neurons cultured on the nanoantennas are also acquired. These results highlight that these nanostructures could be potential candidates for combining electrophysiological measures of large networks with simultaneous spectroscopic investigations at the molecular level. Electronic supplementary information (ESI) available. See DOI: 10.1039/c4nr05578k

  20. Ensembl Genomes 2013: scaling up access to genome-wide data.

    PubMed

    Kersey, Paul Julian; Allen, James E; Christensen, Mikkel; Davis, Paul; Falin, Lee J; Grabmueller, Christoph; Hughes, Daniel Seth Toney; Humphrey, Jay; Kerhornou, Arnaud; Khobova, Julia; Langridge, Nicholas; McDowall, Mark D; Maheswari, Uma; Maslen, Gareth; Nuhn, Michael; Ong, Chuang Kee; Paulini, Michael; Pedro, Helder; Toneva, Iliana; Tuli, Mary Ann; Walts, Brandon; Williams, Gareth; Wilson, Derek; Youens-Clark, Ken; Monaco, Marcela K; Stein, Joshua; Wei, Xuehong; Ware, Doreen; Bolser, Daniel M; Howe, Kevin Lee; Kulesha, Eugene; Lawson, Daniel; Staines, Daniel Michael

    2014-01-01

    Ensembl Genomes (http://www.ensemblgenomes.org) is an integrating resource for genome-scale data from non-vertebrate species. The project exploits and extends technologies for genome annotation, analysis and dissemination, developed in the context of the vertebrate-focused Ensembl project, and provides a complementary set of resources for non-vertebrate species through a consistent set of programmatic and interactive interfaces. These provide access to data including reference sequence, gene models, transcriptional data, polymorphisms and comparative analysis. This article provides an update to the previous publications about the resource, with a focus on recent developments. These include the addition of important new genomes (and related data sets) including crop plants, vectors of human disease and eukaryotic pathogens. In addition, the resource has scaled up its representation of bacterial genomes, and now includes the genomes of over 9000 bacteria. Specific extensions to the web and programmatic interfaces have been developed to support users in navigating these large data sets. Looking forward, analytic tools to allow targeted selection of data for visualization and download are likely to become increasingly important in future as the number of available genomes increases within all domains of life, and some of the challenges faced in representing bacterial data are likely to become commonplace for eukaryotes in future.

  1. Screensaver: an open source lab information management system (LIMS) for high throughput screening facilities

    PubMed Central

    2010-01-01

    Background Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. Results We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. Conclusions The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities. PMID:20482787

  2. Screensaver: an open source lab information management system (LIMS) for high throughput screening facilities.

    PubMed

    Tolopko, Andrew N; Sullivan, John P; Erickson, Sean D; Wrobel, David; Chiang, Su L; Rudnicki, Katrina; Rudnicki, Stewart; Nale, Jennifer; Selfors, Laura M; Greenhouse, Dara; Muhlich, Jeremy L; Shamu, Caroline E

    2010-05-18

    Shared-usage high throughput screening (HTS) facilities are becoming more common in academe as large-scale small molecule and genome-scale RNAi screening strategies are adopted for basic research purposes. These shared facilities require a unique informatics infrastructure that must not only provide access to and analysis of screening data, but must also manage the administrative and technical challenges associated with conducting numerous, interleaved screening efforts run by multiple independent research groups. We have developed Screensaver, a free, open source, web-based lab information management system (LIMS), to address the informatics needs of our small molecule and RNAi screening facility. Screensaver supports the storage and comparison of screening data sets, as well as the management of information about screens, screeners, libraries, and laboratory work requests. To our knowledge, Screensaver is one of the first applications to support the storage and analysis of data from both genome-scale RNAi screening projects and small molecule screening projects. The informatics and administrative needs of an HTS facility may be best managed by a single, integrated, web-accessible application such as Screensaver. Screensaver has proven useful in meeting the requirements of the ICCB-Longwood/NSRB Screening Facility at Harvard Medical School, and has provided similar benefits to other HTS facilities.

  3. Ongoing Use of Data and Specimens from NCI Sponsored Cancer Prevention Clinical Trials in the Community Clinical Oncology Program

    PubMed Central

    Minasian, Lori; Tangen, Catherine M.; Wickerham, D. Lawrence

    2015-01-01

    Large cancer prevention trials provide opportunities to collect a wide array of data and biospecimens at study entry and longitudinally, for a healthy, aging population without cancer. This provides an opportunity to use pre-diagnostic data and specimens to evaluate hypotheses about the initial development of cancer. This paper reports on strides made by, and future possibilities for, the use of accessible biorepositories developed from precisely annotated samples obtained through large-scale National Cancer Institute (NCI)-sponsored cancer prevention clinical trials conducted by the NCI Cooperative Groups. These large cancer prevention studies, which have enrolled over 80,000 volunteers, continue to contribute to our understanding of cancer development more than 10 years after they were closed. PMID:26433556

  4. The Large Ring Laser G for Continuous Earth Rotation Monitoring

    NASA Astrophysics Data System (ADS)

    Schreiber, K. U.; Klügel, T.; Velikoseltsev, A.; Schlüter, W.; Stedman, G. E.; Wells, J.-P. R.

    2009-09-01

    Ring Laser gyroscopes exploit the Sagnac effect and measure rotations absolute. They do not require an external reference frame and therefore provide an independent method to monitor Earth rotation. Large-scale versions of these gyroscopes promise to eventually provide a similar high resolution for the measurement of the variations in the Earth rotation rate as the established methods based on VLBI and GNSS. This would open the door to a continuous monitoring of LOD (Length of Day) and polar motion, which is not yet available today. Another advantage is the access to the sub-daily frequency regime of Earth rotation. The ring laser “G” (Grossring), located at the Geodetic Observatory Wettzell (Germany) is the most advanced realization of such a large gyroscope. This paper outlines the current sensor design and properties.

  5. Where and why hyporheic exchange is important: Inferences from a parsimonious, physically-based river network model

    NASA Astrophysics Data System (ADS)

    Gomez-Velez, J. D.; Harvey, J. W.

    2014-12-01

    Hyporheic exchange has been hypothesized to have basin-scale consequences; however, predictions throughout river networks are limited by available geomorphic and hydrogeologic data as well as models that can analyze and aggregate hyporheic exchange flows across large spatial scales. We developed a parsimonious but physically-based model of hyporheic flow for application in large river basins: Networks with EXchange and Subsurface Storage (NEXSS). At the core of NEXSS is a characterization of the channel geometry, geomorphic features, and related hydraulic drivers based on scaling equations from the literature and readily accessible information such as river discharge, bankfull width, median grain size, sinuosity, channel slope, and regional groundwater gradients. Multi-scale hyporheic flow is computed based on combining simple but powerful analytical and numerical expressions that have been previously published. We applied NEXSS across a broad range of geomorphic diversity in river reaches and synthetic river networks. NEXSS demonstrates that vertical exchange beneath submerged bedforms dominates hyporheic fluxes and turnover rates along the river corridor. Moreover, the hyporheic zone's potential for biogeochemical transformations is comparable across stream orders, but the abundance of lower-order channels results in a considerably higher cumulative effect for low-order streams. Thus, vertical exchange beneath submerged bedforms has more potential for biogeochemical transformations than lateral exchange beneath banks, although lateral exchange through meanders may be important in large rivers. These results have implications for predicting outcomes of river and basin management practices.

  6. Stimulus-dependent spiking relationships with the EEG

    PubMed Central

    Snyder, Adam C.

    2015-01-01

    The development and refinement of noninvasive techniques for imaging neural activity is of paramount importance for human neuroscience. Currently, the most accessible and popular technique is electroencephalography (EEG). However, nearly all of what we know about the neural events that underlie EEG signals is based on inference, because of the dearth of studies that have simultaneously paired EEG recordings with direct recordings of single neurons. From the perspective of electrophysiologists there is growing interest in understanding how spiking activity coordinates with large-scale cortical networks. Evidence from recordings at both scales highlights that sensory neurons operate in very distinct states during spontaneous and visually evoked activity, which appear to form extremes in a continuum of coordination in neural networks. We hypothesized that individual neurons have idiosyncratic relationships to large-scale network activity indexed by EEG signals, owing to the neurons' distinct computational roles within the local circuitry. We tested this by recording neuronal populations in visual area V4 of rhesus macaques while we simultaneously recorded EEG. We found substantial heterogeneity in the timing and strength of spike-EEG relationships and that these relationships became more diverse during visual stimulation compared with the spontaneous state. The visual stimulus apparently shifts V4 neurons from a state in which they are relatively uniformly embedded in large-scale network activity to a state in which their distinct roles within the local population are more prominent, suggesting that the specific way in which individual neurons relate to EEG signals may hold clues regarding their computational roles. PMID:26108954

  7. Large-scale chromatin remodeling at the immunoglobulin heavy chain locus: a paradigm for multigene regulation.

    PubMed

    Bolland, Daniel J; Wood, Andrew L; Corcoran, Anne E

    2009-01-01

    V(D)J recombination in lymphocytes is the cutting and pasting together of antigen receptor genes in cis to generate the enormous variety of coding sequences required to produce diverse antigen receptor proteins. It is the key role of the adaptive immune response, which must potentially combat millions of different foreign antigens. Most antigen receptor loci have evolved to be extremely large and contain multiple individual V, D and J genes. The immunoglobulin heavy chain (Igh) and immunoglobulin kappa light chain (Igk) loci are the largest multigene loci in the mammalian genome and V(D)J recombination is one of the most complicated genetic processes in the nucleus. The challenge for the appropriate lymphocyte is one of macro-management-to make all of the antigen receptor genes in a particular locus available for recombination at the appropriate developmental time-point. Conversely, these large loci must be kept closed in lymphocytes in which they do not normally recombine, to guard against genomic instability generated by the DNA double strand breaks inherent to the V(D)J recombination process. To manage all of these demanding criteria, V(D)J recombination is regulated at numerous levels. It is restricted to lymphocytes since the Rag genes which control the DNA double-strand break step of recombination are only expressed in these cells. Within the lymphocyte lineage, immunoglobulin recombination is restricted to B-lymphocytes and TCR recombination to T-lymphocytes by regulation of locus accessibility, which occurs at multiple levels. Accessibility of recombination signal sequences (RSSs) flanking individual V, D and J genes at the nucleosomal level is the key micro-management mechanism, which is discussed in greater detail in other chapters. This chapter will explore how the antigen receptor loci are regulated as a whole, focussing on the Igh locus as a paradigm for the mechanisms involved. Numerous recent studies have begun to unravel the complex and complementary processes involved in this large-scale locus organisation. We will examine the structure of the Igh locus and the large-scale and higher-order chromatin remodelling processes associated with V(D)J recombination, at the level of the locus itself, its conformational changes and its dynamic localisation within the nucleus.

  8. Toward server-side, high performance climate change data analytics in the Earth System Grid Federation (ESGF) eco-system

    NASA Astrophysics Data System (ADS)

    Fiore, Sandro; Williams, Dean; Aloisio, Giovanni

    2016-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated (e.g., the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). Most of the tools currently available for scientific data analysis in the climate domain fail at large scale since they: (1) are desktop based and need the data locally; (2) are sequential, so do not benefit from available multicore/parallel machines; (3) do not provide declarative languages to express scientific data analysis tasks; (4) are domain-specific, which ties their adoption to a specific domain; and (5) do not provide a workflow support, to enable the definition of complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes ("datacubes"). The project relies on a strong background of high performance database management and OLAP systems to manage large scientific data sets. It also provides a native workflow management support, to define processing chains and workflows with tens to hundreds of data analytics operators to build real scientific use cases. With regard to interoperability aspects, the talk will present the contribution provided both to the RDA Working Group on Array Databases, and the Earth System Grid Federation (ESGF) Compute Working Team. Also highlighted will be the results of large scale climate model intercomparison data analysis experiments, for example: (1) defined in the context of the EU H2020 INDIGO-DataCloud project; (2) implemented in a real geographically distributed environment involving CMCC (Italy) and LLNL (US) sites; (3) exploiting Ophidia as server-side, parallel analytics engine; and (4) applied on real CMIP5 data sets available through ESGF.

  9. Body identification, biometrics and medicine: ethical and social considerations.

    PubMed

    Mordini, Emilio; Ottolini, Corinna

    2007-01-01

    Identity is important when it is weak. This apparent paradox is the core of the current debate on identity. Traditionally, verification of identity has been based upon authentication of attributed and biographical characteristics. After small scale societies and large scale, industrial societies, globalization represents the third period of personal identification. The human body lies at the heart of all strategies for identity management. The tension between human body and personal identity is critical in the health care sector. The health care sector is second only to the financial sector in term of the number of biometric users. Many hospitals and healthcare organizations are in progress to deploy biometric security architecture. Secure identification is critical in the health care system, both to control logic access to centralized archives of digitized patients' data, and to limit physical access to buildings and hospital wards, and to authenticate medical and social support personnel. There is also an increasing need to identify patients with a high degree of certainty. Finally there is the risk that biometric authentication devices can significantly reveal any health information. All these issues require a careful ethical and political scrutiny.

  10. Geospatial Data as a Service: Towards planetary scale real-time analytics

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Larraondo, P. R.; Antony, J.; Richards, C. J.

    2017-12-01

    The rapid growth of earth systems, environmental and geophysical datasets poses a challenge to both end-users and infrastructure providers. For infrastructure and data providers, tasks like managing, indexing and storing large collections of geospatial data needs to take into consideration the various use cases by which consumers will want to access and use the data. Considerable investment has been made by the Earth Science community to produce suitable real-time analytics platforms for geospatial data. There are currently different interfaces that have been defined to provide data services. Unfortunately, there is considerable difference on the standards, protocols or data models which have been designed to target specific communities or working groups. The Australian National University's National Computational Infrastructure (NCI) is used for a wide range of activities in the geospatial community. Earth observations, climate and weather forecasting are examples of these communities which generate large amounts of geospatial data. The NCI has been carrying out significant effort to develop a data and services model that enables the cross-disciplinary use of data. Recent developments in cloud and distributed computing provide a publicly accessible platform where new infrastructures can be built. One of the key components these technologies offer is the possibility of having "limitless" compute power next to where the data is stored. This model is rapidly transforming data delivery from centralised monolithic services towards ubiquitous distributed services that scale up and down adapting to fluctuations in the demand. NCI has developed GSKY, a scalable, distributed server which presents a new approach for geospatial data discovery and delivery based on OGC standards. We will present the architecture and motivating use-cases that drove GSKY's collaborative design, development and production deployment. We show our approach offers the community valuable exploratory analysis capabilities, for dealing with petabyte-scale geospatial data collections.

  11. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    DTIC Science & Technology

    2016-07-25

    and constrain the distributed persistent inside crawlers that have va.lid credentials to access the web services. The main idea is to add a marker...to each web page URL and use the URL path and user inforn1ation contained in the marker to help accurately detect crawlers at its earliest stage...more than half of all website traffic, and malicious bots contributes almost one third of the traffic. As one type of bots, web crawlers have been

  12. Challenges to Computational Aerothermodynamic Simulation and Validation for Planetary Entry Vehicle Analysis

    DTIC Science & Technology

    2010-04-01

    than 0.6 metric tons. They have landed at low elevation sites (below 1 km Mars Orbiter Laser Altimeter ( MOLA )). All accepted a relatively large...Martian atmosphere, and small scale height of obstacles on the ground limit accessible landing sites to those below - 1.0km MOLA . So far the southern...landing to date is MER-Opportunity at Meridiani Planum (-1km MOLA ). Mars Science Lab (MSL) is attempting to develop an EDL system capable of delivering

  13. OpenSim: A Flexible Distributed Neural Network Simulator with Automatic Interactive Graphics.

    PubMed

    Jarosch, Andreas; Leber, Jean Francois

    1997-06-01

    An object-oriented simulator called OpenSim is presented that achieves a high degree of flexibility by relying on a small set of building blocks. The state variables and algorithms put in this framework can easily be accessed through a command shell. This allows one to distribute a large-scale simulation over several workstations and to generate the interactive graphics automatically. OpenSim opens new possibilities for cooperation among Neural Network researchers. Copyright 1997 Elsevier Science Ltd.

  14. Space Station services and design features for users

    NASA Technical Reports Server (NTRS)

    Kurzhals, Peter R.; Mckinney, Royce L.

    1987-01-01

    The operational design features and services planned for the NASA Space Station will furnish, in addition to novel opportunities and facilities, lower costs through interface standardization and automation and faster access by means of computer-aided integration and control processes. By furnishing a basis for large-scale space exploitation, the Space Station will possess industrial production and operational services capabilities that may be used by the private sector for commercial ventures; it could also ultimately support lunar and planetary exploration spacecraft assembly and launch facilities.

  15. Economics in the age of big data.

    PubMed

    Einav, Liran; Levin, Jonathan

    2014-11-07

    The quality and quantity of data on economic activity are expanding rapidly. Empirical research increasingly relies on newly available large-scale administrative data or private sector data that often is obtained through collaboration with private firms. Here we highlight some challenges in accessing and using these new data. We also discuss how new data sets may change the statistical methods used by economists and the types of questions posed in empirical research. Copyright © 2014, American Association for the Advancement of Science.

  16. Fabrication of the HIAD Large-Scale Demonstration Assembly and Upcoming Mission Applications

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; Dinonno, J. M.; Cheatwood, F M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASAs Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale.In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  17. Fabrication of the HIAD Large-Scale Demonstration Assembly

    NASA Technical Reports Server (NTRS)

    Swanson, G. T.; Johnson, R. K.; Hughes, S. J.; DiNonno, J. M.; Cheatwood, F. M.

    2017-01-01

    Over a decade of work has been conducted in the development of NASA's Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This effort has included multiple ground test campaigns and flight tests culminating in the HIAD projects second generation (Gen-2) deployable aeroshell system and associated analytical tools. NASAs HIAD project team has developed, fabricated, and tested inflatable structures (IS) integrated with flexible thermal protection system (F-TPS), ranging in diameters from 3-6m, with cone angles of 60 and 70 deg.In 2015, United Launch Alliance (ULA) announced that they will use a HIAD (10-12m) as part of their Sensible, Modular, Autonomous Return Technology (SMART) for their upcoming Vulcan rocket. ULA expects SMART reusability, coupled with other advancements for Vulcan, will substantially reduce the cost of access to space. The first booster engine recovery via HIAD is scheduled for 2024. To meet this near-term need, as well as future NASA applications, the HIAD team is investigating taking the technology to the 10-15m diameter scale. In the last year, many significant development and fabrication efforts have been accomplished, culminating in the construction of a large-scale inflatable structure demonstration assembly. This assembly incorporated the first three tori for a 12m Mars Human-Scale Pathfinder HIAD conceptual design that was constructed with the current state of the art material set. Numerous design trades and torus fabrication demonstrations preceded this effort. In 2016, three large-scale tori (0.61m cross-section) and six subscale tori (0.25m cross-section) were manufactured to demonstrate fabrication techniques using the newest candidate material sets. These tori were tested to evaluate durability and load capacity. This work led to the selection of the inflatable structures third generation (Gen-3) structural liner. In late 2016, the three tori required for the large-scale demonstration assembly were fabricated, and then integrated in early 2017. The design includes provisions to add the remaining four tori necessary to complete the assembly of the 12m Human-Scale Pathfinder HIAD in the event future project funding becomes available.This presentation will discuss the HIAD large-scale demonstration assembly design and fabrication per-formed in the last year including the precursor tori development and the partial-stack fabrication. Potential near-term and future 10-15m HIAD applications will also be discussed.

  18. Classifying neighbourhoods by level of access to stores selling fresh fruit and vegetables and groceries: identifying problematic areas in the city of Gatineau, Quebec.

    PubMed

    Gould, Adrian C; Apparicio, Philippe; Cloutier, Marie-Soleil

    2012-11-06

    Physical access to stores selling groceries, fresh fruit and vegetables (FV) is essential for urban dwellers. In Canadian cities where low-density development practices are common, social and material deprivation may be compounded by poor geographic access to healthy food. This case study examines access to food stores selling fresh FV in Gatineau, Quebec, to identify areas where poor access is coincident with high deprivation. Food retailers were identified using two secondary sources and each store was visited to establish the total surface area devoted to the sale of fresh FV. Four population-weighted accessibility measures were then calculated for each dissemination area (DA) using road network distances. A deprivation index was created using variables from the 2006 Statistics Canada census, also at the scale of the DA. Finally, six classes of accessibility to a healthy diet were constructed using a k-means classification procedure. These were mapped and superimposed over high deprivation areas. Overall, deprivation is positively correlated with better accessibility. However, more than 18,000 residents (7.5% of the population) live in high deprivation areas characterized by large distances to the nearest retail food store (means of 1.4 km or greater) and virtually no access to fresh FV within walking distance (radius of 1 km). In this research, we identified areas where poor geographic access may introduce an additional constraint for residents already dealing with the challenges of limited financial and social resources. Our results may help guide local food security policies and initiatives.

  19. Imitating intrinsic alignments: a bias to the CMB lensing-galaxy shape cross-correlation power spectrum induced by the large-scale structure bispectrum

    NASA Astrophysics Data System (ADS)

    Merkel, Philipp M.; Schäfer, Björn Malte

    2017-10-01

    Cross-correlating the lensing signals of galaxies and comic microwave background (CMB) fluctuations is expected to provide valuable cosmological information. In particular, it may help tighten constraints on parameters describing the properties of intrinsically aligned galaxies at high redshift. To access the information conveyed by the cross-correlation signal, its accurate theoretical description is required. We compute the bias to CMB lensing-galaxy shape cross-correlation measurements induced by non-linear structure growth. Using tree-level perturbation theory for the large-scale structure bispectrum, we find that the bias is negative on most angular scales, therefore mimicking the signal of intrinsic alignments. Combining Euclid-like galaxy lensing data with a CMB experiment comparable to the Planck satellite mission, the bias becomes significant only on smallest scales (ℓ ≳ 2500). For improved CMB observations, however, the corrections amount to 10-15 per cent of the CMB lensing-intrinsic alignment signal over a wide multipole range (10 ≲ ℓ ≲ 2000). Accordingly, the power spectrum bias, if uncorrected, translates into 2σ and 3σ errors in the determination of the intrinsic alignment amplitude in the case of CMB stage III and stage IV experiments, respectively.

  20. pGlyco 2.0 enables precision N-glycoproteomics with comprehensive quality control and one-step mass spectrometry for intact glycopeptide identification.

    PubMed

    Liu, Ming-Qi; Zeng, Wen-Feng; Fang, Pan; Cao, Wei-Qian; Liu, Chao; Yan, Guo-Quan; Zhang, Yang; Peng, Chao; Wu, Jian-Qiang; Zhang, Xiao-Jin; Tu, Hui-Jun; Chi, Hao; Sun, Rui-Xiang; Cao, Yong; Dong, Meng-Qiu; Jiang, Bi-Yun; Huang, Jiang-Ming; Shen, Hua-Li; Wong, Catherine C L; He, Si-Min; Yang, Peng-Yuan

    2017-09-05

    The precise and large-scale identification of intact glycopeptides is a critical step in glycoproteomics. Owing to the complexity of glycosylation, the current overall throughput, data quality and accessibility of intact glycopeptide identification lack behind those in routine proteomic analyses. Here, we propose a workflow for the precise high-throughput identification of intact N-glycopeptides at the proteome scale using stepped-energy fragmentation and a dedicated search engine. pGlyco 2.0 conducts comprehensive quality control including false discovery rate evaluation at all three levels of matches to glycans, peptides and glycopeptides, improving the current level of accuracy of intact glycopeptide identification. The N-glycoproteome of samples metabolically labeled with 15 N/ 13 C were analyzed quantitatively and utilized to validate the glycopeptide identification, which could be used as a novel benchmark pipeline to compare different search engines. Finally, we report a large-scale glycoproteome dataset consisting of 10,009 distinct site-specific N-glycans on 1988 glycosylation sites from 955 glycoproteins in five mouse tissues.Protein glycosylation is a heterogeneous post-translational modification that generates greater proteomic diversity that is difficult to analyze. Here the authors describe pGlyco 2.0, a workflow for the precise one step identification of intact N-glycopeptides at the proteome scale.

  1. FLARE (Facility for Laboratory Reconnection Experiments): A Major Next-Step for Laboratory Studies of Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Belova, E.; Ellis, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Que, W.; Ren, Y.; Titus, P.; Yamada, M.; Yoo, J.

    2014-12-01

    A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE, is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) at Princeton (http://mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to space and solar plasmas. The motivating major physics questions, the construction status, and the planned collaborative research especially with space and solar research communities will be discussed.

  2. FLARE (Facility for Laboratory Reconnection Experiments): A Major Next-Step for Laboratory Studies of Magnetic Reconnection

    NASA Astrophysics Data System (ADS)

    Ji, Hantao; Bhattacharjee, A.; Prager, S.; Daughton, W.; Bale, Stuart D.; Carter, T.; Crocker, N.; Drake, J.; Egedal, J.; Sarff, J.; Fox, W.; Jara-Almonte, J.; Myers, C.; Ren, Y.; Yamada, M.; Yoo, J.

    2015-04-01

    A new intermediate-scale plasma experiment, called the Facility for Laboratory Reconnection Experiments or FLARE (flare.pppl.gov), is under construction at Princeton as a joint project by five universities and two national labs to study magnetic reconnection in regimes directly relevant to heliophysical and astrophysical plasmas. The currently existing small-scale experiments have been focusing on the single X-line reconnection process in plasmas either with small effective sizes or at low Lundquist numbers, both of which are typically very large in natural plasmas. These new regimes involve multiple X-lines as guided by a reconnection "phase diagram", in which different coupling mechanisms from the global system scale to the local dissipation scale are classified into different reconnection phases [H. Ji & W. Daughton, Phys. Plasmas 18, 111207 (2011)]. The design of the FLARE device is based on the existing Magnetic Reconnection Experiment (MRX) (mrx.pppl.gov) and is to provide experimental access to the new phases involving multiple X-lines at large effective sizes and high Lundquist numbers, directly relevant to magnetospheric, solar wind, and solar coronal plasmas. After a brief summary of recent laboratory results on the topic of magnetic reconnection, the motivating major physics questions, the construction status, and the planned collaborative research especially with heliophysics communities will be discussed.

  3. The Spatiotemporal Trend of City Parks in Mainland China between 1981 and 2014: Implications for the Promotion of Leisure Time Physical Activity and Planning.

    PubMed

    Wang, Kai; Liu, Jianjun

    2017-09-29

    City parks, important environments built for physical activity, play critical roles in preventing chronic diseases and promoting public health. We used five commonly used park indicators to investigate the spatiotemporal trend of city parks in mainland China between 1981 and 2014 at three scales: national, provincial and city class. City parks in China increased significantly with a turning point occurring around the year 2000. Up until the end of 2014, there were 13,074 city parks totaling 367,962 ha with 0.29 parks per 10,000 residents, 8.26 m² of park per capita and 2.00% of parkland as a percentage of urban area. However, there is still a large gap compared to the established American and Japanese city park systems, and only 5.4% of people aged above 20 access city parks for physical activity. The low number of parks per 10,000 residents brings up the issue of the accessibility to physical activity areas that public parks provide. The concern of spatial disparity, also apparent for all five city park indicators, differed strongly at provincial and city class scales. The southern and eastern coastal provinces of Guangdong, Fujian, Zhejiang and Shandong have abundant city park resources. At the scale of the city classes, mega-city II had the highest of the three ratio indicators and the large city class had the lowest. On one hand, the leading province Guangdong and its mega-cities Shenzhen and Dongguan had park indicators comparable to the United States and Japan. On the other hand, there were still five cities with no city parks and many cities with extremely low park indicators. In China, few cities have realized the importance of city parks for the promotion of leisure time physical activity. It is urgent that state and city park laws or guidelines are passed that can serve as baselines for planning a park system and determining a minimum standard for city parks with free, accessible and safe physical activity areas and sports facilities.

  4. The Spatiotemporal Trend of City Parks in Mainland China between 1981 and 2014: Implications for the Promotion of Leisure Time Physical Activity and Planning

    PubMed Central

    Wang, Kai; Liu, Jianjun

    2017-01-01

    City parks, important environments built for physical activity, play critical roles in preventing chronic diseases and promoting public health. We used five commonly used park indicators to investigate the spatiotemporal trend of city parks in mainland China between 1981 and 2014 at three scales: national, provincial and city class. City parks in China increased significantly with a turning point occurring around the year 2000. Up until the end of 2014, there were 13,074 city parks totaling 367,962 ha with 0.29 parks per 10,000 residents, 8.26 m2 of park per capita and 2.00% of parkland as a percentage of urban area. However, there is still a large gap compared to the established American and Japanese city park systems, and only 5.4% of people aged above 20 access city parks for physical activity. The low number of parks per 10,000 residents brings up the issue of the accessibility to physical activity areas that public parks provide. The concern of spatial disparity, also apparent for all five city park indicators, differed strongly at provincial and city class scales. The southern and eastern coastal provinces of Guangdong, Fujian, Zhejiang and Shandong have abundant city park resources. At the scale of the city classes, mega-city II had the highest of the three ratio indicators and the large city class had the lowest. On one hand, the leading province Guangdong and its mega-cities Shenzhen and Dongguan had park indicators comparable to the United States and Japan. On the other hand, there were still five cities with no city parks and many cities with extremely low park indicators. In China, few cities have realized the importance of city parks for the promotion of leisure time physical activity. It is urgent that state and city park laws or guidelines are passed that can serve as baselines for planning a park system and determining a minimum standard for city parks with free, accessible and safe physical activity areas and sports facilities. PMID:28961182

  5. Amphibian and reptile road-kills on tertiary roads in relation to landscape structure: using a citizen science approach with open-access land cover data.

    PubMed

    Heigl, Florian; Horvath, Kathrin; Laaha, Gregor; Zaller, Johann G

    2017-06-26

    Amphibians and reptiles are among the most endangered vertebrate species worldwide. However, little is known how they are affected by road-kills on tertiary roads and whether the surrounding landscape structure can explain road-kill patterns. The aim of our study was to examine the applicability of open-access remote sensing data for a large-scale citizen science approach to describe spatial patterns of road-killed amphibians and reptiles on tertiary roads. Using a citizen science app we monitored road-kills of amphibians and reptiles along 97.5 km of tertiary roads covering agricultural, municipal and interurban roads as well as cycling paths in eastern Austria over two seasons. Surrounding landscape was assessed using open access land cover classes for the region (Coordination of Information on the Environment, CORINE). Hotspot analysis was performed using kernel density estimation (KDE+). Relations between land cover classes and amphibian and reptile road-kills were analysed with conditional probabilities and general linear models (GLM). We also estimated the potential cost-efficiency of a large scale citizen science monitoring project. We recorded 180 amphibian and 72 reptile road-kills comprising eight species mainly occurring on agricultural roads. KDE+ analyses revealed a significant clustering of road-killed amphibians and reptiles, which is an important information for authorities aiming to mitigate road-kills. Overall, hotspots of amphibian and reptile road-kills were next to the land cover classes arable land, suburban areas and vineyards. Conditional probabilities and GLMs identified road-kills especially next to preferred habitats of green toad, common toad and grass snake, the most often found road-killed species. A citizen science approach appeared to be more cost-efficient than monitoring by professional researchers only when more than 400 km of road are monitored. Our findings showed that freely available remote sensing data in combination with a citizen science approach would be a cost-efficient method aiming to identify and monitor road-kill hotspots of amphibians and reptiles on a larger scale.

  6. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    NASA Astrophysics Data System (ADS)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also present some of our findings from applying machine learning and data analytics on the processed SAR data streams. We will also present lessons learned on how to ease the SAR community onto interfacing with these cloud-based SAR science data systems.

  7. A user-friendly tool to transform large scale administrative data into wide table format using a MapReduce program with a Pig Latin based script.

    PubMed

    Horiguchi, Hiromasa; Yasunaga, Hideo; Hashimoto, Hideki; Ohe, Kazuhiko

    2012-12-22

    Secondary use of large scale administrative data is increasingly popular in health services and clinical research, where a user-friendly tool for data management is in great demand. MapReduce technology such as Hadoop is a promising tool for this purpose, though its use has been limited by the lack of user-friendly functions for transforming large scale data into wide table format, where each subject is represented by one row, for use in health services and clinical research. Since the original specification of Pig provides very few functions for column field management, we have developed a novel system called GroupFilterFormat to handle the definition of field and data content based on a Pig Latin script. We have also developed, as an open-source project, several user-defined functions to transform the table format using GroupFilterFormat and to deal with processing that considers date conditions. Having prepared dummy discharge summary data for 2.3 million inpatients and medical activity log data for 950 million events, we used the Elastic Compute Cloud environment provided by Amazon Inc. to execute processing speed and scaling benchmarks. In the speed benchmark test, the response time was significantly reduced and a linear relationship was observed between the quantity of data and processing time in both a small and a very large dataset. The scaling benchmark test showed clear scalability. In our system, doubling the number of nodes resulted in a 47% decrease in processing time. Our newly developed system is widely accessible as an open resource. This system is very simple and easy to use for researchers who are accustomed to using declarative command syntax for commercial statistical software and Structured Query Language. Although our system needs further sophistication to allow more flexibility in scripts and to improve efficiency in data processing, it shows promise in facilitating the application of MapReduce technology to efficient data processing with large scale administrative data in health services and clinical research.

  8. Study of Multiple Scale Physics of Magnetic Reconnection on the FLARE (Facility for Laboratory Reconnection Experiments)

    NASA Astrophysics Data System (ADS)

    Ji, H.; Bhattacharjee, A.; Prager, S.; Daughton, W. S.; Bale, S. D.; Carter, T. A.; Crocker, N.; Drake, J. F.; Egedal, J.; Sarff, J.; Wallace, J.; Chen, Y.; Cutler, R.; Fox, W. R., II; Heitzenroeder, P.; Kalish, M.; Jara-Almonte, J.; Myers, C. E.; Ren, Y.; Yamada, M.; Yoo, J.

    2015-12-01

    The FLARE device (flare.pppl.gov) is a new intermediate-scale plasma experiment under construction at Princeton to study magnetic reconnection in regimes directly relevant to space, solar and astrophysical plasmas. The existing small-scale experiments have been focusing on the single X-line reconnection process either with small effective sizes or at low Lundquist numbers, but both of which are typically very large in natural plasmas. The configuration of the FLARE device is designed to provide experimental access to the new regimes involving multiple X-lines, as guided by a reconnection "phase diagram" [Ji & Daughton, PoP (2011)]. Most of major components of the FLARE device have been designed and are under construction. The device will be assembled and installed in 2016, followed by commissioning and operation in 2017. The planned research on FLARE as a user facility will be discussed on topics including the multiple scale nature of magnetic reconnection from global fluid scales to ion and electron kinetic scales. Results from scoping simulations based on particle and fluid codes and possible comparative research with space measurements will be presented.

  9. Transposon facilitated DNA sequencing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, D.E.; Berg, C.M.; Huang, H.V.

    1990-01-01

    The purpose of this research is to investigate and develop methods that exploit the power of bacterial transposable elements for large scale DNA sequencing: Our premise is that the use of transposons to put primer binding sites randomly in target DNAs should provide access to all portions of large DNA fragments, without the inefficiencies of methods involving random subcloning and attendant repetitive sequencing, or of sequential synthesis of many oligonucleotide primers that are used to match systematically along a DNA molecule. Two unrelated bacterial transposons, Tn5 and {gamma}{delta}, are being used because they have both proven useful for molecular analyses,more » and because they differ sufficiently in mechanism and specificity of transposition to merit parallel development.« less

  10. EuroPhenome and EMPReSS: online mouse phenotyping resource

    PubMed Central

    Mallon, Ann-Marie; Hancock, John M.

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC). PMID:17905814

  11. EuroPhenome and EMPReSS: online mouse phenotyping resource.

    PubMed

    Mallon, Ann-Marie; Blake, Andrew; Hancock, John M

    2008-01-01

    EuroPhenome (http://www.europhenome.org) and EMPReSS (http://empress.har.mrc.ac.uk/) form an integrated resource to provide access to data and procedures for mouse phenotyping. EMPReSS describes 96 Standard Operating Procedures for mouse phenotyping. EuroPhenome contains data resulting from carrying out EMPReSS protocols on four inbred laboratory mouse strains. As well as web interfaces, both resources support web services to enable integration with other mouse phenotyping and functional genetics resources, and are committed to initiatives to improve integration of mouse phenotype databases. EuroPhenome will be the repository for a recently initiated effort to carry out large-scale phenotyping on a large number of knockout mouse lines (EUMODIC).

  12. Planning of distributed generation in distribution network based on improved particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Li, Jinze; Qu, Zhi; He, Xiaoyang; Jin, Xiaoming; Li, Tie; Wang, Mingkai; Han, Qiu; Gao, Ziji; Jiang, Feng

    2018-02-01

    Large-scale access of distributed power can improve the current environmental pressure, at the same time, increasing the complexity and uncertainty of overall distribution system. Rational planning of distributed power can effectively improve the system voltage level. To this point, the specific impact on distribution network power quality caused by the access of typical distributed power was analyzed and from the point of improving the learning factor and the inertia weight, an improved particle swarm optimization algorithm (IPSO) was proposed which could solve distributed generation planning for distribution network to improve the local and global search performance of the algorithm. Results show that the proposed method can well reduce the system network loss and improve the economic performance of system operation with distributed generation.

  13. Striking the balance: Privacy and spatial pattern preservation in masked GPS data

    NASA Astrophysics Data System (ADS)

    Seidl, Dara E.

    Volunteered location and trajectory data are increasingly collected and applied in analysis for a variety of academic fields and recreational pursuits. As access to personal location data increases, issues of privacy arise as individuals become identifiable and linked to other repositories of information. While the quality and precision of data are essential to accurate analysis, there is a tradeoff between privacy and access to data. Obfuscation of point data is a solution that aims to protect privacy and maximize preservation of spatial pattern. This study explores two methods of location obfuscation for volunteered GPS data: grid masking and random perturbation. These methods are applied to travel survey GPS data in the greater metropolitan regions of Chicago and Atlanta in the first large-scale GPS masking study of its kind.

  14. The Eukaryotic Pathogen Databases: a functional genomic resource integrating data from human and veterinary parasites.

    PubMed

    Harb, Omar S; Roos, David S

    2015-01-01

    Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.

  15. A pharyngeal jaw evolutionary innovation facilitated extinction in Lake Victoria cichlids.

    PubMed

    McGee, Matthew D; Borstein, Samuel R; Neches, Russell Y; Buescher, Heinz H; Seehausen, Ole; Wainwright, Peter C

    2015-11-27

    Evolutionary innovations, traits that give species access to previously unoccupied niches, may promote speciation and adaptive radiation. Here, we show that such innovations can also result in competitive inferiority and extinction. We present evidence that the modified pharyngeal jaws of cichlid fishes and several marine fish lineages, a classic example of evolutionary innovation, are not universally beneficial. A large-scale analysis of dietary evolution across marine fish lineages reveals that the innovation compromises access to energy-rich predator niches. We show that this competitive inferiority shaped the adaptive radiation of cichlids in Lake Tanganyika and played a pivotal and previously unrecognized role in the mass extinction of cichlid fishes in Lake Victoria after Nile perch invasion. Copyright © 2015, American Association for the Advancement of Science.

  16. Preface to the volume Large Rivers

    NASA Astrophysics Data System (ADS)

    Latrubesse, Edgardo M.; Abad, Jorge D.

    2018-02-01

    The study and knowledge of the geomorphology of large rivers increased significantly during the last years and the factors that triggered these advances are multiple. On one hand, modern technologies became more accessible and their disseminated usage allowed the collection of data from large rivers as never seen before. The generalized use of high tech data collection with geophysics equipment such as acoustic Doppler current profilers-ADCPs, multibeam echosounders, plus the availability of geospatial and computational tools for morphodynamics, hydrological and hydrosedimentological modeling, have accelerated the scientific production on the geomorphology of large rivers at a global scale. Despite the advances, there is yet a lot of work ahead. Good parts of the large rivers are in the tropics and many are still unexplored. The tropics also hold crucial fluvial basins that concentrate good part of the gross domestic product of large countries like the Parana River in Argentina and Brazil, the Ganges-Brahmaputra in India, the Indus River in Pakistan, and the Mekong River in several countries of South East Asia. The environmental importance of tropical rivers is also outstanding. They hold the highest biodiversity of fluvial fauna and alluvial vegetation and many of them, particularly those in Southeast Asia, are among the most hazardous systems for floods in the entire world. Tropical rivers draining mountain chains such as the Himalaya, the Andes and insular Southeast Asia are also among the most heavily sediment loaded rivers and play a key role in both the storage of sediment at continental scale and the transference of sediments from the continent to the Ocean at planetary scale (Andermann et al., 2012; Latrubesse and Restrepo, 2014; Milliman and Syvitski, 1992; Milliman and Farsnworth, 2011; Sinha and Friend, 1994).

  17. A High-Resolution InDel (Insertion–Deletion) Markers-Anchored Consensus Genetic Map Identifies Major QTLs Governing Pod Number and Seed Yield in Chickpea

    PubMed Central

    Srivastava, Rishi; Singh, Mohar; Bajaj, Deepak; Parida, Swarup K.

    2016-01-01

    Development and large-scale genotyping of user-friendly informative genome/gene-derived InDel markers in natural and mapping populations is vital for accelerating genomics-assisted breeding applications of chickpea with minimal resource expenses. The present investigation employed a high-throughput whole genome next-generation resequencing strategy in low and high pod number parental accessions and homozygous individuals constituting the bulks from each of two inter-specific mapping populations [(Pusa 1103 × ILWC 46) and (Pusa 256 × ILWC 46)] to develop non-erroneous InDel markers at a genome-wide scale. Comparing these high-quality genomic sequences, 82,360 InDel markers with reference to kabuli genome and 13,891 InDel markers exhibiting differentiation between low and high pod number parental accessions and bulks of aforementioned mapping populations were developed. These informative markers were structurally and functionally annotated in diverse coding and non-coding sequence components of genome/genes of kabuli chickpea. The functional significance of regulatory and coding (frameshift and large-effect mutations) InDel markers for establishing marker-trait linkages through association/genetic mapping was apparent. The markers detected a greater amplification (97%) and intra-specific polymorphic potential (58–87%) among a diverse panel of cultivated desi, kabuli, and wild accessions even by using a simpler cost-efficient agarose gel-based assay implicating their utility in large-scale genetic analysis especially in domesticated chickpea with narrow genetic base. Two high-density inter-specific genetic linkage maps generated using aforesaid mapping populations were integrated to construct a consensus 1479 InDel markers-anchored high-resolution (inter-marker distance: 0.66 cM) genetic map for efficient molecular mapping of major QTLs governing pod number and seed yield per plant in chickpea. Utilizing these high-density genetic maps as anchors, three major genomic regions harboring each of pod number and seed yield robust QTLs (15–28% phenotypic variation explained) were identified on chromosomes 2, 4, and 6. The integration of genetic and physical maps at these QTLs mapped on chromosomes scaled-down the long major QTL intervals into high-resolution short pod number and seed yield robust QTL physical intervals (0.89–2.94 Mb) which were essentially got validated in multiple genetic backgrounds of two chickpea mapping populations. The genome-wide InDel markers including natural allelic variants and genomic loci/genes delineated at major six especially in one colocalized novel congruent robust pod number and seed yield robust QTLs mapped on a high-density consensus genetic map were found most promising in chickpea. These functionally relevant molecular tags can drive marker-assisted genetic enhancement to develop high-yielding cultivars with increased seed/pod number and yield in chickpea. PMID:27695461

  18. A small-gap electrostatic micro-actuator for large deflections

    PubMed Central

    Conrad, Holger; Schenk, Harald; Kaiser, Bert; Langa, Sergiu; Gaudet, Matthieu; Schimmanz, Klaus; Stolz, Michael; Lenz, Miriam

    2015-01-01

    Common quasi-static electrostatic micro actuators have significant limitations in deflection due to electrode separation and unstable drive regions. State-of-the-art electrostatic actuators achieve maximum deflections of approximately one third of the electrode separation. Large electrode separation and high driving voltages are normally required to achieve large actuator movements. Here we report on an electrostatic actuator class, fabricated in a CMOS-compatible process, which allows high deflections with small electrode separation. The concept presented makes the huge electrostatic forces within nanometre small electrode separation accessible for large deflections. Electrostatic actuations that are larger than the electrode separation were measured. An analytical theory is compared with measurement and simulation results and enables closer understanding of these actuators. The scaling behaviour discussed indicates significant future improvement on actuator deflection. The presented driving concept enables the investigation and development of novel micro systems with a high potential for improved device and system performance. PMID:26655557

  19. High-resolution digital brain atlases: a Hubble telescope for the brain.

    PubMed

    Jones, Edward G; Stone, James M; Karten, Harvey J

    2011-05-01

    We describe implementation of a method for digitizing at microscopic resolution brain tissue sections containing normal and experimental data and for making the content readily accessible online. Web-accessible brain atlases and virtual microscopes for online examination can be developed using existing computer and internet technologies. Resulting databases, made up of hierarchically organized, multiresolution images, enable rapid, seamless navigation through the vast image datasets generated by high-resolution scanning. Tools for visualization and annotation of virtual microscope slides enable remote and universal data sharing. Interactive visualization of a complete series of brain sections digitized at subneuronal levels of resolution offers fine grain and large-scale localization and quantification of many aspects of neural organization and structure. The method is straightforward and replicable; it can increase accessibility and facilitate sharing of neuroanatomical data. It provides an opportunity for capturing and preserving irreplaceable, archival neurohistological collections and making them available to all scientists in perpetuity, if resources could be obtained from hitherto uninterested agencies of scientific support. © 2011 New York Academy of Sciences.

  20. GenBank.

    PubMed

    Benson, Dennis A; Karsch-Mizrachi, Ilene; Lipman, David J; Ostell, James; Wheeler, David L

    2007-01-01

    GenBank (R) is a comprehensive database that contains publicly available nucleotide sequences for more than 240 000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the EMBL Data Library in Europe and the DNA Data Bank of Japan ensures worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, begin at the NCBI Homepage (www.ncbi.nlm.nih.gov).

  1. GenBank.

    PubMed

    Benson, Dennis A; Karsch-Mizrachi, Ilene; Lipman, David J; Ostell, James; Wheeler, David L

    2005-01-01

    GenBank is a comprehensive database that contains publicly available DNA sequences for more than 165,000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the EMBL Data Library in the UK and the DNA Data Bank of Japan helps to ensure worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, go to the NCBI Homepage at http://www.ncbi.nlm.nih.gov.

  2. GenBank.

    PubMed

    Benson, Dennis A; Karsch-Mizrachi, Ilene; Lipman, David J; Ostell, James; Wheeler, David L

    2006-01-01

    GenBank (R) is a comprehensive database that contains publicly available DNA sequences for more than 205 000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the Web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the EMBL Data Library in Europe and the DNA Data Bank of Japan ensures worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, go to the NCBI Homepage at www.ncbi.nlm.nih.gov.

  3. GenBank.

    PubMed

    Benson, Dennis A; Karsch-Mizrachi, Ilene; Lipman, David J; Ostell, James; Wheeler, David L

    2008-01-01

    GenBank (R) is a comprehensive database that contains publicly available nucleotide sequences for more than 260 000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the European Molecular Biology Laboratory Nucleotide Sequence Database in Europe and the DNA Data Bank of Japan ensures worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, begin at the NCBI Homepage: www.ncbi.nlm.nih.gov.

  4. GenBank

    PubMed Central

    Benson, Dennis A.; Karsch-Mizrachi, Ilene; Lipman, David J.; Ostell, James; Wheeler, David L.

    2008-01-01

    GenBank (R) is a comprehensive database that contains publicly available nucleotide sequences for more than 260 000 named organisms, obtained primarily through submissions from individual laboratories and batch submissions from large-scale sequencing projects. Most submissions are made using the web-based BankIt or standalone Sequin programs and accession numbers are assigned by GenBank staff upon receipt. Daily data exchange with the European Molecular Biology Laboratory Nucleotide Sequence Database in Europe and the DNA Data Bank of Japan ensures worldwide coverage. GenBank is accessible through NCBI's retrieval system, Entrez, which integrates data from the major DNA and protein sequence databases along with taxonomy, genome, mapping, protein structure and domain information, and the biomedical journal literature via PubMed. BLAST provides sequence similarity searches of GenBank and other sequence databases. Complete bimonthly releases and daily updates of the GenBank database are available by FTP. To access GenBank and its related retrieval and analysis services, begin at the NCBI Homepage: www.ncbi.nlm.nih.gov PMID:18073190

  5. Evaluation of Advanced Reactive Surface Area Estimates for Improved Prediction of Mineral Reaction Rates in Porous Media

    NASA Astrophysics Data System (ADS)

    Beckingham, L. E.; Mitnick, E. H.; Zhang, S.; Voltolini, M.; Yang, L.; Steefel, C. I.; Swift, A.; Cole, D. R.; Sheets, J.; Kneafsey, T. J.; Landrot, G.; Anovitz, L. M.; Mito, S.; Xue, Z.; Ajo Franklin, J. B.; DePaolo, D.

    2015-12-01

    CO2 sequestration in deep sedimentary formations is a promising means of reducing atmospheric CO2 emissions but the rate and extent of mineral trapping remains difficult to predict. Reactive transport models provide predictions of mineral trapping based on laboratory mineral reaction rates, which have been shown to have large discrepancies with field rates. This, in part, may be due to poor quantification of mineral reactive surface area in natural porous media. Common estimates of mineral reactive surface area are ad hoc and typically based on grain size, adjusted several orders of magnitude to account for surface roughness and reactivity. This results in orders of magnitude discrepancies in estimated surface areas that directly translate into orders of magnitude discrepancies in model predictions. Additionally, natural systems can be highly heterogeneous and contain abundant nano- and micro-porosity, which can limit connected porosity and access to mineral surfaces. In this study, mineral-specific accessible surface areas are computed for a sample from the reservoir formation at the Nagaoka pilot CO2 injection site (Japan). Accessible mineral surface areas are determined from a multi-scale image analysis including X-ray microCT, SEM QEMSCAN, XRD, SANS, and SEM-FIB. Powder and flow-through column laboratory experiments are performed and the evolution of solutes in the aqueous phase is tracked. Continuum-scale reactive transport models are used to evaluate the impact of reactive surface area on predictions of experimental reaction rates. Evaluated reactive surface areas include geometric and specific surface areas (eg. BET) in addition to their reactive-site weighted counterparts. The most accurate predictions of observed powder mineral dissolution rates were obtained through use of grain-size specific surface areas computed from a BET-based correlation. Effectively, this surface area reflects the grain-fluid contact area, or accessible surface area, in the powder dissolution experiment. In the model of the flow-through column experiment, the accessible mineral surface area, computed from the multi-scale image analysis, is evaluated in addition to the traditional surface area estimates.

  6. Comparison of an algebraic multigrid algorithm to two iterative solvers used for modeling ground water flow and transport

    USGS Publications Warehouse

    Detwiler, R.L.; Mehl, S.; Rajaram, H.; Cheung, W.W.

    2002-01-01

    Numerical solution of large-scale ground water flow and transport problems is often constrained by the convergence behavior of the iterative solvers used to solve the resulting systems of equations. We demonstrate the ability of an algebraic multigrid algorithm (AMG) to efficiently solve the large, sparse systems of equations that result from computational models of ground water flow and transport in large and complex domains. Unlike geometric multigrid methods, this algorithm is applicable to problems in complex flow geometries, such as those encountered in pore-scale modeling of two-phase flow and transport. We integrated AMG into MODFLOW 2000 to compare two- and three-dimensional flow simulations using AMG to simulations using PCG2, a preconditioned conjugate gradient solver that uses the modified incomplete Cholesky preconditioner and is included with MODFLOW 2000. CPU times required for convergence with AMG were up to 140 times faster than those for PCG2. The cost of this increased speed was up to a nine-fold increase in required random access memory (RAM) for the three-dimensional problems and up to a four-fold increase in required RAM for the two-dimensional problems. We also compared two-dimensional numerical simulations of steady-state transport using AMG and the generalized minimum residual method with an incomplete LU-decomposition preconditioner. For these transport simulations, AMG yielded increased speeds of up to 17 times with only a 20% increase in required RAM. The ability of AMG to solve flow and transport problems in large, complex flow systems and its ready availability make it an ideal solver for use in both field-scale and pore-scale modeling.

  7. A Review of Control Strategy of the Large-scale of Electric Vehicles Charging and Discharging Behavior

    NASA Astrophysics Data System (ADS)

    Kong, Lingyu; Han, Jiming; Xiong, Wenting; Wang, Hao; Shen, Yaqi; Li, Ying

    2017-05-01

    Large scale access of electric vehicles will bring huge challenges to the safe operation of the power grid, and it’s important to control the charging and discharging of the electric vehicle. First of all, from the electric quality and network loss, this paper points out the influence on the grid caused by electric vehicle charging behaviour. Besides, control strategy of electric vehicle charging and discharging has carried on the induction and the summary from the direct and indirect control. Direct control strategy means control the electric charging behaviour by controlling its electric vehicle charging and discharging power while the indirect control strategy by means of controlling the price of charging and discharging. Finally, for the convenience of the reader, this paper also proposed a complete idea of the research methods about how to study the control strategy, taking the adaptability and possibility of failure of electric vehicle control strategy into consideration. Finally, suggestions on the key areas for future research are put up.

  8. Predicting viscous-range velocity gradient dynamics in large-eddy simulations of turbulence

    NASA Astrophysics Data System (ADS)

    Johnson, Perry; Meneveau, Charles

    2017-11-01

    The details of small-scale turbulence are not directly accessible in large-eddy simulations (LES), posing a modeling challenge because many important micro-physical processes depend strongly on the dynamics of turbulence in the viscous range. Here, we introduce a method for coupling existing stochastic models for the Lagrangian evolution of the velocity gradient tensor with LES to simulate unresolved dynamics. The proposed approach is implemented in LES of turbulent channel flow and detailed comparisons with DNS are carried out. An application to modeling the fate of deformable, small (sub-Kolmogorov) droplets at negligible Stokes number and low volume fraction with one-way coupling is carried out. These results illustrate the ability of the proposed model to predict the influence of small scale turbulence on droplet micro-physics in the context of LES. This research was made possible by a graduate Fellowship from the National Science Foundation and by a Grant from The Gulf of Mexico Research Initiative.

  9. Hydrodynamics of isotropic and liquid crystalline active polymer solutions.

    PubMed

    Ahmadi, Aphrodite; Marchetti, M C; Liverpool, T B

    2006-12-01

    We describe the large-scale collective behavior of solutions of polar biofilaments and stationary and mobile crosslinkers. Both mobile and stationary crosslinkers induce filament alignment promoting either polar or nematic order. In addition, mobile crosslinkers, such as clusters of motor proteins, exchange forces and torques among the filaments and render the homogeneous states unstable via filament bundling. We start from a Smoluchowski equation for rigid filaments in solutions, where pairwise crosslink-mediated interactions among the filaments yield translational and rotational currents. The large-scale properties of the system are described in terms of continuum equations for filament and motor densities, polarization, and alignment tensor obtained by coarse-graining the Smoluchovski equation. The possible homogeneous and inhomogeneous states of the systems are obtained as stable solutions of the dynamical equations and are characterized in terms of experimentally accessible parameters. We make contact with work by other authors and show that our model allows for an estimate of the various parameters in the hydrodynamic equations in terms of physical properties of the crosslinkers.

  10. Graph 500 on OpenSHMEM: Using a Practical Survey of Past Work to Motivate Novel Algorithmic Developments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grossman, Max; Pritchard Jr., Howard Porter; Budimlic, Zoran

    2016-12-22

    Graph500 [14] is an effort to offer a standardized benchmark across large-scale distributed platforms which captures the behavior of common communicationbound graph algorithms. Graph500 differs from other large-scale benchmarking efforts (such as HPL [6] or HPGMG [7]) primarily in the irregularity of its computation and data access patterns. The core computational kernel of Graph500 is a breadth-first search (BFS) implemented on an undirected graph. The output of Graph500 is a spanning tree of the input graph, usually represented by a predecessor mapping for every node in the graph. The Graph500 benchmark defines several pre-defined input sizes for implementers to testmore » against. This report summarizes investigation into implementing the Graph500 benchmark on OpenSHMEM, and focuses on first building a strong and practical understanding of the strengths and limitations of past work before proposing and developing novel extensions.« less

  11. Enabling Large-Scale Design, Synthesis and Validation of Small Molecule Protein-Protein Antagonists

    PubMed Central

    Koes, David; Khoury, Kareem; Huang, Yijun; Wang, Wei; Bista, Michal; Popowicz, Grzegorz M.; Wolf, Siglinde; Holak, Tad A.; Dömling, Alexander; Camacho, Carlos J.

    2012-01-01

    Although there is no shortage of potential drug targets, there are only a handful known low-molecular-weight inhibitors of protein-protein interactions (PPIs). One problem is that current efforts are dominated by low-yield high-throughput screening, whose rigid framework is not suitable for the diverse chemotypes present in PPIs. Here, we developed a novel pharmacophore-based interactive screening technology that builds on the role anchor residues, or deeply buried hot spots, have in PPIs, and redesigns these entry points with anchor-biased virtual multicomponent reactions, delivering tens of millions of readily synthesizable novel compounds. Application of this approach to the MDM2/p53 cancer target led to high hit rates, resulting in a large and diverse set of confirmed inhibitors, and co-crystal structures validate the designed compounds. Our unique open-access technology promises to expand chemical space and the exploration of the human interactome by leveraging in-house small-scale assays and user-friendly chemistry to rationally design ligands for PPIs with known structure. PMID:22427896

  12. Rucio, the next-generation Data Management system in ATLAS

    NASA Astrophysics Data System (ADS)

    Serfon, C.; Barisits, M.; Beermann, T.; Garonne, V.; Goossens, L.; Lassnig, M.; Nairz, A.; Vigne, R.; ATLAS Collaboration

    2016-04-01

    Rucio is the next-generation of Distributed Data Management (DDM) system benefiting from recent advances in cloud and ;Big Data; computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quixote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 160 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio addresses these issues by relying on new technologies to ensure system scalability, cover new user requirements and employ new automation framework to reduce operational overheads. This paper shows the key concepts of Rucio, details the Rucio design, and the technology it employs, the tests that were conducted to validate it and finally describes the migration steps that were conducted to move from DQ2 to Rucio.

  13. Using adaptive-mesh refinement in SCFT simulations of surfactant adsorption

    NASA Astrophysics Data System (ADS)

    Sides, Scott; Kumar, Rajeev; Jamroz, Ben; Crockett, Robert; Pletzer, Alex

    2013-03-01

    Adsorption of surfactants at interfaces is relevant to many applications such as detergents, adhesives, emulsions and ferrofluids. Atomistic simulations of interface adsorption are challenging due to the difficulty of modeling the wide range of length scales in these problems: the thin interface region in equilibrium with a large bulk region that serves as a reservoir for the adsorbed species. Self-consistent field theory (SCFT) has been extremely useful for studying the morphologies of dense block copolymer melts. Field-theoretic simulations such as these are able to access large length and time scales that are difficult or impossible for particle-based simulations such as molecular dynamics. However, even SCFT methods can be difficult to apply to systems in which small spatial regions might require finer resolution than most of the simulation grid (eg. interface adsorption and confinement). We will present results on interface adsorption simulations using PolySwift++, an object-oriented, polymer SCFT simulation code aided by the Tech-X Chompst library that enables via block-structured AMR calculations with PETSc.

  14. Invisible water, visible impact: groundwater use and Indian agriculture under climate change

    NASA Astrophysics Data System (ADS)

    Zaveri, Esha; Grogan, Danielle S.; Fisher-Vanden, Karen; Frolking, Steve; Lammers, Richard B.; Wrenn, Douglas H.; Prusevich, Alexander; Nicholas, Robert E.

    2016-08-01

    India is one of the world’s largest food producers, making the sustainability of its agricultural system of global significance. Groundwater irrigation underpins India’s agriculture, currently boosting crop production by enough to feed 170 million people. Groundwater overexploitation has led to drastic declines in groundwater levels, threatening to push this vital resource out of reach for millions of small-scale farmers who are the backbone of India’s food security. Historically, losing access to groundwater has decreased agricultural production and increased poverty. We take a multidisciplinary approach to assess climate change challenges facing India’s agricultural system, and to assess the effectiveness of large-scale water infrastructure projects designed to meet these challenges. We find that even in areas that experience climate change induced precipitation increases, expansion of irrigated agriculture will require increasing amounts of unsustainable groundwater. The large proposed national river linking project has limited capacity to alleviate groundwater stress. Thus, without intervention, poverty and food insecurity in rural India is likely to worsen.

  15. Ingestion of bacterially expressed double-stranded RNA inhibits gene expression in planarians.

    PubMed

    Newmark, Phillip A; Reddien, Peter W; Cebrià, Francesc; Sánchez Alvarado, Alejandro

    2003-09-30

    Freshwater planarian flatworms are capable of regenerating complete organisms from tiny fragments of their bodies; the basis for this regenerative prowess is an experimentally accessible stem cell population that is present in the adult planarian. The study of these organisms, classic experimental models for investigating metazoan regeneration, has been revitalized by the application of modern molecular biological approaches. The identification of thousands of unique planarian ESTs, coupled with large-scale whole-mount in situ hybridization screens, and the ability to inhibit planarian gene expression through double-stranded RNA-mediated genetic interference, provide a wealth of tools for studying the molecular mechanisms that regulate tissue regeneration and stem cell biology in these organisms. Here we show that, as in Caenorhabditis elegans, ingestion of bacterially expressed double-stranded RNA can inhibit gene expression in planarians. This inhibition persists throughout the process of regeneration, allowing phenotypes with disrupted regenerative patterning to be identified. These results pave the way for large-scale screens for genes involved in regenerative processes.

  16. News from the proton - recent DIS results from HERA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meier, K.

    1997-01-01

    Recent results from the two large general-purpose detectors H1 and ZEUS at HERA (DESY, Hamburg, Germany) are presented. Emphasis is given to the analysis of deep inelastic scattering defined by the observation of the scattered electron or positron in the main calorimeters. Results on purely inclusive cross sections lead to a determination of the charged (quarks) parton distribution F{sub 2}(x, Q{sup 2}). Access to the electrically neutral parton content (gluons) is obtained indirectly by an analysis of the expected scaling violation behavior of F{sub 2} or directly from multijet rates originating from well-defined initial parton configurations. Finally, the recently uncoveredmore » subclass of large rapidity gap (LRG) events has been analyzed in terms of F{sub 2}. The result supports the concept of a color neutral object (Pomeron IP) being probed by a hard scattering electron. Evidence for factorization of the Pomeron radiation process as well as for scaling in the inclusive IP structure functions has been found.« less

  17. Design and implementation of scalable tape archiver

    NASA Technical Reports Server (NTRS)

    Nemoto, Toshihiro; Kitsuregawa, Masaru; Takagi, Mikio

    1996-01-01

    In order to reduce costs, computer manufacturers try to use commodity parts as much as possible. Mainframes using proprietary processors are being replaced by high performance RISC microprocessor-based workstations, which are further being replaced by the commodity microprocessor used in personal computers. Highly reliable disks for mainframes are also being replaced by disk arrays, which are complexes of disk drives. In this paper we try to clarify the feasibility of a large scale tertiary storage system composed of 8-mm tape archivers utilizing robotics. In the near future, the 8-mm tape archiver will be widely used and become a commodity part, since recent rapid growth of multimedia applications requires much larger storage than disk drives can provide. We designed a scalable tape archiver which connects as many 8-mm tape archivers (element archivers) as possible. In the scalable archiver, robotics can exchange a cassette tape between two adjacent element archivers mechanically. Thus, we can build a large scalable archiver inexpensively. In addition, a sophisticated migration mechanism distributes frequently accessed tapes (hot tapes) evenly among all of the element archivers, which improves the throughput considerably. Even with the failures of some tape drives, the system dynamically redistributes hot tapes to the other element archivers which have live tape drives. Several kinds of specially tailored huge archivers are on the market, however, the 8-mm tape scalable archiver could replace them. To maintain high performance in spite of high access locality when a large number of archivers are attached to the scalable archiver, it is necessary to scatter frequently accessed cassettes among the element archivers and to use the tape drives efficiently. For this purpose, we introduce two cassette migration algorithms, foreground migration and background migration. Background migration transfers cassettes between element archivers to redistribute frequently accessed cassettes, thus balancing the load of each archiver. Background migration occurs the robotics are idle. Both migration algorithms are based on access frequency and space utility of each element archiver. To normalize these parameters according to the number of drives in each element archiver, it is possible to maintain high performance even if some tape drives fail. We found that the foreground migration is efficient at reducing access response time. Beside the foreground migration, the background migration makes it possible to track the transition of spatial access locality quickly.

  18. 75 FR 37479 - In the Matter of NuScale Power, Inc. and All Other Persons; Who Seek or Obtain Access to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-29

    ... date of this Order, submit the fingerprints of one individual whom (a) NuScale nominates as the....\\2\\ NuScale may, at the same time or later, submit the fingerprints of other individuals to whom NuScale seeks to grant access to SGI. Fingerprints shall be submitted and reviewed in accordance with the...

  19. The state-led large scale public private partnership 'Chiranjeevi Program' to increase access to institutional delivery among poor women in Gujarat, India: How has it done? What can we learn?

    PubMed

    De Costa, Ayesha; Vora, Kranti S; Ryan, Kayleigh; Sankara Raman, Parvathy; Santacatterina, Michele; Mavalankar, Dileep

    2014-01-01

    Many low-middle income countries have focused on improving access to and quality of obstetric care, as part of promoting a facility based intra-partum care strategy to reduce maternal mortality. The state of Gujarat in India, implements a facility based intra-partum care program through its large for-profit private obstetric sector, under a state-led public-private-partnership, the Chiranjeevi Yojana (CY), under which the state pays accredited private obstetricians to perform deliveries for poor/tribal women. We examine CY performance, its contribution to overall trends in institutional deliveries in Gujarat over the last decade and its effect on private and public sector deliveries there. District level institutional delivery data (public, private, CY), national surveys, poverty estimates, census data were used. Institutional delivery trends in Gujarat 2000-2010 are presented; including contributions of different sectors and CY. Piece-wise regression was used to study the influence of the CY program on public and private sector institutional delivery. Institutional delivery rose from 40.7% (2001) to 89.3% (2010), driven by sharp increases in private sector deliveries. Public sector and CY contributed 25-29% and 13-16% respectively of all deliveries each year. In 2007, 860 of 2000 private obstetricians participated in CY. Since 2007, >600,000 CY deliveries occurred i.e. one-third of births in the target population. Caesareans under CY were 6%, higher than the 2% reported among poor women by the DLHS survey just before CY. CY did not influence the already rising proportion of private sector deliveries in Gujarat. This paper reports a state-led, fully state-funded, large-scale public-private partnership to improve poor women's access to institutional delivery - there have been >600,000 beneficiaries. While caesarean proportions are higher under CY than before, it is uncertain if all beneficiaries who require sections receive these. Other issues to explore include quality of care, provider attrition and the relatively low coverage.

  20. The State-Led Large Scale Public Private Partnership ‘Chiranjeevi Program’ to Increase Access to Institutional Delivery among Poor Women in Gujarat, India: How Has It Done? What Can We Learn?

    PubMed Central

    De Costa, Ayesha; Vora, Kranti S.; Ryan, Kayleigh; Sankara Raman, Parvathy; Santacatterina, Michele; Mavalankar, Dileep

    2014-01-01

    Background Many low-middle income countries have focused on improving access to and quality of obstetric care, as part of promoting a facility based intra-partum care strategy to reduce maternal mortality. The state of Gujarat in India, implements a facility based intra-partum care program through its large for-profit private obstetric sector, under a state-led public-private-partnership, the Chiranjeevi Yojana (CY), under which the state pays accredited private obstetricians to perform deliveries for poor/tribal women. We examine CY performance, its contribution to overall trends in institutional deliveries in Gujarat over the last decade and its effect on private and public sector deliveries there. Methods District level institutional delivery data (public, private, CY), national surveys, poverty estimates, census data were used. Institutional delivery trends in Gujarat 2000–2010 are presented; including contributions of different sectors and CY. Piece-wise regression was used to study the influence of the CY program on public and private sector institutional delivery. Results Institutional delivery rose from 40.7% (2001) to 89.3% (2010), driven by sharp increases in private sector deliveries. Public sector and CY contributed 25–29% and 13–16% respectively of all deliveries each year. In 2007, 860 of 2000 private obstetricians participated in CY. Since 2007, >600,000 CY deliveries occurred i.e. one-third of births in the target population. Caesareans under CY were 6%, higher than the 2% reported among poor women by the DLHS survey just before CY. CY did not influence the already rising proportion of private sector deliveries in Gujarat. Conclusion This paper reports a state-led, fully state-funded, large-scale public-private partnership to improve poor women’s access to institutional delivery - there have been >600,000 beneficiaries. While caesarean proportions are higher under CY than before, it is uncertain if all beneficiaries who require sections receive these. Other issues to explore include quality of care, provider attrition and the relatively low coverage. PMID:24787692

  1. The Earth Microbiome Project and Global Systems Biology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Jack A.; Jansson, Janet K.; Knight, Rob

    Recently, we published the first large-scale analysis of data from the Earth Microbiome Project (1, 2), a truly multidisciplinary research program involving more than 500 scientists and 27,751 samples acquired from 43 countries. These samples represent myriad specimen types and span a wide range of biotic and abiotic factors, geographic locations, and physicochemical properties. The database (https://qiita.ucsd.edu/emp/) is still growing, with over 90,000 amplicon datasets, >500 metagenomic runs, and metabolomics datasets from a similar number of samples. Importantly, the techniques, data and analytical tools are all standardized and publicly accessible, providing a framework to support research at a scale ofmore » integration that just 7 years ago seemed impossible.« less

  2. Air-Sea Interaction

    NASA Astrophysics Data System (ADS)

    Csanady, G. T.

    2001-03-01

    In recent years air-sea interaction has emerged as a subject in its own right, encompassing small-scale and large-scale processes in both air and sea. Air-Sea Interaction: Laws and Mechanisms is a comprehensive account of how the atmosphere and the ocean interact to control the global climate, what physical laws govern this interaction, and its prominent mechanisms. The topics covered range from evaporation in the oceans, to hurricanes, and on to poleward heat transport by the oceans. By developing the subject from basic physical (thermodynamic) principles, the book is accessible to graduate students and research scientists in meteorology, oceanography, and environmental engineering. It will also be of interest to the broader physics community involved in the treatment of transfer laws, and thermodynamics of the atmosphere and ocean.

  3. Distributed clinical data sharing via dynamic access-control policy transformation.

    PubMed

    Rezaeibagha, Fatemeh; Mu, Yi

    2016-05-01

    Data sharing in electronic health record (EHR) systems is important for improving the quality of healthcare delivery. Data sharing, however, has raised some security and privacy concerns because healthcare data could be potentially accessible by a variety of users, which could lead to privacy exposure of patients. Without addressing this issue, large-scale adoption and sharing of EHR data are impractical. The traditional solution to the problem is via encryption. Although encryption can be applied to access control, it is not applicable for complex EHR systems that require multiple domains (e.g. public and private clouds) with various access requirements. This study was carried out to address the security and privacy issues of EHR data sharing with our novel access-control mechanism, which captures the scenario of the hybrid clouds and need of access-control policy transformation, to provide secure and privacy-preserving data sharing among different healthcare enterprises. We introduce an access-control mechanism with some cryptographic building blocks and present a novel approach for secure EHR data sharing and access-control policy transformation in EHR systems for hybrid clouds. We propose a useful data sharing system for healthcare providers to handle various EHR users who have various access privileges in different cloud environments. A systematic study has been conducted on data sharing in EHR systems to provide a solution to the security and privacy issues. In conclusion, we introduce an access-control method for privacy protection of EHRs and EHR policy transformation that allows an EHR access-control policy to be transformed from a private cloud to a public cloud. This method has never been studied previously in the literature. Furthermore, we provide a protocol to demonstrate policy transformation as an application scenario. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Estimating occupancy in large landscapes: evaluation of amphibian monitoring in the greater Yellowstone ecosystem

    USGS Publications Warehouse

    Gould, William R.; Patla, Debra A.; Daley, Rob; Corn, Paul Stephen; Hossack, Blake R.; Bennetts, Robert E.; Peterson, Charles R.

    2012-01-01

    Monitoring of natural resources is crucial to ecosystem conservation, and yet it can pose many challenges. Annual surveys for amphibian breeding occupancy were conducted in Yellowstone and Grand Teton National Parks over a 4-year period (2006–2009) at two scales: catchments (portions of watersheds) and individual wetland sites. Catchments were selected in a stratified random sample with habitat quality and ease of access serving as strata. All known wetland sites with suitable habitat were surveyed within selected catchments. Changes in breeding occurrence of tiger salamanders, boreal chorus frogs, and Columbia-spotted frogs were assessed using multi-season occupancy estimation. Numerous a priori models were considered within an information theoretic framework including those with catchment and site-level covariates. Habitat quality was the most important predictor of occupancy. Boreal chorus frogs demonstrated the greatest increase in breeding occupancy at the catchment level. Larger changes for all 3 species were detected at the finer site-level scale. Connectivity of sites explained occupancy rates more than other covariates, and may improve understanding of the dynamic processes occurring among wetlands within this ecosystem. Our results suggest monitoring occupancy at two spatial scales within large study areas is feasible and informative.

  5. A Ubiquitous Sensor Network Platform for Integrating Smart Devices into the Semantic Sensor Web

    PubMed Central

    de Vera, David Díaz Pardo; Izquierdo, Álvaro Sigüenza; Vercher, Jesús Bernat; Gómez, Luis Alfonso Hernández

    2014-01-01

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs. PMID:24945678

  6. A ubiquitous sensor network platform for integrating smart devices into the semantic sensor web.

    PubMed

    de Vera, David Díaz Pardo; Izquierdo, Alvaro Sigüenza; Vercher, Jesús Bernat; Hernández Gómez, Luis Alfonso

    2014-06-18

    Ongoing Sensor Web developments make a growing amount of heterogeneous sensor data available to smart devices. This is generating an increasing demand for homogeneous mechanisms to access, publish and share real-world information. This paper discusses, first, an architectural solution based on Next Generation Networks: a pilot Telco Ubiquitous Sensor Network (USN) Platform that embeds several OGC® Sensor Web services. This platform has already been deployed in large scale projects. Second, the USN-Platform is extended to explore a first approach to Semantic Sensor Web principles and technologies, so that smart devices can access Sensor Web data, allowing them also to share richer (semantically interpreted) information. An experimental scenario is presented: a smart car that consumes and produces real-world information which is integrated into the Semantic Sensor Web through a Telco USN-Platform. Performance tests revealed that observation publishing times with our experimental system were well within limits compatible with the adequate operation of smart safety assistance systems in vehicles. On the other hand, response times for complex queries on large repositories may be inappropriate for rapid reaction needs.

  7. Large-scale electrophysiology: acquisition, compression, encryption, and storage of big data.

    PubMed

    Brinkmann, Benjamin H; Bower, Mark R; Stengel, Keith A; Worrell, Gregory A; Stead, Matt

    2009-05-30

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single-neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single-neuron action potentials, high frequency oscillations, and high amplitude ultra-slow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range-encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information.

  8. Large-scale Electrophysiology: Acquisition, Compression, Encryption, and Storage of Big Data

    PubMed Central

    Brinkmann, Benjamin H.; Bower, Mark R.; Stengel, Keith A.; Worrell, Gregory A.; Stead, Matt

    2009-01-01

    The use of large-scale electrophysiology to obtain high spatiotemporal resolution brain recordings (>100 channels) capable of probing the range of neural activity from local field potential oscillations to single neuron action potentials presents new challenges for data acquisition, storage, and analysis. Our group is currently performing continuous, long-term electrophysiological recordings in human subjects undergoing evaluation for epilepsy surgery using hybrid intracranial electrodes composed of up to 320 micro- and clinical macroelectrode arrays. DC-capable amplifiers, sampling at 32 kHz per channel with 18-bits of A/D resolution are capable of resolving extracellular voltages spanning single neuron action potentials, high frequency oscillations, and high amplitude ultraslow activity, but this approach generates 3 terabytes of data per day (at 4 bytes per sample) using current data formats. Data compression can provide several practical benefits, but only if data can be compressed and appended to files in real-time in a format that allows random access to data segments of varying size. Here we describe a state-of-the-art, scalable, electrophysiology platform designed for acquisition, compression, encryption, and storage of large-scale data. Data are stored in a file format that incorporates lossless data compression using range encoded differences, a 32-bit cyclically redundant checksum to ensure data integrity, and 128-bit encryption for protection of patient information. PMID:19427545

  9. mySyntenyPortal: an application package to construct websites for synteny block analysis.

    PubMed

    Lee, Jongin; Lee, Daehwan; Sim, Mikang; Kwon, Daehong; Kim, Juyeon; Ko, Younhee; Kim, Jaebum

    2018-06-05

    Advances in sequencing technologies have facilitated large-scale comparative genomics based on whole genome sequencing. Constructing and investigating conserved genomic regions among multiple species (called synteny blocks) are essential in the comparative genomics. However, they require significant amounts of computational resources and time in addition to bioinformatics skills. Many web interfaces have been developed to make such tasks easier. However, these web interfaces cannot be customized for users who want to use their own set of genome sequences or definition of synteny blocks. To resolve this limitation, we present mySyntenyPortal, a stand-alone application package to construct websites for synteny block analyses by using users' own genome data. mySyntenyPortal provides both command line and web-based interfaces to build and manage websites for large-scale comparative genomic analyses. The websites can be also easily published and accessed by other users. To demonstrate the usability of mySyntenyPortal, we present an example study for building websites to compare genomes of three mammalian species (human, mouse, and cow) and show how they can be easily utilized to identify potential genes affected by genome rearrangements. mySyntenyPortal will contribute for extended comparative genomic analyses based on large-scale whole genome sequences by providing unique functionality to support the easy creation of interactive websites for synteny block analyses from user's own genome data.

  10. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  11. The structure of Mediterranean rocky reef ecosystems across environmental and human gradients, and conservation implications

    USGS Publications Warehouse

    Sala, Enric; Ballesteros, Enric; Dendrinos, Panagiotis; Di Franco, Antonio; Ferretti, Francesco; Foley, David; Fraschetti, Simonetta; Friedlander, Alan M.; Garrabou, Joaquim; Guclusoy, Harun; Guidetti, Paolo; Halpern, Benjamin S.; Hereu, Bernat; Karamanlidis, Alexandros A.; Kizilkaya, Zafer; Macpherson, Enrique; Mangialajo, Luisa; Mariani, Simone; Micheli, Fiorenza; Pais, Antonio; Riser, Kristin; Rosenberg, Andrew A.; Sales, Marta; Selkoe, Kimberly A.; Starr, Rick; Tomas, Fiona; Zabala, Mikel

    2012-01-01

    Historical exploitation of the Mediterranean Sea and the absence of rigorous baselines makes it difficult to evaluate the current health of the marine ecosystems and the efficacy of conservation actions at the ecosystem level. Here we establish the first current baseline and gradient of ecosystem structure of nearshore rocky reefs at the Mediterranean scale. We conducted underwater surveys in 14 marine protected areas and 18 open access sites across the Mediterranean, and across a 31-fold range of fish biomass (from 3.8 to 118 g m-2). Our data showed remarkable variation in the structure of rocky reef ecosystems. Multivariate analysis showed three alternative community states: (1) large fish biomass and reefs dominated by non-canopy algae, (2) lower fish biomass but abundant native algal canopies and suspension feeders, and (3) low fish biomass and extensive barrens, with areas covered by turf algae. Our results suggest that the healthiest shallow rocky reef ecosystems in the Mediterranean have both large fish and algal biomass. Protection level and primary production were the only variables significantly correlated to community biomass structure. Fish biomass was significantly larger in well-enforced no-take marine reserves, but there were no significant differences between multi-use marine protected areas (which allow some fishing) and open access areas at the regional scale. The gradients reported here represent a trajectory of degradation that can be used to assess the health of any similar habitat in the Mediterranean, and to evaluate the efficacy of marine protected areas.

  12. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  13. The structure of Mediterranean rocky reef ecosystems across environmental and human gradients, and conservation implications.

    PubMed

    Sala, Enric; Ballesteros, Enric; Dendrinos, Panagiotis; Di Franco, Antonio; Ferretti, Francesco; Foley, David; Fraschetti, Simonetta; Friedlander, Alan; Garrabou, Joaquim; Güçlüsoy, Harun; Guidetti, Paolo; Halpern, Benjamin S; Hereu, Bernat; Karamanlidis, Alexandros A; Kizilkaya, Zafer; Macpherson, Enrique; Mangialajo, Luisa; Mariani, Simone; Micheli, Fiorenza; Pais, Antonio; Riser, Kristin; Rosenberg, Andrew A; Sales, Marta; Selkoe, Kimberly A; Starr, Rick; Tomas, Fiona; Zabala, Mikel

    2012-01-01

    Historical exploitation of the Mediterranean Sea and the absence of rigorous baselines makes it difficult to evaluate the current health of the marine ecosystems and the efficacy of conservation actions at the ecosystem level. Here we establish the first current baseline and gradient of ecosystem structure of nearshore rocky reefs at the Mediterranean scale. We conducted underwater surveys in 14 marine protected areas and 18 open access sites across the Mediterranean, and across a 31-fold range of fish biomass (from 3.8 to 118 g m(-2)). Our data showed remarkable variation in the structure of rocky reef ecosystems. Multivariate analysis showed three alternative community states: (1) large fish biomass and reefs dominated by non-canopy algae, (2) lower fish biomass but abundant native algal canopies and suspension feeders, and (3) low fish biomass and extensive barrens, with areas covered by turf algae. Our results suggest that the healthiest shallow rocky reef ecosystems in the Mediterranean have both large fish and algal biomass. Protection level and primary production were the only variables significantly correlated to community biomass structure. Fish biomass was significantly larger in well-enforced no-take marine reserves, but there were no significant differences between multi-use marine protected areas (which allow some fishing) and open access areas at the regional scale. The gradients reported here represent a trajectory of degradation that can be used to assess the health of any similar habitat in the Mediterranean, and to evaluate the efficacy of marine protected areas.

  14. Is scale-up of community mobilisation among sex workers really possible in complex urban environments? The case of Mumbai, India.

    PubMed

    Kongelf, Anine; Bandewar, Sunita V S; Bharat, Shalini; Collumbien, Martine

    2015-01-01

    In the last decade, community mobilisation (CM) interventions targeting female sex workers (FSWs) have been scaled-up in India's national response to the HIV epidemic. This included the Bill and Melinda Gates Foundation's Avahan programme which adopted a business approach to plan and manage implementation at scale. With the focus of evaluation efforts on measuring effectiveness and health impacts there has been little analysis thus far of the interaction of the CM interventions with the sex work industry in complex urban environments. Between March and July 2012 semi-structured, in-depth interviews and focus group discussions were conducted with 63 HIV intervention implementers, to explore challenges of HIV prevention among FSWs in Mumbai. A thematic analysis identified contextual factors that impact CM implementation. Large-scale interventions are not only impacted by, but were shown to shape the dynamic social context. Registration practices and programme monitoring were experienced as stigmatising, reflected in shifting client preferences towards women not disclosing as 'sex workers'. This combined with urban redevelopment and gentrification of traditional red light areas, forcing dispersal and more 'hidden' ways of solicitation, further challenging outreach and collectivisation. Participants reported that brothel owners and 'pimps' continued to restrict access to sex workers and the heterogeneous 'community' of FSWs remains fragmented with high levels of mobility. Stakeholder engagement was poor and mobilising around HIV prevention not compelling. Interventions largely failed to respond to community needs as strong target-orientation skewed activities towards those most easily measured and reported. Large-scale interventions have been impacted by and contributed to an increasingly complex sex work environment in Mumbai, challenging outreach and mobilisation efforts. Sex workers remain a vulnerable and disempowered group needing continued support and more comprehensive services.

  15. An explicit GIS-based river basin framework for aquatic ecosystem conservation in the Amazon

    NASA Astrophysics Data System (ADS)

    Venticinque, Eduardo; Forsberg, Bruce; Barthem, Ronaldo; Petry, Paulo; Hess, Laura; Mercado, Armando; Cañas, Carlos; Montoya, Mariana; Durigan, Carlos; Goulding, Michael

    2016-11-01

    Despite large-scale infrastructure development, deforestation, mining and petroleum exploration in the Amazon Basin, relatively little attention has been paid to the management scale required for the protection of wetlands, fisheries and other aspects of aquatic ecosystems. This is due, in part, to the enormous size, multinational composition and interconnected nature of the Amazon River system, as well as to the absence of an adequate spatial model for integrating data across the entire Amazon Basin. In this data article we present a spatially uniform multi-scale GIS framework that was developed especially for the analysis, management and monitoring of various aspects of aquatic systems in the Amazon Basin. The Amazon GIS-Based River Basin Framework is accessible as an ESRI geodatabase at doi:10.5063/F1BG2KX8.

  16. Phenomenology of strongly coupled chiral gauge theories

    DOE PAGES

    Bai, Yang; Berger, Joshua; Osborne, James; ...

    2016-11-25

    A sector with QCD-like strong dynamics is common in models of non-standard physics. Such a model could be accessible in LHC searches if both confinement and big-quarks charged under the confining group are at the TeV scale. Big-quark masses at this scale can be explained if the new fermions are chiral under a new U(1)' gauge symmetry such that their bare masses are related to the U(1)'-breaking and new confinement scales. Here we present a study of a minimal GUT-motivated and gauge anomaly-free model with implications for the LHC Run 2 searches. We find that the first signatures of suchmore » models could appear as two gauge boson resonances. The chiral nature of the model could be confirmed by observation of a Z'γ resonance, where the Z' naturally has a large leptonic branching ratio because of its kinetic mixing with the hypercharge gauge boson.« less

  17. Full Scale Software Support on Mobile Lightweight Devices by Utilization of All Types of Wireless Technologies

    NASA Astrophysics Data System (ADS)

    Krejcar, Ondrej

    New kind of mobile lightweight devices can run full scale applications with same comfort as on desktop devices only with several limitations. One of them is insufficient transfer speed on wireless connectivity. Main area of interest is in a model of a radio-frequency based system enhancement for locating and tracking users of a mobile information system. The experimental framework prototype uses a wireless network infrastructure to let a mobile lightweight device determine its indoor or outdoor position. User location is used for data prebuffering and pushing information from server to user’s PDA. All server data is saved as artifacts along with its position information in building or larger area environment. The accessing of prebuffered data on mobile lightweight device can highly improve response time needed to view large multimedia data. This fact can help with design of new full scale applications for mobile lightweight devices.

  18. Large arterial occlusive strokes as a medical emergency: need to accurately predict clot location.

    PubMed

    Vanacker, Peter; Faouzi, Mohamed; Eskandari, Ashraf; Maeder, Philippe; Meuli, Reto; Michel, Patrik

    2017-10-01

    Endovascular treatment for acute ischemic stroke with a large intracranial occlusion was recently shown to be effective. Timely knowledge of the presence, site, and extent of arterial occlusions in the ischemic territory has the potential to influence patient selection for endovascular treatment. We aimed to find predictors of large vessel occlusive strokes, on the basis of available demographic, clinical, radiological, and laboratory data in the emergency setting. Patients enrolled in ASTRAL registry with acute ischemic stroke and computed tomography (CT)-angiography within 12 h of stroke onset were selected and categorized according to occlusion site. Easily accessible variables were used in a multivariate analysis. Of 1645 patients enrolled, a significant proportion (46.2%) had a large vessel occlusion in the ischemic territory. The main clinical predictors of any arterial occlusion were in-hospital stroke [odd ratios (OR) 2.1, 95% confidence interval 1.4-3.1], higher initial National Institute of Health Stroke Scale (OR 1.1, 1.1-1.2), presence of visual field defects (OR 1.9, 1.3-2.6), dysarthria (OR 1.4, 1.0-1.9), or hemineglect (OR 2.0, 1.4-2.8) at admission and atrial fibrillation (OR 1.7, 1.2-2.3). Further, the following radiological predictors were identified: time-to-imaging (OR 0.9, 0.9-1.0), early ischemic changes (OR 2.3, 1.7-3.2), and silent lesions on CT (OR 0.7, 0.5-1.0). The area under curve for this analysis was 0.85. Looking at different occlusion sites, National Institute of Health Stroke Scale and early ischemic changes on CT were independent predictors in all subgroups. Neurological deficits, stroke risk factors, and CT findings accurately identify acute ischemic stroke patients at risk of symptomatic vessel occlusion. Predicting the presence of these occlusions may impact emergency stroke care in regions with limited access to noninvasive vascular imaging.

  19. A practical large scale/high speed data distribution system using 8 mm libraries

    NASA Technical Reports Server (NTRS)

    Howard, Kevin

    1993-01-01

    Eight mm tape libraries are known primarily for their small size, large storage capacity, and low cost. However, many applications require an additional attribute which, heretofore, has been lacking -- high transfer rate. Transfer rate is particularly important in a large scale data distribution environment -- an environment in which 8 mm tape should play a very important role. Data distribution is a natural application for 8 mm for several reasons: most large laboratories have access to 8 mm tape drives, 8 mm tapes are upwardly compatible, 8 mm media are very inexpensive, 8 mm media are light weight (important for shipping purposes), and 8 mm media densely pack data (5 gigabytes now and 15 gigabytes on the horizon). If the transfer rate issue were resolved, 8 mm could offer a good solution to the data distribution problem. To that end Exabyte has analyzed four ways to increase its transfer rate: native drive transfer rate increases, data compression at the drive level, tape striping, and homogeneous drive utilization. Exabyte is actively pursuing native drive transfer rate increases and drive level data compression. However, for non-transmitted bulk data applications (which include data distribution) the other two methods (tape striping and homogeneous drive utilization) hold promise.

  20. Open access, library and publisher competition, and the evolution of general commerce.

    PubMed

    Odlyzko, Andrew M

    2015-02-01

    Discussions of the economics of scholarly communication are usually devoted to Open Access, rising journal prices, publisher profits, and boycotts. That ignores what seems a much more important development in this market. Publishers, through the oft-reviled Big Deal packages, are providing much greater and more egalitarian access to the journal literature, an approximation to true Open Access. In the process, they are also marginalizing libraries and obtaining a greater share of the resources going into scholarly communication. This is enabling a continuation of publisher profits as well as of what for decades has been called "unsustainable journal price escalation." It is also inhibiting the spread of Open Access and potentially leading to an oligopoly of publishers controlling distribution through large-scale licensing. The Big Deal practices are worth studying for several general reasons. The degree to which publishers succeed in diminishing the role of libraries may be an indicator of the degree and speed at which universities transform themselves. More importantly, these Big Deals appear to point the way to the future of the whole economy, where progress is characterized by declining privacy, increasing price discrimination, increasing opaqueness in pricing, increasing reliance on low-paid or unpaid work of others for profits, and business models that depend on customer inertia. © The Author(s) 2014.

  1. A Spatial Method to Calculate Small-Scale Fisheries Extent

    NASA Astrophysics Data System (ADS)

    Johnson, A. F.; Moreno-Báez, M.; Giron-Nava, A.; Corominas, J.; Erisman, B.; Ezcurra, E.; Aburto-Oropeza, O.

    2016-02-01

    Despite global catch per unit effort having redoubled since the 1950's, the global fishing fleet is estimated to be twice the size that the oceans can sustainably support. In order to gauge the collateral impacts of fishing intensity, we must be able to estimate the spatial extent and amount of fishing vessels in the oceans. Methods that do currently exist are built around electronic tracking and log book systems and generally focus on industrial fisheries. Spatial extent for small-scale fisheries therefore remains elusive for many small-scale fishing fleets; even though these fisheries land the same biomass for human consumption as industrial fisheries. Current methods are data-intensive and require extensive extrapolation when estimated across large spatial scales. We present an accessible, spatial method of calculating the extent of small-scale fisheries based on two simple measures that are available, or at least easily estimable, in even the most data poor fisheries: the number of boats and the local coastal human population. We demonstrate this method is fishery-type independent and can be used to quantitatively evaluate the efficacy of growth in small-scale fisheries. This method provides an important first step towards estimating the fishing extent of the small-scale fleet, globally.

  2. Conservation of Pollinators in Traditional Agricultural Landscapes – New Challenges in Transylvania (Romania) Posed by EU Accession and Recommendations for Future Research

    PubMed Central

    Kovács-Hostyánszki, Anikó; Földesi, Rita; Mózes, Edina; Szirák, Ádám; Fischer, Joern; Hanspach, Jan; Báldi, András

    2016-01-01

    Farmland biodiversity is strongly declining in most of Western Europe, but still survives in traditional low intensity agricultural landscapes in Central and Eastern Europe. Accession to the EU however intensifies agriculture, which leads to the vanishing of traditional farming. Our aim was to describe the pollinator assemblages of the last remnants of these landscapes, thus set the baseline of sustainable farming for pollination, and to highlight potential measures of conservation. In these traditional farmlands in the Transylvanian Basin, Romania (EU accession in 2007), we studied the major pollinator groups—wild bees, hoverflies and butterflies. Landscape scale effects of semi-natural habitats, land cover diversity, the effects of heterogeneity and woody vegetation cover and on-site flower resources were tested on pollinator communities in traditionally managed arable fields and grasslands. Our results showed: (i) semi-natural habitats at the landscape scale have a positive effect on most pollinators, especially in the case of low heterogeneity of the direct vicinity of the studied sites; (ii) both arable fields and grasslands hold abundant flower resources, thus both land use types are important in sustaining pollinator communities; (iii) thus, pollinator conservation can rely even on arable fields under traditional management regime. This has an indirect message that the tiny flower margins around large intensive fields in west Europe can be insufficient conservation measures to restore pollinator communities at the landscape scale, as this is still far the baseline of necessary flower resources. This hypothesis needs further study, which includes more traditional landscapes providing baseline, and exploration of other factors behind the lower than baseline level biodiversity values of fields under agri-environmental schemes (AES). PMID:27285118

  3. Conservation of Pollinators in Traditional Agricultural Landscapes - New Challenges in Transylvania (Romania) Posed by EU Accession and Recommendations for Future Research.

    PubMed

    Kovács-Hostyánszki, Anikó; Földesi, Rita; Mózes, Edina; Szirák, Ádám; Fischer, Joern; Hanspach, Jan; Báldi, András

    2016-01-01

    Farmland biodiversity is strongly declining in most of Western Europe, but still survives in traditional low intensity agricultural landscapes in Central and Eastern Europe. Accession to the EU however intensifies agriculture, which leads to the vanishing of traditional farming. Our aim was to describe the pollinator assemblages of the last remnants of these landscapes, thus set the baseline of sustainable farming for pollination, and to highlight potential measures of conservation. In these traditional farmlands in the Transylvanian Basin, Romania (EU accession in 2007), we studied the major pollinator groups-wild bees, hoverflies and butterflies. Landscape scale effects of semi-natural habitats, land cover diversity, the effects of heterogeneity and woody vegetation cover and on-site flower resources were tested on pollinator communities in traditionally managed arable fields and grasslands. Our results showed: (i) semi-natural habitats at the landscape scale have a positive effect on most pollinators, especially in the case of low heterogeneity of the direct vicinity of the studied sites; (ii) both arable fields and grasslands hold abundant flower resources, thus both land use types are important in sustaining pollinator communities; (iii) thus, pollinator conservation can rely even on arable fields under traditional management regime. This has an indirect message that the tiny flower margins around large intensive fields in west Europe can be insufficient conservation measures to restore pollinator communities at the landscape scale, as this is still far the baseline of necessary flower resources. This hypothesis needs further study, which includes more traditional landscapes providing baseline, and exploration of other factors behind the lower than baseline level biodiversity values of fields under agri-environmental schemes (AES).

  4. A Thermal Technique of Fault Nucleation, Growth, and Slip

    NASA Astrophysics Data System (ADS)

    Garagash, D.; Germanovich, L. N.; Murdoch, L. C.; Martel, S. J.; Reches, Z.; Elsworth, D.; Onstott, T. C.

    2009-12-01

    Fractures and fluids influence virtually all mechanical processes in the crust, but many aspects of these processes remain poorly understood largely because of a lack of controlled field experiments at appropriate scale. We have developed an in-situ experimental approach to create carefully controlled faults at scale of ~10 meters using thermal techniques to modify in situ stresses to the point where the rock fails in shear. This approach extends experiments on fault nucleation and growth to length scales 2-3 orders of magnitude greater than are currently possible in the laboratory. The experiments could be done at depths where the modified in situ stresses are sufficient to drive faulting, obviating the need for unrealistically large loading frames. Such experiments require an access to large rock volumes in the deep subsurface in a controlled setting. The Deep Underground Science and Engineering Laboratory (DUSEL), which is a research facility planned to occupy the workings of the former Homestake gold mine in the northern Black Hills, South Dakota, presents an opportunity for accessing locations with vertical stresses as large as 60 MPa (down to 2400 m depth), which is sufficient to create faults. One of the most promising methods for manipulating stresses to create faults that we have evaluated involves drilling two parallel planar arrays of boreholes and circulating cold fluid (e.g., liquid nitrogen) to chill the region in the vicinity of the boreholes. Cooling a relatively small region around each borehole causes the rock to contract, reducing the normal compressive stress throughout much larger region between the arrays of boreholes. This scheme was evaluated using both scaling analysis and a finite element code. Our results show that if the boreholes are spaced by ~1 m, in several days to weeks, the normal compressive stress can be reduced by 10 MPa or more, and it is even possible to create net tension between the borehole arrays. According to the Mohr-Coulomb strength criterion with standard Byerlee parameters, a fault will initiate before the net tension occurs. After a new fault is created, hot fluid can be injected into the boreholes to increase the temperature and reverse the direction of fault slip. This process can be repeated to study the formation of gouge, and how the properties of gouge control fault slip and associated seismicity. Instrumenting the site with arrays of geophones, tiltmeters, strain gauges, and displacement transducers as well as back mining - an opportunity provided by the DUSEL project - can reveal details of the fault geometry and gouge. We also expect to find small faults (with cm-scale displacement) during construction of DUSEL drifts. The same thermal technique can be used to induce slip on one of them and compare the “man-made” and natural gouges. The thermal technique appears to be a relatively simple way to rapidly change the stress field and either create slip on existing fractures or create new faults at scales up to 10 m or more.

  5. Global assessment of the status of coral reef herbivorous fishes: evidence for fishing effects.

    PubMed

    Edwards, C B; Friedlander, A M; Green, A G; Hardt, M J; Sala, E; Sweatman, H P; Williams, I D; Zgliczynski, B; Sandin, S A; Smith, J E

    2014-01-07

    On coral reefs, herbivorous fishes consume benthic primary producers and regulate competition between fleshy algae and reef-building corals. Many of these species are also important fishery targets, yet little is known about their global status. Using a large-scale synthesis of peer-reviewed and unpublished data, we examine variability in abundance and biomass of herbivorous reef fishes and explore evidence for fishing impacts globally and within regions. We show that biomass is more than twice as high in locations not accessible to fisheries relative to fisheries-accessible locations. Although there are large biogeographic differences in total biomass, the effects of fishing are consistent in nearly all regions. We also show that exposure to fishing alters the structure of the herbivore community by disproportionately reducing biomass of large-bodied functional groups (scraper/excavators, browsers, grazer/detritivores), while increasing biomass and abundance of territorial algal-farming damselfishes (Pomacentridae). The browser functional group that consumes macroalgae and can help to prevent coral-macroalgal phase shifts appears to be most susceptible to fishing. This fishing down the herbivore guild probably alters the effectiveness of these fishes in regulating algal abundance on reefs. Finally, data from remote and unfished locations provide important baselines for setting management and conservation targets for this important group of fishes.

  6. Accessibility Measures: Formulation Considerations and Current Applications

    DOT National Transportation Integrated Search

    2000-09-01

    This report examines micro-scale and macro-scale factors for inclusion in an ideal accessibility measure. Their potential influence on the evaluation of mode choice and destination choice is discussed. Availability in Texas' major cities is presented...

  7. “All for some”: water inequity in Zambia and Zimbabwe

    NASA Astrophysics Data System (ADS)

    Robinson, Peter B.

    In southern Africa, gross disparities in access to water are symptomatic of the overall uneven pattern of development. Despite post-independence egalitarian rhetoric, in countries such as Zambia and Zimbabwe inappropriate models (piped house connections in the urban areas, high technology irrigation schemes in the agricultural sector), combined with weak macro-economies and poorly formulated sectoral policies have actually exacerbated the disparities. Zero or very low tariffs have played a major role in this. Although justified as being consistent with water's special status, inadequate tariffs in fact serve to undermine any programme of making water accessible to all. This has led to a narrowing of development options, resulting in exclusivist rather than inclusivist development, and stagnation rather than dynamism. A major part of the explanation for perpetuation of such unsatisfactory outcomes is the existence of political interest groups who benefit from the status quo. The first case study in the paper involves urban water consumers in Zambia, where those with piped water connections seek to continue the culture of low tariffs which is by now deeply embedded. The result is that the water supply authorities (in this case the newly formed, but still politically constrained 'commercialised utilities') are unable even to maintain adequate supplies to the piped customers, let alone extend service to the peri-urban dwellers, 56% of whom do not have access to safe water. The paper outlines some modest, workable principles to achieve universal, affordable access to water in the urban areas, albeit through a mix of service delivery mechanisms. In a second case study of rural productive water in Zimbabwe, the reasons for only 2% of the rural subsistence farming households being involved in formal small-scale irrigation schemes 20 years after independence are explored. Again, a major part of the explanation lies in government pursuing a water delivery model which is not affordable or sustainable on a wide scale. Its provision, via substantial capital and recurrent subsidies, for a small group has a large opportunity cost for society as a whole. The small-scale irrigators have a vested interest in ensuring that the subsidies are maintained, but in the process continue to absorb a disproportionate amount of resources which could be used for development elsewhere. By choosing simpler, cheaper water technologies, and assisting farmers with growing and marketing high value crops, the resources could instead be used to benefit a much larger proportion of households. With well designed programmes aimed at achieving equity, large numbers of subsistence farmers could improve their incomes and start working their way out of poverty.

  8. NFFA-Europe: enhancing European competitiveness in nanoscience research and innovation (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Carsughi, Flavio; Fonseca, Luis

    2017-06-01

    NFFA-EUROPE is an European open access resource for experimental and theoretical nanoscience and sets out a platform to carry out comprehensive projects for multidisciplinary research at the nanoscale extending from synthesis to nanocharacterization to theory and numerical simulation. Advanced infrastructures specialized on growth, nano-lithography, nano-characterization, theory and simulation and fine-analysis with Synchrotron, FEL and Neutron radiation sources are integrated in a multi-site combination to develop frontier research on methods for reproducible nanoscience research and to enable European and international researchers from diverse disciplines to carry out advanced proposals impacting science and innovation. NFFA-EUROPE will enable coordinated access to infrastructures on different aspects of nanoscience research that is not currently available at single specialized ones and without duplicating their specific scopes. Approved user projects will have access to the best suited instruments and support competences for performing the research, including access to analytical large scale facilities, theory and simulation and high-performance computing facilities. Access is offered free of charge to European users and users will receive a financial contribution for their travel, accommodation and subsistence costs. The users access will include several "installations" and will be coordinated through a single entry point portal that will activate an advanced user-infrastructure dialogue to build up a personalized access programme with an increasing return on science and innovation production. The own research activity of NFFA-EUROPE will address key bottlenecks of nanoscience research: nanostructure traceability, protocol reproducibility, in-operando nano-manipulation and analysis, open data.

  9. High-Resolution Climate Data Visualization through GIS- and Web-based Data Portals

    NASA Astrophysics Data System (ADS)

    WANG, X.; Huang, G.

    2017-12-01

    Sound decisions on climate change adaptation rely on an in-depth assessment of potential climate change impacts at regional and local scales, which usually requires finer resolution climate projections at both spatial and temporal scales. However, effective downscaling of global climate projections is practically difficult due to the lack of computational resources and/or long-term reference data. Although a large volume of downscaled climate data has been make available to the public, how to understand and interpret the large-volume climate data and how to make use of the data to drive impact assessment and adaptation studies are still challenging for both impact researchers and decision makers. Such difficulties have become major barriers preventing informed climate change adaptation planning at regional scales. Therefore, this research will explore new GIS- and web-based technologies to help visualize the large-volume regional climate data with high spatiotemporal resolutions. A user-friendly public data portal, named Climate Change Data Portal (CCDP, http://ccdp.network), will be established to allow intuitive and open access to high-resolution regional climate projections at local scales. The CCDP offers functions of visual representation through geospatial maps and data downloading for a variety of climate variables (e.g., temperature, precipitation, relative humidity, solar radiation, and wind) at multiple spatial resolutions (i.e., 25 - 50 km) and temporal resolutions (i.e., annual, seasonal, monthly, daily, and hourly). The vast amount of information the CCDP encompasses can provide a crucial basis for assessing impacts of climate change on local communities and ecosystems and for supporting better decision making under a changing climate.

  10. HTS-DB: an online resource to publish and query data from functional genomics high-throughput siRNA screening projects.

    PubMed

    Saunders, Rebecca E; Instrell, Rachael; Rispoli, Rossella; Jiang, Ming; Howell, Michael

    2013-01-01

    High-throughput screening (HTS) uses technologies such as RNA interference to generate loss-of-function phenotypes on a genomic scale. As these technologies become more popular, many research institutes have established core facilities of expertise to deal with the challenges of large-scale HTS experiments. As the efforts of core facility screening projects come to fruition, focus has shifted towards managing the results of these experiments and making them available in a useful format that can be further mined for phenotypic discovery. The HTS-DB database provides a public view of data from screening projects undertaken by the HTS core facility at the CRUK London Research Institute. All projects and screens are described with comprehensive assay protocols, and datasets are provided with complete descriptions of analysis techniques. This format allows users to browse and search data from large-scale studies in an informative and intuitive way. It also provides a repository for additional measurements obtained from screens that were not the focus of the project, such as cell viability, and groups these data so that it can provide a gene-centric summary across several different cell lines and conditions. All datasets from our screens that can be made available can be viewed interactively and mined for further hit lists. We believe that in this format, the database provides researchers with rapid access to results of large-scale experiments that might facilitate their understanding of genes/compounds identified in their own research. DATABASE URL: http://hts.cancerresearchuk.org/db/public.

  11. Estimating planktonic diversity through spatial dominance patterns in a model ocean.

    PubMed

    Soccodato, Alice; d'Ovidio, Francesco; Lévy, Marina; Jahn, Oliver; Follows, Michael J; De Monte, Silvia

    2016-10-01

    In the open ocean, the observation and quantification of biodiversity patterns is challenging. Marine ecosystems are indeed largely composed by microbial planktonic communities whose niches are affected by highly dynamical physico-chemical conditions, and whose observation requires advanced methods for morphological and molecular classification. Optical remote sensing offers an appealing complement to these in-situ techniques. Global-scale coverage at high spatiotemporal resolution is however achieved at the cost of restrained information on the local assemblage. Here, we use a coupled physical and ecological model ocean simulation to explore one possible metrics for comparing measures performed on such different scales. We show that a large part of the local diversity of the virtual plankton ecosystem - corresponding to what accessible by genomic methods - can be inferred from crude, but spatially extended, information - as conveyed by remote sensing. Shannon diversity of the local community is indeed highly correlated to a 'seascape' index, which quantifies the surrounding spatial heterogeneity of the most abundant functional group. The error implied in drastically reducing the resolution of the plankton community is shown to be smaller in frontal regions as well as in regions of intermediate turbulent energy. On the spatial scale of hundreds of kms, patterns of virtual plankton diversity are thus largely sustained by mixing communities that occupy adjacent niches. We provide a proof of principle that in the open ocean information on spatial variability of communities can compensate for limited local knowledge, suggesting the possibility of integrating in-situ and satellite observations to monitor biodiversity distribution at the global scale. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Architectures of fiber optic network in telecommunications

    NASA Astrophysics Data System (ADS)

    Vasile, Irina B.; Vasile, Alexandru; Filip, Luminita E.

    2005-08-01

    The operators of telecommunications have targeted their efforts towards realizing applications using broad band fiber optics systems in the access network. Thus, a new concept related to the implementation of fiber optic transmission systems, named FITL (Fiber In The Loop) has appeared. The fiber optic transmission systems have been extensively used for realizing the transport and intercommunication of the public telecommunication network, as well as for assuring the access to the telecommunication systems of the great corporations. Still, the segment of the residential users and small corporations did not benefit on large scale of this technology implementation. For the purpose of defining fiber optic applications, more types of architectures were conceived, like: bus, ring, star, tree. In the case of tree-like networks passive splitters (that"s where the name of PON comes from - Passive Optical Network-), which reduce significantly the costs of the fiber optic access, by separating the costs of the optical electronic components. That's why the passive fiber optics architectures (PON represent a viable solution for realizing the access at the user's loop. The main types of fiber optics architectures included in this work are: FTTC (Fiber To The Curb); FTTB (Fiber To The Building); FTTH (Fiber To The Home).

  13. Efficient traffic grooming with dynamic ONU grouping for multiple-OLT-based access network

    NASA Astrophysics Data System (ADS)

    Zhang, Shizong; Gu, Rentao; Ji, Yuefeng; Wang, Hongxiang

    2015-12-01

    Fast bandwidth growth urges large-scale high-density access scenarios, where the multiple Passive Optical Networking (PON) system clustered deployment can be adopted as an appropriate solution to fulfill the huge bandwidth demands, especially for a future 5G mobile network. However, the lack of interaction between different optical line terminals (OLTs) results in part of the bandwidth resources waste. To increase the bandwidth efficiency, as well as reduce bandwidth pressure at the edge of a network, we propose a centralized flexible PON architecture based on Time- and Wavelength-Division Multiplexing PON (TWDM PON). It can provide flexible affiliation for optical network units (ONUs) and different OLTs to support access network traffic localization. Specifically, a dynamic ONU grouping algorithm (DGA) is provided to obtain the minimal OLT outbound traffic. Simulation results show that DGA obtains an average 25.23% traffic gain increment under different OLT numbers within a small ONU number situation, and the traffic gain will increase dramatically with the increment of the ONU number. As the DGA can be deployed easily as an application running above the centralized control plane, the proposed architecture can be helpful to improve the network efficiency for future traffic-intensive access scenarios.

  14. 'Time is costly': modelling the macroeconomic impact of scaling-up antiretroviral treatment in sub-Saharan Africa.

    PubMed

    Ventelou, Bruno; Moatti, Jean-Paul; Videau, Yann; Kazatchkine, Michel

    2008-01-02

    Macroeconomic policy requirements may limit the capacity of national and international policy-makers to allocate sufficient resources for scaling-up access to HIV care and treatment in developing countries. An endogenous growth model, which takes into account the evolution of society's human capital, was used to assess the macroeconomic impact of policies aimed at scaling-up access to HIV/AIDS treatment in six African countries (Angola, Benin, Cameroon, Central African Republic, Ivory Coast and Zimbabwe). The model results showed that scaling-up access to treatment in the affected population would limit gross domestic product losses due to AIDS although differently from country to country. In our simulated scenarios of access to antiretroviral therapy, only 10.3% of the AIDS shock is counterbalanced in Zimbabwe, against 85.2% in Angola and even 100.0% in Benin (a total recovery). For four out of the six countries (Angola, Benin, Cameroon, Ivory Coast), the macro-economic gains of scaling-up would become potentially superior to its associated costs in 2010. Despite the variability of HIV prevalence rates between countries, macro-economic estimates strongly suggest that a massive investment in scaling-up access to HIV treatment may efficiently counteract the detrimental long-term impact of the HIV pandemic on economic growth, to the extent that the AIDS shock has not already driven the economy beyond an irreversible 'no-development epidemiological trap'.

  15. Vacuum isostatic micro/macro molding of PTFE materials for laser beam shaping in environmental applications: large scale UV laser water purification

    NASA Astrophysics Data System (ADS)

    Lizotte, Todd; Ohar, Orest

    2009-08-01

    Accessibility to fresh clean water has determined the location and survival of civilizations throughout the ages [1]. The tangible economic value of water is demonstrated by industry's need for water in fields such as semiconductor, food and pharmaceutical manufacturing. Economic stability for all sectors of industry depends on access to reliable volumes of good quality water. As can be seen on television a nation's economy is seriously affected by water shortages through drought or mismanagement and as such those water resources must therefore be managed both for the public interest and the economic future. For over 50 years ultraviolet water purification has been the mainstay technology for water treatment, killing potential microbiological agents in water for leisure activities such as swimming pools to large scale waste water treatment facilities where the UV light photo-oxidizes various pollutants and contaminants. Well tailored to the task, UV provides a cost effective way to reduce the use of chemicals in sanitization and anti-biological applications. Predominantly based on low pressure Hg UV discharge lamps, the system is plagued with lifetime issues (~1 year normal operation), the last ten years has shown that the technology continues to advance and larger scale systems are turning to more advanced lamp designs and evaluating solidstate UV light sources and more powerful laser sources. One of the issues facing the treatment of water with UV lasers is an appropriate means of delivering laser light efficiently over larger volumes or cross sections of water. This paper examines the potential advantages of laser beam shaping components made from isostatically micro molding microstructured PTFE materials for integration into large scale water purification and sterilization systems, for both lamps and laser sources. Applying a unique patented fabrication method engineers can form micro and macro scale diffractive, holographic and faceted reflective structures into fused and semi-fused PTFE materials and compounds for use in UV Reactors. The materials unique attributes provide an unusual but effective hybrid element, by combining Lambertian diffusion and spectral reflective attributes. This paper will provide examples of the applications where this technology could be applied and typical constructions. An overview of UV sources commonly used in water treatment, including high power UV lasers and solid state UV light sources will be discussed. The paper will summarize how beam shaping elements produced in PTFE materials would provide further benefits to the emerging water disinfection or treatment market.

  16. The Design of PC/MISI, a PC-Based Common User Interface to Remote Information Storage and Retrieval Systems. M.S. ThesisFinal Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Hall, Philip P.

    1985-01-01

    The amount of information contained in the data bases of large-scale information storage and retrieval systems is very large and growing at a rapid rate. The methods available for assessing this information have not been successful in making the information easily available to the people who have the greatest need for it. This thesis describes the design of a personal computer based system which will provide a means for these individuals to retrieve this data through one standardized interface. The thesis identifies each of the major problems associated with providing access to casual users of IS and R systems and describes the manner in which these problems are to be solved by the utilization of the local processing power of a PC. Additional capabilities, not available with standard access methods, are also provided to improve the user's ability to make use of this information. The design of PC/MISI is intended to facilitate its use as a research vehicle. Evaluation mechanisms and possible areas of future research are described. The PC/MISI development effort is part of a larger research effort directed at improving access to remote IS and R systems. This research effort, supported in part by NASA, is also reviewed.

  17. SIMAP--a comprehensive database of pre-calculated protein sequence similarities, domains, annotations and clusters.

    PubMed

    Rattei, Thomas; Tischler, Patrick; Götz, Stefan; Jehl, Marc-André; Hoser, Jonathan; Arnold, Roland; Conesa, Ana; Mewes, Hans-Werner

    2010-01-01

    The prediction of protein function as well as the reconstruction of evolutionary genesis employing sequence comparison at large is still the most powerful tool in sequence analysis. Due to the exponential growth of the number of known protein sequences and the subsequent quadratic growth of the similarity matrix, the computation of the Similarity Matrix of Proteins (SIMAP) becomes a computational intensive task. The SIMAP database provides a comprehensive and up-to-date pre-calculation of the protein sequence similarity matrix, sequence-based features and sequence clusters. As of September 2009, SIMAP covers 48 million proteins and more than 23 million non-redundant sequences. Novel features of SIMAP include the expansion of the sequence space by including databases such as ENSEMBL as well as the integration of metagenomes based on their consistent processing and annotation. Furthermore, protein function predictions by Blast2GO are pre-calculated for all sequences in SIMAP and the data access and query functions have been improved. SIMAP assists biologists to query the up-to-date sequence space systematically and facilitates large-scale downstream projects in computational biology. Access to SIMAP is freely provided through the web portal for individuals (http://mips.gsf.de/simap/) and for programmatic access through DAS (http://webclu.bio.wzw.tum.de/das/) and Web-Service (http://mips.gsf.de/webservices/services/SimapService2.0?wsdl).

  18. Prototype Packaged Databases and Software in Health

    PubMed Central

    Gardenier, Turkan K.

    1980-01-01

    This paper describes the recent demand for packaged databases and software for health applications in light of developments in mini-and micro-computer technology. Specific features for defining prospective user groups are discussed; criticisms generated for large-scale epidemiological data use as a means of replacing clinical trials and associated controls are posed to the reader. The available collaborative efforts for access and analysis of jointly structured health data are stressed, with recommendations for new analytical techniques specifically geared to monitoring data such as the CTSS (Cumulative Transitional State Score) generated for tacking ongoing patient status over time in clinical trials. Examples of graphic display are given from the Domestic Information Display System (DIDS) which is a collaborative multi-agency effort to computerize and make accessible user-specified U.S. and local maps relating to health, environment, socio-economic and energy data.

  19. Modeling Social Capital as Dynamic Networks to Promote Access to Oral Healthcare

    PubMed Central

    Northridge, Mary E.; Kunzel, Carol; Zhang, Qiuyi; Kum, Susan S.; Gilbert, Jessica L.; Jin, Zhu; Metcalf, Sara S.

    2016-01-01

    Social capital, as comprised of human connections in social networks and their associated benefits, is closely related to the health of individuals, communities, and societies at large. For disadvantaged population groups such as older adults and racial/ethnic minorities, social capital may play a particularly critical role in mitigating the negative effects and reinforcing the positive effects on health. In this project, we model social capital as both cause and effect by simulating dynamic networks. Informed in part by a community-based health promotion program, an agent-based model is contextualized in a GIS environment to explore the complexity of social disparities in oral and general health as experienced at the individual, interpersonal, and community scales. This study provides the foundation for future work investigating how health and healthcare accessibility may be influenced by social networks. PMID:27668298

  20. From the Margins to the Spotlight: Diverse Deaf and Hard of Hearing Student Populations and Standardized Assessment Accessibility.

    PubMed

    Cawthon, Stephanie

    2015-01-01

    Designing assessments and tests is one of the more challenging aspects of creating an accessible learning environment for students who are deaf or hard of hearing (DHH), particularly for deaf students with a disability (DWD). Standardized assessments are a key mechanism by which the educational system in the United States measures student progress, teacher effectiveness, and the impact of school reform. The diversity of student characteristics within DHH and DWD populations is only now becoming visible in the research literature relating to standardized assessments and their use in large-scale accountability reforms. The purpose of this article is to explore the theoretical frameworks surrounding assessment policy and practice, current research related to standardized assessment and students who are DHH and DWD, and potential implications for practice within both the assessment and instruction contexts.

  1. Approximation concepts for efficient structural synthesis

    NASA Technical Reports Server (NTRS)

    Schmit, L. A., Jr.; Miura, H.

    1976-01-01

    It is shown that efficient structural synthesis capabilities can be created by using approximation concepts to mesh finite element structural analysis methods with nonlinear mathematical programming techniques. The history of the application of mathematical programming techniques to structural design optimization problems is reviewed. Several rather general approximation concepts are described along with the technical foundations of the ACCESS 1 computer program, which implements several approximation concepts. A substantial collection of structural design problems involving truss and idealized wing structures is presented. It is concluded that since the basic ideas employed in creating the ACCESS 1 program are rather general, its successful development supports the contention that the introduction of approximation concepts will lead to the emergence of a new generation of practical and efficient, large scale, structural synthesis capabilities in which finite element analysis methods and mathematical programming algorithms will play a central role.

  2. Modeling Social Capital as Dynamic Networks to Promote Access to Oral Healthcare.

    PubMed

    Wang, Hua; Northridge, Mary E; Kunzel, Carol; Zhang, Qiuyi; Kum, Susan S; Gilbert, Jessica L; Jin, Zhu; Metcalf, Sara S

    2016-01-01

    Social capital, as comprised of human connections in social networks and their associated benefits, is closely related to the health of individuals, communities, and societies at large. For disadvantaged population groups such as older adults and racial/ethnic minorities, social capital may play a particularly critical role in mitigating the negative effects and reinforcing the positive effects on health. In this project, we model social capital as both cause and effect by simulating dynamic networks. Informed in part by a community-based health promotion program, an agent-based model is contextualized in a GIS environment to explore the complexity of social disparities in oral and general health as experienced at the individual, interpersonal, and community scales. This study provides the foundation for future work investigating how health and healthcare accessibility may be influenced by social networks.

  3. Virtual Observatory Science Applications

    NASA Technical Reports Server (NTRS)

    McGlynn, Tom

    2005-01-01

    Many Virtual-Observatory-based applications are now available to astronomers for use in their research. These span data discovery, access, visualization and analysis. Tools can quickly gather and organize information from sites around the world to help in planning a response to a gamma-ray burst, help users pick filters to isolate a desired feature, make an average template for z=2 AGN, select sources based upon information in many catalogs, or correlate massive distributed databases. Using VO protocols, the reach of existing software tools and packages can be greatly extended, allowing users to find and access remote information almost as conveniently as local data. The talk highlights just a few of the tools available to scientists, describes how both large and small scale projects can use existing tools, and previews some of the new capabilities that will be available in the next few years.

  4. Colloquium: Mechanical formalisms for tissue dynamics.

    PubMed

    Tlili, Sham; Gay, Cyprien; Graner, François; Marcq, Philippe; Molino, François; Saramito, Pierre

    2015-05-01

    The understanding of morphogenesis in living organisms has been renewed by tremendous progress in experimental techniques that provide access to cell scale, quantitative information both on the shapes of cells within tissues and on the genes being expressed. This information suggests that our understanding of the respective contributions of gene expression and mechanics, and of their crucial entanglement, will soon leap forward. Biomechanics increasingly benefits from models, which assist the design and interpretation of experiments, point out the main ingredients and assumptions, and ultimately lead to predictions. The newly accessible local information thus calls for a reflection on how to select suitable classes of mechanical models. We review both mechanical ingredients suggested by the current knowledge of tissue behaviour, and modelling methods that can help generate a rheological diagram or a constitutive equation. We distinguish cell scale ("intra-cell") and tissue scale ("inter-cell") contributions. We recall the mathematical framework developed for continuum materials and explain how to transform a constitutive equation into a set of partial differential equations amenable to numerical resolution. We show that when plastic behaviour is relevant, the dissipation function formalism appears appropriate to generate constitutive equations; its variational nature facilitates numerical implementation, and we discuss adaptations needed in the case of large deformations. The present article gathers theoretical methods that can readily enhance the significance of the data to be extracted from recent or future high throughput biomechanical experiments.

  5. The effects of paid maternity leave: Evidence from Temporary Disability Insurance.

    PubMed

    Stearns, Jenna

    2015-09-01

    This paper investigates the effects of a large-scale paid maternity leave program on birth outcomes in the United States. In 1978, states with Temporary Disability Insurance (TDI) programs were required to start providing wage replacement benefits to pregnant women, substantially increasing access to antenatal and postnatal paid leave for working mothers. Using natality data, I find that TDI paid maternity leave reduces the share of low birth weight births by 3.2 percent, and the estimated treatment-on-the-treated effect is over 10 percent. It also decreases the likelihood of early term birth by 6.6 percent. Paid maternity leave has particularly large impacts on the children of unmarried and black mothers. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Affordable and accurate large-scale hybrid-functional calculations on GPU-accelerated supercomputers

    NASA Astrophysics Data System (ADS)

    Ratcliff, Laura E.; Degomme, A.; Flores-Livas, José A.; Goedecker, Stefan; Genovese, Luigi

    2018-03-01

    Performing high accuracy hybrid functional calculations for condensed matter systems containing a large number of atoms is at present computationally very demanding or even out of reach if high quality basis sets are used. We present a highly optimized multiple graphics processing unit implementation of the exact exchange operator which allows one to perform fast hybrid functional density-functional theory (DFT) calculations with systematic basis sets without additional approximations for up to a thousand atoms. With this method hybrid DFT calculations of high quality become accessible on state-of-the-art supercomputers within a time-to-solution that is of the same order of magnitude as traditional semilocal-GGA functionals. The method is implemented in a portable open-source library.

  7. Role of Correlations in the Collective Behavior of Microswimmer Suspensions

    NASA Astrophysics Data System (ADS)

    Stenhammar, Joakim; Nardini, Cesare; Nash, Rupert W.; Marenduzzo, Davide; Morozov, Alexander

    2017-07-01

    In this Letter, we study the collective behavior of a large number of self-propelled microswimmers immersed in a fluid. Using unprecedentedly large-scale lattice Boltzmann simulations, we reproduce the transition to bacterial turbulence. We show that, even well below the transition, swimmers move in a correlated fashion that cannot be described by a mean-field approach. We develop a novel kinetic theory that captures these correlations and is nonperturbative in the swimmer density. To provide an experimentally accessible measure of correlations, we calculate the diffusivity of passive tracers and reveal its nontrivial density dependence. The theory is in quantitative agreement with the lattice Boltzmann simulations and captures the asymmetry between pusher and puller swimmers below the transition to turbulence.

  8. Progress in long scale length laser plasma interactions

    NASA Astrophysics Data System (ADS)

    Glenzer, S. H.; Arnold, P.; Bardsley, G.; Berger, R. L.; Bonanno, G.; Borger, T.; Bower, D. E.; Bowers, M.; Bryant, R.; Buckman, S.; Burkhart, S. C.; Campbell, K.; Chrisp, M. P.; Cohen, B. I.; Constantin, C.; Cooper, F.; Cox, J.; Dewald, E.; Divol, L.; Dixit, S.; Duncan, J.; Eder, D.; Edwards, J.; Erbert, G.; Felker, B.; Fornes, J.; Frieders, G.; Froula, D. H.; Gardner, S. D.; Gates, C.; Gonzalez, M.; Grace, S.; Gregori, G.; Greenwood, A.; Griffith, R.; Hall, T.; Hammel, B. A.; Haynam, C.; Heestand, G.; Henesian, M.; Hermes, G.; Hinkel, D.; Holder, J.; Holdner, F.; Holtmeier, G.; Hsing, W.; Huber, S.; James, T.; Johnson, S.; Jones, O. S.; Kalantar, D.; Kamperschroer, J. H.; Kauffman, R.; Kelleher, T.; Knight, J.; Kirkwood, R. K.; Kruer, W. L.; Labiak, W.; Landen, O. L.; Langdon, A. B.; Langer, S.; Latray, D.; Lee, A.; Lee, F. D.; Lund, D.; MacGowan, B.; Marshall, S.; McBride, J.; McCarville, T.; McGrew, L.; Mackinnon, A. J.; Mahavandi, S.; Manes, K.; Marshall, C.; Menapace, J.; Mertens, E.; Meezan, N.; Miller, G.; Montelongo, S.; Moody, J. D.; Moses, E.; Munro, D.; Murray, J.; Neumann, J.; Newton, M.; Ng, E.; Niemann, C.; Nikitin, A.; Opsahl, P.; Padilla, E.; Parham, T.; Parrish, G.; Petty, C.; Polk, M.; Powell, C.; Reinbachs, I.; Rekow, V.; Rinnert, R.; Riordan, B.; Rhodes, M.; Roberts, V.; Robey, H.; Ross, G.; Sailors, S.; Saunders, R.; Schmitt, M.; Schneider, M. B.; Shiromizu, S.; Spaeth, M.; Stephens, A.; Still, B.; Suter, L. J.; Tietbohl, G.; Tobin, M.; Tuck, J.; Van Wonterghem, B. M.; Vidal, R.; Voloshin, D.; Wallace, R.; Wegner, P.; Whitman, P.; Williams, E. A.; Williams, K.; Winward, K.; Work, K.; Young, B.; Young, P. E.; Zapata, P.; Bahr, R. E.; Seka, W.; Fernandez, J.; Montgomery, D.; Rose, H.

    2004-12-01

    The first experiments on the National Ignition Facility (NIF) have employed the first four beams to measure propagation and laser backscattering losses in large ignition-size plasmas. Gas-filled targets between 2 and 7 mm length have been heated from one side by overlapping the focal spots of the four beams from one quad operated at 351 nm (3ω) with a total intensity of 2 × 1015 W cm-2. The targets were filled with 1 atm of CO2 producing up to 7 mm long homogeneously heated plasmas with densities of ne = 6 × 1020 cm-3 and temperatures of Te = 2 keV. The high energy in an NIF quad of beams of 16 kJ, illuminating the target from one direction, creates unique conditions for the study of laser-plasma interactions at scale lengths not previously accessible. The propagation through the large-scale plasma was measured with a gated x-ray imager that was filtered for 3.5 keV x-rays. These data indicate that the beams interact with the full length of this ignition-scale plasma during the last ~1 ns of the experiment. During that time, the full aperture measurements of the stimulated Brillouin scattering and stimulated Raman scattering show scattering into the four focusing lenses of 3% for the smallest length (~2 mm), increasing to 10-12% for ~7 mm. These results demonstrate the NIF experimental capabilities and further provide a benchmark for three-dimensional modelling of the laser-plasma interactions at ignition-size scale lengths.

  9. Impact of post-Born lensing on the CMB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pratten, Geraint; Lewis, Antony, E-mail: G.Pratten@Sussex.ac.uk, E-mail: antony@cosmologist.info

    Lensing of the CMB is affected by post-Born lensing, producing corrections to the convergence power spectrum and introducing field rotation. We show numerically that the lensing convergence power spectrum is affected at the ∼< 0.2% level on accessible scales, and that this correction and the field rotation are negligible for observations with arcminute beam and noise levels ∼> 1 μK arcmin. The field rotation generates ∼ 2.5% of the total lensing B-mode polarization amplitude (0.2% in power on small scales), but has a blue spectrum on large scales, making it highly subdominant to the convergence B modes on scales wheremore » they are a source of confusion for the signal from primordial gravitational waves. Since the post-Born signal is non-linear, it also generates a bispectrum with the convergence. We show that the post-Born contributions to the bispectrum substantially change the shape predicted from large-scale structure non-linearities alone, and hence must be included to estimate the expected total signal and impact of bispectrum biases on CMB lensing reconstruction quadratic estimators and other observables. The field-rotation power spectrum only becomes potentially detectable for noise levels || 1 μK arcmin, but its bispectrum with the convergence may be observable at ∼ 3σ with Stage IV observations. Rotation-induced and convergence-induced B modes are slightly correlated by the bispectrum, and the bispectrum also produces additional contributions to the lensed BB power spectrum.« less

  10. Global scale groundwater flow model

    NASA Astrophysics Data System (ADS)

    Sutanudjaja, Edwin; de Graaf, Inge; van Beek, Ludovicus; Bierkens, Marc

    2013-04-01

    As the world's largest accessible source of freshwater, groundwater plays vital role in satisfying the basic needs of human society. It serves as a primary source of drinking water and supplies water for agricultural and industrial activities. During times of drought, groundwater sustains water flows in streams, rivers, lakes and wetlands, and thus supports ecosystem habitat and biodiversity, while its large natural storage provides a buffer against water shortages. Yet, the current generation of global scale hydrological models does not include a groundwater flow component that is a crucial part of the hydrological cycle and allows the simulation of groundwater head dynamics. In this study we present a steady-state MODFLOW (McDonald and Harbaugh, 1988) groundwater model on the global scale at 5 arc-minutes resolution. Aquifer schematization and properties of this groundwater model were developed from available global lithological model (e.g. Dürr et al., 2005; Gleeson et al., 2010; Hartmann and Moorsdorff, in press). We force the groundwtaer model with the output from the large-scale hydrological model PCR-GLOBWB (van Beek et al., 2011), specifically the long term net groundwater recharge and average surface water levels derived from routed channel discharge. We validated calculated groundwater heads and depths with available head observations, from different regions, including the North and South America and Western Europe. Our results show that it is feasible to build a relatively simple global scale groundwater model using existing information, and estimate water table depths within acceptable accuracy in many parts of the world.

  11. Large-scale annotation of small-molecule libraries using public databases.

    PubMed

    Zhou, Yingyao; Zhou, Bin; Chen, Kaisheng; Yan, S Frank; King, Frederick J; Jiang, Shumei; Winzeler, Elizabeth A

    2007-01-01

    While many large publicly accessible databases provide excellent annotation for biological macromolecules, the same is not true for small chemical compounds. Commercial data sources also fail to encompass an annotation interface for large numbers of compounds and tend to be cost prohibitive to be widely available to biomedical researchers. Therefore, using annotation information for the selection of lead compounds from a modern day high-throughput screening (HTS) campaign presently occurs only under a very limited scale. The recent rapid expansion of the NIH PubChem database provides an opportunity to link existing biological databases with compound catalogs and provides relevant information that potentially could improve the information garnered from large-scale screening efforts. Using the 2.5 million compound collection at the Genomics Institute of the Novartis Research Foundation (GNF) as a model, we determined that approximately 4% of the library contained compounds with potential annotation in such databases as PubChem and the World Drug Index (WDI) as well as related databases such as the Kyoto Encyclopedia of Genes and Genomes (KEGG) and ChemIDplus. Furthermore, the exact structure match analysis showed 32% of GNF compounds can be linked to third party databases via PubChem. We also showed annotations such as MeSH (medical subject headings) terms can be applied to in-house HTS databases in identifying signature biological inhibition profiles of interest as well as expediting the assay validation process. The automated annotation of thousands of screening hits in batch is becoming feasible and has the potential to play an essential role in the hit-to-lead decision making process.

  12. A large-scale cluster randomized trial to determine the effects of community-based dietary sodium reduction--the China Rural Health Initiative Sodium Reduction Study.

    PubMed

    Li, Nicole; Yan, Lijing L; Niu, Wenyi; Labarthe, Darwin; Feng, Xiangxian; Shi, Jingpu; Zhang, Jianxin; Zhang, Ruijuan; Zhang, Yuhong; Chu, Hongling; Neiman, Andrea; Engelgau, Michael; Elliott, Paul; Wu, Yangfeng; Neal, Bruce

    2013-11-01

    Cardiovascular diseases are the leading cause of death and disability in China. High blood pressure caused by excess intake of dietary sodium is widespread and an effective sodium reduction program has potential to improve cardiovascular health. This study is a large-scale, cluster-randomized, trial done in five Northern Chinese provinces. Two counties have been selected from each province and 12 townships in each county making a total of 120 clusters. Within each township one village has been selected for participation with 1:1 randomization stratified by county. The sodium reduction intervention comprises community health education and a food supply strategy based upon providing access to salt substitute. Subsidization of the price of salt substitute was done in 30 intervention villages selected at random. Control villages continued usual practices. The primary outcome for the study is dietary sodium intake level estimated from assays of 24-hour urine. The trial recruited and randomized 120 townships in April 2011. The sodium reduction program was commenced in the 60 intervention villages between May and June of that year with outcome surveys scheduled for October to December 2012. Baseline data collection shows that randomisation achieved good balance across groups. The establishment of the China Rural Health Initiative has enabled the launch of this large-scale trial designed to identify a novel, scalable strategy for reduction of dietary sodium and control of blood pressure. If proved effective, the intervention could plausibly be implemented at low cost in large parts of China and other countries worldwide. © 2013.

  13. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  14. Packaging of silicon photonic devices: from prototypes to production

    NASA Astrophysics Data System (ADS)

    Morrissey, Padraic E.; Gradkowski, Kamil; Carroll, Lee; O'Brien, Peter

    2018-02-01

    The challenges associated with the photonic packaging of silicon devices is often underestimated and remains technically challenging. In this paper, we review some key enabling technologies that will allow us to overcome the current bottleneck in silicon photonic packaging; while also describing the recent developments in standardisation, including the establishment of PIXAPP as the worlds first open-access PIC packaging and assembly Pilot Line. These developments will allow the community to move from low volume prototype photonic packaged devices to large scale volume manufacturing, where the full commercialisation of PIC technology can be realised.

  15. Mental illness stigma, help seeking, and public health programs.

    PubMed

    Henderson, Claire; Evans-Lacko, Sara; Thornicroft, Graham

    2013-05-01

    Globally, more than 70% of people with mental illness receive no treatment from health care staff. Evidence suggests that factors increasing the likelihood of treatment avoidance or delay before presenting for care include (1) lack of knowledge to identify features of mental illnesses, (2) ignorance about how to access treatment, (3) prejudice against people who have mental illness, and (4) expectation of discrimination against people diagnosed with mental illness. In this article, we reviewed the evidence on whether large-scale anti-stigma campaigns could lead to increased levels of help seeking.

  16. Mental Illness Stigma, Help Seeking, and Public Health Programs

    PubMed Central

    Evans-Lacko, Sara; Thornicroft, Graham

    2013-01-01

    Globally, more than 70% of people with mental illness receive no treatment from health care staff. Evidence suggests that factors increasing the likelihood of treatment avoidance or delay before presenting for care include (1) lack of knowledge to identify features of mental illnesses, (2) ignorance about how to access treatment, (3) prejudice against people who have mental illness, and (4) expectation of discrimination against people diagnosed with mental illness. In this article, we reviewed the evidence on whether large-scale anti-stigma campaigns could lead to increased levels of help seeking. PMID:23488489

  17. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Supinski, B.; Caliga, D.

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  18. Field Guide to Plant Model Systems

    PubMed Central

    Chang, Caren; Bowman, John L.; Meyerowitz, Elliot M.

    2016-01-01

    For the past several decades, advances in plant development, physiology, cell biology, and genetics have relied heavily on the model (or reference) plant Arabidopsis thaliana. Arabidopsis resembles other plants, including crop plants, in many but by no means all respects. Study of Arabidopsis alone provides little information on the evolutionary history of plants, evolutionary differences between species, plants that survive in different environments, or plants that access nutrients and photosynthesize differently. Empowered by the availability of large-scale sequencing and new technologies for investigating gene function, many new plant models are being proposed and studied. PMID:27716506

  19. NAS (Numerical Aerodynamic Simulation Program) technical summaries, March 1989 - February 1990

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Given here are selected scientific results from the Numerical Aerodynamic Simulation (NAS) Program's third year of operation. During this year, the scientific community was given access to a Cray-2 and a Cray Y-MP supercomputer. Topics covered include flow field analysis of fighter wing configurations, large-scale ocean modeling, the Space Shuttle flow field, advanced computational fluid dynamics (CFD) codes for rotary-wing airloads and performance prediction, turbulence modeling of separated flows, airloads and acoustics of rotorcraft, vortex-induced nonlinearities on submarines, and standing oblique detonation waves.

  20. Cross-polar transport and scavenging of Siberian aerosols containing black carbon during the 2012 ACCESS summer campaign

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raut, Jean -Christophe; Marelle, Louis; Fast, Jerome D.

    During the ACCESS airborne campaign in July 2012, extensive boreal forest fires resulted in significant aerosol transport to the Arctic. A 10-day episode combining intense biomass burning over Siberia and low-pressure systems over the Arctic Ocean resulted in efficient transport of plumes containing black carbon (BC) towards the Arctic, mostly in the upper troposphere (6–8 km). Here, a combination of in situ observations (DLR Falcon aircraft), satellite analysis and WRF-Chem simulations is used to understand the vertical and horizontal transport mechanisms of BC with a focus on the role of wet removal. Between the northwestern Norwegian coast and the Svalbardmore » archipelago, the Falcon aircraft sampled plumes with enhanced CO concentrations up to 200 ppbv and BC mixing ratios up to 25 ng kg –1. During transport to the Arctic region, a large fraction of BC particles are scavenged by two wet deposition processes, namely wet removal by large-scale precipitation and removal in wet convective updrafts, with both processes contributing almost equally to the total accumulated deposition of BC. Our results underline that applying a finer horizontal resolution (40 instead of 100 km) improves the model performance, as it significantly reduces the overestimation of BC levels observed at a coarser resolution in the mid-troposphere. According to the simulations at 40 km, the transport efficiency of BC (TE BC) in biomass burning plumes was larger (60 %), because it was impacted by small accumulated precipitation along trajectory (1 mm). In contrast TE BC was small (< 30 %) and accumulated precipitation amounts were larger (5–10 mm) in plumes influenced by urban anthropogenic sources and flaring activities in northern Russia, resulting in transport to lower altitudes. TE BC due to large-scale precipitation is responsible for a sharp meridional gradient in the distribution of BC concentrations. Wet removal in cumulus clouds is the cause of modeled vertical gradient of TE BC, especially in the mid-latitudes, reflecting the distribution of convective precipitation, but is dominated in the Arctic region by the large-scale wet removal associated with the formation of stratocumulus clouds in the planetary boundary layer (PBL) that produce frequent drizzle.« less

Top