Sample records for large-scale resource sharing

  1. Why is data sharing in collaborative natural resource efforts so hard and what can we do to improve it?

    PubMed

    Volk, Carol J; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents (n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  2. Why is Data Sharing in Collaborative Natural Resource Efforts so Hard and What can We Do to Improve it?

    NASA Astrophysics Data System (ADS)

    Volk, Carol J.; Lucero, Yasmin; Barnas, Katie

    2014-05-01

    Increasingly, research and management in natural resource science rely on very large datasets compiled from multiple sources. While it is generally good to have more data, utilizing large, complex datasets has introduced challenges in data sharing, especially for collaborating researchers in disparate locations ("distributed research teams"). We surveyed natural resource scientists about common data-sharing problems. The major issues identified by our survey respondents ( n = 118) when providing data were lack of clarity in the data request (including format of data requested). When receiving data, survey respondents reported various insufficiencies in documentation describing the data (e.g., no data collection description/no protocol, data aggregated, or summarized without explanation). Since metadata, or "information about the data," is a central obstacle in efficient data handling, we suggest documenting metadata through data dictionaries, protocols, read-me files, explicit null value documentation, and process metadata as essential to any large-scale research program. We advocate for all researchers, but especially those involved in distributed teams to alleviate these problems with the use of several readily available communication strategies including the use of organizational charts to define roles, data flow diagrams to outline procedures and timelines, and data update cycles to guide data-handling expectations. In particular, we argue that distributed research teams magnify data-sharing challenges making data management training even more crucial for natural resource scientists. If natural resource scientists fail to overcome communication and metadata documentation issues, then negative data-sharing experiences will likely continue to undermine the success of many large-scale collaborative projects.

  3. Some aspects of control of a large-scale dynamic system

    NASA Technical Reports Server (NTRS)

    Aoki, M.

    1975-01-01

    Techniques of predicting and/or controlling the dynamic behavior of large scale systems are discussed in terms of decentralized decision making. Topics discussed include: (1) control of large scale systems by dynamic team with delayed information sharing; (2) dynamic resource allocation problems by a team (hierarchical structure with a coordinator); and (3) some problems related to the construction of a model of reduced dimension.

  4. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  5. Resource integration and shared outcomes at the watershed scale

    Treesearch

    Eleanor S. Towns

    2000-01-01

    Shared resources are universal resources that are vital for sustaining communities, enhancing our quality of life and preserving ecosystem health. We have a shared responsibility to conserve shared resources and preserve their integrity for future generations. Resource integration is accomplished through ecosystem management, often at a watershed scale. The shared...

  6. Axiope tools for data management and data sharing.

    PubMed

    Goddard, Nigel H; Cannon, Robert C; Howell, Fred W

    2003-01-01

    Many areas of biological research generate large volumes of very diverse data. Managing this data can be a difficult and time-consuming process, particularly in an academic environment where there are very limited resources for IT support staff such as database administrators. The most economical and efficient solutions are those that enable scientists with minimal IT expertise to control and operate their own desktop systems. Axiope provides one such solution, Catalyzer, which acts as flexible cataloging system for creating structured records describing digital resources. The user is able specify both the content and structure of the information included in the catalog. Information and resources can be shared by a variety of means, including automatically generated sets of web pages. Federation and integration of this information, where needed, is handled by Axiope's Mercat server. Where there is a need for standardization or compatibility of the structures usedby different researchers this canbe achieved later by applying user-defined mappings in Mercat. In this way, large-scale data sharing can be achieved without imposing unnecessary constraints or interfering with the way in which individual scientists choose to record and catalog their work. We summarize the key technical issues involved in scientific data management and data sharing, describe the main features and functionality of Axiope Catalyzer and Axiope Mercat, and discuss future directions and requirements for an information infrastructure to support large-scale data sharing and scientific collaboration.

  7. Enterprise tools to promote interoperability: MonitoringResources.org supports design and documentation of large-scale, long-term monitoringprograms

    NASA Astrophysics Data System (ADS)

    Weltzin, J. F.; Scully, R. A.; Bayer, J.

    2016-12-01

    Individual natural resource monitoring programs have evolved in response to different organizational mandates, jurisdictional needs, issues and questions. We are establishing a collaborative forum for large-scale, long-term monitoring programs to identify opportunities where collaboration could yield efficiency in monitoring design, implementation, analyses, and data sharing. We anticipate these monitoring programs will have similar requirements - e.g. survey design, standardization of protocols and methods, information management and delivery - that could be met by enterprise tools to promote sustainability, efficiency and interoperability of information across geopolitical boundaries or organizational cultures. MonitoringResources.org, a project of the Pacific Northwest Aquatic Monitoring Partnership, provides an on-line suite of enterprise tools focused on aquatic systems in the Pacific Northwest Region of the United States. We will leverage on and expand this existing capacity to support continental-scale monitoring of both aquatic and terrestrial systems. The current stakeholder group is focused on programs led by bureaus with the Department of Interior, but the tools will be readily and freely available to a broad variety of other stakeholders. Here, we report the results of two initial stakeholder workshops focused on (1) establishing a collaborative forum of large scale monitoring programs, (2) identifying and prioritizing shared needs, (3) evaluating existing enterprise resources, (4) defining priorities for development of enhanced capacity for MonitoringResources.org, and (5) identifying a small number of pilot projects that can be used to define and test development requirements for specific monitoring programs.

  8. Application of cooperative and non-cooperative games in large-scale water quantity and quality management: a case study.

    PubMed

    Mahjouri, Najmeh; Ardestani, Mojtaba

    2011-01-01

    In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.

  9. A distributed computing approach to mission operations support. [for spacecraft

    NASA Technical Reports Server (NTRS)

    Larsen, R. L.

    1975-01-01

    Computing mission operation support includes orbit determination, attitude processing, maneuver computation, resource scheduling, etc. The large-scale third-generation distributed computer network discussed is capable of fulfilling these dynamic requirements. It is shown that distribution of resources and control leads to increased reliability, and exhibits potential for incremental growth. Through functional specialization, a distributed system may be tuned to very specific operational requirements. Fundamental to the approach is the notion of process-to-process communication, which is effected through a high-bandwidth communications network. Both resource-sharing and load-sharing may be realized in the system.

  10. Resource Management for Distributed Parallel Systems

    NASA Technical Reports Server (NTRS)

    Neuman, B. Clifford; Rao, Santosh

    1993-01-01

    Multiprocessor systems should exist in the the larger context of distributed systems, allowing multiprocessor resources to be shared by those that need them. Unfortunately, typical multiprocessor resource management techniques do not scale to large networks. The Prospero Resource Manager (PRM) is a scalable resource allocation system that supports the allocation of processing resources in large networks and multiprocessor systems. To manage resources in such distributed parallel systems, PRM employs three types of managers: system managers, job managers, and node managers. There exist multiple independent instances of each type of manager, reducing bottlenecks. The complexity of each manager is further reduced because each is designed to utilize information at an appropriate level of abstraction.

  11. Value-focused framework for defining landscape-scale conservation targets

    USGS Publications Warehouse

    Romañach, Stephanie; Benscoter, Allison M.; Brandt, Laura A.

    2016-01-01

    Conservation of natural resources can be challenging in a rapidly changing world and require collaborative efforts for success. Conservation planning is the process of deciding how to protect, conserve, and enhance or minimize loss of natural and cultural resources. Establishing conservation targets (also called indicators or endpoints), the measurable expressions of desired resource conditions, can help with site-specific up to landscape-scale conservation planning. Using conservation targets and tracking them through time can deliver benefits such as insight into ecosystem health and providing early warnings about undesirable trends. We describe an approach using value-focused thinking to develop statewide conservation targets for Florida. Using such an approach allowed us to first identify stakeholder objectives and then define conservation targets to meet those objectives. Stakeholders were able to see how their shared efforts fit into the broader conservation context, and also anticipate the benefits of multi-agency and -organization collaboration. We developed an iterative process for large-scale conservation planning that included defining a shared framework for the process, defining the conservation targets themselves, as well as developing management and monitoring strategies for evaluation of their effectiveness. The process we describe is applicable to other geographies where multiple parties are seeking to implement collaborative, large-scale biological planning.

  12. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  13. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  14. Is Attentional Resource Allocation Across Sensory Modalities Task-Dependent?

    PubMed

    Wahn, Basil; König, Peter

    2017-01-01

    Human information processing is limited by attentional resources. That is, via attentional mechanisms, humans select a limited amount of sensory input to process while other sensory input is neglected. In multisensory research, a matter of ongoing debate is whether there are distinct pools of attentional resources for each sensory modality or whether attentional resources are shared across sensory modalities. Recent studies have suggested that attentional resource allocation across sensory modalities is in part task-dependent. That is, the recruitment of attentional resources across the sensory modalities depends on whether processing involves object-based attention (e.g., the discrimination of stimulus attributes) or spatial attention (e.g., the localization of stimuli). In the present paper, we review findings in multisensory research related to this view. For the visual and auditory sensory modalities, findings suggest that distinct resources are recruited when humans perform object-based attention tasks, whereas for the visual and tactile sensory modalities, partially shared resources are recruited. If object-based attention tasks are time-critical, shared resources are recruited across the sensory modalities. When humans perform an object-based attention task in combination with a spatial attention task, partly shared resources are recruited across the sensory modalities as well. Conversely, for spatial attention tasks, attentional processing does consistently involve shared attentional resources for the sensory modalities. Generally, findings suggest that the attentional system flexibly allocates attentional resources depending on task demands. We propose that such flexibility reflects a large-scale optimization strategy that minimizes the brain's costly resource expenditures and simultaneously maximizes capability to process currently relevant information.

  15. Science Information System in Japan. NIER Occasional Paper 02/83.

    ERIC Educational Resources Information Center

    Matsumura, Tamiko

    This paper describes the development of a proposed Japanese Science Information System (SIS), a nationwide network of research and academic libraries, large-scale computer centers, national research institutes, and other organizations, to be formed for the purpose of sharing information and resources in the natural sciences, technology, the…

  16. A Case for Data Commons

    PubMed Central

    Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt

    2017-01-01

    Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693

  17. Strategies for Energy Efficient Resource Management of Hybrid Programming Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Dong; Supinski, Bronis de; Schulz, Martin

    2013-01-01

    Many scientific applications are programmed using hybrid programming models that use both message-passing and shared-memory, due to the increasing prevalence of large-scale systems with multicore, multisocket nodes. Previous work has shown that energy efficiency can be improved using software-controlled execution schemes that consider both the programming model and the power-aware execution capabilities of the system. However, such approaches have focused on identifying optimal resource utilization for one programming model, either shared-memory or message-passing, in isolation. The potential solution space, thus the challenge, increases substantially when optimizing hybrid models since the possible resource configurations increase exponentially. Nonetheless, with the accelerating adoptionmore » of hybrid programming models, we increasingly need improved energy efficiency in hybrid parallel applications on large-scale systems. In this work, we present new software-controlled execution schemes that consider the effects of dynamic concurrency throttling (DCT) and dynamic voltage and frequency scaling (DVFS) in the context of hybrid programming models. Specifically, we present predictive models and novel algorithms based on statistical analysis that anticipate application power and time requirements under different concurrency and frequency configurations. We apply our models and methods to the NPB MZ benchmarks and selected applications from the ASC Sequoia codes. Overall, we achieve substantial energy savings (8.74% on average and up to 13.8%) with some performance gain (up to 7.5%) or negligible performance loss.« less

  18. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  19. Bridging Hydroinformatics Services Between HydroShare and SWATShare

    NASA Astrophysics Data System (ADS)

    Merwade, V.; Zhao, L.; Song, C. X.; Tarboton, D. G.; Goodall, J. L.; Stealey, M.; Rajib, A.; Morsy, M. M.; Dash, P. K.; Miles, B.; Kim, I. L.

    2016-12-01

    Many cyberinfrastructure systems in the hydrologic and related domains emerged in the past decade with more being developed to address various data management and modeling needs. Although clearly beneficial to the broad user community, it is a challenging task to build interoperability across these systems due to various obstacles including technological, organizational, semantic, and social issues. This work presents our experience in developing interoperability between two hydrologic cyberinfrastructure systems - SWATShare and HydroShare. HydroShare is a large-scale online system aiming at enabling the hydrologic user community to share their data, models, and analysis online for solving complex hydrologic research questions. On the other side, SWATShare is a focused effort to allow SWAT (Soil and Water Assessment Tool) modelers share, execute and analyze SWAT models using high performance computing resources. Making these two systems interoperable required common sign-in through OAuth, sharing of models through common metadata standards and use of standard web-services for implementing key import/export functionalities. As a result, users from either community can leverage the resources and services across these systems without having to manually importing, exporting, or processing their models. Overall, this use case is an example that can serve as a model for the interoperability among other systems as no one system can provide all the functionality needed to address large interdisciplinary problems.

  20. A Stream Tilling Approach to Surface Area Estimation for Large Scale Spatial Data in a Shared Memory System

    NASA Astrophysics Data System (ADS)

    Liu, Jiping; Kang, Xiaochen; Dong, Chun; Xu, Shenghua

    2017-12-01

    Surface area estimation is a widely used tool for resource evaluation in the physical world. When processing large scale spatial data, the input/output (I/O) can easily become the bottleneck in parallelizing the algorithm due to the limited physical memory resources and the very slow disk transfer rate. In this paper, we proposed a stream tilling approach to surface area estimation that first decomposed a spatial data set into tiles with topological expansions. With these tiles, the one-to-one mapping relationship between the input and the computing process was broken. Then, we realized a streaming framework towards the scheduling of the I/O processes and computing units. Herein, each computing unit encapsulated a same copy of the estimation algorithm, and multiple asynchronous computing units could work individually in parallel. Finally, the performed experiment demonstrated that our stream tilling estimation can efficiently alleviate the heavy pressures from the I/O-bound work, and the measured speedup after being optimized have greatly outperformed the directly parallel versions in shared memory systems with multi-core processors.

  1. GIGGLE: a search engine for large-scale integrated genome analysis.

    PubMed

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-02-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation.

  2. GIGGLE: a search engine for large-scale integrated genome analysis

    PubMed Central

    Layer, Ryan M; Pedersen, Brent S; DiSera, Tonya; Marth, Gabor T; Gertz, Jason; Quinlan, Aaron R

    2018-01-01

    GIGGLE is a genomics search engine that identifies and ranks the significance of genomic loci shared between query features and thousands of genome interval files. GIGGLE (https://github.com/ryanlayer/giggle) scales to billions of intervals and is over three orders of magnitude faster than existing methods. Its speed extends the accessibility and utility of resources such as ENCODE, Roadmap Epigenomics, and GTEx by facilitating data integration and hypothesis generation. PMID:29309061

  3. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    PubMed

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  4. Data Use Agreement | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    CPTAC requests that data users abide by the same principles that were previously established in the Fort Lauderdale and Amsterdam meetings. The recommendations from the Fort Lauderdale meeting (2003) on best practices and principles for sharing large-scale genomic data address the roles and responsibilities of data producers, data users and funders of community resource projects.

  5. Renewable Energy Zone (REZ) Transmission Planning Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Nathan

    A REZ is a geographical area that enables the development of profitable, cost-effective, grid-connected renewable energy (RE). The REZ Transmission Planning Process is a proactive approach to plan, approve, and build transmission infrastructure connecting REZs to the power system which helps to increase the share of solar, wind and other RE resources in the power system while maintaining reliability and economics, and focuses on large-scale wind and solar resources that can be developed in sufficient quantities to warrant transmission system expansion and upgrades.

  6. Studying interregional wildland fire engine assignments for large fire suppression

    Treesearch

    Erin J. Belval; Yu Wei; David E. Calkin; Crystal S. Stonesifer; Matthew P. Thompson; John R. Tipton

    2017-01-01

    One crucial component of large fire response in the United States (US) is the sharing of wildland firefighting resources between regions: resources from regions experiencing low fire activity supplement resources in regions experiencing high fire activity. An important step towards improving the efficiency of resource sharing and related policies is to develop a better...

  7. CloudMan as a platform for tool, data, and analysis distribution.

    PubMed

    Afgan, Enis; Chapman, Brad; Taylor, James

    2012-11-27

    Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.

  8. The economics of time shared computing: Congestion, user costs and capacity

    NASA Technical Reports Server (NTRS)

    Agnew, C. E.

    1982-01-01

    Time shared systems permit the fixed costs of computing resources to be spread over large numbers of users. However, bottleneck results in the theory of closed queueing networks can be used to show that this economy of scale will be offset by the increased congestion that results as more users are added to the system. If one considers the total costs, including the congestion cost, there is an optimal number of users for a system which equals the saturation value usually used to define system capacity.

  9. Towards Sustainability and Scalability of Educational Innovations in Hydrology:What is the Value and who is the Customer?

    NASA Astrophysics Data System (ADS)

    Deshotel, M.; Habib, E. H.

    2016-12-01

    There is an increasing desire by the water education community to use emerging research resources and technological advances in order to reform current educational practices. Recent years have witnessed some exemplary developments that tap into emerging hydrologic modeling and data sharing resources, innovative digital and visualization technologies, and field experiences. However, such attempts remain largely at the scale of individual efforts and fall short of meeting scalability and sustainability solutions. This can be attributed to number of reasons such as inadequate experience with modeling and data-based educational developments, lack of faculty time to invest in further developments, and lack of resources to further support the project. Another important but often-overlooked reason is the lack of adequate insight on the actual needs of end-users of such developments. Such insight is highly critical to inform how to scale and sustain educational innovations. In this presentation, we share with the hydrologic community experiences gathered from an ongoing experiment where the authors engaged in a hypothesis-driven, customer-discovery process to inform the scalability and sustainability of educational innovations in the field of hydrology and water resources education. The experiment is part of a program called Innovation Corps for Learning (I-Corps L). This program follows a business model approach where a value proposition is initially formulated on the educational innovation. The authors then engaged in a hypothesis-validation process through an intense series of customer interviews with different segments of potential end users, including junior/senior students, student interns, and hydrology professors. The authors also sought insight from engineering firms by interviewing junior engineers and their supervisors to gather feedback on the preparedness of graduating engineers as they enter the workforce in the area of water resources. Exploring the large landscape of potential users is critical in formulating a user-driven approach that can inform the innovation development. The presentation shares the results of this experiment and the insight gained and discusses how such information can inform the community on sustaining and scaling hydrology educational developments.

  10. Enabling large-scale next-generation sequence assembly with Blacklight

    PubMed Central

    Couger, M. Brian; Pipes, Lenore; Squina, Fabio; Prade, Rolf; Siepel, Adam; Palermo, Robert; Katze, Michael G.; Mason, Christopher E.; Blood, Philip D.

    2014-01-01

    Summary A variety of extremely challenging biological sequence analyses were conducted on the XSEDE large shared memory resource Blacklight, using current bioinformatics tools and encompassing a wide range of scientific applications. These include genomic sequence assembly, very large metagenomic sequence assembly, transcriptome assembly, and sequencing error correction. The data sets used in these analyses included uncategorized fungal species, reference microbial data, very large soil and human gut microbiome sequence data, and primate transcriptomes, composed of both short-read and long-read sequence data. A new parallel command execution program was developed on the Blacklight resource to handle some of these analyses. These results, initially reported previously at XSEDE13 and expanded here, represent significant advances for their respective scientific communities. The breadth and depth of the results achieved demonstrate the ease of use, versatility, and unique capabilities of the Blacklight XSEDE resource for scientific analysis of genomic and transcriptomic sequence data, and the power of these resources, together with XSEDE support, in meeting the most challenging scientific problems. PMID:25294974

  11. vPELS: An E-Learning Social Environment for VLSI Design with Content Security Using DRM

    ERIC Educational Resources Information Center

    Dewan, Jahangir; Chowdhury, Morshed; Batten, Lynn

    2014-01-01

    This article provides a proposal for personal e-learning system (vPELS [where "v" stands for VLSI: very large scale integrated circuit])) architecture in the context of social network environment for VLSI Design. The main objective of vPELS is to develop individual skills on a specific subject--say, VLSI--and share resources with peers.…

  12. Hierarchical Data Distribution Scheme for Peer-to-Peer Networks

    NASA Astrophysics Data System (ADS)

    Bhushan, Shashi; Dave, M.; Patel, R. B.

    2010-11-01

    In the past few years, peer-to-peer (P2P) networks have become an extremely popular mechanism for large-scale content sharing. P2P systems have focused on specific application domains (e.g. music files, video files) or on providing file system like capabilities. P2P is a powerful paradigm, which provides a large-scale and cost-effective mechanism for data sharing. P2P system may be used for storing data globally. Can we implement a conventional database on P2P system? But successful implementation of conventional databases on the P2P systems is yet to be reported. In this paper we have presented the mathematical model for the replication of the partitions and presented a hierarchical based data distribution scheme for the P2P networks. We have also analyzed the resource utilization and throughput of the P2P system with respect to the availability, when a conventional database is implemented over the P2P system with variable query rate. Simulation results show that database partitions placed on the peers with higher availability factor perform better. Degradation index, throughput, resource utilization are the parameters evaluated with respect to the availability factor.

  13. The semantic web in translational medicine: current applications and future directions

    PubMed Central

    Machado, Catia M.; Rebholz-Schuhmann, Dietrich; Freitas, Ana T.; Couto, Francisco M.

    2015-01-01

    Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice. PMID:24197933

  14. The semantic web in translational medicine: current applications and future directions.

    PubMed

    Machado, Catia M; Rebholz-Schuhmann, Dietrich; Freitas, Ana T; Couto, Francisco M

    2015-01-01

    Semantic web technologies offer an approach to data integration and sharing, even for resources developed independently or broadly distributed across the web. This approach is particularly suitable for scientific domains that profit from large amounts of data that reside in the public domain and that have to be exploited in combination. Translational medicine is such a domain, which in addition has to integrate private data from the clinical domain with proprietary data from the pharmaceutical domain. In this survey, we present the results of our analysis of translational medicine solutions that follow a semantic web approach. We assessed these solutions in terms of their target medical use case; the resources covered to achieve their objectives; and their use of existing semantic web resources for the purposes of data sharing, data interoperability and knowledge discovery. The semantic web technologies seem to fulfill their role in facilitating the integration and exploration of data from disparate sources, but it is also clear that simply using them is not enough. It is fundamental to reuse resources, to define mappings between resources, to share data and knowledge. All these aspects allow the instantiation of translational medicine at the semantic web-scale, thus resulting in a network of solutions that can share resources for a faster transfer of new scientific results into the clinical practice. The envisioned network of translational medicine solutions is on its way, but it still requires resolving the challenges of sharing protected data and of integrating semantic-driven technologies into the clinical practice. © The Author 2013. Published by Oxford University Press.

  15. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  16. Distributed intrusion detection system based on grid security model

    NASA Astrophysics Data System (ADS)

    Su, Jie; Liu, Yahui

    2008-03-01

    Grid computing has developed rapidly with the development of network technology and it can solve the problem of large-scale complex computing by sharing large-scale computing resource. In grid environment, we can realize a distributed and load balance intrusion detection system. This paper first discusses the security mechanism in grid computing and the function of PKI/CA in the grid security system, then gives the application of grid computing character in the distributed intrusion detection system (IDS) based on Artificial Immune System. Finally, it gives a distributed intrusion detection system based on grid security system that can reduce the processing delay and assure the detection rates.

  17. Job Management and Task Bundling

    NASA Astrophysics Data System (ADS)

    Berkowitz, Evan; Jansen, Gustav R.; McElvain, Kenneth; Walker-Loud, André

    2018-03-01

    High Performance Computing is often performed on scarce and shared computing resources. To ensure computers are used to their full capacity, administrators often incentivize large workloads that are not possible on smaller systems. Measurements in Lattice QCD frequently do not scale to machine-size workloads. By bundling tasks together we can create large jobs suitable for gigantic partitions. We discuss METAQ and mpi_jm, software developed to dynamically group computational tasks together, that can intelligently backfill to consume idle time without substantial changes to users' current workflows or executables.

  18. Data sharing and intellectual property in a genomic epidemiology network: policies for large-scale research collaboration.

    PubMed Central

    Chokshi, Dave A.; Parker, Michael; Kwiatkowski, Dominic P.

    2006-01-01

    Genomic epidemiology is a field of research that seeks to improve the prevention and management of common diseases through an understanding of their molecular origins. It involves studying thousands of individuals, often from different populations, with exacting techniques. The scale and complexity of such research has required the formation of research consortia. Members of these consortia need to agree on policies for managing shared resources and handling genetic data. Here we consider data-sharing and intellectual property policies for an international research consortium working on the genomic epidemiology of malaria. We outline specific guidelines governing how samples and data are transferred among its members; how results are released into the public domain; when to seek protection for intellectual property; and how intellectual property should be managed. We outline some pragmatic solutions founded on the basic principles of promoting innovation and access. PMID:16710548

  19. CloudMan as a platform for tool, data, and analysis distribution

    PubMed Central

    2012-01-01

    Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507

  20. Application of the EVEX resource to event extraction and network construction: Shared Task entry and result analysis

    PubMed Central

    2015-01-01

    Background Modern methods for mining biomolecular interactions from literature typically make predictions based solely on the immediate textual context, in effect a single sentence. No prior work has been published on extending this context to the information automatically gathered from the whole biomedical literature. Thus, our motivation for this study is to explore whether mutually supporting evidence, aggregated across several documents can be utilized to improve the performance of the state-of-the-art event extraction systems. In this paper, we describe our participation in the latest BioNLP Shared Task using the large-scale text mining resource EVEX. We participated in the Genia Event Extraction (GE) and Gene Regulation Network (GRN) tasks with two separate systems. In the GE task, we implemented a re-ranking approach to improve the precision of an existing event extraction system, incorporating features from the EVEX resource. In the GRN task, our system relied solely on the EVEX resource and utilized a rule-based conversion algorithm between the EVEX and GRN formats. Results In the GE task, our re-ranking approach led to a modest performance increase and resulted in the first rank of the official Shared Task results with 50.97% F-score. Additionally, in this paper we explore and evaluate the usage of distributed vector representations for this challenge. In the GRN task, we ranked fifth in the official results with a strict/relaxed SER score of 0.92/0.81 respectively. To try and improve upon these results, we have implemented a novel machine learning based conversion system and benchmarked its performance against the original rule-based system. Conclusions For the GRN task, we were able to produce a gene regulatory network from the EVEX data, warranting the use of such generic large-scale text mining data in network biology settings. A detailed performance and error analysis provides more insight into the relatively low recall rates. In the GE task we demonstrate that both the re-ranking approach and the word vectors can provide slight performance improvement. A manual evaluation of the re-ranking results pinpoints some of the challenges faced in applying large-scale text mining knowledge to event extraction. PMID:26551766

  1. Being Sticker Rich: Numerical Context Influences Children’s Sharing Behavior

    PubMed Central

    Posid, Tasha; Fazio, Allyse; Cordes, Sara

    2015-01-01

    Young children spontaneously share resources with anonymous recipients, but little is known about the specific circumstances that promote or hinder these prosocial tendencies. Children (ages 3–11) received a small (12) or large (30) number of stickers, and were then given the opportunity to share their windfall with either one or multiple anonymous recipients (Dictator Game). Whether a child chose to share or not varied as a function of age, but was uninfluenced by numerical context. Moreover, children’s giving was consistent with a proportion-based account, such that children typically donated a similar proportion (but different absolute number) of the resources given to them, regardless of whether they originally received a small or large windfall. The proportion of resources donated, however, did vary based on the number of recipients with whom they were allowed to share, such that on average, children shared more when there were more recipients available, particularly when they had more resources, suggesting they take others into consideration when making prosocial decisions. Finally, results indicated that a child’s gender also predicted sharing behavior, with males generally sharing more resources than females. Together, findings suggest that the numerical contexts under which children are asked to share, as well as the quantity of resources that they have to share, may interact to promote (or hinder) altruistic behaviors throughout childhood. PMID:26535900

  2. Decentralized Opportunistic Spectrum Resources Access Model and Algorithm toward Cooperative Ad-Hoc Networks.

    PubMed

    Liu, Ming; Xu, Yang; Mohammed, Abdul-Wahid

    2016-01-01

    Limited communication resources have gradually become a critical factor toward efficiency of decentralized large scale multi-agent coordination when both system scales up and tasks become more complex. In current researches, due to the agent's limited communication and observational capability, an agent in a decentralized setting can only choose a part of channels to access, but cannot perceive or share global information. Each agent's cooperative decision is based on the partial observation of the system state, and as such, uncertainty in the communication network is unavoidable. In this situation, it is a major challenge working out cooperative decision-making under uncertainty with only a partial observation of the environment. In this paper, we propose a decentralized approach that allows agents cooperatively search and independently choose channels. The key to our design is to build an up-to-date observation for each agent's view so that a local decision model is achievable in a large scale team coordination. We simplify the Dec-POMDP model problem, and each agent can jointly work out its communication policy in order to improve its local decision utilities for the choice of communication resources. Finally, we discuss an implicate resource competition game, and show that, there exists an approximate resources access tradeoff balance between agents. Based on this discovery, the tradeoff between real-time decision-making and the efficiency of cooperation using these channels can be well improved.

  3. Database of significant deposits of gold, silver, copper, lead, and zinc in the United States

    USGS Publications Warehouse

    Long, Keith R.; DeYoung,, John H.; Ludington, Stephen

    1998-01-01

    It has long been recognized that the largest mineral deposits contain most of the known mineral endowment (Singer and DeYoung, 1980). Sometimes called giant or world-class deposits, these largest deposits account for a very large share of historic and current mineral production and resources in industrial society (Singer, 1995). For example, Singer (1995) shows that the largest 10 percent of the world’s gold deposits contain 86 percent of the gold discovered to date. Many mineral resource issues and investigations are more easily addressed if limited to the relatively small number of deposits that contain most of the known mineral resources. An estimate of known resources using just these deposits would normally be sufficient, because considering smaller deposits would not add significantly to the total estimate. Land-use planning should treat mainly with these deposits due to their relative scarcity, the large share of known resources they contain, and the fact that economies of scale allow minerals to be produced much more cheaply from larger deposits. Investigation of environmental and other hazards that result from mining operations can be limited to these largest deposits because they account for most of past and current production.The National Mineral Resource Assessment project of the U.S. Geological Survey (USGS) has compiled a database on the largest known deposits of gold, silver, copper, lead, and zinc in the United States to complement the 1996 national assessment of undiscovered deposits of these same metals (Ludington and Cox, 1996). The deposits in this database account for approximately 99 percent of domestic production of these metals and probably a similar share of identified resources. These data may be compared with results of the assessment of undiscovered resources to characterize the nation’s total mineral endowment for these metals. This database is a starting point for any national or regional mineral-resource or mineral-environmental investigation.

  4. Large-scale quantum networks based on graphs

    NASA Astrophysics Data System (ADS)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  5. Cross-Jurisdictional Resource Sharing in Changing Public Health Landscape: Contributory Factors and Theoretical Explanations.

    PubMed

    Shah, Gulzar H; Badana, Adrian N S; Robb, Claire; Livingood, William C

    2016-01-01

    Local health departments (LHDs) are striving to meet public health needs within their jurisdictions, amidst fiscal restraints and complex dynamic environment. Resource sharing across jurisdictions is a critical opportunity for LHDs to continue to enhance effectiveness and increase efficiency. This research examines the extent of cross-jurisdictional resource sharing among LHDs, the programmatic areas and organizational functions for which LHDs share resources, and LHD characteristics associated with resource sharing. Data from the National Association of County & City Health Officials' 2013 National Profile of LHDs were used. Descriptive statistics and multinomial logistic regression were performed for the 5 implementation-oriented outcome variables of interest, with 3 levels of implementation. More than 54% of LHDs shared resources such as funding, staff, or equipment with 1 or more other LHDs on a continuous, recurring basis. Results from the multinomial regression analysis indicate that economies of scale (population size and metropolitan status) had significant positive influences (at P ≤ .05) on resource sharing. Engagement in accreditation, community health assessment, community health improvement planning, quality improvement, and use of the Community Guide were associated with lower levels of engagement in resource sharing. Doctoral degree of the top executive and having 1 or more local boards of health carried a positive influence on resource sharing. Cross-jurisdictional resource sharing is a viable and commonly used process to overcome the challenges of new and emerging public health problems within the constraints of restricted budgets. LHDs, particularly smaller LHDs with limited resources, should consider increased resource sharing to address emerging challenges.

  6. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  7. The natural resources supply indexes study of the pig breeding scale in China

    NASA Astrophysics Data System (ADS)

    Leng, Bi-Bin; Zhang, Qi-Zhen; Ji, Xue-Qiang; Xu, Yue-Feng

    2017-08-01

    For the pollution problem of the pig breeding scale, we took three indexes as evaluation criterion, including arable land per capita, the water resource per capita and per capita share of grain. Then SPSS was used to synthesized the natural resources supply indexes of the pig breeding scale. The results show that with the fast development of technology and the steadily rising of grain production, the natural resources supply indexes of the pig breeding scale are raising constantly.

  8. Principles of cooperation across systems: from human sharing to multicellularity and cancer.

    PubMed

    Aktipis, Athena

    2016-01-01

    From cells to societies, several general principles arise again and again that facilitate cooperation and suppress conflict. In this study, I describe three general principles of cooperation and how they operate across systems including human sharing, cooperation in animal and insect societies and the massively large-scale cooperation that occurs in our multicellular bodies. The first principle is that of Walk Away: that cooperation is enhanced when individuals can leave uncooperative partners. The second principle is that resource sharing is often based on the need of the recipient (i.e., need-based transfers) rather than on strict account-keeping. And the last principle is that effective scaling up of cooperation requires increasingly sophisticated and costly cheater suppression mechanisms. By comparing how these principles operate across systems, we can better understand the constraints on cooperation. This can facilitate the discovery of novel ways to enhance cooperation and suppress cheating in its many forms, from social exploitation to cancer.

  9. Differentiation in Access to, and the Use and Sharing of (Open) Educational Resources among Students and Lecturers at Kenyan Universities

    ERIC Educational Resources Information Center

    Pete, Judith; Mulder, Fred; Neto, Jose Dutra Oliveira

    2017-01-01

    In order to obtain a fair "OER picture" for the Global South a large-scale study has been carried out for a series of countries, including Kenya. In this paper we report on the Kenya study, run at four universities that have been selected with randomly sampled students and lecturers. Empirical data have been generated by the use of a…

  10. Improving and integrating data on invasive species collected by citizen scientists

    USGS Publications Warehouse

    2010-01-01

    Limited resources make it difficult to effectively document, monitor, and control invasive species across large areas, resulting in large gaps in our knowledge of current and future invasion patterns. We surveyed 128 citizen science program coordinators and interviewed 15 of them to evaluate their potential role in filling these gaps. Many programs collect data on invasive species and are willing to contribute these data to public databases. Although resources for education and monitoring are readily available, groups generally lack tools to manage and analyze data. Potential users of these data also retain concerns over data quality. We discuss how to address these concerns about citizen scientist data and programs while preserving the advantages they afford. A unified yet flexible national citizen science program aimed at tracking invasive species location, abundance, and control efforts could be designed using centralized data sharing and management tools. Such a system could meet the needs of multiple stakeholders while allowing efficiencies of scale, greater standardization of methods, and improved data quality testing and sharing. Finally, we present a prototype for such a system (see www.citsci.org).

  11. Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.

    PubMed

    Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y

    2018-02-01

    The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.

  12. RabbitQR: fast and flexible big data processing at LSST data rates using existing, shared-use hardware

    NASA Astrophysics Data System (ADS)

    Kotulla, Ralf; Gopu, Arvind; Hayashi, Soichi

    2016-08-01

    Processing astronomical data to science readiness was and remains a challenge, in particular in the case of multi detector instruments such as wide-field imagers. One such instrument, the WIYN One Degree Imager, is available to the astronomical community at large, and, in order to be scientifically useful to its varied user community on a short timescale, provides its users fully calibrated data in addition to the underlying raw data. However, time-efficient re-processing of the often large datasets with improved calibration data and/or software requires more than just a large number of CPU-cores and disk space. This is particularly relevant if all computing resources are general purpose and shared with a large number of users in a typical university setup. Our approach to address this challenge is a flexible framework, combining the best of both high performance (large number of nodes, internal communication) and high throughput (flexible/variable number of nodes, no dedicated hardware) computing. Based on the Advanced Message Queuing Protocol, we a developed a Server-Manager- Worker framework. In addition to the server directing the work flow and the worker executing the actual work, the manager maintains a list of available worker, adds and/or removes individual workers from the worker pool, and re-assigns worker to different tasks. This provides the flexibility of optimizing the worker pool to the current task and workload, improves load balancing, and makes the most efficient use of the available resources. We present performance benchmarks and scaling tests, showing that, today and using existing, commodity shared- use hardware we can process data with data throughputs (including data reduction and calibration) approaching that expected in the early 2020s for future observatories such as the Large Synoptic Survey Telescope.

  13. Moving to stay in place: behavioral mechanisms for coexistence of African large carnivores.

    PubMed

    Vanak, Abi Tamim; Fortin, Daniel; Thaker, Maria; Ogden, Monika; Owen, Cailey; Greatwood, Sophie; Slotow, Rob

    2013-11-01

    Most ecosystems have multiple predator species that not only compete for shared prey, but also pose direct threats to each other. These intraguild interactions are key drivers of carnivore community structure, with ecosystem-wide cascading effects. Yet, behavioral mechanisms for coexistence of multiple carnivore species remain poorly understood. The challenges of studying large, free-ranging carnivores have resulted in mainly coarse-scale examination of behavioral strategies without information about all interacting competitors. We overcame some of these challenges by examining the concurrent fine-scale movement decisions of almost all individuals of four large mammalian carnivore species in a closed terrestrial system. We found that the intensity ofintraguild interactions did not follow a simple hierarchical allometric pattern, because spatial and behavioral tactics of subordinate species changed with threat and resource levels across seasons. Lions (Panthera leo) were generally unrestricted and anchored themselves in areas rich in not only their principal prey, but also, during periods of resource limitation (dry season), rich in the main prey for other carnivores. Because of this, the greatest cost (potential intraguild predation) for subordinate carnivores was spatially coupled with the highest potential benefit of resource acquisition (prey-rich areas), especially in the dry season. Leopard (P. pardus) and cheetah (Acinonyx jubatus) overlapped with the home range of lions but minimized their risk using fine-scaled avoidance behaviors and restricted resource acquisition tactics. The cost of intraguild competition was most apparent for cheetahs, especially during the wet season, as areas with energetically rewarding large prey (wildebeest) were avoided when they overlapped highly with the activity areas of lions. Contrary to expectation, the smallest species (African wild dog, Lycaon pictus) did not avoid only lions, but also used multiple tactics to minimize encountering all other competitors. Intraguild competition thus forced wild dogs into areas with the lowest resource availability year round. Coexistence of multiple carnivore species has typically been explained by dietary niche separation, but our multi-scaled movement results suggest that differences in resource acquisition may instead be a consequence of avoiding intraguild competition. We generate a more realistic representation of hierarchical behavioral interactions that may ultimately drive spatially explicit trophic structures of multi-predator communities.

  14. The essential nature of sharing in science.

    PubMed

    Fischer, Beth A; Zigmond, Michael J

    2010-12-01

    Advances in science are the combined result of the efforts of a great many scientists, and in many cases, their willingness to share the products of their research. These products include data sets, both small and large, and unique research resources not commercially available, such as cell lines and software programs. The sharing of these resources enhances both the scope and the depth of research, while making more efficient use of time and money. However, sharing is not without costs, many of which are borne by the individual who develops the research resource. Sharing, for example, reduces the uniqueness of the resources available to a scientist, potentially influencing the originator's perceived productivity and ultimately his or her competitiveness for jobs, promotions, and grants. Nevertheless, for most researchers-particularly those using public funds-sharing is no longer optional but must be considered an obligation to science, the funding agency, and ultimately society at large. Most funding agencies, journals, and professional societies now require a researcher who has published work involving a unique resource to make that resource available to other investigators. Changes could be implemented to mitigate some of the costs. The creator of the resource could explore the possibility of collaborating with those who request it. In addition, institutions that employ and fund researchers could change their policies and practices to make sharing a more attractive and viable option. For example, when evaluating an individual's productivity, institutions could provide credit for the impact a researcher has had on their field through the provision of their unique resources to other investigators, regardless of whether that impact is reflected in the researcher's list of publications. In addition, increased funding for the development and maintenance of user-friendly public repositories for data and research resources would also help to reduce barriers to sharing by minimizing the time, effort, and funding needed by individual investigators to comply with requests for their unique resource. Indeed, sharing is an imperative, but it is also essential to find ways to protect for both the original owner of the resource and those wishing to share it.

  15. A FairShare Scheduling Service for OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Vallero, S.; Zaccolo, V.

    2017-10-01

    In the ideal limit of infinite resources, multi-tenant applications are able to scale in/out on a Cloud driven only by their functional requirements. While a large Public Cloud may be a reasonable approximation of this condition, small scientific computing centres usually work in a saturated regime. In this case, an advanced resource allocation policy is needed in order to optimize the use of the data centre. The general topic of advanced resource scheduling is addressed by several components of the EU-funded INDIGO-DataCloud project. In this contribution, we describe the FairShare Scheduler Service (FaSS) for OpenNebula (ONE). The service must satisfy resource requests according to an algorithm which prioritizes tasks according to an initial weight and to the historical resource usage of the project. The software was designed to be less intrusive as possible in the ONE code. We keep the original ONE scheduler implementation to match requests to available resources, but the queue of pending jobs to be processed is the one ordered according to priorities as delivered by the FaSS. The FaSS implementation is still being finalized and in this contribution we describe the functional and design requirements the module should satisfy, as well as its high-level architecture.

  16. Preparing Laboratory and Real-World EEG Data for Large-Scale Analysis: A Containerized Approach

    PubMed Central

    Bigdely-Shamlo, Nima; Makeig, Scott; Robbins, Kay A.

    2016-01-01

    Large-scale analysis of EEG and other physiological measures promises new insights into brain processes and more accurate and robust brain–computer interface models. However, the absence of standardized vocabularies for annotating events in a machine understandable manner, the welter of collection-specific data organizations, the difficulty in moving data across processing platforms, and the unavailability of agreed-upon standards for preprocessing have prevented large-scale analyses of EEG. Here we describe a “containerized” approach and freely available tools we have developed to facilitate the process of annotating, packaging, and preprocessing EEG data collections to enable data sharing, archiving, large-scale machine learning/data mining and (meta-)analysis. The EEG Study Schema (ESS) comprises three data “Levels,” each with its own XML-document schema and file/folder convention, plus a standardized (PREP) pipeline to move raw (Data Level 1) data to a basic preprocessed state (Data Level 2) suitable for application of a large class of EEG analysis methods. Researchers can ship a study as a single unit and operate on its data using a standardized interface. ESS does not require a central database and provides all the metadata data necessary to execute a wide variety of EEG processing pipelines. The primary focus of ESS is automated in-depth analysis and meta-analysis EEG studies. However, ESS can also encapsulate meta-information for the other modalities such as eye tracking, that are increasingly used in both laboratory and real-world neuroimaging. ESS schema and tools are freely available at www.eegstudy.org and a central catalog of over 850 GB of existing data in ESS format is available at studycatalog.org. These tools and resources are part of a larger effort to enable data sharing at sufficient scale for researchers to engage in truly large-scale EEG analysis and data mining (BigEEG.org). PMID:27014048

  17. Multi-dimension and Comprehensive Assessment on the Utilizing and Sharing of Regional Large-Scale Scientific Equipment

    PubMed Central

    Li, Chen; Yongbo, Lv; Chi, Chen

    2015-01-01

    Based on the data from 30 provincial regions in China, an assessment and empirical analysis was carried out on the utilizing and sharing of the large-scale scientific equipment with a comprehensive assessment model established on the three dimensions, namely, equipment, utilization and sharing. The assessment results were interpreted in light of relevant policies. The results showed that on the whole, the overall development level in the provincial regions in eastern and central China is higher than that in western China. This is mostly because of the large gap among the different provincial regions with respect to the equipped level. But in terms of utilizing and sharing, some of the Western provincial regions, such as Ningxia, perform well, which is worthy of our attention. Policy adjustment targeting at the differentiation, elevation of the capacity of the equipment management personnel, perfection of the sharing and cooperation platform, and the promotion of the establishment of open sharing funds, are all important measures to promote the utilization and sharing of the large-scale scientific equipment and to narrow the gap among different regions. PMID:25937850

  18. Attention and Visuospatial Working Memory Share the Same Processing Resources

    PubMed Central

    Feng, Jing; Pratt, Jay; Spence, Ian

    2012-01-01

    Attention and visuospatial working memory (VWM) share very similar characteristics; both have the same upper bound of about four items in capacity and they recruit overlapping brain regions. We examined whether both attention and VWM share the same processing resources using a novel dual-task costs approach based on a load-varying dual-task technique. With sufficiently large loads on attention and VWM, considerable interference between the two processes was observed. A further load increase on either process produced reciprocal increases in interference on both processes, indicating that attention and VWM share common resources. More critically, comparison among four experiments on the reciprocal interference effects, as measured by the dual-task costs, demonstrates no significant contribution from additional processing other than the shared processes. These results support the notion that attention and VWM share the same processing resources. PMID:22529826

  19. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  20. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses

    PubMed Central

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-01-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. PMID:24462600

  1. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    PubMed

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  3. Perfect quantum multiple-unicast network coding protocol

    NASA Astrophysics Data System (ADS)

    Li, Dan-Dan; Gao, Fei; Qin, Su-Juan; Wen, Qiao-Yan

    2018-01-01

    In order to realize long-distance and large-scale quantum communication, it is natural to utilize quantum repeater. For a general quantum multiple-unicast network, it is still puzzling how to complete communication tasks perfectly with less resources such as registers. In this paper, we solve this problem. By applying quantum repeaters to multiple-unicast communication problem, we give encoding-decoding schemes for source nodes, internal ones and target ones, respectively. Source-target nodes share EPR pairs by using our encoding-decoding schemes over quantum multiple-unicast network. Furthermore, quantum communication can be accomplished perfectly via teleportation. Compared with existed schemes, our schemes can reduce resource consumption and realize long-distance transmission of quantum information.

  4. Development of a Scale to Measure Faculty Attitude Towards Open Educational Resources

    ERIC Educational Resources Information Center

    Mishra, Sanjaya; Sharma, Meenu; Sharma, Ramesh Chander; Singh, Alka; Thakur, Atul

    2016-01-01

    This paper describes the entire methodology for the development of a scale to measure Attitude towards Open Educational Resources (ATOER). Traditionally, it is observed that some teachers are more willing to share their work than others, indicating the need to understand teachers' psychological and behavioural determinants that influence use of…

  5. Raising Concerns about Sharing and Reusing Large-Scale Mathematics Classroom Observation Video Data

    ERIC Educational Resources Information Center

    Ing, Marsha; Samkian, Artineh

    2018-01-01

    There are great opportunities and challenges to sharing large-scale mathematics classroom observation data. This Research Commentary describes the methodological opportunities and challenges and provides a specific example from a mathematics education research project to illustrate how the research questions and framework drove observational…

  6. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environmentmore » without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.« less

  7. OpenMP parallelization of a gridded SWAT (SWATG)

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Hou, Jinliang; Cao, Yongpan; Gu, Juan; Huang, Chunlin

    2017-12-01

    Large-scale, long-term and high spatial resolution simulation is a common issue in environmental modeling. A Gridded Hydrologic Response Unit (HRU)-based Soil and Water Assessment Tool (SWATG) that integrates grid modeling scheme with different spatial representations also presents such problems. The time-consuming problem affects applications of very high resolution large-scale watershed modeling. The OpenMP (Open Multi-Processing) parallel application interface is integrated with SWATG (called SWATGP) to accelerate grid modeling based on the HRU level. Such parallel implementation takes better advantage of the computational power of a shared memory computer system. We conducted two experiments at multiple temporal and spatial scales of hydrological modeling using SWATG and SWATGP on a high-end server. At 500-m resolution, SWATGP was found to be up to nine times faster than SWATG in modeling over a roughly 2000 km2 watershed with 1 CPU and a 15 thread configuration. The study results demonstrate that parallel models save considerable time relative to traditional sequential simulation runs. Parallel computations of environmental models are beneficial for model applications, especially at large spatial and temporal scales and at high resolutions. The proposed SWATGP model is thus a promising tool for large-scale and high-resolution water resources research and management in addition to offering data fusion and model coupling ability.

  8. A national-scale authentication infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, R.; Engert, D.; Foster, I.

    2000-12-01

    Today, individuals and institutions in science and industry are increasingly forming virtual organizations to pool resources and tackle a common goal. Participants in virtual organizations commonly need to share resources such as data archives, computer cycles, and networks - resources usually available only with restrictions based on the requested resource's nature and the user's identity. Thus, any sharing mechanism must have the ability to authenticate the user's identity and determine if the user is authorized to request the resource. Virtual organizations tend to be fluid, however, so authentication mechanisms must be flexible and lightweight, allowing administrators to quickly establish andmore » change resource-sharing arrangements. However, because virtual organizations complement rather than replace existing institutions, sharing mechanisms cannot change local policies and must allow individual institutions to maintain control over their own resources. Our group has created and deployed an authentication and authorization infrastructure that meets these requirements: the Grid Security Infrastructure. GSI offers secure single sign-ons and preserves site control over access policies and local security. It provides its own versions of common applications, such as FTP and remote login, and a programming interface for creating secure applications.« less

  9. Does a House Divided Stand? Kinship and the Continuity of Shared Living Arrangements

    PubMed Central

    Glick, Jennifer E.; Van Hook, Jennifer

    2011-01-01

    Shared living arrangements can provide housing, economies of scale, and other instrumental support and may become an important resource in times of economic constraint. But the extent to which such living arrangements experience continuity or rapid change in composition is unclear. Previous research on extended-family households tended to focus on factors that trigger the onset of coresidence, including life course events or changes in health status and related economic needs. Relying on longitudinal data from 9,932 households in the Survey of Income and Program Participation (SIPP), the analyses demonstrate that the distribution of economic resources in the household also influences the continuity of shared living arrangements. The results suggest that multigenerational households of parents and adult children experience greater continuity in composition when one individual or couple has a disproportionate share of the economic resources in the household. Other coresidential households, those shared by other kin or nonkin, experience greater continuity when resources are more evenly distributed. PMID:22259218

  10. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    USGS Publications Warehouse

    Counihan, Timothy D.; Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers.

  11. Can data from disparate long-term fish monitoring programs be used to increase our understanding of regional and continental trends in large river assemblages?

    PubMed Central

    Waite, Ian R.; Casper, Andrew F.; Ward, David L.; Sauer, Jennifer S.; Irwin, Elise R.; Chapman, Colin G.; Ickes, Brian S.; Paukert, Craig P.; Kosovich, John J.; Bayer, Jennifer M.

    2018-01-01

    Understanding trends in the diverse resources provided by large rivers will help balance tradeoffs among stakeholders and inform strategies to mitigate the effects of landscape scale stressors such as climate change and invasive species. Absent a cohesive coordinated effort to assess trends in important large river resources, a logical starting point is to assess our ability to draw inferences from existing efforts. In this paper, we use a common analytical framework to analyze data from five disparate fish monitoring programs to better understand the nature of spatial and temporal trends in large river fish assemblages. We evaluated data from programs that monitor fishes in the Colorado, Columbia, Illinois, Mississippi, and Tallapoosa rivers using non-metric dimensional scaling ordinations and associated tests to evaluate trends in fish assemblage structure and native fish biodiversity. Our results indicate that fish assemblages exhibited significant spatial and temporal trends in all five of the rivers. We also document native species diversity trends that were variable within and between rivers and generally more evident in rivers with higher species richness and programs of longer duration. We discuss shared and basin-specific landscape level stressors. Having a basic understanding of the nature and extent of trends in fish assemblages is a necessary first step towards understanding factors affecting biodiversity and fisheries in large rivers. PMID:29364953

  12. How multiagency partnerships can successfully address large-scale pollution problems: a Hawaii case study.

    PubMed

    Donohue, Mary J

    2003-06-01

    Oceanic circulation patterns deposit significant amounts of marine pollution, including derelict fishing gear from North Pacific Ocean fisheries, in the Hawaiian Archipelago [Mar. Pollut. Bull. 42(12) (2001) 1301]. Management responsibility for these islands and their associated natural resources is shared by several government authorities. Non-governmental organizations (NGOs) and private industry also have interests in the archipelago. Since the marine debris problem in this region is too large for any single agency to manage, a multiagency marine debris working group (group) was established in 1998 to improve marine debris mitigation in Hawaii. To date, 16 federal, state, and local agencies, working with industry and NGOs, have removed 195 tons of derelict fishing gear from the Northwestern Hawaiian Islands. This review details the evolution of the partnership, notes its challenges and rewards, and advocates its continued use as an effective resource management tool.

  13. Reviewing innovative Earth observation solutions for filling science-policy gaps in hydrology

    NASA Astrophysics Data System (ADS)

    Lehmann, Anthony; Giuliani, Gregory; Ray, Nicolas; Rahman, Kazi; Abbaspour, Karim C.; Nativi, Stefano; Craglia, Massimo; Cripe, Douglas; Quevauviller, Philippe; Beniston, Martin

    2014-10-01

    Improved data sharing is needed for hydrological modeling and water management that require better integration of data, information and models. Technological advances in Earth observation and Web technologies have allowed the development of Spatial Data Infrastructures (SDIs) for improved data sharing at various scales. International initiatives catalyze data sharing by promoting interoperability standards to maximize the use of data and by supporting easy access to and utilization of geospatial data. A series of recent European projects are contributing to the promotion of innovative Earth observation solutions and the uptake of scientific outcomes in policy. Several success stories involving different hydrologists' communities can be reported around the World. Gaps still exist in hydrological, agricultural, meteorological and climatological data access because of various issues. While many sources of data exists at all scales it remains difficult and time-consuming to assemble hydrological information for most projects. Furthermore, data and sharing formats remain very heterogeneous. Improvements require implementing/endorsing some commonly agreed standards and documenting data with adequate metadata. The brokering approach allows binding heterogeneous resources published by different data providers and adapting them to tools and interfaces commonly used by consumers of these resources. The challenge is to provide decision-makers with reliable information, based on integrated data and tools derived from both Earth observations and scientific models. Successful SDIs rely therefore on various aspects: a shared vision between all participants, necessity to solve a common problem, adequate data policies, incentives, and sufficient resources. New data streams from remote sensing or crowd sourcing are also producing valuable information to improve our understanding of the water cycle, while field sensors are developing rapidly and becoming less costly. More recent data standards are enhancing interoperability between hydrology and other scientific disciplines, while solutions exist to communicate uncertainty of data and models, which is an essential pre-requisite for decision-making. Distributed computing infrastructures can handle complex and large hydrological data and models, while Web Processing Services bring the flexibility to develop and execute simple to complex workflows over the Internet. The need for capacity building at human, infrastructure and institutional levels is also a major driver for reinforcing the commitment to SDI concepts.

  14. Agri-Environmental Resource Management by Large-Scale Collective Action: Determining KEY Success Factors

    ERIC Educational Resources Information Center

    Uetake, Tetsuya

    2015-01-01

    Purpose: Large-scale collective action is necessary when managing agricultural natural resources such as biodiversity and water quality. This paper determines the key factors to the success of such action. Design/Methodology/Approach: This paper analyses four large-scale collective actions used to manage agri-environmental resources in Canada and…

  15. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong Geoscience Departments sought to create the same type of shared information base that was supporting individual faculty for departments. The Teach the Earth portal and its underlying web development tools were used by NSF-funded projects in education to disseminate their results. Leveraging these funded efforts, the Climate Literacy Network has expanded this geoscience education community to include individuals broadly interested in fostering climate literacy. Most recently, the InTeGrate project is implementing inter-institutional collaborative authoring, testing and evaluation of curricular materials. While these projects represent only a fraction of the activity in geoscience education, they are important drivers in the development of a large, national, coherent geoscience education network with the ability to collaborate and disseminate information effectively. Importantly, the community is open and defined by active participation. Key mechanisms for engagement have included alignment of project activities with participants needs and goals; productive face-to-face and virtual workshops, events, and series; stipends for completion of large products; and strong supporting staff to keep projects moving and assist with product production. One measure of its success is the adoption and adaptation of resources and models by emerging projects, which results in the continued growth of the network.

  16. Large scale rainfall diversity and satellite propagation

    NASA Technical Reports Server (NTRS)

    Lin, H. P.; Vogel, W. J.

    1992-01-01

    From the NOAA 15 minute precipitation file for the U.S., data was selected for 128 stations covering a 17 year period and the probability of simultaneous rainfall at several stations was calculated. We assumed that the chosen stations were located in separate beams of a multi-beam communications satellite with shared fade mitigation resources. In order to estimate the demands made on these resources, we determined the number of stations at which rainfall rates exceeded 10 to 40 mm/hr. We found a 1 percent probability that at least 5 of the 128 stations have rain at or over 10 mm/hr in any 15 minute interval. Rain at 2 stations was found to correlate over distances less than about 600 miles.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurlbut, David; Zhou, Ella; Bird, Lori

    A strategically planned transmission network is an important source of flexibility for the integration of large-scale renewable energy (RE). Such a network can offer access to a broad geographic diversity of resources, which can reduce flexibility needs and facilitate sharing between neighboring balancing areas. This report builds on two previous NREL technical reports - Advancing System Flexibility for High Penetration Renewable Integration (Milligan et al. 2015) and 'Renewables-Friendly' Grid Development Strategies (Hurlbut et al. 2015) - which discuss various flexibility options and provide an overview of U.S. market models and grid planning. This report focuses on addressing issues with cross-regional/provincialmore » transmission in China with the aim of integrating renewable resources that are concentrated in remote areas and require inter-regional/provincial power exchange.« less

  18. The development of generosity and moral cognition across five cultures.

    PubMed

    Cowell, Jason M; Lee, Kang; Malcolm-Smith, Susan; Selcuk, Bilge; Zhou, Xinyue; Decety, Jean

    2017-07-01

    Morality is an evolved aspect of human nature, yet is heavily influenced by cultural environment. This developmental study adopted an integrative approach by combining measures of socioeconomic status (SES), executive function, affective sharing, empathic concern, theory of mind, and moral judgment in predicting sharing behavior in children (N = 999) from the age of 5 to 12 in five large-scale societies: Canada, China, Turkey, South Africa, and the USA. Results demonstrate that age, gender, SES, culture, and social cognitive mechanisms explain over 20% of the variance worldwide in children's resource allocation. These findings are discussed in reference to standard cultural comparisons (individualist/collectivist), as well as the degree of market integration, and highlight continuities and discontinuities in children's generosity across urban contexts. © 2016 John Wiley & Sons Ltd.

  19. Integration of a neuroimaging processing pipeline into a pan-canadian computing grid

    NASA Astrophysics Data System (ADS)

    Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.

    2012-02-01

    The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.

  20. Revenue-sharing clubs provide economic insurance and incentives for sustainability in common-pool resource systems.

    PubMed

    Tilman, Andrew R; Levin, Simon; Watson, James R

    2018-06-05

    Harvesting behaviors of natural resource users, such as farmers, fishermen and aquaculturists, are shaped by season-to-season and day-to-day variability, or in other words risk. Here, we explore how risk-mitigation strategies can lead to sustainable use and improved management of common-pool natural resources. Over-exploitation of unmanaged natural resources, which lowers their long-term productivity, is a central challenge facing societies. While effective top-down management is a possible solution, it is not available if the resource is outside the jurisdictional bounds of any management entity, or if existing institutions cannot effectively impose sustainable-use rules. Under these conditions, alternative approaches to natural resource governance are required. Here, we study revenue-sharing clubs as a mechanism by which resource users can mitigate their income volatility and importantly, as a co-benefit, are also incentivized to reduce their effort, leading to reduced over-exploitation and improved resource governance. We use game theoretic analyses and agent-based modeling to determine the conditions in which revenue-sharing can be beneficial for resource management as well as resource users. We find that revenue-sharing agreements can emerge and lead to improvements in resource management when there is large variability in production/revenue and when this variability is uncorrelated across members of the revenue-sharing club. Further, we show that if members of the revenue-sharing collective can sell their product at a price premium, then the range of ecological and economic conditions under which revenue-sharing can be a tool for management greatly expands. These results have implications for the design of bottom-up management, where resource users themselves are incentivized to operate in ecologically sustainable and economically advantageous ways. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Interlibrary Cooperation: A County-Wide Network.

    ERIC Educational Resources Information Center

    Russell, Susan S.; Hanf, Betty

    The lack of proximity to large information centers combined with limited budgets, resources, and expertise prompted the special libraries of Berks County, Pennsylvania, to attempt a network for interlibrary sharing. Information specialists from large an small industrial, medical, and public resource centers organized and operated a network of…

  2. Improving data workflow systems with cloud services and use of open data for bioinformatics research.

    PubMed

    Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich

    2017-04-16

    Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.

  3. Data management strategies for multinational large-scale systems biology projects.

    PubMed

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  4. Data management strategies for multinational large-scale systems biology projects

    PubMed Central

    Peuker, Martin; Regenbrecht, Christian R.A.

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don’t Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects. PMID:23047157

  5. Processing structure in language and music: a case for shared reliance on cognitive control.

    PubMed

    Slevc, L Robert; Okada, Brooke M

    2015-06-01

    The relationship between structural processing in music and language has received increasing interest in the past several years, spurred by the influential Shared Syntactic Integration Resource Hypothesis (SSIRH; Patel, Nature Neuroscience, 6, 674-681, 2003). According to this resource-sharing framework, music and language rely on separable syntactic representations but recruit shared cognitive resources to integrate these representations into evolving structures. The SSIRH is supported by findings of interactions between structural manipulations in music and language. However, other recent evidence suggests that such interactions also can arise with nonstructural manipulations, and some recent neuroimaging studies report largely nonoverlapping neural regions involved in processing musical and linguistic structure. These conflicting results raise the question of exactly what shared (and distinct) resources underlie musical and linguistic structural processing. This paper suggests that one shared resource is prefrontal cortical mechanisms of cognitive control, which are recruited to detect and resolve conflict that occurs when expectations are violated and interpretations must be revised. By this account, musical processing involves not just the incremental processing and integration of musical elements as they occur, but also the incremental generation of musical predictions and expectations, which must sometimes be overridden and revised in light of evolving musical input.

  6. Sharing resources: opportunities for smaller primary care practices to increase their capacity for patient care. Findings from the 2009 Commonwealth Fund International Health Policy Survey of Primary Care Physicians.

    PubMed

    Fryer, Ashley-Kay; Doty, Michelle M; Audet, Anne-Marie J

    2011-03-01

    Most Americans get their health care in small physician practices. Yet, small practice settings are often unable to provide the same range of services or partici­pate in quality improvement initiatives as large practices because they lack the staff, infor­mation technology, and office systems. One promising strategy is to share clinical sup­port services and information systems with other practices. New findings from the 2009 Commonwealth Fund International Health Policy Survey of Primary Care Physicians suggest smaller practices that share resources are more likely than those without shared resources to have advanced electronic medical records and health information technology, routinely track and manage patient information, have after-hours care arrangements, and engage in quality monitoring and benchmarking. This issue brief highlights strategies that can increase resources among small- and medium-sized practices and efforts supported by states, the private sector, and the Affordable Care Act that encourage the expansion of shared-resource models.

  7. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine.

    PubMed

    Bao, Shunxing; Weitendorf, Frederick D; Plassard, Andrew J; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A

    2017-02-11

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging.

  8. Theoretical and empirical comparison of big data image processing with Apache Hadoop and Sun Grid Engine

    NASA Astrophysics Data System (ADS)

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2017-03-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., "short" processing times and/or "large" datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply "large scale" processing transitions into "big data" and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and nonrelevant for medical imaging.

  9. Ricebase - a resource for rice breeding

    USDA-ARS?s Scientific Manuscript database

    Ricebase combines accessions, traits, markers, and genes with genome-scale datasets to empower rice breeders and geneticists to explore big-data resources. The underlying code and schema are shared with CassavaBase and the Sol Genomics Network (SGN) databases. Ricebase was launched specifically to m...

  10. Large-Scale Event Extraction from Literature with Multi-Level Gene Normalization

    PubMed Central

    Wei, Chih-Hsuan; Hakala, Kai; Pyysalo, Sampo; Ananiadou, Sophia; Kao, Hung-Yu; Lu, Zhiyong; Salakoski, Tapio; Van de Peer, Yves; Ginter, Filip

    2013-01-01

    Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/). Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from http://evexdb.org/download/, under the Creative Commons – Attribution – Share Alike (CC BY-SA) license. PMID:23613707

  11. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological observations such as soil moisture and radar rainfall depth and by sharing the model and its codes in public domain, respectively.

  12. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoliang; Stauffer, Philip H.

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  13. Landscapes for Energy and Wildlife: Conservation Prioritization for Golden Eagles across Large Spatial Scales

    PubMed Central

    Tack, Jason D.; Fedy, Bradley C.

    2015-01-01

    Proactive conservation planning for species requires the identification of important spatial attributes across ecologically relevant scales in a model-based framework. However, it is often difficult to develop predictive models, as the explanatory data required for model development across regional management scales is rarely available. Golden eagles are a large-ranging predator of conservation concern in the United States that may be negatively affected by wind energy development. Thus, identifying landscapes least likely to pose conflict between eagles and wind development via shared space prior to development will be critical for conserving populations in the face of imposing development. We used publically available data on golden eagle nests to generate predictive models of golden eagle nesting sites in Wyoming, USA, using a suite of environmental and anthropogenic variables. By overlaying predictive models of golden eagle nesting habitat with wind energy resource maps, we highlight areas of potential conflict among eagle nesting habitat and wind development. However, our results suggest that wind potential and the relative probability of golden eagle nesting are not necessarily spatially correlated. Indeed, the majority of our sample frame includes areas with disparate predictions between suitable nesting habitat and potential for developing wind energy resources. Map predictions cannot replace on-the-ground monitoring for potential risk of wind turbines on wildlife populations, though they provide industry and managers a useful framework to first assess potential development. PMID:26262876

  14. Landscapes for energy and wildlife: conservation prioritization for golden eagles across large spatial scales

    USGS Publications Warehouse

    Tack, Jason D.; Fedy, Bradley C.

    2015-01-01

    Proactive conservation planning for species requires the identification of important spatial attributes across ecologically relevant scales in a model-based framework. However, it is often difficult to develop predictive models, as the explanatory data required for model development across regional management scales is rarely available. Golden eagles are a large-ranging predator of conservation concern in the United States that may be negatively affected by wind energy development. Thus, identifying landscapes least likely to pose conflict between eagles and wind development via shared space prior to development will be critical for conserving populations in the face of imposing development. We used publically available data on golden eagle nests to generate predictive models of golden eagle nesting sites in Wyoming, USA, using a suite of environmental and anthropogenic variables. By overlaying predictive models of golden eagle nesting habitat with wind energy resource maps, we highlight areas of potential conflict among eagle nesting habitat and wind development. However, our results suggest that wind potential and the relative probability of golden eagle nesting are not necessarily spatially correlated. Indeed, the majority of our sample frame includes areas with disparate predictions between suitable nesting habitat and potential for developing wind energy resources. Map predictions cannot replace on-the-ground monitoring for potential risk of wind turbines on wildlife populations, though they provide industry and managers a useful framework to first assess potential development.

  15. Landscapes for Energy and Wildlife: Conservation Prioritization for Golden Eagles across Large Spatial Scales.

    PubMed

    Tack, Jason D; Fedy, Bradley C

    2015-01-01

    Proactive conservation planning for species requires the identification of important spatial attributes across ecologically relevant scales in a model-based framework. However, it is often difficult to develop predictive models, as the explanatory data required for model development across regional management scales is rarely available. Golden eagles are a large-ranging predator of conservation concern in the United States that may be negatively affected by wind energy development. Thus, identifying landscapes least likely to pose conflict between eagles and wind development via shared space prior to development will be critical for conserving populations in the face of imposing development. We used publically available data on golden eagle nests to generate predictive models of golden eagle nesting sites in Wyoming, USA, using a suite of environmental and anthropogenic variables. By overlaying predictive models of golden eagle nesting habitat with wind energy resource maps, we highlight areas of potential conflict among eagle nesting habitat and wind development. However, our results suggest that wind potential and the relative probability of golden eagle nesting are not necessarily spatially correlated. Indeed, the majority of our sample frame includes areas with disparate predictions between suitable nesting habitat and potential for developing wind energy resources. Map predictions cannot replace on-the-ground monitoring for potential risk of wind turbines on wildlife populations, though they provide industry and managers a useful framework to first assess potential development.

  16. Accelerating large-scale simulation of seismic wave propagation by multi-GPUs and three-dimensional domain decomposition

    NASA Astrophysics Data System (ADS)

    Okamoto, Taro; Takenaka, Hiroshi; Nakamura, Takeshi; Aoki, Takayuki

    2010-12-01

    We adopted the GPU (graphics processing unit) to accelerate the large-scale finite-difference simulation of seismic wave propagation. The simulation can benefit from the high-memory bandwidth of GPU because it is a "memory intensive" problem. In a single-GPU case we achieved a performance of about 56 GFlops, which was about 45-fold faster than that achieved by a single core of the host central processing unit (CPU). We confirmed that the optimized use of fast shared memory and registers were essential for performance. In the multi-GPU case with three-dimensional domain decomposition, the non-contiguous memory alignment in the ghost zones was found to impose quite long time in data transfer between GPU and the host node. This problem was solved by using contiguous memory buffers for ghost zones. We achieved a performance of about 2.2 TFlops by using 120 GPUs and 330 GB of total memory: nearly (or more than) 2200 cores of host CPUs would be required to achieve the same performance. The weak scaling was nearly proportional to the number of GPUs. We therefore conclude that GPU computing for large-scale simulation of seismic wave propagation is a promising approach as a faster simulation is possible with reduced computational resources compared to CPUs.

  17. ClearedLeavesDB: an online database of cleared plant leaf images

    PubMed Central

    2014-01-01

    Background Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. Description The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. Conclusions We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org. PMID:24678985

  18. ClearedLeavesDB: an online database of cleared plant leaf images.

    PubMed

    Das, Abhiram; Bucksch, Alexander; Price, Charles A; Weitz, Joshua S

    2014-03-28

    Leaf vein networks are critical to both the structure and function of leaves. A growing body of recent work has linked leaf vein network structure to the physiology, ecology and evolution of land plants. In the process, multiple institutions and individual researchers have assembled collections of cleared leaf specimens in which vascular bundles (veins) are rendered visible. In an effort to facilitate analysis and digitally preserve these specimens, high-resolution images are usually created, either of entire leaves or of magnified leaf subsections. In a few cases, collections of digital images of cleared leaves are available for use online. However, these collections do not share a common platform nor is there a means to digitally archive cleared leaf images held by individual researchers (in addition to those held by institutions). Hence, there is a growing need for a digital archive that enables online viewing, sharing and disseminating of cleared leaf image collections held by both institutions and individual researchers. The Cleared Leaf Image Database (ClearedLeavesDB), is an online web-based resource for a community of researchers to contribute, access and share cleared leaf images. ClearedLeavesDB leverages resources of large-scale, curated collections while enabling the aggregation of small-scale collections within the same online platform. ClearedLeavesDB is built on Drupal, an open source content management platform. It allows plant biologists to store leaf images online with corresponding meta-data, share image collections with a user community and discuss images and collections via a common forum. We provide tools to upload processed images and results to the database via a web services client application that can be downloaded from the database. We developed ClearedLeavesDB, a database focusing on cleared leaf images that combines interactions between users and data via an intuitive web interface. The web interface allows storage of large collections and integrates with leaf image analysis applications via an open application programming interface (API). The open API allows uploading of processed images and other trait data to the database, further enabling distribution and documentation of analyzed data within the community. The initial database is seeded with nearly 19,000 cleared leaf images representing over 40 GB of image data. Extensible storage and growth of the database is ensured by using the data storage resources of the iPlant Discovery Environment. ClearedLeavesDB can be accessed at http://clearedleavesdb.org.

  19. Expanding Access to Large-Scale Genomic Data While Promoting Privacy: A Game Theoretic Approach.

    PubMed

    Wan, Zhiyu; Vorobeychik, Yevgeniy; Xia, Weiyi; Clayton, Ellen Wright; Kantarcioglu, Murat; Malin, Bradley

    2017-02-02

    Emerging scientific endeavors are creating big data repositories of data from millions of individuals. Sharing data in a privacy-respecting manner could lead to important discoveries, but high-profile demonstrations show that links between de-identified genomic data and named persons can sometimes be reestablished. Such re-identification attacks have focused on worst-case scenarios and spurred the adoption of data-sharing practices that unnecessarily impede research. To mitigate concerns, organizations have traditionally relied upon legal deterrents, like data use agreements, and are considering suppressing or adding noise to genomic variants. In this report, we use a game theoretic lens to develop more effective, quantifiable protections for genomic data sharing. This is a fundamentally different approach because it accounts for adversarial behavior and capabilities and tailors protections to anticipated recipients with reasonable resources, not adversaries with unlimited means. We demonstrate this approach via a new public resource with genomic summary data from over 8,000 individuals-the Sequence and Phenotype Integration Exchange (SPHINX)-and show that risks can be balanced against utility more effectively than with traditional approaches. We further show the generalizability of this framework by applying it to other genomic data collection and sharing endeavors. Recognizing that such models are dependent on a variety of parameters, we perform extensive sensitivity analyses to show that our findings are robust to their fluctuations. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  20. Collaborative science, policy development and program implementation in the transboundary Georgia Basin/Puget Sound ecosystem.

    PubMed

    Fraser, David A; Gaydos, Joseph K; Karlsen, Erik; Rylko, Michael S

    2006-02-01

    The transboundary Georgia Basin Puget Sound ecosystem is situated in the southwest corner of British Columbia and northwest comer of Washington State. While bountiful and beautiful, this international region is facing significant threats to its marine and freshwater resources, air quality, habitats and species. These environmental challenges are compounded by rapid population growth and attendant uiban sprawl. As ecosystem stresses amplified and partnerships formed around possible solutions, it became increasingly clear that the shared sustainability challenges in the Georgia Basin and Puget Sound required shared solutions. Federal, state and provincial institutional arrangements were made between jurisdictions, which formalized small scale interest in transboundary management of this ecosystem. Formal agreements, however, can only do so much to further management of an ecosystem that spans international boarders. A transboundary regional research meeting, the 2003 GB/PS Research Conference, opened the doors for large-scale informal cross-boarder cooperation and management. In addition to cooperation, continued efforts to stem toxic pollution, contain urban growth, and protect and restore ecosystems, require a commitment from scientists, educators and policy makers to better integrate research and science with decision-making.

  1. Cross-Jurisdictional Resource Sharing in Local Health Departments: Implications for Services, Quality, and Cost.

    PubMed

    Humphries, Debbie L; Hyde, Justeen; Hahn, Ethan; Atherly, Adam; O'Keefe, Elaine; Wilkinson, Geoffrey; Eckhouse, Seth; Huleatt, Steve; Wong, Samuel; Kertanis, Jennifer

    2018-01-01

    Forty one percent of local health departments in the U.S. serve jurisdictions with populations of 25,000 or less. Researchers, policymakers, and advocates have long questioned how to strengthen public health systems in smaller municipalities. Cross-jurisdictional sharing may increase quality of service, access to resources, and efficiency of resource use. To characterize perceived strengths and challenges of independent and comprehensive sharing approaches, and to assess cost, quality, and breadth of services provided by independent and sharing health departments in Connecticut (CT) and Massachusetts (MA). We interviewed local health directors or their designees from 15 comprehensive resource-sharing jurisdictions and 54 single-municipality jurisdictions in CT and MA using a semi-structured interview. Quantitative data were drawn from closed-ended questions in the semi-structured interviews; municipal demographic data were drawn from the American Community Survey and other public sources. Qualitative data were drawn from open-ended questions in the semi-structured interviews. The findings from this multistate study highlight advantages and disadvantages of two common public health service delivery models - independent and shared. Shared service jurisdictions provided more community health programs and services, and invested significantly more ($120 per thousand (1K) population vs. $69.5/1K population) on healthy food access activities. Sharing departments had more indicators of higher quality food safety inspections (FSIs), and there was a non-linear relationship between cost per FSI and number of FSI. Minimum cost per FSI was reached above the total number of FSI conducted by all but four of the jurisdictions sampled. Independent jurisdictions perceived their governing bodies to have greater understanding of the roles and responsibilities of local public health, while shared service jurisdictions had fewer staff per 1,000 population. There are trade-offs with sharing and remaining independent. Independent health departments serving small jurisdictions have limited resources but strong local knowledge. Multi-municipality departments have more resources but require more time and investment in governance and decision-making. When making decisions about the right service delivery model for a given municipality, careful consideration should be given to local culture and values. Some economies of scale may be achieved through resource sharing for municipalities <25,000 population.

  2. The Scalable Checkpoint/Restart Library

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moody, A.

    The Scalable Checkpoint/Restart (SCR) library provides an interface that codes may use to worite our and read in application-level checkpoints in a scalable fashion. In the current implementation, checkpoint files are cached in local storage (hard disk or RAM disk) on the compute nodes. This technique provides scalable aggregate bandwidth and uses storage resources that are fully dedicated to the job. This approach addresses the two common drawbacks of checkpointing a large-scale application to a shared parallel file system, namely, limited bandwidth and file system contention. In fact, on current platforms, SCR scales linearly with the number of compute nodes.more » It has been benchmarked as high as 720GB/s on 1094 nodes of Atlas, which is nearly two orders of magnitude faster thanthe parallel file system.« less

  3. An interactive web application for the dissemination of human systems immunology data.

    PubMed

    Speake, Cate; Presnell, Scott; Domico, Kelly; Zeitner, Brad; Bjork, Anna; Anderson, David; Mason, Michael J; Whalen, Elizabeth; Vargas, Olivia; Popov, Dimitry; Rinchai, Darawan; Jourde-Chiche, Noemie; Chiche, Laurent; Quinn, Charlie; Chaussabel, Damien

    2015-06-19

    Systems immunology approaches have proven invaluable in translational research settings. The current rate at which large-scale datasets are generated presents unique challenges and opportunities. Mining aggregates of these datasets could accelerate the pace of discovery, but new solutions are needed to integrate the heterogeneous data types with the contextual information that is necessary for interpretation. In addition, enabling tools and technologies facilitating investigators' interaction with large-scale datasets must be developed in order to promote insight and foster knowledge discovery. State of the art application programming was employed to develop an interactive web application for browsing and visualizing large and complex datasets. A collection of human immune transcriptome datasets were loaded alongside contextual information about the samples. We provide a resource enabling interactive query and navigation of transcriptome datasets relevant to human immunology research. Detailed information about studies and samples are displayed dynamically; if desired the associated data can be downloaded. Custom interactive visualizations of the data can be shared via email or social media. This application can be used to browse context-rich systems-scale data within and across systems immunology studies. This resource is publicly available online at [Gene Expression Browser Landing Page ( https://gxb.benaroyaresearch.org/dm3/landing.gsp )]. The source code is also available openly [Gene Expression Browser Source Code ( https://github.com/BenaroyaResearch/gxbrowser )]. We have developed a data browsing and visualization application capable of navigating increasingly large and complex datasets generated in the context of immunological studies. This intuitive tool ensures that, whether taken individually or as a whole, such datasets generated at great effort and expense remain interpretable and a ready source of insight for years to come.

  4. Transmission Challenges and Best Practices for Cost-Effective Renewable Energy Delivery across State and Provincial Boundaries

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Shengru; Hurlbut, David J.; Bird, Lori A.

    A strategically planned transmission network is an important source of flexibility for the integration of large-scale renewable energy (RE). Such a network can offer access to a broad geographic diversity of resources, which can reduce flexibility needs and facilitate sharing between neighboring balancing areas. This report builds on two previous NREL technical reports - Advancing System Flexibility for High Penetration Renewable Integration (Milligan et al. 2015) and 'Renewables-Friendly' Grid Development Strategies (Hurlbut et al. 2015) - which discuss various flexibility options and provide an overview of U.S. market models and grid planning. This report focuses on addressing issues with cross-regional/provincialmore » transmission in China with the aim of integrating renewable resources that are concentrated in remote areas and require inter-regional/provincial power exchange.« less

  5. PERSPECTIVES ON LARGE-SCALE NATURAL RESOURCES SURVEYS WHEN CAUSE-EFFECT IS A POTENTIAL ISSUE

    EPA Science Inventory

    Our objective is to present a perspective on large-scale natural resource monitoring when cause-effect is a potential issue. We believe that the approach of designing a survey to meet traditional commodity production and resource state descriptive objectives is too restrictive an...

  6. Does a House Divided Stand? Kinship and the Continuity of Shared Living Arrangements

    ERIC Educational Resources Information Center

    Glick, Jennifer E.; Van Hook, Jennifer

    2011-01-01

    Shared living arrangements can provide housing, economies of scale, and other instrumental support and may become an important resource in times of economic constraint. But the extent to which such living arrangements experience continuity or rapid change in composition is unclear. Previous research on extended-family households tended to focus on…

  7. Thermal energy storage for CSP (Concentrating Solar Power)

    NASA Astrophysics Data System (ADS)

    Py, Xavier; Sadiki, Najim; Olives, Régis; Goetz, Vincent; Falcoz, Quentin

    2017-07-01

    The major advantage of concentrating solar power before photovoltaic is the possibility to store thermal energy at large scale allowing dispatchability. Then, only CSP solar power plants including thermal storage can be operated 24 h/day using exclusively the solar resource. Nevertheless, due to a too low availability in mined nitrate salts, the actual mature technology of the two tanks molten salts cannot be applied to achieve the expected international share in the power production for 2050. Then alternative storage materials are under studies such as natural rocks and recycled ceramics made from industrial wastes. The present paper is a review of those alternative approaches.

  8. Towards Device-Independent Information Processing on General Quantum Networks

    NASA Astrophysics Data System (ADS)

    Lee, Ciarán M.; Hoban, Matty J.

    2018-01-01

    The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.

  9. Towards Device-Independent Information Processing on General Quantum Networks.

    PubMed

    Lee, Ciarán M; Hoban, Matty J

    2018-01-12

    The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.

  10. Principles for an interactive multi-scale assessment of sustainable production limits - lessons from the Limpopo river basin case, South Africa

    NASA Astrophysics Data System (ADS)

    Froebrich, Jochen; de Cleccq, Willem; Veraart, Jeroen; Vullings, Wies

    2015-04-01

    About 7.2 billion people currently live on the Earth and the population is projected to reach 9.6 billion by 2050, that growth will be mainly in developing countries, with more than half in Africa (United Nations 2013). Any local extension of irrigated agriculture in a region of scarce natural resources may potentially restrict the possibility to extend land and water use at another location of the same river basin. In order to support, develop and to assess such future interventions, it is important to define limits until which a sustainable production can take place at a given location, taking into account competing claims on natural resources, human welfare and impacts on environmental quality. We define Sustainable production limits as limits for the possible resource use, within which a production can be extended without restricting the growth opportunities at a neighboured location. The more threatened the natural resources become, the more important it is to consider the effect of other upcoming interventions within the same region. As a consequence, interventions for future resource use have to be assessed against the future available natural resources. This is of particular relevance for evaluating possible extensions of irrigation areas within a river basin. Investigating possible limits for extending irrigated agriculture at local scale requires an understanding of the complexity, including boundaries, activities, stakeholders, and opportunities at river basin scale, and more. Switching between the scales in this information, in a participatory process, appears to be a challenge in its own. Within the Limpopo River basin (South Africa), we analysed (i) possible interventions at local scale (transdisciplinary innovation of irrigation by smallholders, launching of PPPs), (ii) restrictions for developing irrigation at the Letaba sub basin scale, and (iii) water balance at the scale of the Limpopo basin. Experiences from the Limpopo case revealed, that within the field of socio-hydrology interventions affecting land and water use, depend for a large part on entrepreneurial or at least human initiatives and an enabling environment. Such variables cannot be included in quantitative deterministic models. Therefore we have to find other ways to anticipate future developments. Furthermore for the upscaling - downscaling of local interventions it is important to reduce complexity. Instead of providing a plethora of scenarios, which will only hinder decision making, the process of defining sustainable production limits have to cumulate in a jointly shared strategy for the most likely future use of land, water and biodiversity resources. More experience must be gained how to facilitate such an interactive development of a jointly shared strategy best. Modern interactive IT tools can play a major role, but require a strong interaction with hydrological models and water balance calculation at the various scales. Acknowledgement The work has been partly executed within the EU project FP7 EAU4Food, the PPP project Inno Giyani, which is funded by the Dutch Government, and the Dutch funded project "More food on smaller footprint". The authors are grateful to all project partners and colleagues for the contributions and discussion.

  11. "I'm Not Sharing My Work!" An Approach to Community Building

    ERIC Educational Resources Information Center

    Lewis, David; Slapak-Barski, Judith

    2014-01-01

    The faculty of a large southeastern university were brought together to form a "community of faculty." With support from a federal grant we set out to build what came to be called the Faculty Toolbox. This website serves as a public repository that brought together the faculty to share and contribute to its library of shared resources.…

  12. A Real-Time Web of Things Framework with Customizable Openness Considering Legacy Devices

    PubMed Central

    Zhao, Shuai; Yu, Le; Cheng, Bo

    2016-01-01

    With the development of the Internet of Things (IoT), resources and applications based on it have emerged on a large scale. However, most efforts are “silo” solutions where devices and applications are tightly coupled. Infrastructures are needed to connect sensors to the Internet, open up and break the current application silos and move to a horizontal application mode. Based on the concept of Web of Things (WoT), many infrastructures have been proposed to integrate the physical world with the Web. However, issues such as no real-time guarantee, lack of fine-grained control of data, and the absence of explicit solutions for integrating heterogeneous legacy devices, hinder their widespread and practical use. To address these issues, this paper proposes a WoT resource framework that provides the infrastructures for the customizable openness and sharing of users’ data and resources under the premise of ensuring the real-time behavior of their own applications. The proposed framework is validated by actual systems and experimental evaluations. PMID:27690038

  13. A Real-Time Web of Things Framework with Customizable Openness Considering Legacy Devices.

    PubMed

    Zhao, Shuai; Yu, Le; Cheng, Bo

    2016-09-28

    With the development of the Internet of Things (IoT), resources and applications based on it have emerged on a large scale. However, most efforts are "silo" solutions where devices and applications are tightly coupled. Infrastructures are needed to connect sensors to the Internet, open up and break the current application silos and move to a horizontal application mode. Based on the concept of Web of Things (WoT), many infrastructures have been proposed to integrate the physical world with the Web. However, issues such as no real-time guarantee, lack of fine-grained control of data, and the absence of explicit solutions for integrating heterogeneous legacy devices, hinder their widespread and practical use. To address these issues, this paper proposes a WoT resource framework that provides the infrastructures for the customizable openness and sharing of users' data and resources under the premise of ensuring the real-time behavior of their own applications. The proposed framework is validated by actual systems and experimental evaluations.

  14. Decentralized Data Sharing of Tissue Microarrays for Investigative Research in Oncology

    PubMed Central

    Chen, Wenjin; Schmidt, Cristina; Parashar, Manish; Reiss, Michael; Foran, David J.

    2007-01-01

    Tissue microarray technology (TMA) is a relatively new approach for efficiently and economically assessing protein and gene expression across large ensembles of tissue specimens. Tissue microarray technology holds great potential for reducing the time and cost associated with conducting research in tissue banking, proteomics, and outcome studies. However, the sheer volume of images and other data generated from even limited studies involving tissue microarrays quickly approach the processing capacity and resources of a division or department. This challenge is compounded by the fact that large-scale projects in several areas of modern research rely upon multi-institutional efforts in which investigators and resources are spread out over multiple campuses, cities, and states. To address some of the data management issues several leading institutions have begun to develop their own “in-house” systems, independently, but such data will be only minimally useful if it isn’t accessible to others in the scientific community. Investigators at different institutions studying the same or related disorders might benefit from the synergy of sharing results. To facilitate sharing of TMA data across different database implementations, the Technical Standards Committee of the Association for Pathology Informatics organized workshops in efforts to establish a standardized TMA data exchange specification. The focus of our research does not relate to the establishment of standards for exchange, but rather builds on these efforts and concentrates on the design, development and deployment of a decentralized collaboratory for the unsupervised characterization, and seamless and secure discovery and sharing of TMA data. Specifically, we present a self-organizing, peer-to-peer indexing and discovery infrastructure for quantitatively assessing digitized TMA’s. The system utilizes a novel, optimized decentralized search engine that supports flexible querying, while guaranteeing that once information has been stored in the system, it will be found with bounded costs. PMID:19081778

  15. How institutions shaped the last major evolutionary transition to large-scale human societies

    PubMed Central

    Powers, Simon T.; van Schaik, Carel P.; Lehmann, Laurent

    2016-01-01

    What drove the transition from small-scale human societies centred on kinship and personal exchange, to large-scale societies comprising cooperation and division of labour among untold numbers of unrelated individuals? We propose that the unique human capacity to negotiate institutional rules that coordinate social actions was a key driver of this transition. By creating institutions, humans have been able to move from the default ‘Hobbesian’ rules of the ‘game of life’, determined by physical/environmental constraints, into self-created rules of social organization where cooperation can be individually advantageous even in large groups of unrelated individuals. Examples include rules of food sharing in hunter–gatherers, rules for the usage of irrigation systems in agriculturalists, property rights and systems for sharing reputation between mediaeval traders. Successful institutions create rules of interaction that are self-enforcing, providing direct benefits both to individuals that follow them, and to individuals that sanction rule breakers. Forming institutions requires shared intentionality, language and other cognitive abilities largely absent in other primates. We explain how cooperative breeding likely selected for these abilities early in the Homo lineage. This allowed anatomically modern humans to create institutions that transformed the self-reliance of our primate ancestors into the division of labour of large-scale human social organization. PMID:26729937

  16. Researching Resistance to Open Education Resource Contribution: An Activity Theory Approach

    ERIC Educational Resources Information Center

    Cox, Glenda

    2013-01-01

    Higher education and associated institutions are beginning to share teaching materials known as Open Educational Resources (OER) or open courseware across the globe. Their success depends largely on the willingness of academics at these institutions to add their teaching resources. In a survey of the literature on OER there are several articles…

  17. Using Linked Data to Annotate and Search Educational Video Resources for Supporting Distance Learning

    ERIC Educational Resources Information Center

    Yu, Hong Qing; Pedrinaci, C.; Dietze, S.; Domingue, J.

    2012-01-01

    Multimedia educational resources play an important role in education, particularly for distance learning environments. With the rapid growth of the multimedia web, large numbers of educational video resources are increasingly being created by several different organizations. It is crucial to explore, share, reuse, and link these educational…

  18. ASK-LDT 2.0: A Web-Based Graphical Tool for Authoring Learning Designs

    ERIC Educational Resources Information Center

    Zervas, Panagiotis; Fragkos, Konstantinos; Sampson, Demetrios G.

    2013-01-01

    During the last decade, Open Educational Resources (OERs) have gained increased attention for their potential to support open access, sharing and reuse of digital educational resources. Therefore, a large amount of digital educational resources have become available worldwide through web-based open access repositories which are referred to as…

  19. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets

    PubMed Central

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852

  20. Measure for Measure: Urban Water and Energy

    NASA Astrophysics Data System (ADS)

    Chini, C.; Stillwell, A. S.

    2017-12-01

    Urban environments in the United States account for a majority of the population and, as such, require large volumes of treated drinking water supply and wastewater removal, both of which need energy. Despite the large share of water that urban environments demand, there is limited accounting of these water resources outside of the city itself. In this study, we provide and analyze a database of drinking water and wastewater utility flows and energy that comprise anthropogenic fluxes of water through the urban environment. We present statistical analyses of the database at an annual, spatial, and intra-annual scale. The average daily per person water flux is estimated as 563 liters of drinking water and 496 liters of wastewater, requiring 340 kWh/1000 m3 and 430 kWh/1000 m3 of energy, respectively, to treat these resources. This energy demand accounts for 1% of the total annual electricity production of the United States. Additionally, the water and embedded energy loss associated with non-revenue water (estimated at 15.8% annually) accounts for 9.1 km3of water and 3600 GWh, enough electricity to power 300,000 U.S. households annually. Through the analysis and benchmarking of the current state of urban water fluxes, we propose the term `blue city,' which promotes urban sustainability and conservation policy focusing on water resources. As the nation's water resources become scarcer and more unpredictable, it is essential to include water resources in urban sustainability planning and continue data collection of these vital resources.

  1. Dynamic Collaboration Infrastructure for Hydrologic Science

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.

    2016-12-01

    Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.

  2. Governance of global health research consortia: Sharing sovereignty and resources within Future Health Systems.

    PubMed

    Pratt, Bridget; Hyder, Adnan A

    2017-02-01

    Global health research partnerships are increasingly taking the form of consortia that conduct programs of research in low and middle-income countries (LMICs). An ethical framework has been developed that describes how the governance of consortia comprised of institutions from high-income countries and LMICs should be structured to promote health equity. It encompasses initial guidance for sharing sovereignty in consortia decision-making and sharing consortia resources. This paper describes a first effort to examine whether and how consortia can uphold that guidance. Case study research was undertaken with the Future Health Systems consortium, performs research to improve health service delivery for the poor in Bangladesh, China, India, and Uganda. Data were thematically analysed and revealed that proposed ethical requirements for sharing sovereignty and sharing resources are largely upheld by Future Health Systems. Facilitating factors included having a decentralised governance model, LMIC partners with good research capacity, and firm budgets. Higher labour costs in the US and UK and the funder's policy of allocating funds to consortia on a reimbursement basis prevented full alignment with guidance on sharing resources. The lessons described in this paper can assist other consortia to more systematically link their governance policy and practice to the promotion of health equity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  4. Exploring the effectiveness of sustainable water management structures in the Upper Pungwe river basin

    NASA Astrophysics Data System (ADS)

    Nyikadzino, B.; Chibisa, P.; Makurira, H.

    The study endeavoured to assess the effectiveness of stakeholder structures and their participation in sustainable water resources management in the Upper Pungwe river basin shared by Zimbabwe and Mozambique. The study sought to assess the level and effectiveness of stakeholder, gender and the vulnerable groups representation in sustainable water resources management as well as the whole stakeholder participation process. The study employed both qualitative and quantitative methods for data collection and analysis. Sampling data was obtained from 15 stakeholder representatives (councillors) constituting Pungwe Subcatchment Council, 30 water users ranging from small scale to large scale users and professionals in water resources management. Two different questionnaires and three structured interviews were administered during the study. Water permit database, financial reports and other source documents were also analysed. The study established that the sustainability and effectiveness of stakeholder structures and their participation in water resources management is being compromised by lack of stakeholder awareness. Water utilisation is very high in the subcatchment (99%) while women participation is still low (20%). The study therefore recommends the use of quotas for the participation of women in stakeholder structures. Stakeholder structures are encouraged to intensify stakeholder awareness on issues of river protection, efficient water use and pollution control. Further research is recommended to be carried out on the effectiveness of stakeholder structures in combating water pollution and enhancing river protection.

  5. Scaling issues in sustainable river basin management

    NASA Astrophysics Data System (ADS)

    Timmerman, Jos; Froebich, Jochen

    2014-05-01

    Sustainable river basin management implies considering the whole river basin when managing the water resources. Management measures target at dividing the water over different uses (nature, agriculture, industry, households) thereby avoiding calamities like having too much, too little or bad quality water. Water management measures are taken at the local level, usually considering the sub-national and sometimes national effects of such measures. A large part of the world's freshwater resources, however, is contained in river basins and groundwater systems that are shared by two or more countries. Sustainable river basin management consequently has to encompass local, regional, national and international scales. This requires coordination over and cooperation between these levels that is currently compressed into the term 'water governance' . Governance takes into account that a large number of stakeholders in different regimes (the principles, rules and procedures that steer management) contribute to policy and management of a resource. Governance includes the increasing importance of basically non-hierarchical modes of governing, where non-state actors (formal organizations like NGOs, private companies, consumer associations, etc.) participate in the formulation and implementation of public policy. Land use determines the run-off generation and use of irrigation water. Land use is increasingly determined by private sector initiatives at local scale. This is a complicating factor in the governance issue, as in comparison to former developments of large scale irrigation systems, planning institutions at state level have then less insight on actual water consumption. The water management regime of a basin consequently has to account for the different scales of water management and within these different scales with both state and non-state actors. The central elements of regimes include the policy setting (the policies and water management strategies), legal setting (national and international laws and agreements), the institutional setting (the formal networks), information management (the information collection and dissemination system), and financing systems (the public and private sources that cover the water management costs). These elements are usually designed for a specific level and are ideally aligned with the other levels. The presentation will go into detail on connecting the different elements of the water management regime between different levels as well as on the overarching governance issues that play a role and will present opportunities and limitations of the linking options.

  6. Tried & Tested. Ideas from Teacher Centers in the Southeast.

    ERIC Educational Resources Information Center

    Bohstedt, Jinx, Ed.; Eisenmann-Donahue, Pat, Ed.

    Throughout the southeastern United States, teacher centers share much in common. The conceptual framework of teachers helping teachers inspires the development of resources and services which are similar whether the center serves a large district or only a few schools. Although the teacher centers share similar philosophies, concerns, successes,…

  7. Supporting Shared Resource Usage for a Diverse User Community: the OSG Experience and Lessons Learned

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele; Levshina, Tanya; Rynge, Mats; Sehgal, Chander; Slyz, Marko

    2012-12-01

    The Open Science Grid (OSG) supports a diverse community of new and existing users in adopting and making effective use of the Distributed High Throughput Computing (DHTC) model. The LHC user community has deep local support within the experiments. For other smaller communities and individual users the OSG provides consulting and technical services through the User Support area. We describe these sometimes successful and sometimes not so successful experiences and analyze lessons learned that are helping us improve our services. The services offered include forums to enable shared learning and mutual support, tutorials and documentation for new technology, and troubleshooting of problematic or systemic failure modes. For new communities and users, we bootstrap their use of the distributed high throughput computing technologies and resources available on the OSG by following a phased approach. We first adapt the application and run a small production campaign on a subset of “friendly” sites. Only then do we move the user to run full production campaigns across the many remote sites on the OSG, adding to the community resources up to hundreds of thousands of CPU hours per day. This scaling up generates new challenges - like no determinism in the time to job completion, and diverse errors due to the heterogeneity of the configurations and environments - so some attention is needed to get good results. We cover recent experiences with image simulation for the Large Synoptic Survey Telescope (LSST), small-file large volume data movement for the Dark Energy Survey (DES), civil engineering simulation with the Network for Earthquake Engineering Simulation (NEES), and accelerator modeling with the Electron Ion Collider group at BNL. We will categorize and analyze the use cases and describe how our processes are evolving based on lessons learned.

  8. Program on application of communications satellites to educational development

    NASA Technical Reports Server (NTRS)

    Morgan, R. P.; Singh, J. P.

    1971-01-01

    Interdisciplinary research in needs analysis, communications technology studies, and systems synthesis is reported. Existing and planned educational telecommunications services are studied and library utilization of telecommunications is described. Preliminary estimates are presented of ranges of utilization of educational telecommunications services for 1975 and 1985; instructional and public television, computer-aided instruction, computing resources, and information resource sharing for various educational levels and purposes. Communications technology studies include transmission schemes for still-picture television, use of Gunn effect devices, and TV receiver front ends for direct satellite reception at 12 GHz. Two major studies in the systems synthesis project concern (1) organizational and administrative aspects of a large-scale instructional satellite system to be used with schools and (2) an analysis of future development of instructional television, with emphasis on the use of video tape recorders and cable television. A communications satellite system synthesis program developed for NASA is now operational on the university IBM 360-50 computer.

  9. A new workforce in the making? A case study of strategic human resource management in a whole-system change effort in healthcare.

    PubMed

    Macfarlane, Fraser; Greenhalgh, Trish; Humphrey, Charlotte; Hughes, Jane; Butler, Ceri; Pawson, Ray

    2011-01-01

    This paper seeks to describe the exploration of human resource issues in one large-scale program of innovation in healthcare. It is informed by established theories of management in the workplace and a multi-level model of diffusion of innovations. A realist approach was used based on interviews, ethnographic observation and documentary analysis. Five main approaches ("theories of change") were adopted to develop and support the workforce: recruiting staff with skills in service transformation; redesigning roles and creating new roles; enhancing workforce planning; linking staff development to service needs; creating opportunities for shared learning and knowledge exchange. Each had differing levels of success. The paper includes HR implications for the modernisation of a complex service organisation. This is the first time a realist evaluation of a complex health modernisation initiative has been undertaken.

  10. Expanding Approaches for Understanding Impact: Integrating Technology, Curriculum, and Open Educational Resources in Science Education

    ERIC Educational Resources Information Center

    Ye, Lei; Recker, Mimi; Walker, Andrew; Leary, Heather; Yuan, Min

    2015-01-01

    This article reports results from a scale-up study of the impact of a software tool designed to support teachers in the digital learning era. This tool, the Curriculum Customization Service (CCS), enables teachers to access open educational resources from multiple providers, customize them for classroom instruction, and share them with other…

  11. Environmental management of small-scale and artisanal mining: the Portovelo-Zaruma goldmining area, southern Ecuador.

    PubMed

    Tarras-Wahlberg, N H

    2002-06-01

    This paper considers technical measures and policy initiatives needed to improve environmental management in the Portovelo-Zaruma mining district of southern Ecuador. In this area, gold is mined by a large number of small-scale and artisanal operators, and discharges of cyanide and metal-laden tailings have had a severe impact on the shared Ecuadorian-Peruvian Puyango river system. It is shown to be technically possible to confine mining waste and tailings at a reasonable cost. However, the complex topography of the mining district forces tailings management to be communal, where all operators are connected to one central tailings impoundment. This, in turn, implies two things: (i) that a large number of operators must agree to pool resources to bring such a facility into reality; and (ii) that miners must move away from rudimentary operations that survive on a day-to-day basis, towards bigger, mechanized and longer-term sustainable operations that are based on proven ore reserves. It is deemed unlikely that existing environmental regulations and the provision of technical solutions will be sufficient to resolve the environmental problems. Important impediments relate to the limited financial resources available to each individual miner and the problems of pooling these resources, and to the fact that the main impacts of pollution are suffered downstream of the mining district and, hence, do not affect the miners themselves. Three policy measures are therefore suggested. First, the enforcement of existing regulations must be improved, and this may be achieved by the strengthening of the central authority charged with supervision and control of mining activities. Second, local government involvement and local public participation in environmental management needs to be promoted. Third, a clear policy should be defined which promotes the reorganisation of small operations into larger units that are strong enough to sustain rational exploration and environmental obligations. The case study suggests that mining policy in lesser-developed countries should develop to enable small-scale and artisanal miners to form entities that are of a sufficiently large scale to allow adequate and cost-effective environmental protection.

  12. Performance of distributed multiscale simulations

    PubMed Central

    Borgdorff, J.; Ben Belgacem, M.; Bona-Casas, C.; Fazendeiro, L.; Groen, D.; Hoenen, O.; Mizeranschi, A.; Suter, J. L.; Coster, D.; Coveney, P. V.; Dubitzky, W.; Hoekstra, A. G.; Strand, P.; Chopard, B.

    2014-01-01

    Multiscale simulations model phenomena across natural scales using monolithic or component-based code, running on local or distributed resources. In this work, we investigate the performance of distributed multiscale computing of component-based models, guided by six multiscale applications with different characteristics and from several disciplines. Three modes of distributed multiscale computing are identified: supplementing local dependencies with large-scale resources, load distribution over multiple resources, and load balancing of small- and large-scale resources. We find that the first mode has the apparent benefit of increasing simulation speed, and the second mode can increase simulation speed if local resources are limited. Depending on resource reservation and model coupling topology, the third mode may result in a reduction of resource consumption. PMID:24982258

  13. The landscape for epigenetic/epigenomic biomedical resources

    PubMed Central

    Shakya, Kabita; O'Connell, Mary J.; Ruskin, Heather J.

    2012-01-01

    Recent advances in molecular biology and computational power have seen the biomedical sector enter a new era, with corresponding development of Bioinformatics as a major discipline. Generation of enormous amounts of data has driven the need for more advanced storage solutions and shared access through a range of public repositories. The number of such biomedical resources is increasing constantly and mining these large and diverse data sets continues to present real challenges. This paper attempts a general overview of currently available resources, together with remarks on their data mining and analysis capabilities. Of interest here is the recent shift in focus from genetic to epigenetic/epigenomic research and the emergence and extension of resource provision to support this both at local and global scale. Biomedical text and numerical data mining are both considered, the first dealing with automated methods for analyzing research content and information extraction, and the second (broadly) with pattern recognition and prediction. Any summary and selection of resources is inherently limited, given the spectrum available, but the aim is to provide a guideline for the assessment and comparison of currently available provision, particularly as this relates to epigenetics/epigenomics. PMID:22874136

  14. [Thirty years of US long-term ecological research: characteristics, results, and lessons learned of--taking the Virginia Coast Reserve as an example].

    PubMed

    Zhu, Gao-Ru; Porter, John H; Xu, Xue-Gong

    2011-06-01

    In order to observe and understand long-term and large-scale ecological changes, the US National Science Foundation initiated a Long-Term Ecological Research (LTER) program in 1980. Over the past 30 years, the US LTER program has achieved advances in ecological and social science research, and in the development of site-based research infrastructure. This paper attributed the success of the program to five characteristics, i.e., 1) consistency of research topics and data across the network, 2) long-term time scale of both the research and the program, 3) flexibility in research content and funding procedures, 4) growth of LTER to include international partners, new disciplines such as social science, advanced research methods, and cooperation among sites, and 5) sharing of data and educational resources. The Virginia Coast Reserve LTER site was taken as an example to illustrate how the US LTER works at site level. Some suggestions were made on the China long-term ecological research, including strengthening institution construction, improving network and inter-site cooperation, emphasizing data quality, management, and sharing, reinforcing multidisciplinary cooperation, and expanding public influence.

  15. Disaster and Contingency Planning for Scientific Shared Resource Cores.

    PubMed

    Mische, Sheenah; Wilkerson, Amy

    2016-04-01

    Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores ("cores") to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution's overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy.

  16. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...

  17. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...

  18. Studying Teacher Selection of Resources in an Ultra-Large Scale Interactive System: Does Metadata Guide the Way?

    ERIC Educational Resources Information Center

    Abramovich, Samuel; Schunn, Christian

    2012-01-01

    Ultra-large-scale interactive systems on the Internet have begun to change how teachers prepare for instruction, particularly in regards to resource selection. Consequently, it is important to look at how teachers are currently selecting resources beyond content or keyword search. We conducted a two-part observational study of an existing popular…

  19. Supporting information technology across health boards in New Zealand: themes emerging from the development of a shared services organization.

    PubMed

    Day, K J; Norris, A C

    2006-03-01

    Shared services organizations are ascribed with adding value to business in several ways but especially by sharing resources and leading to economies of scale. However, these gains are not automatic and in some instances, particularly healthcare, they are difficult to achieve. This article describes a project to develop a shared services information technology infrastructure across two district health boards in New Zealand. The study reveals valuable insight into the crisis issues that accompany change management and identifies emergent themes that can be used to reduce negative impact.

  20. Indirect Reciprocity, Resource Sharing, and Environmental Risk: Evidence from Field Experiments in Siberia

    PubMed Central

    Howe, E. Lance; Murphy, James J.; Gerkey, Drew; West, Colin Thor

    2016-01-01

    Integrating information from existing research, qualitative ethnographic interviews, and participant observation, we designed a field experiment that introduces idiosyncratic environmental risk and a voluntary sharing decision into a standard public goods game. Conducted with subsistence resource users in rural villages on the Kamchatka Peninsula in Northeast Siberia, we find evidence consistent with a model of indirect reciprocity and local social norms of helping the needy. When participants are allowed to develop reputations in the experiments, as is the case in most small-scale societies, we find that sharing is increasingly directed toward individuals experiencing hardship, good reputations increase aid, and the pooling of resources through voluntary sharing becomes more effective. We also find high levels of voluntary sharing without a strong commitment device; however, this form of cooperation does not increase contributions to the public good. Our results are consistent with previous experiments and theoretical models, suggesting strategic risks tied to rewards, punishments, and reputations are important. However, unlike studies that focus solely on strategic risks, we find the effects of rewards, punishments, and reputations are altered by the presence of environmental factors. Unexpected changes in resource abundance increase interdependence and may alter the costs and benefits of cooperation, relative to defection. We suggest environmental factors that increase interdependence are critically important to consider when developing and testing theories of cooperation PMID:27442434

  1. Feasibility of large-scale power plants based on thermoelectric effects

    NASA Astrophysics Data System (ADS)

    Liu, Liping

    2014-12-01

    Heat resources of small temperature difference are easily accessible, free and enormous on the Earth. Thermoelectric effects provide the technology for converting these heat resources directly into electricity. We present designs for electricity generators based on thermoelectric effects that utilize heat resources of small temperature difference, e.g., ocean water at different depths and geothermal resources, and conclude that large-scale power plants based on thermoelectric effects are feasible and economically competitive. The key observation is that the power factor of thermoelectric materials, unlike the figure of merit, can be improved by orders of magnitude upon laminating good conductors and good thermoelectric materials. The predicted large-scale power generators based on thermoelectric effects, if validated, will have the advantages of the scalability, renewability, and free supply of heat resources of small temperature difference on the Earth.

  2. Scaling and Sustaining Effective Early Childhood Programs Through School-Family-University Collaboration.

    PubMed

    Reynolds, Arthur J; Hayakawa, Momoko; Ou, Suh-Ruu; Mondi, Christina F; Englund, Michelle M; Candee, Allyson J; Smerillo, Nicole E

    2017-09-01

    We describe the development, implementation, and evaluation of a comprehensive preschool to third grade prevention program for the goals of sustaining services at a large scale. The Midwest Child-Parent Center (CPC) Expansion is a multilevel collaborative school reform model designed to improve school achievement and parental involvement from ages 3 to 9. By increasing the dosage, coordination, and comprehensiveness of services, the program is expected to enhance the transition to school and promote more enduring effects on well-being in multiple domains. We review and evaluate evidence from two longitudinal studies (Midwest CPC, 2012 to present; Chicago Longitudinal Study, 1983 to present) and four implementation examples of how the guiding principles of shared ownership, committed resources, and progress monitoring for improvement can promote effectiveness. The implementation system of partners and further expansion using "Pay for Success" financing shows the feasibility of scaling the program while continuing to improve effectiveness. © 2017 The Authors. Child Development published by Wiley Periodicals, Inc. on behalf of Society for Research in Child Development.

  3. Nebhydro: Sharing Geospatial Data to Supportwater Management in Nebraska

    NASA Astrophysics Data System (ADS)

    Kamble, B.; Irmak, A.; Hubbard, K.; Deogun, J.; Dvorak, B.

    2012-12-01

    Recent advances in web-enabled geographical technologies have the potential to make a dramatic impact on development of highly interactive spatial applications on the web for visualization of large-scale geospatial data by water resources and irrigation scientists. Spatial and point scale water resources data visualization are an emerging and challenging application domain. Query based visual explorations of geospatial hydrological data can play an important role in stimulating scientific hypotheses and seeking causal relationships among hydro variables. The Nebraska Hydrological Information System (NebHydro) utilizes ESRI's ArcGIS server technology to increase technological awareness among farmers, irrigation managers and policy makers. Web-based geospatial applications are an effective way to expose scientific hydrological datasets to the research community and the public. NebHydro uses Adobe Flex technology to offer an online visualization and data analysis system for presentation of social and economic data. Internet mapping services is an integrated product of GIS and Internet technologies; it is a favored solution to achieve the interoperability of GIS. The development of Internet based GIS services in the state of Nebraska showcases the benefits of sharing geospatial hydrological data among agencies, resource managers and policy makers. Geospatial hydrological Information (Evapotranspiration from Remote Sensing, vegetation indices (NDVI), USGS Stream gauge data, Climatic data etc.) is generally generated through model simulation (METRIC, SWAP, Linux, Python based scripting etc). Information is compiled into and stored within object oriented relational spatial databases using a geodatabase information model that supports the key data types needed by applications including features, relationships, networks, imagery, terrains, maps and layers. The system provides online access, querying, visualization, and analysis of the hydrological data from several sources at one place. The study indicates that internet GIS, developed using advanced technologies, provides valuable education potential to users in hydrology and irrigation engineering and suggests that such a system can support advanced hydrological data access and analysis tools to improve utility of data in operations. Keywords: Hydrological Information System, NebHydro, Water Management, data sharing, data visualization, ArcGIS server.

  4. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    NASA Astrophysics Data System (ADS)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely pre-deployed workflow engines or submits workflow engines with the workflow to local or remote resources to execute workflows. The SHIWA Proxy Server manages certificates needed to execute the workflows on different DCIs. Currently SSP supports sharing of ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflows. Further workflow systems can be added to the simulation platform as required by research communities. The FP7 'Building a European Research Community through Interoperable Workflows and Data' (ER-flow) project disseminates the achievements of the SHIWA project to build workflow user communities across Europe. ER-flow provides application supports to research communities within (Astrophysics, Computational Chemistry, Heliophysics and Life Sciences) and beyond (Hydrometeorology and Seismology) to develop, share and run workflows through the simulation platform. The simulation platform supports four usage scenarios: creating and publishing workflows in the repository, searching and selecting workflows in the repository, executing non-native workflows and creating and running meta-workflows. The presentation will outline the CGI concept, the SHIWA Simulation Platform, the ER-flow usage scenarios and how the Hydrometeorology research community runs simulations on SSP.

  5. Overview of the LINCS architecture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.; Watson, R.W.

    1982-01-13

    Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less

  6. International Observe the Moon Night: Using Public Outreach Events to Tell Your Story to the Public

    NASA Astrophysics Data System (ADS)

    Hsu, B. C.; International Observe the Moon Night Coordinating Committee

    2011-12-01

    From various interpretations of the lunar "face," early pictograms of the Moon's phases, or to the use of the lunar cycle for festivals or harvests, the Moon has an undeniable influence on human civilization. International Observe the Moon Night (InOMN) capitalizes on the human connection to the Moon by engaging the public in annual lunar observation campaigns that share the excitement of lunar science and exploration. In 2010 (InOMN's inaugural year), over 500,000 people attended events in 53 countries around the world. About 68% of InOMN hosts - astronomy clubs, museums, schools, or other groups - used the resources on the InOMN website (http://observethemoonnight.org). The InOMN website provided supporting materials for InOMN event hosts in the form of downloadable advertising materials, Moon maps, suggestions for hands-on educational activities, and links to lunar science content. InOMN event participants shared their experiences with the world using the Web and social media, event hosts shared their experiences with evaluation data, and amateur astronomers and photographers shared their images of the Moon through the lunar photography contest. The overwhelming response from InOMN in 2010 represents an untapped potential for infusing cutting edge lunar science and exploration into a large-scale public outreach event.

  7. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    PubMed

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  8. Network bandwidth utilization forecast model on high bandwidth networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wuchert; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2%. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  9. Network Bandwidth Utilization Forecast Model on High Bandwidth Network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the scale of the data size growth, it has become more challenging for users to achieve the best possible network performance on a shared network. We have developed a forecast model to predict expected bandwidth utilization for high-bandwidth wide area network. The forecast model can improve the efficiency of resource utilization and scheduling data movements on high-bandwidth network to accommodate ever increasing data volume for large-scale scientific data applications. Univariate model is developed with STL and ARIMA on SNMP path utilization data. Compared with traditional approach such as Box-Jenkins methodology,more » our forecast model reduces computation time by 83.2percent. It also shows resilience against abrupt network usage change. The accuracy of the forecast model is within the standard deviation of the monitored measurements.« less

  10. Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.

    PubMed

    Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L

    2014-01-01

    As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  11. A 10-year ecosystem restoration community of practice tracks large-scale restoration trends

    EPA Science Inventory

    In 2004, a group of large-scale ecosystem restoration practitioners across the United States convened to start the process of sharing restoration science, management, and best practices under the auspices of a traditional conference umbrella. This forum allowed scientists and dec...

  12. Large-scale P2P network based distributed virtual geographic environment (DVGE)

    NASA Astrophysics Data System (ADS)

    Tan, Xicheng; Yu, Liang; Bian, Fuling

    2007-06-01

    Virtual Geographic Environment has raised full concern as a kind of software information system that helps us understand and analyze the real geographic environment, and it has also expanded to application service system in distributed environment--distributed virtual geographic environment system (DVGE), and gets some achievements. However, limited by the factor of the mass data of VGE, the band width of network, as well as numerous requests and economic, etc. DVGE still faces some challenges and problems which directly cause the current DVGE could not provide the public with high-quality service under current network mode. The Rapid development of peer-to-peer network technology has offered new ideas of solutions to the current challenges and problems of DVGE. Peer-to-peer network technology is able to effectively release and search network resources so as to realize efficient share of information. Accordingly, this paper brings forth a research subject on Large-scale peer-to-peer network extension of DVGE as well as a deep study on network framework, routing mechanism, and DVGE data management on P2P network.

  13. A Survey on Virtualization of Wireless Sensor Networks

    PubMed Central

    Islam, Md. Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam

    2012-01-01

    Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization. PMID:22438759

  14. A survey on virtualization of Wireless Sensor Networks.

    PubMed

    Islam, Md Motaharul; Hassan, Mohammad Mehedi; Lee, Ga-Won; Huh, Eui-Nam

    2012-01-01

    Wireless Sensor Networks (WSNs) are gaining tremendous importance thanks to their broad range of commercial applications such as in smart home automation, health-care and industrial automation. In these applications multi-vendor and heterogeneous sensor nodes are deployed. Due to strict administrative control over the specific WSN domains, communication barriers, conflicting goals and the economic interests of different WSN sensor node vendors, it is difficult to introduce a large scale federated WSN. By allowing heterogeneous sensor nodes in WSNs to coexist on a shared physical sensor substrate, virtualization in sensor network may provide flexibility, cost effective solutions, promote diversity, ensure security and increase manageability. This paper surveys the novel approach of using the large scale federated WSN resources in a sensor virtualization environment. Our focus in this paper is to introduce a few design goals, the challenges and opportunities of research in the field of sensor network virtualization as well as to illustrate a current status of research in this field. This paper also presents a wide array of state-of-the art projects related to sensor network virtualization.

  15. Terra II--A Spaceship Earth Simulation.

    ERIC Educational Resources Information Center

    Mastrude, Peggy

    1985-01-01

    This simulation helps students in grades four to eight see their planet as one environment with limited resources shared by all. Students learn that the earth is a large system comprised of small systems, that systems are interdependent and often have irreplaceable parts, and that resources are not equally divided among countries. (RM)

  16. Economic and hydrogeologic disparities govern the vulnerability of shared groundwater to strategic overdraft

    NASA Astrophysics Data System (ADS)

    Mullen, C.; Muller, M. F.

    2017-12-01

    Groundwater resources are depleting globally at an alarming rate. When the resource is shared, exploitation by individual users affects groundwater levels and increases pumping costs to all users. This incentivizes individual users to strategically over-pump, an effect that is challenging to keep in check because the underground nature of the resource often precludes regulations from being effectively implemented. As a result, shared groundwater resources are prone to tragedies of the commons that exacerbate their rapid depletion. However, we showed in a recent study that the vulnerability of aquifer systems to strategic overuse is strongly affected by local economic and physical characteristics, which suggests that not all shared aquifers are subject to tragedies of the commons. Building on these findings, we develop a vulnerability index based on coupled game theoretical and groundwater flow models. We show that vulnerability to strategic overdraft is driven by four intuitively interpretable adimensional parameters that describe economic and hydrogeologic disparities between the agents exploiting the aquifer. This suggests a scale-independent relation between the vulnerability of groundwater systems to common-pool overdraft and their economic and physical characteristics. We investigate this relation for a sample of existing aquifer systems and explore implications for enforceable groundwater agreements that would effectively mitigate strategic overdraft.

  17. Data Validation and Sharing in a Large Research Program

    EPA Science Inventory

    Appropriate data handling practices are important in the support of large research teams with shifting and competing priorities. Determining those best practices is an ongoing effort for the US EPA’s National Aquatic Resource Surveys. We focus on the well understood data ...

  18. High-resolution digital brain atlases: a Hubble telescope for the brain.

    PubMed

    Jones, Edward G; Stone, James M; Karten, Harvey J

    2011-05-01

    We describe implementation of a method for digitizing at microscopic resolution brain tissue sections containing normal and experimental data and for making the content readily accessible online. Web-accessible brain atlases and virtual microscopes for online examination can be developed using existing computer and internet technologies. Resulting databases, made up of hierarchically organized, multiresolution images, enable rapid, seamless navigation through the vast image datasets generated by high-resolution scanning. Tools for visualization and annotation of virtual microscope slides enable remote and universal data sharing. Interactive visualization of a complete series of brain sections digitized at subneuronal levels of resolution offers fine grain and large-scale localization and quantification of many aspects of neural organization and structure. The method is straightforward and replicable; it can increase accessibility and facilitate sharing of neuroanatomical data. It provides an opportunity for capturing and preserving irreplaceable, archival neurohistological collections and making them available to all scientists in perpetuity, if resources could be obtained from hitherto uninterested agencies of scientific support. © 2011 New York Academy of Sciences.

  19. Cybersecurity and privacy issues for socially integrated mobile healthcare applications operating in a multi-cloud environment.

    PubMed

    Al-Muhtadi, Jalal; Shahzad, Basit; Saleem, Kashif; Jameel, Wasif; Orgun, Mehmet A

    2017-05-01

    Social media has enabled information-sharing across massively large networks of people without spending much financial resources and time that are otherwise required in the print and electronic media. Mobile-based social media applications have overwhelmingly changed the information-sharing perspective. However, with the advent of such applications at an unprecedented scale, the privacy of the information is compromised to a larger extent if breach mitigation is not adequate. Since healthcare applications are also being developed for mobile devices so that they also benefit from the power of social media, cybersecurity privacy concerns for such sensitive applications have become critical. This article discusses the architecture of a typical mobile healthcare application, in which customized privacy levels are defined for the individuals participating in the system. It then elaborates on how the communication across a social network in a multi-cloud environment can be made more secure and private, especially for healthcare applications.

  20. The AI Bus architecture for distributed knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain

    1991-01-01

    The AI Bus architecture is layered, distributed object oriented framework developed to support the requirements of advanced technology programs for an order of magnitude improvement in software costs. The consequent need for highly autonomous computer systems, adaptable to new technology advances over a long lifespan, led to the design of an open architecture and toolbox for building large scale, robust, production quality systems. The AI Bus accommodates a mix of knowledge based and conventional components, running on heterogeneous, distributed real world and testbed environment. The concepts and design is described of the AI Bus architecture and its current implementation status as a Unix C++ library or reusable objects. Each high level semiautonomous agent process consists of a number of knowledge sources together with interagent communication mechanisms based on shared blackboards and message passing acquaintances. Standard interfaces and protocols are followed for combining and validating subsystems. Dynamic probes or demons provide an event driven means for providing active objects with shared access to resources, and each other, while not violating their security.

  1. Application of large-scale, multi-resolution watershed modeling framework using the Hydrologic and Water Quality System (HAWQS)

    USDA-ARS?s Scientific Manuscript database

    In recent years, large-scale watershed modeling has been implemented broadly in the field of water resources planning and management. Complex hydrological, sediment, and nutrient processes can be simulated by sophisticated watershed simulation models for important issues such as water resources all...

  2. A Prescriptive, Intergenerational-Tension Ageism Scale: Succession, Identity, and Consumption (SIC)

    PubMed Central

    North, Michael S.; Fiske, Susan T.

    2014-01-01

    We introduce a novel ageism scale, focusing on prescriptive beliefs concerning potential intergenerational tensions: active, envied resource Succession, symbolic Identity avoidance, and passive, shared-resource Consumption (SIC). Four studies (2,010 total participants) developed the scale. EFA formed an initial 20-item, three-factor solution (Study 1). The scale converges appropriately with other prejudice measures and diverges from other social control measures (Study 2). It diverges from anti-youth ageism (Study 3). Study 4’s experiment yielded both predictive and divergent validity apropos another ageism measure. Structural equation modeling confirmed model fit across all studies. Per an intergenerational-tension focus, younger people consistently scored the highest. As generational equity issues intensify, the scale provides a contemporary tool for current and future ageism research. PMID:23544391

  3. SEEK: a systems biology data and model management platform.

    PubMed

    Wolstencroft, Katherine; Owen, Stuart; Krebs, Olga; Nguyen, Quyen; Stanford, Natalie J; Golebiewski, Martin; Weidemann, Andreas; Bittkowski, Meik; An, Lihua; Shockley, David; Snoep, Jacky L; Mueller, Wolfgang; Goble, Carole

    2015-07-11

    Systems biology research typically involves the integration and analysis of heterogeneous data types in order to model and predict biological processes. Researchers therefore require tools and resources to facilitate the sharing and integration of data, and for linking of data to systems biology models. There are a large number of public repositories for storing biological data of a particular type, for example transcriptomics or proteomics, and there are several model repositories. However, this silo-type storage of data and models is not conducive to systems biology investigations. Interdependencies between multiple omics datasets and between datasets and models are essential. Researchers require an environment that will allow the management and sharing of heterogeneous data and models in the context of the experiments which created them. The SEEK is a suite of tools to support the management, sharing and exploration of data and models in systems biology. The SEEK platform provides an access-controlled, web-based environment for scientists to share and exchange data and models for day-to-day collaboration and for public dissemination. A plug-in architecture allows the linking of experiments, their protocols, data, models and results in a configurable system that is available 'off the shelf'. Tools to run model simulations, plot experimental data and assist with data annotation and standardisation combine to produce a collection of resources that support analysis as well as sharing. Underlying semantic web resources additionally extract and serve SEEK metadata in RDF (Resource Description Format). SEEK RDF enables rich semantic queries, both within SEEK and between related resources in the web of Linked Open Data. The SEEK platform has been adopted by many systems biology consortia across Europe. It is a data management environment that has a low barrier of uptake and provides rich resources for collaboration. This paper provides an update on the functions and features of the SEEK software, and describes the use of the SEEK in the SysMO consortium (Systems biology for Micro-organisms), and the VLN (virtual Liver Network), two large systems biology initiatives with different research aims and different scientific communities.

  4. Virtual Small Business Emergency Operations Center (VSBEOC): Shared Awareness and Decision Making for Small Business

    DTIC Science & Technology

    2011-06-01

    Shared Awareness and Decision Making for Small Business Topic(s) 2. Topic 1: Concepts, Theory , and Policy 1. Topic 5: Collaboration, Shared...emergencies do not have the time or the resources to collaborate on a continual basis with a large number of organizations. 3. A primary Crisis Management...Center (CMC) should be identified in advance. This is the initial site used by the Crisis Management Team and Response Teams for directing and

  5. Disaster and Contingency Planning for Scientific Shared Resource Cores

    PubMed Central

    Wilkerson, Amy

    2016-01-01

    Progress in biomedical research is largely driven by improvements, innovations, and breakthroughs in technology, accelerating the research process, and an increasingly complex collaboration of both clinical and basic science. This increasing sophistication has driven the need for centralized shared resource cores (“cores”) to serve the scientific community. From a biomedical research enterprise perspective, centralized resource cores are essential to increased scientific, operational, and cost effectiveness; however, the concentration of instrumentation and resources in the cores may render them highly vulnerable to damage from severe weather and other disasters. As such, protection of these assets and the ability to recover from a disaster is increasingly critical to the mission and success of the institution. Therefore, cores should develop and implement both disaster and business continuity plans and be an integral part of the institution’s overall plans. Here we provide an overview of key elements required for core disaster and business continuity plans, guidance, and tools for developing these plans, and real-life lessons learned at a large research institution in the aftermath of Superstorm Sandy. PMID:26848285

  6. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    NASA Astrophysics Data System (ADS)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  7. Groundwater and human development: challenges and opportunities in livelihoods and environment.

    PubMed

    Shah, T

    2005-01-01

    At less than 1000 km3/year, the world's annual use of groundwater is 1.5% of renewable water resource but contributes a lion's share of water-induced human welfare. Global groundwater use however has increased manifold in the past 50 years; and the human race has never had to manage groundwater use on such a large scale. Sustaining the massive welfare gains groundwater development has created without ruining the resource is a key water challenge facing the world today. In exploring this challenge, we have focused a good deal on conditions of resource occurrence but less so on resource use. I offer a typology of five groundwater demand systems as Groundwater Socio-ecologies (GwSE), each embodying a unique pattern of interactions between socio-economic and ecological variables, and each facing a distinct groundwater governance challenge. During the past century, a growing corpus of experiential knowledge has accumulated in the industrialized world on managing groundwater in various uses and contexts. A daunting global groundwater issue today is to apply this knowledge intelligently to by far the more formidable challenge that has arisen in developing regions of Asia and Africa, where groundwater irrigation has evolved into a colossal anarchy supporting billions of livelihoods but threatening the resource itself.

  8. Conceptualizing the Science Curriculum: 40 Years of Developing Assessment Frameworks in Three Large-Scale Assessments

    ERIC Educational Resources Information Center

    Kind, Per Morten

    2013-01-01

    The paper analyzes conceptualizations in the science frameworks in three large-scale assessments, Trends in Mathematics and Science Study (TIMSS), Programme for International Student Assessment (PISA), and National Assessment of Educational Progress (NAEP). The assessments have a shared history, but have developed different conceptualizations. The…

  9. The shared and unique values of optical, fluorescence, thermal and microwave satellite data for estimating large-scale crop yields

    USDA-ARS?s Scientific Manuscript database

    Large-scale crop monitoring and yield estimation are important for both scientific research and practical applications. Satellite remote sensing provides an effective means for regional and global cropland monitoring, particularly in data-sparse regions that lack reliable ground observations and rep...

  10. Scale and modeling issues in water resources planning

    USGS Publications Warehouse

    Lins, H.F.; Wolock, D.M.; McCabe, G.J.

    1997-01-01

    Resource planners and managers interested in utilizing climate model output as part of their operational activities immediately confront the dilemma of scale discordance. Their functional responsibilities cover relatively small geographical areas and necessarily require data of relatively high spatial resolution. Climate models cover a large geographical, i.e. global, domain and produce data at comparatively low spatial resolution. Although the scale differences between model output and planning input are large, several techniques have been developed for disaggregating climate model output to a scale appropriate for use in water resource planning and management applications. With techniques in hand to reduce the limitations imposed by scale discordance, water resource professionals must now confront a more fundamental constraint on the use of climate models-the inability to produce accurate representations and forecasts of regional climate. Given the current capabilities of climate models, and the likelihood that the uncertainty associated with long-term climate model forecasts will remain high for some years to come, the water resources planning community may find it impractical to utilize such forecasts operationally.

  11. The GOBLET training portal: a global repository of bioinformatics training materials, courses and trainers

    PubMed Central

    Corpas, Manuel; Jimenez, Rafael C.; Bongcam-Rudloff, Erik; Budd, Aidan; Brazas, Michelle D.; Fernandes, Pedro L.; Gaeta, Bruno; van Gelder, Celia; Korpelainen, Eija; Lewitter, Fran; McGrath, Annette; MacLean, Daniel; Palagi, Patricia M.; Rother, Kristian; Taylor, Jan; Via, Allegra; Watson, Mick; Schneider, Maria Victoria; Attwood, Teresa K.

    2015-01-01

    Summary: Rapid technological advances have led to an explosion of biomedical data in recent years. The pace of change has inspired new collaborative approaches for sharing materials and resources to help train life scientists both in the use of cutting-edge bioinformatics tools and databases and in how to analyse and interpret large datasets. A prototype platform for sharing such training resources was recently created by the Bioinformatics Training Network (BTN). Building on this work, we have created a centralized portal for sharing training materials and courses, including a catalogue of trainers and course organizers, and an announcement service for training events. For course organizers, the portal provides opportunities to promote their training events; for trainers, the portal offers an environment for sharing materials, for gaining visibility for their work and promoting their skills; for trainees, it offers a convenient one-stop shop for finding suitable training resources and identifying relevant training events and activities locally and worldwide. Availability and implementation: http://mygoblet.org/training-portal Contact: manuel.corpas@tgac.ac.uk PMID:25189782

  12. Integrating TRENCADIS components in gLite to share DICOM medical images and structured reports.

    PubMed

    Blanquer, Ignacio; Hernández, Vicente; Salavert, José; Segrelles, Damià

    2010-01-01

    The problem of sharing medical information among different centres has been tackled by many projects. Several of them target the specific problem of sharing DICOM images and structured reports (DICOM-SR), such as the TRENCADIS project. In this paper we propose sharing and organizing DICOM data and DICOM-SR metadata benefiting from the existent deployed Grid infrastructures compliant with gLite such as EGEE or the Spanish NGI. These infrastructures contribute with a large amount of storage resources for creating knowledge databases and also provide metadata storage resources (such as AMGA) to semantically organize reports in a tree-structure. First, in this paper, we present the extension of TRENCADIS architecture to use gLite components (LFC, AMGA, SE) on the shake of increasing interoperability. Using the metadata from DICOM-SR, and maintaining its tree structure, enables federating different but compatible diagnostic structures and simplifies the definition of complex queries. This article describes how to do this in AMGA and it shows an approach to efficiently code radiology reports to enable the multi-centre federation of data resources.

  13. Development and Validation of a PTSD-Related Impairment Scale

    DTIC Science & Technology

    2012-06-01

    Social Adjustment Scale (SAS-SR) (58] Dyadic Adjustment Scale (DAS) [59] Life Stressors and Social Resources Inventory ( LISRES ) [60] 3...measure that gauges on- 200 Social Resources lnven- 2. Spouse/partner going life stressors and social resources tory ( LISRES ; Moos & 3. Finances as well...measures (e.g., ICF checklist, LISRES ; Moos, Penn, & Billings, 1988) may nor be practical or desirable in many healthcare settings or in large-scale

  14. CLIMB (the Cloud Infrastructure for Microbial Bioinformatics): an online resource for the medical microbiology community

    PubMed Central

    Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J.; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius

    2016-01-01

    The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data. PMID:28785418

  15. CLIMB (the Cloud Infrastructure for Microbial Bioinformatics): an online resource for the medical microbiology community.

    PubMed

    Connor, Thomas R; Loman, Nicholas J; Thompson, Simon; Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius; Sheppard, Samuel K; Pallen, Mark J

    2016-09-01

    The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data.

  16. In Defense of the National Labs and Big-Budget Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodwin, J R

    2008-07-29

    The purpose of this paper is to present the unofficial and unsanctioned opinions of a Visiting Scientist at Lawrence Livermore National Laboratory on the values of LLNL and the other National Labs. The basic founding value and goal of the National Labs is big-budget scientific research, along with smaller-budget scientific research that cannot easily be done elsewhere. The most important example in the latter category is classified defense-related research. The historical guiding light here is the Manhattan Project. This endeavor was unique in human history, and might remain so. The scientific expertise and wealth of an entire nation was tappedmore » in a project that was huge beyond reckoning, with no advance guarantee of success. It was in many respects a clash of scientific titans, with a large supporting cast, collaborating toward a single well-defined goal. Never had scientists received so much respect, so much money, and so much intellectual freedom to pursue scientific progress. And never was the gap between theory and implementation so rapidly narrowed, with results that changed the world, completely. Enormous resources are spent at the national or international level on large-scale scientific projects. LLNL has the most powerful computer in the world, Blue Gene/L. (Oops, Los Alamos just seized the title with Roadrunner; such titles regularly change hands.) LLNL also has the largest laser in the world, the National Ignition Facility (NIF). Lawrence Berkeley National Lab (LBNL) has the most powerful microscope in the world. Not only is it beyond the resources of most large corporations to make such expenditures, but the risk exceeds the possible rewards for those corporations that could. Nor can most small countries afford to finance large scientific projects, and not even the richest can afford largess, especially if Congress is under major budget pressure. Some big-budget research efforts are funded by international consortiums, such as the Large Hadron Collider (LHC) at CERN, and the International Tokamak Experimental Reactor (ITER) in Cadarache, France, a magnetic-confinement fusion research project. The postWWII histories of particle and fusion physics contain remarkable examples of both international competition, with an emphasis on secrecy, and international cooperation, with an emphasis on shared knowledge and resources. Initiatives to share sometimes came from surprising directions. Most large-scale scientific projects have potential defense applications. NIF certainly does; it is primarily designed to create small-scale fusion explosions. Blue Gene/L operates in part in service to NIF, and in part to various defense projects. The most important defense projects include stewardship of the national nuclear weapons stockpile, and the proposed redesign and replacement of those weapons with fewer, safer, more reliable, longer-lived, and less apocalyptic warheads. Many well-meaning people will consider the optimal lifetime of a nuclear weapon to be zero, but most thoughtful people, when asked how much longer they think this nation will require them, will ask for some time to think. NIF is also designed to create exothermic small-scale fusion explosions. The malapropos 'exothermic' here is a convenience to cover a profusion of complexities, but the basic idea is that the explosions will create more recoverable energy than was used to create them. One can hope that the primary future benefits of success for NIF will be in cost-effective generation of electrical power through controlled small-scale fusion reactions, rather than in improved large-scale fusion explosions. Blue Gene/L also services climate research, genomic research, materials research, and a myriad of other computational problems that become more feasible, reliable, and precise the larger the number of computational nodes employed. Blue Gene/L has to be sited within a security complex for obvious reasons, but its value extends to the nation and the world. There is a duality here between large-scale scientific research machines and the supercomputers used to model them. An astounding example is illustrated in a graph released by EFDAJET, at Oxfordshire, UK, presently the largest operating magnetic-confinement fusion experiment. The graph shows plasma confinement times (an essential performance parameter) for all the major tokamaks in the international fusion program, over their existing lifetimes. The remarkable thing about the data is not so much confinement-time versus date or scale, but the fact that the data are given for both the computer model predictions and the actual experimental measurements, and the two are in phenomenal agreement over the extended range of scales. Supercomputer models, sometimes operating with the intricacy of Schroedinger's equation at quantum physical scales, have become a costly but enormously cost-saving tool.« less

  17. Higher Education ERP: Lessons Learned.

    ERIC Educational Resources Information Center

    Swartz, Dave; Orgill, Ken

    2001-01-01

    Shares experiences and lessons learned by chief information officers of large universities about enterprise resource planning (ERP). Specifically, provides a framework for approaching an ERP that could save universities millions of dollars. (EV)

  18. Shared versus distributed memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Jordan, Harry F.

    1991-01-01

    The question of whether multiprocessors should have shared or distributed memory has attracted a great deal of attention. Some researchers argue strongly for building distributed memory machines, while others argue just as strongly for programming shared memory multiprocessors. A great deal of research is underway on both types of parallel systems. Special emphasis is placed on systems with a very large number of processors for computation intensive tasks and considers research and implementation trends. It appears that the two types of systems will likely converge to a common form for large scale multiprocessors.

  19. The agent-based spatial information semantic grid

    NASA Astrophysics Data System (ADS)

    Cui, Wei; Zhu, YaQiong; Zhou, Yong; Li, Deren

    2006-10-01

    Analyzing the characteristic of multi-Agent and geographic Ontology, The concept of the Agent-based Spatial Information Semantic Grid (ASISG) is defined and the architecture of the ASISG is advanced. ASISG is composed with Multi-Agents and geographic Ontology. The Multi-Agent Systems are composed with User Agents, General Ontology Agent, Geo-Agents, Broker Agents, Resource Agents, Spatial Data Analysis Agents, Spatial Data Access Agents, Task Execution Agent and Monitor Agent. The architecture of ASISG have three layers, they are the fabric layer, the grid management layer and the application layer. The fabric layer what is composed with Data Access Agent, Resource Agent and Geo-Agent encapsulates the data of spatial information system so that exhibits a conceptual interface for the Grid management layer. The Grid management layer, which is composed with General Ontology Agent, Task Execution Agent and Monitor Agent and Data Analysis Agent, used a hybrid method to manage all resources that were registered in a General Ontology Agent that is described by a General Ontology System. The hybrid method is assembled by resource dissemination and resource discovery. The resource dissemination push resource from Local Ontology Agent to General Ontology Agent and the resource discovery pull resource from the General Ontology Agent to Local Ontology Agents. The Local Ontology Agent is derived from special domain and describes the semantic information of local GIS. The nature of the Local Ontology Agents can be filtrated to construct a virtual organization what could provides a global scheme. The virtual organization lightens the burdens of guests because they need not search information site by site manually. The application layer what is composed with User Agent, Geo-Agent and Task Execution Agent can apply a corresponding interface to a domain user. The functions that ASISG should provide are: 1) It integrates different spatial information systems on the semantic The Grid management layer establishes a virtual environment that integrates seamlessly all GIS notes. 2) When the resource management system searches data on different spatial information systems, it transfers the meaning of different Local Ontology Agents rather than access data directly. So the ability of search and query can be said to be on the semantic level. 3) The data access procedure is transparent to guests, that is, they could access the information from remote site as current disk because the General Ontology Agent could automatically link data by the Data Agents that link the Ontology concept to GIS data. 4) The capability of processing massive spatial data. Storing, accessing and managing massive spatial data from TB to PB; efficiently analyzing and processing spatial data to produce model, information and knowledge; and providing 3D and multimedia visualization services. 5) The capability of high performance computing and processing on spatial information. Solving spatial problems with high precision, high quality, and on a large scale; and process spatial information in real time or on time, with high-speed and high efficiency. 6) The capability of sharing spatial resources. The distributed heterogeneous spatial information resources are Shared and realizing integrated and inter-operated on semantic level, so as to make best use of spatial information resources,such as computing resources, storage devices, spatial data (integrating from GIS, RS and GPS), spatial applications and services, GIS platforms, 7) The capability of integrating legacy GIS system. A ASISG can not only be used to construct new advanced spatial application systems, but also integrate legacy GIS system, so as to keep extensibility and inheritance and guarantee investment of users. 8) The capability of collaboration. Large-scale spatial information applications and services always involve different departments in different geographic places, so remote and uniform services are needed. 9) The capability of supporting integration of heterogeneous systems. Large-scale spatial information systems are always synthetically applications, so ASISG should provide interoperation and consistency through adopting open and applied technology standards. 10) The capability of adapting dynamic changes. Business requirements, application patterns, management strategies, and IT products always change endlessly for any departments, so ASISG should be self-adaptive. Two examples are provided in this paper, those examples provide a detailed way on how you design your semantic grid based on Multi-Agent systems and Ontology. In conclusion, the semantic grid of spatial information system could improve the ability of the integration and interoperability of spatial information grid.

  20. Future energy system challenges for Africa: Insights from Integrated Assessment Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucas, Paul; Nielsen, Jens; Calvin, Katherine V.

    Although Africa’s share in the global energy system is only small today, the ongoing population growth and economic development imply that this can change significantly. In this paper, we discuss long-term energy developments in Africa using the results of the LIMITS model inter-comparison study. The analysis focusses on the position of Africa in the wider global energy system and climate mitigation. The results show a considerable spread in model outcomes. Without specific climate policy, Africa’s share in global CO 2 emissions is projected to increase from around 1-4% today to 3-23% by 2100. In all models, emissions only start tomore » become really significant on a global scale after 2050. Furthermore, by 2030 still around 50% of total household energy use is supplied through traditional bio-energy, in contrast to existing ambitions from international organisations to provide access to modern energy for all. After 2050, the energy mix is projected to converge towards a global average energy mix with high shares of fossil fuels and electricity use. Finally, although the continent is now a large net exporter of oil and gas, towards 2050 it most likely needs most of its resources to meet its rapidly growing domestic demand. With respect to climate policy, the rapid expansion of the industrial and the power sector also create large mitigation potential and thereby the possibility to align the investment peak in the energy system with climate policy and potential revenues from international carbon trading.« less

  1. Integrating Green and Blue Water Management Tools for Land and Water Resources Planning

    NASA Astrophysics Data System (ADS)

    Jewitt, G. P. W.

    2009-04-01

    The role of land use and land use change on the hydrological cycle is well known. However, the impacts of large scale land use change are poorly considered in water resources planning, unless they require direct abstraction of water resources and associated development of infrastructure e.g. Irrigation Schemes. However, large scale deforestation for the supply of raw materials, expansion of the areas of plantation forestry, increasing areas under food production and major plans for cultivation of biofuels in many developing countries are likely to result in extensive land use change. Given the spatial extent and temporal longevity of these proposed developments, major impacts on water resources are inevitable. It is imperative that managers and planners consider the consequences for downstream ecosystems and users in such developments. However, many popular tools, such as the vitual water approach, provide only coarse scale "order of magnitude" type estimates with poor consideration of, and limited usefulness, for land use planning. In this paper, a framework for the consideration of the impacts of large scale land use change on water resources at a range of temporal and spatial scales is presented. Drawing on experiences from South Africa, where the establishment of exotic commercial forest plantations is only permitted once a water use license has been granted, the framework adopts the "green water concept" for the identification of potential high impact areas of land use change and provides for integration with traditional "blue water" water resources planning tools for more detailed planning. Appropriate tools, ranging from simple spreadsheet solutions to more sophisticated remote sensing and hydrological models are described, and the application of the framework for consideration of water resources impacts associated with the establishment of large scale tectona grandis, sugar cane and jatropha curcas plantations is illustrated through examples in Mozambique and South Africa. Keywords: Land use change, water resources, green water, blue water, biofuels, developing countries

  2. Exploring Cloud Computing for Large-scale Scientific Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Guang; Han, Binh; Yin, Jian

    This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less

  3. A Conceptual Framework to Enhance the Interoperability of Observatories among Countries, Continents and the World

    NASA Astrophysics Data System (ADS)

    Loescher, H.; Fundamental Instrument Unit

    2013-05-01

    Ecological research addresses challenges relating to the dynamics of the planet, such as changes in climate, biodiversity, ecosystem functioning and services, carbon and energy cycles, natural and human-induced hazards, and adaptation and mitigation strategies that involve many science and engineering disciplines and cross national boundaries. Because of the global nature of these challenges, greater international collaboration is required for knowledge sharing and technology deployment to advance earth science investigations and enhance societal benefits. For example, the Working Group on Biodiversity Preservation and Ecosystem Services (PCAST 2011) noted the scale and complexity of the physical and human resources needed to address these challenges. Many of the most pressing ecological research questions require global-scale data and global scale solutions (Suresh 2012), e.g., interdisciplinary data access from data centers managing ecological resources and hazards, drought, heat islands, carbon cycle, or data used to forecast the rate of spread of invasive species or zoonotic diseases. Variability and change at one location or in one region may well result from the superposition of global processes coupled together with regional and local modes of variability. For example, we know the El Niño-Southern Oscillation large-scale modes of variability in the coupled terrestrial-aquatic-atmospheric systems' correlation with variability in regional rainfall and ecosystem functions. It is therefore a high priority of government and non-government organizations to develop the necessary large scale, world-class research infrastructures for environmental research—and the framework by which these data can be shared, discovered, and utilized by a broad user community of scientists and policymakers, alike. Given that there are many, albeit nascent, efforts to build new environmental observatories/networks globally (e.g., EU-ICOS, EU-Lifewatch, AU-TERN, China-CERN, GEOSS, GEO-BON, NutNet, etc.) and domestically, (e.g., NSF-CZO, USDA-LTAR, DOE-NGEE, Soil Carbon Network, etc.), there is a strong and mutual desire to assure interoperability of data. Developing interoperability is the degree by which each of the following is mapped between observatories (entities), defined by linking i) science requirements with science questions, ii) traceability of measurements to nationally and internationally accepted standards, iii) how data product are derived, i.e., algorithms, procedures, and methods, and iv) the bioinformatics which broadly include data formats, metadata, controlled vocabularies, and semantics. Here, we explore the rationale and focus areas for interoperability, the governance and work structures, example projects (NSF-NEON, EU-ICOS, and AU-TERN), and the emergent roles of scientists in these endeavors.

  4. Experience in using commercial clouds in CMS

    NASA Astrophysics Data System (ADS)

    Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration

    2017-10-01

    Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.

  5. Experience in using commercial clouds in CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bauerdick, L.; Bockelman, B.; Dykstra, D.

    Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less

  6. How much is enough? The recurrent problem of setting measurable objectives in conservation

    USGS Publications Warehouse

    Tear, T.H.; Kareiva, P.; Angermeier, P.L.; Comer, P.; Czech, B.; Kautz, R.; Landon, L.; Mehlman, D.; Murphy, K.; Ruckelshaus, M.; Scott, J.M.; Wilhere, G.

    2005-01-01

    International agreements, environmental laws, resource management agencies, and environmental nongovernmental organizations all establish objectives that define what they hope to accomplish. Unfortunately, quantitative objectives in conservation are typically set without consistency and scientific rigor. As a result, conservationists are failing to provide credible answers to the question "How much is enough?" This is a serious problem because objectives profoundly shape where and how limited conservation resources are spent, and help to create a shared vision for the future. In this article we develop guidelines to help steer conservation biologists and practitioners through the process of objective setting. We provide three case studies to highlight the practical challenges of objective setting in different social, political, and legal contexts. We also identify crucial gaps in our science, including limited knowledge of species distributions and of large-scale, long-term ecosystem dynamics, that must be filled if we hope to do better than setting conservation objectives through intuition and best guesses. ?? 2005 American Institute of Biological Sciences.

  7. Network support for turn-taking in multimedia collaboration

    NASA Astrophysics Data System (ADS)

    Dommel, Hans-Peter; Garcia-Luna-Aceves, Jose J.

    1997-01-01

    The effectiveness of collaborative multimedia systems depends on the regulation of access to their shared resources, such as continuous media or instruments used concurrently by multiple parties. Existing applications use only simple protocols to mediate such resource contention. Their cooperative rules follow a strict agenda and are largely application-specific. The inherent problem of floor control lacks a systematic methodology. This paper presents a general model on floor control for correct, scalable, fine-grained and fair resource sharing that integrates user interaction with network conditions, and adaptation to various media types. The motion of turn-taking known from psycholinguistics in studies on discourse structure is adapted for this framework. Viewed as a computational analogy to speech communication, online collaboration revolves around dynamically allocated access permissions called floors. The control semantics of floors derives from concurrently control methodology. An explicit specification and verification of a novel distributed Floor Control Protocol are presented. Hosts assume sharing roles that allow for efficient dissemination of control information, agreeing on a floor holder which is granted mutually exclusive access to a resource. Performance analytic aspects of floor control protocols are also briefly discussed.

  8. Method for prefetching non-contiguous data structures

    DOEpatents

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Hoenicke, Dirk [Ossining, NY; Ohmacht, Martin [Brewster, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Takken, Todd E [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2009-05-05

    A low latency memory system access is provided in association with a weakly-ordered multiprocessor system. Each processor in the multiprocessor shares resources, and each shared resource has an associated lock within a locking device that provides support for synchronization between the multiple processors in the multiprocessor and the orderly sharing of the resources. A processor only has permission to access a resource when it owns the lock associated with that resource, and an attempt by a processor to own a lock requires only a single load operation, rather than a traditional atomic load followed by store, such that the processor only performs a read operation and the hardware locking device performs a subsequent write operation rather than the processor. A simple perfecting for non-contiguous data structures is also disclosed. A memory line is redefined so that in addition to the normal physical memory data, every line includes a pointer that is large enough to point to any other line in the memory, wherein the pointers to determine which memory line to prefect rather than some other predictive algorithm. This enables hardware to effectively prefect memory access patterns that are non-contiguous, but repetitive.

  9. Low latency memory access and synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.

    A low latency memory system access is provided in association with a weakly-ordered multiprocessor system. Each processor in the multiprocessor shares resources, and each shared resource has an associated lock within a locking device that provides support for synchronization between the multiple processors in the multiprocessor and the orderly sharing of the resources. A processor only has permission to access a resource when it owns the lock associated with that resource, and an attempt by a processor to own a lock requires only a single load operation, rather than a traditional atomic load followed by store, such that the processormore » only performs a read operation and the hardware locking device performs a subsequent write operation rather than the processor. A simple prefetching for non-contiguous data structures is also disclosed. A memory line is redefined so that in addition to the normal physical memory data, every line includes a pointer that is large enough to point to any other line in the memory, wherein the pointers to determine which memory line to prefetch rather than some other predictive algorithm. This enables hardware to effectively prefetch memory access patterns that are non-contiguous, but repetitive.« less

  10. Low latency memory access and synchronization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blumrich, Matthias A.; Chen, Dong; Coteus, Paul W.

    A low latency memory system access is provided in association with a weakly-ordered multiprocessor system. Bach processor in the multiprocessor shares resources, and each shared resource has an associated lock within a locking device that provides support for synchronization between the multiple processors in the multiprocessor and the orderly sharing of the resources. A processor only has permission to access a resource when it owns the lock associated with that resource, and an attempt by a processor to own a lock requires only a single load operation, rather than a traditional atomic load followed by store, such that the processormore » only performs a read operation and the hardware locking device performs a subsequent write operation rather than the processor. A simple prefetching for non-contiguous data structures is also disclosed. A memory line is redefined so that in addition to the normal physical memory data, every line includes a pointer that is large enough to point to any other line in the memory, wherein the pointers to determine which memory line to prefetch rather than some other predictive algorithm. This enables hardware to effectively prefetch memory access patterns that are non-contiguous, but repetitive.« less

  11. Identification of Volcanic Landforms and Processes on Earth and Mars using Geospatial Analysis (Invited)

    NASA Astrophysics Data System (ADS)

    Fagents, S. A.; Hamilton, C. W.

    2009-12-01

    Nearest neighbor (NN) analysis enables the identification of landforms using non-morphological parameters and can be useful for constraining the geological processes contributing to observed patterns of spatial distribution. Explosive interactions between lava and water can generate volcanic rootless cone (VRC) groups that are well suited to geospatial analyses because they consist of a large number of landforms that share a common formation mechanism. We have applied NN analysis tools to quantitatively compare the spatial distribution of VRCs in the Laki lava flow in Iceland to analogous landforms in the Tartarus Colles Region of eastern Elysium Planitia, Mars. Our results show that rootless eruption sites on both Earth and Mars exhibit systematic variations in spatial organization that are related to variations in the distribution of resources (lava and water) at different scales. Field observations in Iceland reveal that VRC groups are composite structures formed by the emplacement of chronologically and spatially distinct domains. Regionally, rootless cones cluster into groups and domains, but within domains NN distances exhibit random to repelled distributions. This suggests that on regional scales VRCs cluster in locations that contain sufficient resources, whereas on local scales rootless eruption sites tend to self-organize into distributions that maximize the utilization of limited resources (typically groundwater). Within the Laki lava flow, near-surface water is abundant and pre-eruption topography appears to exert the greatest control on both lava inundation regions and clustering of rootless eruption sites. In contrast, lava thickness appears to be the controlling factor in the formation of rootless eruption sites in the Tartarus Colles Region. A critical lava thickness may be required to initiate rootless eruptions on Mars because the lava flows must contain sufficient heat for transferred thermal energy to reach the underlying cryosphere and volatilize buried ground ice. In both environments, the spatial distribution of rootless eruption sites on local scales may either be random, which indicates that rootless eruption sites form independently of one another, or repelled, which implies resource limitation. Where competition for limited groundwater causes rootless eruption sites to develop greater than random NN separation, rootless eruption sites can be modeled as a system of pumping wells that extract water from a shared aquifer, thereby generating repelled distributions due to non-initiation or early cessation of rootless explosive activity at sites with insufficient access to groundwater. Thus statistical NN analyses can be combined with field observations and remote sensing to obtain information about self-organization processes within geological systems and the effects of environmental resource limitation on the spatial distribution of volcanic landforms. NN analyses may also be used to quantitatively compare the spatial distribution of landforms in different planetary environments and for supplying non-morphological evidence to discriminate between feature identities and geological formation mechanisms.

  12. Price schedules coordination for electricity pool markets

    NASA Astrophysics Data System (ADS)

    Legbedji, Alexis Motto

    2002-04-01

    We consider the optimal coordination of a class of mathematical programs with equilibrium constraints, which is formally interpreted as a resource-allocation problem. Many decomposition techniques were proposed to circumvent the difficulty of solving large systems with limited computer resources. The considerable improvement in computer architecture has allowed the solution of large-scale problems with increasing speed. Consequently, interest in decomposition techniques has waned. Nonetheless, there is an important class of applications for which decomposition techniques will still be relevant, among others, distributed systems---the Internet, perhaps, being the most conspicuous example---and competitive economic systems. Conceptually, a competitive economic system is a collection of agents that have similar or different objectives while sharing the same system resources. In theory, constructing a large-scale mathematical program and solving it centrally, using currently available computing power can optimize such systems of agents. In practice, however, because agents are self-interested and not willing to reveal some sensitive corporate data, one cannot solve these kinds of coordination problems by simply maximizing the sum of agent's objective functions with respect to their constraints. An iterative price decomposition or Lagrangian dual method is considered best suited because it can operate with limited information. A price-directed strategy, however, can only work successfully when coordinating or equilibrium prices exist, which is not generally the case when a weak duality is unavoidable. Showing when such prices exist and how to compute them is the main subject of this thesis. Among our results, we show that, if the Lagrangian function of a primal program is additively separable, price schedules coordination may be attained. The prices are Lagrange multipliers, and are also the decision variables of a dual program. In addition, we propose a new form of augmented or nonlinear pricing, which is an example of the use of penalty functions in mathematical programming. Applications are drawn from mathematical programming problems of the form arising in electric power system scheduling under competition.

  13. The Computing and Data Grid Approach: Infrastructure for Distributed Science Applications

    NASA Technical Reports Server (NTRS)

    Johnston, William E.

    2002-01-01

    With the advent of Grids - infrastructure for using and managing widely distributed computing and data resources in the science environment - there is now an opportunity to provide a standard, large-scale, computing, data, instrument, and collaboration environment for science that spans many different projects and provides the required infrastructure and services in a relatively uniform and supportable way. Grid technology has evolved over the past several years to provide the services and infrastructure needed for building 'virtual' systems and organizations. We argue that Grid technology provides an excellent basis for the creation of the integrated environments that can combine the resources needed to support the large- scale science projects located at multiple laboratories and universities. We present some science case studies that indicate that a paradigm shift in the process of science will come about as a result of Grids providing transparent and secure access to advanced and integrated information and technologies infrastructure: powerful computing systems, large-scale data archives, scientific instruments, and collaboration tools. These changes will be in the form of services that can be integrated with the user's work environment, and that enable uniform and highly capable access to these computers, data, and instruments, regardless of the location or exact nature of these resources. These services will integrate transient-use resources like computing systems, scientific instruments, and data caches (e.g., as they are needed to perform a simulation or analyze data from a single experiment); persistent-use resources. such as databases, data catalogues, and archives, and; collaborators, whose involvement will continue for the lifetime of a project or longer. While we largely address large-scale science in this paper, Grids, particularly when combined with Web Services, will address a broad spectrum of science scenarios. both large and small scale.

  14. Insect density-plant density relationships: a modified view of insect responses to resource concentrations.

    PubMed

    Andersson, Petter; Löfstedt, Christer; Hambäck, Peter A

    2013-12-01

    Habitat area is an important predictor of spatial variation in animal densities. However, the area often correlates with the quantity of resources within habitats, complicating our understanding of the factors shaping animal distributions. We addressed this problem by investigating densities of insect herbivores in habitat patches with a constant area but varying numbers of plants. Using a mathematical model, predictions of scale-dependent immigration and emigration rates for insects into patches with different densities of host plants were derived. Moreover, a field experiment was conducted where the scaling properties of odour-mediated attraction in relation to the number of odour sources were estimated, in order to derive a prediction of immigration rates of olfactory searchers. The theoretical model predicted that we should expect immigration rates of contact and visual searchers to be determined by patch area, with a steep scaling coefficient, μ = -1. The field experiment suggested that olfactory searchers should show a less steep scaling coefficient, with μ ≈ -0.5. A parameter estimation and analysis of published data revealed a correspondence between observations and predictions, and density-variation among groups could largely be explained by search behaviour. Aphids showed scaling coefficients corresponding to the prediction for contact/visual searchers, whereas moths, flies and beetles corresponded to the prediction for olfactory searchers. As density responses varied considerably among groups, and variation could be explained by a certain trait, we conclude that a general theory of insect responses to habitat heterogeneity should be based on shared traits, rather than a general prediction for all species.

  15. Emerging tools for continuous nutrient monitoring networks: Sensors advancing science and water resources protection

    USGS Publications Warehouse

    Pellerin, Brian; Stauffer, Beth A; Young, Dwane A; Sullivan, Daniel J.; Bricker, Suzanne B.; Walbridge, Mark R; Clyde, Gerard A; Shaw, Denice M

    2016-01-01

    Sensors and enabling technologies are becoming increasingly important tools for water quality monitoring and associated water resource management decisions. In particular, nutrient sensors are of interest because of the well-known adverse effects of nutrient enrichment on coastal hypoxia, harmful algal blooms, and impacts to human health. Accurate and timely information on nutrient concentrations and loads is integral to strategies designed to minimize risk to humans and manage the underlying drivers of water quality impairment. Using nitrate sensors as an example, we highlight the types of applications in freshwater and coastal environments that are likely to benefit from continuous, real-time nutrient data. The concurrent emergence of new tools to integrate, manage and share large data sets is critical to the successful use of nutrient sensors and has made it possible for the field of continuous nutrient monitoring to rapidly move forward. We highlight several near-term opportunities for Federal agencies, as well as the broader scientific and management community, that will help accelerate sensor development, build and leverage sites within a national network, and develop open data standards and data management protocols that are key to realizing the benefits of a large-scale, integrated monitoring network. Investing in these opportunities will provide new information to guide management and policies designed to protect and restore our nation’s water resources.

  16. [Research on tumor information grid framework].

    PubMed

    Zhang, Haowei; Qin, Zhu; Liu, Ying; Tan, Jianghao; Cao, Haitao; Chen, Youping; Zhang, Ke; Ding, Yuqing

    2013-10-01

    In order to realize tumor disease information sharing and unified management, we utilized grid technology to make the data and software resources which distributed in various medical institutions for effective integration so that we could make the heterogeneous resources consistent and interoperable in both semantics and syntax aspects. This article describes the tumor grid framework, the type of the service being packaged in Web Service Description Language (WSDL) and extensible markup language schemas definition (XSD), the client use the serialized document to operate the distributed resources. The service objects could be built by Unified Modeling Language (UML) as middle ware to create application programming interface. All of the grid resources are registered in the index and released in the form of Web Services based on Web Services Resource Framework (WSRF). Using the system we can build a multi-center, large sample and networking tumor disease resource sharing framework to improve the level of development in medical scientific research institutions and the patient's quality of life.

  17. Critical Issues in Large-Scale Assessment: A Resource Guide.

    ERIC Educational Resources Information Center

    Redfield, Doris

    The purpose of this document is to provide practical guidance and support for the design, development, and implementation of large-scale assessment systems that are grounded in research and best practice. Information is included about existing large-scale testing efforts, including national testing programs, state testing programs, and…

  18. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  19. Radiologic image communication and archive service: a secure, scalable, shared approach

    NASA Astrophysics Data System (ADS)

    Fellingham, Linda L.; Kohli, Jagdish C.

    1995-11-01

    The Radiologic Image Communication and Archive (RICA) service is designed to provide a shared archive for medical images to the widest possible audience of customers. Images are acquired from a number of different modalities, each available from many different vendors. Images are acquired digitally from those modalities which support direct digital output and by digitizing films for projection x-ray exams. The RICA Central Archive receives standard DICOM 3.0 messages and data streams from the medical imaging devices at customer institutions over the public telecommunication network. RICA represents a completely scalable resource. The user pays only for what he is using today with the full assurance that as the volume of image data that he wishes to send to the archive increases, the capacity will be there to accept it. To provide this seamless scalability imposes several requirements on the RICA architecture: (1) RICA must support the full array of transport services. (2) The Archive Interface must scale cost-effectively to support local networks that range from the very small (one x-ray digitizer in a medical clinic) to the very large and complex (a large hospital with several CTs, MRs, Nuclear medicine devices, ultrasound machines, CRs, and x-ray digitizers). (3) The Archive Server must scale cost-effectively to support rapidly increasing demands for service providing storage for and access to millions of patients and hundreds of millions of images. The architecture must support the incorporation of improved technology as it becomes available to maintain performance and remain cost-effective as demand rises.

  20. Automatic Tools for Enhancing the Collaborative Experience in Large Projects

    NASA Astrophysics Data System (ADS)

    Bourilkov, D.; Rodriquez, J. L.

    2014-06-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  1. LETTER TO THE EDITOR: Iteratively-coupled propagating exterior complex scaling method for electron hydrogen collisions

    NASA Astrophysics Data System (ADS)

    Bartlett, Philip L.; Stelbovics, Andris T.; Bray, Igor

    2004-02-01

    A newly-derived iterative coupling procedure for the propagating exterior complex scaling (PECS) method is used to efficiently calculate the electron-impact wavefunctions for atomic hydrogen. An overview of this method is given along with methods for extracting scattering cross sections. Differential scattering cross sections at 30 eV are presented for the electron-impact excitation to the n = 1, 2, 3 and 4 final states, for both PECS and convergent close coupling (CCC), which are in excellent agreement with each other and with experiment. PECS results are presented at 27.2 eV and 30 eV for symmetric and asymmetric energy-sharing triple differential cross sections, which are in excellent agreement with CCC and exterior complex scaling calculations, and with experimental data. At these intermediate energies, the efficiency of the PECS method with iterative coupling has allowed highly accurate partial-wave solutions of the full Schrödinger equation, for L les 50 and a large number of coupled angular momentum states, to be obtained with minimal computing resources.

  2. Computational biology in the cloud: methods and new insights from computing at scale.

    PubMed

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  3. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  4. Dynamic VM Provisioning for TORQUE in a Cloud Environment

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.

    2014-06-01

    Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.

  5. The social brain: scale-invariant layering of Erdős-Rényi networks in small-scale human societies.

    PubMed

    Harré, Michael S; Prokopenko, Mikhail

    2016-05-01

    The cognitive ability to form social links that can bind individuals together into large cooperative groups for safety and resource sharing was a key development in human evolutionary and social history. The 'social brain hypothesis' argues that the size of these social groups is based on a neurologically constrained capacity for maintaining long-term stable relationships. No model to date has been able to combine a specific socio-cognitive mechanism with the discrete scale invariance observed in ethnographic studies. We show that these properties result in nested layers of self-organizing Erdős-Rényi networks formed by each individual's ability to maintain only a small number of social links. Each set of links plays a specific role in the formation of different social groups. The scale invariance in our model is distinct from previous 'scale-free networks' studied using much larger social groups; here, the scale invariance is in the relationship between group sizes, rather than in the link degree distribution. We also compare our model with a dominance-based hierarchy and conclude that humans were probably egalitarian in hunter-gatherer-like societies, maintaining an average maximum of four or five social links connecting all members in a largest social network of around 132 people. © 2016 The Author(s).

  6. Sustainability in Health care by Allocating Resources Effectively (SHARE) 11: reporting outcomes of an evidence-driven approach to disinvestment in a local healthcare setting.

    PubMed

    Harris, Claire; Allen, Kelly; Ramsey, Wayne; King, Richard; Green, Sally

    2018-05-30

    This is the final paper in a thematic series reporting a program of Sustainability in Health care by Allocating Resources Effectively (SHARE) in a local healthcare setting. The SHARE Program was established to explore a systematic, integrated, evidence-based organisation-wide approach to disinvestment in a large Australian health service network. This paper summarises the findings, discusses the contribution of the SHARE Program to the body of knowledge and understanding of disinvestment in the local healthcare setting, and considers implications for policy, practice and research. The SHARE program was conducted in three phases. Phase One was undertaken to understand concepts and practices related to disinvestment and the implications for a local health service and, based on this information, to identify potential settings and methods for decision-making about disinvestment. The aim of Phase Two was to implement and evaluate the proposed methods to determine which were sustainable, effective and appropriate in a local health service. A review of the current literature incorporating the SHARE findings was conducted in Phase Three to contribute to the understanding of systematic approaches to disinvestment in the local healthcare context. SHARE differed from many other published examples of disinvestment in several ways: by seeking to identify and implement disinvestment opportunities within organisational infrastructure rather than as standalone projects; considering disinvestment in the context of all resource allocation decisions rather than in isolation; including allocation of non-monetary resources as well as financial decisions; and focusing on effective use of limited resources to optimise healthcare outcomes. The SHARE findings provide a rich source of new information about local health service decision-making, in a level of detail not previously reported, to inform others in similar situations. Multiple innovations related to disinvestment were found to be acceptable and feasible in the local setting. Factors influencing decision-making, implementation processes and final outcomes were identified; and methods for further exploration, or avoidance, in attempting disinvestment in this context are proposed based on these findings. The settings, frameworks, models, methods and tools arising from the SHARE findings have potential to enhance health care and patient outcomes.

  7. Versatile Gene-Specific Sequence Tags for Arabidopsis Functional Genomics: Transcript Profiling and Reverse Genetics Applications

    PubMed Central

    Hilson, Pierre; Allemeersch, Joke; Altmann, Thomas; Aubourg, Sébastien; Avon, Alexandra; Beynon, Jim; Bhalerao, Rishikesh P.; Bitton, Frédérique; Caboche, Michel; Cannoot, Bernard; Chardakov, Vasil; Cognet-Holliger, Cécile; Colot, Vincent; Crowe, Mark; Darimont, Caroline; Durinck, Steffen; Eickhoff, Holger; de Longevialle, Andéol Falcon; Farmer, Edward E.; Grant, Murray; Kuiper, Martin T.R.; Lehrach, Hans; Léon, Céline; Leyva, Antonio; Lundeberg, Joakim; Lurin, Claire; Moreau, Yves; Nietfeld, Wilfried; Paz-Ares, Javier; Reymond, Philippe; Rouzé, Pierre; Sandberg, Goran; Segura, Maria Dolores; Serizet, Carine; Tabrett, Alexandra; Taconnat, Ludivine; Thareau, Vincent; Van Hummelen, Paul; Vercruysse, Steven; Vuylsteke, Marnik; Weingartner, Magdalena; Weisbeek, Peter J.; Wirta, Valtteri; Wittink, Floyd R.A.; Zabeau, Marc; Small, Ian

    2004-01-01

    Microarray transcript profiling and RNA interference are two new technologies crucial for large-scale gene function studies in multicellular eukaryotes. Both rely on sequence-specific hybridization between complementary nucleic acid strands, inciting us to create a collection of gene-specific sequence tags (GSTs) representing at least 21,500 Arabidopsis genes and which are compatible with both approaches. The GSTs were carefully selected to ensure that each of them shared no significant similarity with any other region in the Arabidopsis genome. They were synthesized by PCR amplification from genomic DNA. Spotted microarrays fabricated from the GSTs show good dynamic range, specificity, and sensitivity in transcript profiling experiments. The GSTs have also been transferred to bacterial plasmid vectors via recombinational cloning protocols. These cloned GSTs constitute the ideal starting point for a variety of functional approaches, including reverse genetics. We have subcloned GSTs on a large scale into vectors designed for gene silencing in plant cells. We show that in planta expression of GST hairpin RNA results in the expected phenotypes in silenced Arabidopsis lines. These versatile GST resources provide novel and powerful tools for functional genomics. PMID:15489341

  8. An incremental anomaly detection model for virtual machines.

    PubMed

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.

  9. An incremental anomaly detection model for virtual machines

    PubMed Central

    Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu

    2017-01-01

    Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245

  10. Are We Ready for Mass Fatality Incidents? Preparedness of the US Mass Fatality Infrastructure.

    PubMed

    Merrill, Jacqueline A; Orr, Mark; Chen, Daniel Y; Zhi, Qi; Gershon, Robyn R

    2016-02-01

    To assess the preparedness of the US mass fatality infrastructure, we developed and tested metrics for 3 components of preparedness: organizational, operational, and resource sharing networks. In 2014, data were collected from 5 response sectors: medical examiners and coroners, the death care industry, health departments, faith-based organizations, and offices of emergency management. Scores were calculated within and across sectors and a weighted score was developed for the infrastructure. A total of 879 respondents reported highly variable organizational capabilities: 15% had responded to a mass fatality incident (MFI); 42% reported staff trained for an MFI, but only 27% for an MFI involving hazardous contaminants. Respondents estimated that 75% of their staff would be willing and able to respond, but only 53% if contaminants were involved. Most perceived their organization as somewhat prepared, but 13% indicated "not at all." Operational capability scores ranged from 33% (death care industry) to 77% (offices of emergency management). Network capability analysis found that only 42% of possible reciprocal relationships between resource-sharing partners were present. The cross-sector composite score was 51%; that is, half the key capabilities for preparedness were in place. The sectors in the US mass fatality infrastructure report suboptimal capability to respond. National leadership is needed to ensure sector-specific and infrastructure-wide preparedness for a large-scale MFI.

  11. Business and public health collaboration for emergency preparedness in Georgia: a case study.

    PubMed

    Buehler, James W; Whitney, Ellen A; Berkelman, Ruth L

    2006-11-20

    Governments may be overwhelmed by a large-scale public health emergency, such as a massive bioterrorist attack or natural disaster, requiring collaboration with businesses and other community partners to respond effectively. In Georgia, public health officials and members of the Business Executives for National Security have successfully collaborated to develop and test procedures for dispensing medications from the Strategic National Stockpile. Lessons learned from this collaboration should be useful to other public health and business leaders interested in developing similar partnerships. The authors conducted a case study based on interviews with 26 government, business, and academic participants in this collaboration. The partnership is based on shared objectives to protect public health and assure community cohesion in the wake of a large-scale disaster, on the recognition that acting alone neither public health agencies nor businesses are likely to manage such a response successfully, and on the realization that business and community continuity are intertwined. The partnership has required participants to acknowledge and address multiple challenges, including differences in business and government cultures and operational constraints, such as concerns about the confidentiality of shared information, liability, and the limits of volunteerism. The partnership has been facilitated by a business model based on defining shared objectives, identifying mutual needs and vulnerabilities, developing carefully-defined projects, and evaluating proposed project methods through exercise testing. Through collaborative engagement in progressively more complex projects, increasing trust and understanding have enabled the partners to make significant progress in addressing these challenges. As a result of this partnership, essential relationships have been established, substantial private resources and capabilities have been engaged in government preparedness programs, and a model for collaborative, emergency mass dispensing of pharmaceuticals has been developed, tested, and slated for expansion. The lessons learned from this collaboration in Georgia should be considered by other government and business leaders seeking to develop similar partnerships.

  12. Business and public health collaboration for emergency preparedness in Georgia: a case study

    PubMed Central

    Buehler, James W; Whitney, Ellen A; Berkelman, Ruth L

    2006-01-01

    Background Governments may be overwhelmed by a large-scale public health emergency, such as a massive bioterrorist attack or natural disaster, requiring collaboration with businesses and other community partners to respond effectively. In Georgia, public health officials and members of the Business Executives for National Security have successfully collaborated to develop and test procedures for dispensing medications from the Strategic National Stockpile. Lessons learned from this collaboration should be useful to other public health and business leaders interested in developing similar partnerships. Methods The authors conducted a case study based on interviews with 26 government, business, and academic participants in this collaboration. Results The partnership is based on shared objectives to protect public health and assure community cohesion in the wake of a large-scale disaster, on the recognition that acting alone neither public health agencies nor businesses are likely to manage such a response successfully, and on the realization that business and community continuity are intertwined. The partnership has required participants to acknowledge and address multiple challenges, including differences in business and government cultures and operational constraints, such as concerns about the confidentiality of shared information, liability, and the limits of volunteerism. The partnership has been facilitated by a business model based on defining shared objectives, identifying mutual needs and vulnerabilities, developing carefully-defined projects, and evaluating proposed project methods through exercise testing. Through collaborative engagement in progressively more complex projects, increasing trust and understanding have enabled the partners to make significant progress in addressing these challenges. Conclusion As a result of this partnership, essential relationships have been established, substantial private resources and capabilities have been engaged in government preparedness programs, and a model for collaborative, emergency mass dispensing of pharmaceuticals has been developed, tested, and slated for expansion. The lessons learned from this collaboration in Georgia should be considered by other government and business leaders seeking to develop similar partnerships. PMID:17116256

  13. Challenges in Managing Trustworthy Large-scale Digital Science

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  14. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    ERIC Educational Resources Information Center

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  15. Why Do Countries Participate in International Large-Scale Assessments? The Case of PISA. Policy Research Working Paper 7447

    ERIC Educational Resources Information Center

    Lockheed, Marlaine E.

    2015-01-01

    The number of countries that regularly participate in international large-scale assessments has increased sharply over the past 15 years, with the share of countries participating in the Programme for International Student Assessment growing from one-fifth of countries in 2000 to over one-third of countries in 2015. What accounts for this…

  16. Scaling Law of Urban Ride Sharing.

    PubMed

    Tachet, R; Sagarra, O; Santi, P; Resta, G; Szell, M; Strogatz, S H; Ratti, C

    2017-03-06

    Sharing rides could drastically improve the efficiency of car and taxi transportation. Unleashing such potential, however, requires understanding how urban parameters affect the fraction of individual trips that can be shared, a quantity that we call shareability. Using data on millions of taxi trips in New York City, San Francisco, Singapore, and Vienna, we compute the shareability curves for each city, and find that a natural rescaling collapses them onto a single, universal curve. We explain this scaling law theoretically with a simple model that predicts the potential for ride sharing in any city, using a few basic urban quantities and no adjustable parameters. Accurate extrapolations of this type will help planners, transportation companies, and society at large to shape a sustainable path for urban growth.

  17. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline

    PubMed Central

    2014-01-01

    Background Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. Results To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. Conclusions By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples. PMID:24475911

  18. The Impact of Varying Statutory Arrangements on Spatial Data Sharing and Access in Regional NRM Bodies

    NASA Astrophysics Data System (ADS)

    Paudyal, D. R.; McDougall, K.; Apan, A.

    2014-12-01

    Spatial information plays an important role in many social, environmental and economic decisions and increasingly acknowledged as a national resource essential for wider societal and environmental benefits. Natural Resource Management is one area where spatial information can be used for improved planning and decision making processes. In Australia, state government organisations are the custodians of spatial information necessary for natural resource management and regional NRM bodies are responsible to regional delivery of NRM activities. The access and sharing of spatial information between government agencies and regional NRM bodies is therefore as an important issue for improving natural resource management outcomes. The aim of this paper is to evaluate the current status of spatial information access, sharing and use with varying statutory arrangements and its impacts on spatial data infrastructure (SDI) development in catchment management sector in Australia. Further, it critically examined whether any trends and significant variations exist due to different institutional arrangements (statutory versus non-statutory) or not. A survey method was used to collect primary data from 56 regional natural resource management (NRM) bodies responsible for catchment management in Australia. Descriptive statistics method was used to show the similarities and differences between statutory and non-statutory arrangements. The key factors which influence sharing and access to spatial information are also explored. The results show the current statutory and administrative arrangements and regional focus for natural resource management is reasonable from a spatial information management perspective and provides an opportunity for building SDI at the catchment scale. However, effective institutional arrangements should align catchment SDI development activities with sub-national and national SDI development activities to address catchment management issues. We found minor differences in spatial information access, use and sharing due to varying institutional environment (statutory versus non-statutory). The non-statutory group appears to be more flexible and selfsufficient whilst statutory regional NRM bodies may lack flexibility in their spatial information management practices. We found spatial information access, use and sharing has significant impacts on spatial data infrastructure development in catchment management sector in Australia.

  19. Internet and mobile technologies: addressing the mental health of trauma survivors in less resourced communities.

    PubMed

    Ruzek, J I; Yeager, C M

    2017-01-01

    Internet and mobile technologies offer potentially critical ways of delivering mental health support in low-resource settings. Much evidence indicates an enormous negative impact of mental health problems in low- and middle-income countries (LMICs), and many of these problems are caused, or worsened, by exposure to wars, conflicts, natural and human-caused disasters, and other traumatic events. Though specific mental health treatments have been found to be efficacious and cost-effective for low-resource settings, most individuals living in these areas do not have access to them. Low-intensity task-sharing interventions will help, but there is a limit to the scalability and sustainability of human resources in these settings. To address the needs of trauma survivors, it will be important to develop and implement Internet and mobile technology resources to help reduce the scarcity, inequity, and inefficiency of current mental health services in LMICs. Mobile and Internet resources are experiencing a rapid growth in LMICs and can help address time, stigma, and cost barriers and connect those who have been socially isolated by traumatic events. This review discusses current research in technological interventions in low-resource settings and outlines key issues and future challenges and opportunities. Though formidable challenges exist for large-scale deployment of mobile and Internet mental health technologies, work to date indicates that these technologies are indeed feasible to develop, evaluate, and deliver to those in need of mental health services, and that they can be effective.

  20. Statistical physics of language dynamics

    NASA Astrophysics Data System (ADS)

    Loreto, Vittorio; Baronchelli, Andrea; Mukherjee, Animesh; Puglisi, Andrea; Tria, Francesca

    2011-04-01

    Language dynamics is a rapidly growing field that focuses on all processes related to the emergence, evolution, change and extinction of languages. Recently, the study of self-organization and evolution of language and meaning has led to the idea that a community of language users can be seen as a complex dynamical system, which collectively solves the problem of developing a shared communication framework through the back-and-forth signaling between individuals. We shall review some of the progress made in the past few years and highlight potential future directions of research in this area. In particular, the emergence of a common lexicon and of a shared set of linguistic categories will be discussed, as examples corresponding to the early stages of a language. The extent to which synthetic modeling is nowadays contributing to the ongoing debate in cognitive science will be pointed out. In addition, the burst of growth of the web is providing new experimental frameworks. It makes available a huge amount of resources, both as novel tools and data to be analyzed, allowing quantitative and large-scale analysis of the processes underlying the emergence of a collective information and language dynamics.

  1. A survey of tools and resources for the next generation analyst

    NASA Astrophysics Data System (ADS)

    Hall, David L.; Graham, Jake; Catherman, Emily

    2015-05-01

    We have previously argued that a combination of trends in information technology (IT) and changing habits of people using IT provide opportunities for the emergence of a new generation of analysts that can perform effective intelligence, surveillance and reconnaissance (ISR) on a "do it yourself" (DIY) or "armchair" approach (see D.L. Hall and J. Llinas (2014)). Key technology advances include: i) new sensing capabilities including the use of micro-scale sensors and ad hoc deployment platforms such as commercial drones, ii) advanced computing capabilities in mobile devices that allow advanced signal and image processing and modeling, iii) intelligent interconnections due to advances in "web N" capabilities, and iv) global interconnectivity and increasing bandwidth. In addition, the changing habits of the digital natives reflect new ways of collecting and reporting information, sharing information, and collaborating in dynamic teams. This paper provides a survey and assessment of tools and resources to support this emerging analysis approach. The tools range from large-scale commercial tools such as IBM i2 Analyst Notebook, Palantir, and GeoSuite to emerging open source tools such as GeoViz and DECIDE from university research centers. The tools include geospatial visualization tools, social network analysis tools and decision aids. A summary of tools is provided along with links to web sites for tool access.

  2. Stochastic multi-objective auto-optimization for resource allocation decision-making in fixed-input health systems.

    PubMed

    Bastian, Nathaniel D; Ekin, Tahir; Kang, Hyojung; Griffin, Paul M; Fulton, Lawrence V; Grannan, Benjamin C

    2017-06-01

    The management of hospitals within fixed-input health systems such as the U.S. Military Health System (MHS) can be challenging due to the large number of hospitals, as well as the uncertainty in input resources and achievable outputs. This paper introduces a stochastic multi-objective auto-optimization model (SMAOM) for resource allocation decision-making in fixed-input health systems. The model can automatically identify where to re-allocate system input resources at the hospital level in order to optimize overall system performance, while considering uncertainty in the model parameters. The model is applied to 128 hospitals in the three services (Air Force, Army, and Navy) in the MHS using hospital-level data from 2009 - 2013. The results are compared to the traditional input-oriented variable returns-to-scale Data Envelopment Analysis (DEA) model. The application of SMAOM to the MHS increases the expected system-wide technical efficiency by 18 % over the DEA model while also accounting for uncertainty of health system inputs and outputs. The developed method is useful for decision-makers in the Defense Health Agency (DHA), who have a strategic level objective of integrating clinical and business processes through better sharing of resources across the MHS and through system-wide standardization across the services. It is also less sensitive to data outliers or sampling errors than traditional DEA methods.

  3. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  4. CERN data services for LHC computing

    NASA Astrophysics Data System (ADS)

    Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.

    2017-10-01

    Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.

  5. Cram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gamblin, T.

    2014-08-29

    Large-scale systems like Sequoia allow running small numbers of very large (1M+ process) jobs, but their resource managers and schedulers do not allow large numbers of small (4, 8, 16, etc.) process jobs to run efficiently. Cram is a tool that allows users to launch many small MPI jobs within one large partition, and to overcome the limitations of current resource management software for large ensembles of jobs.

  6. The extent of interorganizational resource sharing among local health departments: the association with organizational characteristics and institutional factors.

    PubMed

    Vest, Joshua R; Shah, Gulzar H

    2012-11-01

    Resource sharing, arrangements between local health departments (LHDs) for joint programs or to share staff, is a growing occurrence. The post-9/11 influx of federal funding and new public health preparedness responsibilities dramatically increased the occurrence of these inter-LHD relationships, and several states have pursed more intrastate collaboration. This article describes the current state of resource sharing among LHDs and identifies the factors associated with resource sharing. Using the National Association of County & City Health Officials' 2010 Profile Survey, we determined the self-reported number of shared programmatic activities and the number of shared organizational functions for a sample of LHDs. Negative binomial regression models described the relationships between factors suggested by interorganizational theory and the counts of sharing activities. We examined the extent of resource sharing using 2 different count variables: (1) number of shared programmatic activities and (2) number of shared organizational functions. About one-half of all LHDs are engaged in resource sharing. The extent of sharing was lower for those serving larger populations, with city jurisdictions, or of larger size. Sharing was more extensive for state-governed LHDs, those covering multiple jurisdictions, states with centralized governance, and in instances of financial constraint. Many LHDs are engaged in a greater extent of resource sharing than others. Leaders of LHDs can work within the context of these factors to leverage resource sharing to meet their organizational needs.

  7. Formation of large-scale structure from cosmic-string loops and cold dark matter

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.; Scherrer, Robert J.

    1987-01-01

    Some results from a numerical simulation of the formation of large-scale structure from cosmic-string loops are presented. It is found that even though G x mu is required to be lower than 2 x 10 to the -6th (where mu is the mass per unit length of the string) to give a low enough autocorrelation amplitude, there is excessive power on smaller scales, so that galaxies would be more dense than observed. The large-scale structure does not include a filamentary or connected appearance and shares with more conventional models based on Gaussian perturbations the lack of cluster-cluster correlation at the mean cluster separation scale as well as excessively small bulk velocities at these scales.

  8. Data sharing by scientists: Practices and perceptions

    USGS Publications Warehouse

    Tenopir, C.; Allard, S.; Douglass, K.; Aydinoglu, A.U.; Wu, L.; Read, E.; Manoff, M.; Frame, M.

    2011-01-01

    Background: Scientific research in the 21st century is more data intensive and collaborative than in the past. It is important to study the data practices of researchers - data accessibility, discovery, re-use, preservation and, particularly, data sharing. Data sharing is a valuable part of the scientific method allowing for verification of results and extending research from prior results. Methodology/Principal Findings: A total of 1329 scientists participated in this survey exploring current data sharing practices and perceptions of the barriers and enablers of data sharing. Scientists do not make their data electronically available to others for various reasons, including insufficient time and lack of funding. Most respondents are satisfied with their current processes for the initial and short-term parts of the data or research lifecycle (collecting their research data; searching for, describing or cataloging, analyzing, and short-term storage of their data) but are not satisfied with long-term data preservation. Many organizations do not provide support to their researchers for data management both in the short- and long-term. If certain conditions are met (such as formal citation and sharing reprints) respondents agree they are willing to share their data. There are also significant differences and approaches in data management practices based on primary funding agency, subject discipline, age, work focus, and world region. Conclusions/Significance: Barriers to effective data sharing and preservation are deeply rooted in the practices and culture of the research process as well as the researchers themselves. New mandates for data management plans from NSF and other federal agencies and world-wide attention to the need to share and preserve data could lead to changes. Large scale programs, such as the NSF-sponsored DataNET (including projects like DataONE) will both bring attention and resources to the issue and make it easier for scientists to apply sound data management principles. ?? 2011 Tenopir et al.

  9. Data Sharing by Scientists: Practices and Perceptions

    PubMed Central

    Tenopir, Carol; Allard, Suzie; Douglass, Kimberly; Aydinoglu, Arsev Umur; Wu, Lei; Read, Eleanor; Manoff, Maribeth; Frame, Mike

    2011-01-01

    Background Scientific research in the 21st century is more data intensive and collaborative than in the past. It is important to study the data practices of researchers – data accessibility, discovery, re-use, preservation and, particularly, data sharing. Data sharing is a valuable part of the scientific method allowing for verification of results and extending research from prior results. Methodology/Principal Findings A total of 1329 scientists participated in this survey exploring current data sharing practices and perceptions of the barriers and enablers of data sharing. Scientists do not make their data electronically available to others for various reasons, including insufficient time and lack of funding. Most respondents are satisfied with their current processes for the initial and short-term parts of the data or research lifecycle (collecting their research data; searching for, describing or cataloging, analyzing, and short-term storage of their data) but are not satisfied with long-term data preservation. Many organizations do not provide support to their researchers for data management both in the short- and long-term. If certain conditions are met (such as formal citation and sharing reprints) respondents agree they are willing to share their data. There are also significant differences and approaches in data management practices based on primary funding agency, subject discipline, age, work focus, and world region. Conclusions/Significance Barriers to effective data sharing and preservation are deeply rooted in the practices and culture of the research process as well as the researchers themselves. New mandates for data management plans from NSF and other federal agencies and world-wide attention to the need to share and preserve data could lead to changes. Large scale programs, such as the NSF-sponsored DataNET (including projects like DataONE) will both bring attention and resources to the issue and make it easier for scientists to apply sound data management principles. PMID:21738610

  10. Taking Teacher Learning to Scale: Sharing Knowledge and Spreading Ideas across Geographies

    ERIC Educational Resources Information Center

    Klein, Emily J.; Jaffe-Walter, Reva; Riordan, Megan

    2016-01-01

    This research reports data from case studies of three intermediary organizations facing the challenge of scaling up teacher learning. The turn of the century launched scaling-up efforts of all three intermediaries, growing from intimate groups, where founding teachers and staff were key supports for teacher learning, to large multistate…

  11. The fundamental closed-form solution of control-related states of kth order S3PR system with left-side non-sharing resource places of Petri nets

    NASA Astrophysics Data System (ADS)

    Chao, Daniel Yuh; Yu, Tsung Hsien

    2016-01-01

    Due to the state explosion problem, it has been unimaginable to enumerate reachable states for Petri nets. Chao broke the barrier earlier by developing the very first closed-form solution of the number of reachable and other states for marked graphs and the kth order system. Instead of using first-met bad marking, we propose 'the moment to launch resource allocation' (MLR) as a partial deadlock avoidance policy for a large, real-time dynamic resource allocation system. Presently, we can use the future deadlock ratio of the current state as the indicator of MLR due to which the ratio can be obtained real-time by a closed-form formula. This paper progresses the application of an MLR concept one step further on Gen-Left kth order systems (one non-sharing resource place in any position of the left-side process), which is also the most fundamental asymmetric net structure, by the construction of the system's closed-form solution of the control-related states (reachable, forbidden, live and deadlock states) with a formula depending on the parameters of k and the location of the non-sharing resource. Here, we kick off a new era of real-time, dynamic resource allocation decisions by constructing a generalisation formula of kth order systems (Gen-Left) with r* on the left side but at arbitrary locations.

  12. SmallTool - a toolkit for realizing shared virtual environments on the Internet

    NASA Astrophysics Data System (ADS)

    Broll, Wolfgang

    1998-09-01

    With increasing graphics capabilities of computers and higher network communication speed, networked virtual environments have become available to a large number of people. While the virtual reality modelling language (VRML) provides users with the ability to exchange 3D data, there is still a lack of appropriate support to realize large-scale multi-user applications on the Internet. In this paper we will present SmallTool, a toolkit to support shared virtual environments on the Internet. The toolkit consists of a VRML-based parsing and rendering library, a device library, and a network library. This paper will focus on the networking architecture, provided by the network library - the distributed worlds transfer and communication protocol (DWTP). DWTP provides an application-independent network architecture to support large-scale multi-user environments on the Internet.

  13. Composing Data Parallel Code for a SPARQL Graph Engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellana, Vito G.; Tumeo, Antonino; Villa, Oreste

    Big data analytics process large amount of data to extract knowledge from them. Semantic databases are big data applications that adopt the Resource Description Framework (RDF) to structure metadata through a graph-based representation. The graph based representation provides several benefits, such as the possibility to perform in memory processing with large amounts of parallelism. SPARQL is a language used to perform queries on RDF-structured data through graph matching. In this paper we present a tool that automatically translates SPARQL queries to parallel graph crawling and graph matching operations. The tool also supports complex SPARQL constructs, which requires more than basicmore » graph matching for their implementation. The tool generates parallel code annotated with OpenMP pragmas for x86 Shared-memory Multiprocessors (SMPs). With respect to commercial database systems such as Virtuoso, our approach reduces memory occupation due to join operations and provides higher performance. We show the scaling of the automatically generated graph-matching code on a 48-core SMP.« less

  14. Collaborative Working for Large Digitisation Projects

    ERIC Educational Resources Information Center

    Yeates, Robin; Guy, Damon

    2006-01-01

    Purpose: To explore the effectiveness of large-scale consortia for disseminating local heritage via the web. To describe the creation of a large geographically based cultural heritage consortium in the South East of England and management lessons resulting from a major web site digitisation project. To encourage the improved sharing of experience…

  15. Scaling to diversity: The DERECHOS distributed infrastructure for analyzing and sharing data

    NASA Astrophysics Data System (ADS)

    Rilee, M. L.; Kuo, K. S.; Clune, T.; Oloso, A.; Brown, P. G.

    2016-12-01

    Integrating Earth Science data from diverse sources such as satellite imagery and simulation output can be expensive and time-consuming, limiting scientific inquiry and the quality of our analyses. Reducing these costs will improve innovation and quality in science. The current Earth Science data infrastructure focuses on downloading data based on requests formed from the search and analysis of associated metadata. And while the data products provided by archives may use the best available data sharing technologies, scientist end-users generally do not have such resources (including staff) available to them. Furthermore, only once an end-user has received the data from multiple diverse sources and has integrated them can the actual analysis and synthesis begin. The cost of getting from idea to where synthesis can start dramatically slows progress. In this presentation we discuss a distributed computational and data storage framework that eliminates much of the aforementioned cost. The SciDB distributed array database is central as it is optimized for scientific computing involving very large arrays, performing better than less specialized frameworks like Spark. Adding spatiotemporal functions to the SciDB creates a powerful platform for analyzing and integrating massive, distributed datasets. SciDB allows Big Earth Data analysis to be performed "in place" without the need for expensive downloads and end-user resources. Spatiotemporal indexing technologies such as the hierarchical triangular mesh enable the compute and storage affinity needed to efficiently perform co-located and conditional analyses minimizing data transfers. These technologies automate the integration of diverse data sources using the framework, a critical step beyond current metadata search and analysis. Instead of downloading data into their idiosyncratic local environments, end-users can generate and share data products integrated from diverse multiple sources using a common shared environment, turning distributed active archive centers (DAACs) from warehouses into distributed active analysis centers.

  16. Webinar July 28: H2@Scale - A Potential Opportunity | News | NREL

    Science.gov Websites

    role of hydrogen at the grid scale and the efforts of a large, national lab team assembled to evaluate the potential of hydrogen to play a critical role in our energy future. Presenters will share facts

  17. Continuous quality improvement: a shared governance model that maximizes agent-specific knowledge.

    PubMed

    Burkoski, Vanessa; Yoon, Jennifer

    2013-01-01

    Motivate, Innovate, Celebrate: an innovative shared governance model through the establishment of continuous quality improvement (CQI) councils was implemented across the London Health Sciences Centre (LHSC). The model leverages agent-specific knowledge at the point of care and provides a structure aimed at building human resources capacity and sustaining enhancements to quality and safe care delivery. Interprofessional and cross-functional teams work through the CQI councils to identify, formulate, execute and evaluate CQI initiatives. In addition to a structure that facilitates collaboration, accountability and ownership, a corporate CQI Steering Committee provides the forum for scaling up and spreading this model. Point-of-care staff, clinical management and educators were trained in LEAN methodology and patient experience-based design to ensure sufficient knowledge and resources to support the implementation.

  18. Sibling relationships as a resource for coping with traumatic events.

    PubMed

    Perricone, Giovanna; Fontana, Valentina; Burgio, Sofia; Polizzi, Concetta

    2014-01-01

    The study investigated the correlation between the perception of sibling relationship to cope an adverse occurrence - the partial collapse of a primary school - and the indicators related to the traumatic impact set off by the event, by soliciting the child's reminiscence of the catastrophic experience. One hundred trauma-exposed children were recruited from a Sicilian primary school and were administered the following research instruments: the Trauma Symptom Checklist for Children (TSCC-A), to investigate the traumatized response that can be triggered in the children involved; the Brother as a Resource Questionnaire (BRQ), to delve into the perception of sibling relationship as a resource. The outcomes showed statistically significant negative correlations between the Anxiety scale of the TSCC-A and the Scaffolding factors (r = -.260, p < .05) and Decision making process (r = -.315, p < 05) of the BRQ; between the Depression scale and the Scaffolding factors (r = -.147, p < .05), Emotional sharing (r = -.168, p < .05) and Decision making process (r = -.281, p < .05). The Anger scale correlated negatively with the Emotional sharing (r = -187, p < .05), the Decision making process (r = -.182, p < .05) and the Scaffolding factors (r = -.279, p < .05); the Post-traumatic Stress correlated negatively with the Scaffolding factors (r = -.203, p < .05) and the Decision making process (r = -.238, p < .05). Lastly, the Dissociation correlated negatively with the Decision making process (r = -.270, p < .05).

  19. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  20. Compiler-directed cache management in multiprocessors

    NASA Technical Reports Server (NTRS)

    Cheong, Hoichi; Veidenbaum, Alexander V.

    1990-01-01

    The necessity of finding alternatives to hardware-based cache coherence strategies for large-scale multiprocessor systems is discussed. Three different software-based strategies sharing the same goals and general approach are presented. They consist of a simple invalidation approach, a fast selective invalidation scheme, and a version control scheme. The strategies are suitable for shared-memory multiprocessor systems with interconnection networks and a large number of processors. Results of trace-driven simulations conducted on numerical benchmark routines to compare the performance of the three schemes are presented.

  1. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  2. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  3. [The digital information platform after-sale service of medical equipment].

    PubMed

    Cao, Shaoping; Li, Bin

    2015-01-01

    This paper describes the after-sale service of medical equipment information management platform, with large data sharing resources to further enhance customer service in the whole management process of medical service, to strengthen quality management, to control medical risk.

  4. The Autism Brain Imaging Data Exchange: Towards Large-Scale Evaluation of the Intrinsic Brain Architecture in Autism

    PubMed Central

    Di Martino, Adriana; Yan, Chao-Gan; Li, Qingyang; Denio, Erin; Castellanos, Francisco X.; Alaerts, Kaat; Anderson, Jeffrey S.; Assaf, Michal; Bookheimer, Susan Y.; Dapretto, Mirella; Deen, Ben; Delmonte, Sonja; Dinstein, Ilan; Ertl-Wagner, Birgit; Fair, Damien A.; Gallagher, Louise; Kennedy, Daniel P.; Keown, Christopher L.; Keysers, Christian; Lainhart, Janet E.; Lord, Catherine; Luna, Beatriz; Menon, Vinod; Minshew, Nancy; Monk, Christopher S.; Mueller, Sophia; Müller, Ralph-Axel; Nebel, Mary Beth; Nigg, Joel T.; O’Hearn, Kirsten; Pelphrey, Kevin A.; Peltier, Scott J.; Rudie, Jeffrey D.; Sunaert, Stefan; Thioux, Marc; Tyszka, J. Michael; Uddin, Lucina Q.; Verhoeven, Judith S.; Wenderoth, Nicole; Wiggins, Jillian L.; Mostofsky, Stewart H.; Milham, Michael P.

    2014-01-01

    Autism spectrum disorders (ASD) represent a formidable challenge for psychiatry and neuroscience because of their high prevalence, life-long nature, complexity and substantial heterogeneity. Facing these obstacles requires large-scale multidisciplinary efforts. While the field of genetics has pioneered data sharing for these reasons, neuroimaging had not kept pace. In response, we introduce the Autism Brain Imaging Data Exchange (ABIDE) – a grassroots consortium aggregating and openly sharing 1112 existing resting-state functional magnetic resonance imaging (R-fMRI) datasets with corresponding structural MRI and phenotypic information from 539 individuals with ASD and 573 age-matched typical controls (TC; 7–64 years) (http://fcon_1000.projects.nitrc.org/indi/abide/). Here, we present this resource and demonstrate its suitability for advancing knowledge of ASD neurobiology based on analyses of 360 males with ASD and 403 male age-matched TC. We focused on whole-brain intrinsic functional connectivity and also survey a range of voxel-wise measures of intrinsic functional brain architecture. Whole-brain analyses reconciled seemingly disparate themes of both hypo and hyperconnectivity in the ASD literature; both were detected, though hypoconnectivity dominated, particularly for cortico-cortical and interhemispheric functional connectivity. Exploratory analyses using an array of regional metrics of intrinsic brain function converged on common loci of dysfunction in ASD (mid and posterior insula, posterior cingulate cortex), and highlighted less commonly explored regions such as thalamus. The survey of the ABIDE R-fMRI datasets provides unprecedented demonstrations of both replication and novel discovery. By pooling multiple international datasets, ABIDE is expected to accelerate the pace of discovery setting the stage for the next generation of ASD studies. PMID:23774715

  5. The autism brain imaging data exchange: towards a large-scale evaluation of the intrinsic brain architecture in autism.

    PubMed

    Di Martino, A; Yan, C-G; Li, Q; Denio, E; Castellanos, F X; Alaerts, K; Anderson, J S; Assaf, M; Bookheimer, S Y; Dapretto, M; Deen, B; Delmonte, S; Dinstein, I; Ertl-Wagner, B; Fair, D A; Gallagher, L; Kennedy, D P; Keown, C L; Keysers, C; Lainhart, J E; Lord, C; Luna, B; Menon, V; Minshew, N J; Monk, C S; Mueller, S; Müller, R-A; Nebel, M B; Nigg, J T; O'Hearn, K; Pelphrey, K A; Peltier, S J; Rudie, J D; Sunaert, S; Thioux, M; Tyszka, J M; Uddin, L Q; Verhoeven, J S; Wenderoth, N; Wiggins, J L; Mostofsky, S H; Milham, M P

    2014-06-01

    Autism spectrum disorders (ASDs) represent a formidable challenge for psychiatry and neuroscience because of their high prevalence, lifelong nature, complexity and substantial heterogeneity. Facing these obstacles requires large-scale multidisciplinary efforts. Although the field of genetics has pioneered data sharing for these reasons, neuroimaging had not kept pace. In response, we introduce the Autism Brain Imaging Data Exchange (ABIDE)-a grassroots consortium aggregating and openly sharing 1112 existing resting-state functional magnetic resonance imaging (R-fMRI) data sets with corresponding structural MRI and phenotypic information from 539 individuals with ASDs and 573 age-matched typical controls (TCs; 7-64 years) (http://fcon_1000.projects.nitrc.org/indi/abide/). Here, we present this resource and demonstrate its suitability for advancing knowledge of ASD neurobiology based on analyses of 360 male subjects with ASDs and 403 male age-matched TCs. We focused on whole-brain intrinsic functional connectivity and also survey a range of voxel-wise measures of intrinsic functional brain architecture. Whole-brain analyses reconciled seemingly disparate themes of both hypo- and hyperconnectivity in the ASD literature; both were detected, although hypoconnectivity dominated, particularly for corticocortical and interhemispheric functional connectivity. Exploratory analyses using an array of regional metrics of intrinsic brain function converged on common loci of dysfunction in ASDs (mid- and posterior insula and posterior cingulate cortex), and highlighted less commonly explored regions such as the thalamus. The survey of the ABIDE R-fMRI data sets provides unprecedented demonstrations of both replication and novel discovery. By pooling multiple international data sets, ABIDE is expected to accelerate the pace of discovery setting the stage for the next generation of ASD studies.

  6. Theoretical and Empirical Comparison of Big Data Image Processing with Apache Hadoop and Sun Grid Engine

    PubMed Central

    Bao, Shunxing; Weitendorf, Frederick D.; Plassard, Andrew J.; Huo, Yuankai; Gokhale, Aniruddha; Landman, Bennett A.

    2016-01-01

    The field of big data is generally concerned with the scale of processing at which traditional computational paradigms break down. In medical imaging, traditional large scale processing uses a cluster computer that combines a group of workstation nodes into a functional unit that is controlled by a job scheduler. Typically, a shared-storage network file system (NFS) is used to host imaging data. However, data transfer from storage to processing nodes can saturate network bandwidth when data is frequently uploaded/retrieved from the NFS, e.g., “short” processing times and/or “large” datasets. Recently, an alternative approach using Hadoop and HBase was presented for medical imaging to enable co-location of data storage and computation while minimizing data transfer. The benefits of using such a framework must be formally evaluated against a traditional approach to characterize the point at which simply “large scale” processing transitions into “big data” and necessitates alternative computational frameworks. The proposed Hadoop system was implemented on a production lab-cluster alongside a standard Sun Grid Engine (SGE). Theoretical models for wall-clock time and resource time for both approaches are introduced and validated. To provide real example data, three T1 image archives were retrieved from a university secure, shared web database and used to empirically assess computational performance under three configurations of cluster hardware (using 72, 109, or 209 CPU cores) with differing job lengths. Empirical results match the theoretical models. Based on these data, a comparative analysis is presented for when the Hadoop framework will be relevant and non-relevant for medical imaging. PMID:28736473

  7. The importance of considering external influences during presuppression wildfire planning

    Treesearch

    Marc R. Wiitala; Andrew E. Wilson

    2008-01-01

    Few administrative units involved in wildland fire protection are islands unto themselves when it comes to wildfire activity and suppression. If not directly affected by the wildfire workload of their neighbors, they are affected by the availability of nationally shared resources impacted by wildfire activity at the regional and national scale. These external...

  8. Framework for Shared Drinking Water Risk Assessment.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lowry, Thomas Stephen; Tidwell, Vincent C.; Peplinski, William John

    Central to protecting our nation's critical infrastructure is the development of methodologies for prioritizing action and supporting resource allocation decisions associated with risk-reduction initiatives. Toward this need a web-based risk assessment framework that promotes the anonymous sharing of results among water utilities is demonstrated. Anonymous sharing of results offers a number of potential advantages such as assistance in recognizing and correcting bias, identification of 'unknown, unknowns', self-assessment and benchmarking for the local utility, treatment of shared assets and/or threats across multiple utilities, and prioritization of actions beyond the scale of a single utility. The constructed framework was demonstrated for threemore » water utilities. Demonstration results were then compared to risk assessment results developed using a different risk assessment application by a different set of analysts.« less

  9. Food for contagion: Synthesis and future directions for studying host–parasite responses to resource shifts in anthropogenic environments

    PubMed Central

    Altizer, Sonia. M.; Becker, Daniel J.; Epstein, Jonathan H.; Forbes, Kristian M.; Gillespie, Thomas R.; Hall, Richard J.; Hawley, Dana; Hernandez, Sonia M.; Martin, Lynn B.; Plowright, Raina K.; Satterfield, Dara A.; Streicker, Daniel G.

    2018-01-01

    Human-provided resource subsidies for wildlife are diverse, common, and have profound consequences for wildlife–pathogen interactions, as demonstrated by papers in this themed issue spanning empirical, theoretical, and management perspectives from a range of study systems. Contributions cut across scales of organization, from the within-host dynamics of immune function, to population-level impacts on parasite transmission, to landscape- and regional-scale patterns of infection. In this concluding paper, we identify common threads and key findings from author contributions, including the consequences of resource subsidies for (i) host immunity; (ii) animal aggregation and contact rates; (iii) host movement and landscape-level infection patterns; and (iv) inter-specific contacts and cross-species transmission. Exciting avenues for future work include studies that integrate mechanistic modeling and empirical approaches to better explore cross-scale processes, and experimental manipulations of food resources to quantify host and pathogen responses. Work is also needed to examine evolutionary responses to provisioning, and ask how diet-altered changes to the host microbiome influence infection processes. Given the massive public health and conservation implications of anthropogenic resource shifts, we end by underscoring the need for practical recommendations to manage supplemental feeding practices, limit human–wildlife conflicts over shared food resources, and reduce cross-species transmission risks, including to humans. PMID:29531154

  10. Breaking barriers through collaboration: the example of the Cell Migration Consortium.

    PubMed

    Horwitz, Alan Rick; Watson, Nikki; Parsons, J Thomas

    2002-10-15

    Understanding complex integrated biological processes, such as cell migration, requires interdisciplinary approaches. The Cell Migration Consortium, funded by a Large-Scale Collaborative Project Award from the National Institute of General Medical Science, develops and disseminates new technologies, data, reagents, and shared information to a wide audience. The development and operation of this Consortium may provide useful insights for those who plan similarly large-scale, interdisciplinary approaches.

  11. ESRI applications of GIS technology: Mineral resource development

    NASA Technical Reports Server (NTRS)

    Derrenbacher, W.

    1981-01-01

    The application of geographic information systems technology to large scale regional assessment related to mineral resource development, identifying candidate sites for related industry, and evaluating sites for waste disposal is discussed. Efforts to develop data bases were conducted at scales ranging from 1:3,000,000 to 1:25,000. In several instances, broad screening was conducted for large areas at a very general scale with more detailed studies subsequently undertaken in promising areas windowed out of the generalized data base. Increasingly, the systems which are developed are structured as the spatial framework for the long-term collection, storage, referencing, and retrieval of vast amounts of data about large regions. Typically, the reconnaissance data base for a large region is structured at 1:250,000 scale, data bases for smaller areas being structured at 1:25,000, 1:50,000 or 1:63,360. An integrated data base for the coterminous US was implemented at a scale of 1:3,000,000 for two separate efforts.

  12. Risk and the evolution of human exchange.

    PubMed

    Kaplan, Hillard S; Schniter, Eric; Smith, Vernon L; Wilson, Bart J

    2012-08-07

    Compared with other species, exchange among non-kin is a hallmark of human sociality in both the breadth of individuals and total resources involved. One hypothesis is that extensive exchange evolved to buffer the risks associated with hominid dietary specialization on calorie dense, large packages, especially from hunting. 'Lucky' individuals share food with 'unlucky' individuals with the expectation of reciprocity when roles are reversed. Cross-cultural data provide prima facie evidence of pair-wise reciprocity and an almost universal association of high-variance (HV) resources with greater exchange. However, such evidence is not definitive; an alternative hypothesis is that food sharing is really 'tolerated theft', in which individuals possessing more food allow others to steal from them, owing to the threat of violence from hungry individuals. Pair-wise correlations may reflect proximity providing greater opportunities for mutual theft of food. We report a laboratory experiment of foraging and food consumption in a virtual world, designed to test the risk-reduction hypothesis by determining whether people form reciprocal relationships in response to variance of resource acquisition, even when there is no external enforcement of any transfer agreements that might emerge. Individuals can forage in a high-mean, HV patch or a low-mean, low-variance (LV) patch. The key feature of the experimental design is that individuals can transfer resources to others. We find that sharing hardly occurs after LV foraging, but among HV foragers sharing increases dramatically over time. The results provide strong support for the hypothesis that people are pre-disposed to evaluate gains from exchange and respond to unsynchronized variance in resource availability through endogenous reciprocal trading relationships.

  13. Data Mashups: Linking Human Health and Wellbeing with Weather, Climate and the Environment

    NASA Astrophysics Data System (ADS)

    Fleming, L. E.; Sarran, C.; Golding, B.; Haines, A.; Kessel, A.; Djennad, M.; Hajat, S.; Nichols, G.; Gordon Brown, H.; Depledge, M.

    2016-12-01

    A large part of the global disease burden can be linked to environmental factors, underpinned by unhealthy behaviours. Research into these linkages suffers from lack of common tools and databases for investigations across many different scientific disciplines to explore these complex associations. The MEDMI (Medical and Environmental Data-a Mash-up Infrastructure) Partnership brings together leading organisations and researchers in climate, weather, environment, and human health. We have created a proof-of-concept central data and analysis system with the UK Met Office and Public Health England data as the internet-based MEDMI Platform (www.data-mashup.org.uk) to serve as a common resource for researchers to link and analyse complex meteorological, environmental and epidemiological data in the UK. The Platform is hosted on its own dedicated server, with secure internet and in-person access with appropriate safeguards for ethical, copyright, security, preservation, and data sharing issues. Via the Platform, there is a demonstration Browser Application with access to user-selected subsets of the data for: a) analyses using time series (e.g. mortality/environmental variables), and b) data visualizations (e.g. infectious diseases/environmental variables). One demonstration project is linking climate change, harmful algal blooms and oceanographic modelling building on the hydrodynamic-biogeochemical coupled models; in situ and satellite observations as well as UK HAB data and hospital episode statistics data are being used for model verification and future forecasting. The MEDMI Project provides a demonstration of the potential, barriers and challenges, of these "data mashups" of environment and health data. Although there remain many challenges to creating and sustaining such a shared resource, these activities and resources are essential to truly explore the complex interactions between climate and other environmental change and health at the local and global scale.

  14. Evaluating the implementation of a national disclosure policy for large-scale adverse events in an integrated health care system: identification of gaps and successes.

    PubMed

    Maguire, Elizabeth M; Bokhour, Barbara G; Wagner, Todd H; Asch, Steven M; Gifford, Allen L; Gallagher, Thomas H; Durfee, Janet M; Martinello, Richard A; Elwy, A Rani

    2016-11-11

    Many healthcare organizations have developed disclosure policies for large-scale adverse events, including the Veterans Health Administration (VA). This study evaluated VA's national large-scale disclosure policy and identifies gaps and successes in its implementation. Semi-structured qualitative interviews were conducted with leaders, hospital employees, and patients at nine sites to elicit their perceptions of recent large-scale adverse events notifications and the national disclosure policy. Data were coded using the constructs of the Consolidated Framework for Implementation Research (CFIR). We conducted 97 interviews. Insights included how to handle the communication of large-scale disclosures through multiple levels of a large healthcare organization and manage ongoing communications about the event with employees. Of the 5 CFIR constructs and 26 sub-constructs assessed, seven were prominent in interviews. Leaders and employees specifically mentioned key problem areas involving 1) networks and communications during disclosure, 2) organizational culture, 3) engagement of external change agents during disclosure, and 4) a need for reflecting on and evaluating the policy implementation and disclosure itself. Patients shared 5) preferences for personal outreach by phone in place of the current use of certified letters. All interviewees discussed 6) issues with execution and 7) costs of the disclosure. CFIR analysis reveals key problem areas that need to be addresses during disclosure, including: timely communication patterns throughout the organization, establishing a supportive culture prior to implementation, using patient-approved, effective communications strategies during disclosures; providing follow-up support for employees and patients, and sharing lessons learned.

  15. Legal Agreements and the Governance of Research Commons: Lessons from Materials Sharing in Mouse Genomics

    PubMed Central

    Mishra, Amrita

    2014-01-01

    Abstract Omics research infrastructure such as databases and bio-repositories requires effective governance to support pre-competitive research. Governance includes the use of legal agreements, such as Material Transfer Agreements (MTAs). We analyze the use of such agreements in the mouse research commons, including by two large-scale resource development projects: the International Knockout Mouse Consortium (IKMC) and International Mouse Phenotyping Consortium (IMPC). We combine an analysis of legal agreements and semi-structured interviews with 87 members of the mouse model research community to examine legal agreements in four contexts: (1) between researchers; (2) deposit into repositories; (3) distribution by repositories; and (4) exchanges between repositories, especially those that are consortium members of the IKMC and IMPC. We conclude that legal agreements for the deposit and distribution of research reagents should be kept as simple and standard as possible, especially when minimal enforcement capacity and resources exist. Simple and standardized legal agreements reduce transactional bottlenecks and facilitate the creation of a vibrant and sustainable research commons, supported by repositories and databases. PMID:24552652

  16. Effective Tooling for Linked Data Publishing in Scientific Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Purohit, Sumit; Smith, William P.; Chappell, Alan R.

    Challenges that make it difficult to find, share, and combine published data, such as data heterogeneity and resource discovery, have led to increased adoption of semantic data standards and data publishing technologies. To make data more accessible, interconnected and discoverable, some domains are being encouraged to publish their data as Linked Data. Consequently, this trend greatly increases the amount of data that semantic web tools are required to process, store, and interconnect. In attempting to process and manipulate large data sets, tools–ranging from simple text editors to modern triplestores– eventually breakdown upon reaching undefined thresholds. This paper offers a systematicmore » approach that data publishers can use to categorize suitable tools to meet their data publishing needs. We present a real-world use case, the Resource Discovery for Extreme Scale Collaboration (RDESC), which features a scientific dataset(maximum size of 1.4 billion triples) used to evaluate a toolbox for data publishing in climate research. This paper also introduces a semantic data publishing software suite developed for the RDESC project.« less

  17. Electronic Resource Sharing in Community Colleges: A Snapshot of Florida, Wisconsin, Texas, and Louisiana.

    ERIC Educational Resources Information Center

    Mahoney, Brian D.

    2000-01-01

    States that several states are establishing networks for resource sharing. Florida offers these resources through the Florida Distance Learning Library Initiative, Wisconsin has BadgerLink and WISCAT, TexShare provides library resource sharing in Texas, and Louisiana has LOUIS and LLN. These are some of the states successfully demonstrating…

  18. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models featuremore » diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific workflows with the convenience of a few mouse clicks while hiding the implementation and technical details from end users. Particularly, we will consider two types of applications with distinct performance requirements: data-centric and service-centric applications. For data-centric applications, the main workflow task involves large-volume data generation, catalog, storage, and movement typically from supercomputers or experimental facilities to a team of geographically distributed users; while for service-centric applications, the main focus of workflow is on data archiving, preprocessing, filtering, synthesis, visualization, and other application-specific analysis. We will conduct a comprehensive comparison of existing workflow systems and choose the best suited one with open-source code, a flexible system structure, and a large user base as the starting point for our development. Based on the chosen system, we will develop and integrate new components including a black box design of computing modules, performance monitoring and prediction, and workflow optimization and reconfiguration, which are missing from existing workflow systems. A modular design for separating specification, execution, and monitoring aspects will be adopted to establish a common generic infrastructure suited for a wide spectrum of science applications. We will further design and develop efficient workflow mapping and scheduling algorithms to optimize the workflow performance in terms of minimum end-to-end delay, maximum frame rate, and highest reliability. We will develop and demonstrate the SWAMP system in a local environment, the grid network, and the 100Gpbs Advanced Network Initiative (ANI) testbed. The demonstration will target scientific applications in climate modeling and high energy physics and the functions to be demonstrated include workflow deployment, execution, steering, and reconfiguration. Throughout the project period, we will work closely with the science communities in the fields of climate modeling and high energy physics including Spallation Neutron Source (SNS) and Large Hadron Collider (LHC) projects to mature the system for production use.« less

  19. Cloud computing: a new business paradigm for biomedical information sharing.

    PubMed

    Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti

    2010-04-01

    We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.

  20. Recreational use in dispersed public lands measured using social media data and on-site counts.

    PubMed

    Fisher, David M; Wood, Spencer A; White, Eric M; Blahna, Dale J; Lange, Sarah; Weinberg, Alex; Tomco, Michael; Lia, Emilia

    2018-09-15

    Outdoor recreation is one of many important benefits provided by public lands. Data on recreational use are critical for informing management of recreation resources, however, managers often lack actionable information on visitor use for large protected areas that lack controlled access points. The purpose of this study is to explore the potential for social media data (e.g., geotagged images shared on Flickr and trip reports shared on a hiking forum) to provide land managers with useful measures of recreational use to dispersed areas, and to provide lessons learned from comparing several more traditional counting methods. First, we measure daily and monthly visitation rates to individual trails within the Mount Baker-Snoqualmie National Forest (MBSNF) in western Washington. At 15 trailheads, we compare counts of hikers from infrared sensors, timelapse cameras, and manual on-site counts, to counts based on the number of shared geotagged images and trip reports from those locations. Second, we measure visitation rates to each National Forest System (NFS) unit across the US and compare annual measurements derived from the number of geotagged images to estimates from the US Forest Service National Visitor Use Monitoring Program. At both the NFS unit and the individual-trail scales, we found strong correlations between traditional measures of recreational use and measures based on user-generated content shared on the internet. For national forests in every region of the country, correlations between official Forest Service statistics and geotagged images ranged between 55% and 95%. For individual trails within the MBSNF, monthly visitor counts from on-site measurements were strongly correlated with counts from geotagged images (79%) and trip reports (91%). The convenient, cost-efficient and timely nature of collecting and analyzing user-generated data could allow land managers to monitor use over different seasons of the year and at sites and scales never previously monitored, contributing to a more comprehensive understanding of recreational use patterns and values. Copyright © 2018 Elsevier Ltd. All rights reserved.

  1. Improved Cognitive Development in Preterm Infants with Shared Book Reading.

    PubMed

    Braid, Susan; Bernstein, Jenny

    2015-01-01

    To examine the effect of shared book reading on the cognitive development of children born preterm and to determine what factors influence shared book reading in this population. Secondary analysis using the Early Childhood Longitudinal Study-Birth Cohort, a large, nationally representative survey of children born in the United States in 2001. One thousand four hundred singleton preterm infants (22-36 weeks gestation). Cognitive development measured using the Bayley Mental Scale score from the Bayley Scales of Infant Development Research Edition. Adjusting for neonatal, maternal, and socioeconomic characteristics, reading aloud more than two times a week is associated with higher cognitive development scores in two-year-old children born preterm (p < .001). Race/ethnicity and maternal education affect how often parents read to their children. Shared book reading holds potential as an early developmental intervention for this population.

  2. Social networks and environmental outcomes.

    PubMed

    Barnes, Michele L; Lynham, John; Kalberg, Kolter; Leung, PingSun

    2016-06-07

    Social networks can profoundly affect human behavior, which is the primary force driving environmental change. However, empirical evidence linking microlevel social interactions to large-scale environmental outcomes has remained scarce. Here, we leverage comprehensive data on information-sharing networks among large-scale commercial tuna fishers to examine how social networks relate to shark bycatch, a global environmental issue. We demonstrate that the tendency for fishers to primarily share information within their ethnic group creates segregated networks that are strongly correlated with shark bycatch. However, some fishers share information across ethnic lines, and examinations of their bycatch rates show that network contacts are more strongly related to fishing behaviors than ethnicity. Our findings indicate that social networks are tied to actions that can directly impact marine ecosystems, and that biases toward within-group ties may impede the diffusion of sustainable behaviors. Importantly, our analysis suggests that enhanced communication channels across segregated fisher groups could have prevented the incidental catch of over 46,000 sharks between 2008 and 2012 in a single commercial fishery.

  3. Scalable Triadic Analysis of Large-Scale Graphs: Multi-Core vs. Multi-Processor vs. Multi-Threaded Shared Memory Architectures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George; Marquez, Andres; Choudhury, Sutanay

    2012-09-01

    Triadic analysis encompasses a useful set of graph mining methods that is centered on the concept of a triad, which is a subgraph of three nodes and the configuration of directed edges across the nodes. Such methods are often applied in the social sciences as well as many other diverse fields. Triadic methods commonly operate on a triad census that counts the number of triads of every possible edge configuration in a graph. Like other graph algorithms, triadic census algorithms do not scale well when graphs reach tens of millions to billions of nodes. To enable the triadic analysis ofmore » large-scale graphs, we developed and optimized a triad census algorithm to efficiently execute on shared memory architectures. We will retrace the development and evolution of a parallel triad census algorithm. Over the course of several versions, we continually adapted the code’s data structures and program logic to expose more opportunities to exploit parallelism on shared memory that would translate into improved computational performance. We will recall the critical steps and modifications that occurred during code development and optimization. Furthermore, we will compare the performances of triad census algorithm versions on three specific systems: Cray XMT, HP Superdome, and AMD multi-core NUMA machine. These three systems have shared memory architectures but with markedly different hardware capabilities to manage parallelism.« less

  4. Biodiversity, extinctions, and evolution of ecosystems with shared resources

    NASA Astrophysics Data System (ADS)

    Kozlov, Vladimir; Vakulenko, Sergey; Wennergren, Uno

    2017-03-01

    We investigate the formation of stable ecological networks where many species share the same resource. We show that such a stable ecosystem naturally occurs as a result of extinctions. We obtain an analytical relation for the number of coexisting species, and we find a relation describing how many species that may become extinct as a result of a sharp environmental change. We introduce a special parameter that is a combination of species traits and resource characteristics used in the model formulation. This parameter describes the pressure on the system to converge, by extinctions. When that stress parameter is large, we obtain that the species traits are concentrated at certain values. This stress parameter is thereby a parameter that determines the level of final biodiversity of the system. Moreover, we show that the dynamics of this limit system can be described by simple differential equations.

  5. Assessing Knowledge Sharing Among Academics: A Validation of the Knowledge Sharing Behavior Scale (KSBS).

    PubMed

    Ramayah, T; Yeap, Jasmine A L; Ignatius, Joshua

    2014-04-01

    There is a belief that academics tend to hold on tightly to their knowledge and intellectual resources. However, not much effort has been put into the creation of a valid and reliable instrument to measure knowledge sharing behavior among the academics. To apply and validate the Knowledge Sharing Behavior Scale (KSBS) as a measure of knowledge sharing behavior within the academic community. Respondents (N = 447) were academics from arts and science streams in 10 local, public universities in Malaysia. Data were collected using the 28-item KSBS that assessed four dimensions of knowledge sharing behavior namely written contributions, organizational communications, personal interactions, and communities of practice. The exploratory factor analysis showed that the items loaded on the dimension constructs that they were supposed to represent, thus proving construct validity. A within-factor analysis revealed that each set of items representing their intended dimension loaded on only one construct, therefore establishing convergent validity. All four dimensions were not perfectly correlated with each other or organizational citizenship behavior, thereby proving discriminant validity. However, all four dimensions correlated with organizational commitment, thus confirming predictive validity. Furthermore, all four factors correlated with both tacit and explicit sharing, which confirmed their concurrent validity. All measures also possessed sufficient reliability (α > .70). The KSBS is a valid and reliable instrument that can be used to formally assess the types of knowledge artifacts residing among academics and the degree of knowledge sharing in relation to those artifacts. © The Author(s) 2014.

  6. Policy approaches to renewable energy investment in the Mediterranean region

    NASA Astrophysics Data System (ADS)

    Patt, A.; Komendantova, N.; Battaglini, A.; Lilliestam, J.; Williges, K.

    2009-04-01

    Europe's climate policy objective of 20% renewable energy by 2020, and the call by the IPCC to reduce greenhouse gas emissions by 80% by 2050, pose major challenges for the European Union. Several policy options are available to move towards these objectives. In this paper, we will address the most critical policy and governance issues associated with one particular approach to scaling up renewable energy resources: reliance on large-scale energy generation facilities outside the European continent, such as onshore and offshore wind farms and concentrating solar power (CSP) facilities in the Mediterranean region. Several feasibility studies completed over the past three years (German Aerospace Center 2006; German Aerospace Center 2005; Czisch, Elektrotechnik 2005, p. 488; Lorenz, Pinner, Seitz, McKinsey Quarterly 2008, p.10; German Aerospace Center 2005; Knies 2008, The Club of Rome; Khosla, Breaking the Climate Deadlock Briefing Papers, 2008, p.19) have convincingly demonstrated that large-scale wind and CSP projects ought to be very attractive for a number of reasons, including cost, reliability of power supply, and technological maturity. According to these studies it would be technically possible for Europe to rely on large-scale wind and CSP for the majority of its power needs by 2050—indeed enough to completely replace its reliance on fossil fuels for power generation—at competitive cost over its current, carbon intensive system. While it has been shown to be technically feasible to develop renewable resources in North Africa to account for a large share of Europe's energy needs, doing so would require sustained double digit rates of growth in generating and long-distance transmission capacity, and would potentially require a very different high voltage grid architecture within Europe. Doing so at a large scale could require enormous up-front investments in technical capacity, financial instruments and human resources. What are the policy instruments best suited to achieving such growth quickly and smoothly? What bottlenecks—in terms of supply chains, human capital, finance, and transmission capacity—need to be anticipated and addressed if the rate of capacity growth is to be sustained over several decades? What model of governance would create a safe investment climate in consistence with new EU legislation (i.e. EU Renewable Energy Directive) as well as expected post-Kyoto targets and mechanisms? The material that we present here is based on a series of workshops held between November 2008 and January 2009, in which a wide range of stakeholders expressed their views about the fundamental needs for policy intervention. Supplementing the results from these workshops have been additional expert interviews, and basic financial modeling. One of the interesting results from this research is the need for a multi-pronged approach. First, there is a need for a support scheme, potentially compatible with in all cases supplementing the EU REN Directive, that would create a stable market for North African electricity in Europe. Second, there is a need for policies that facilitate the formation of public private partnerships in North Africa, as the specific investment vehicle, as a way to manage some of the uncertainties associated with large-scale investments in the region. Third, attention has to be paid to the development of supply chains within the Mediterranean region, as a way of ensuring the compatibility of such investments with sustainable development.

  7. Wireless Shared Resources: Sharing Right-Of-Way For Wireless Telecommunications, Guidance On Legal And Institutional Issues

    DOT National Transportation Integrated Search

    1997-06-06

    PUBLIC-PRIVATE PARTNERSHIPS SHARED RESOURCE PROJECTS ARE PUBLIC-PRIVATE ARRANGEMENTS THAT INVOLVE SHARING PUBLIC PROPERTY SUCH AS RIGHTS-OF-WAY AND PRIVATE RESOURCES SUCH AS TELECOMMUNICATIONS CAPACITY AND EXPERTISE. TYPICALLY, PRIVATE TELECOMMUNI...

  8. The HydroShare Collaborative Repository for the Hydrology Community

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.

    2017-12-01

    HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of, and collaboration around, "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting our approach to making this system easy to use and serving the needs of the hydrology community represented by the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI). Metadata for uploaded files is harvested automatically or captured using easy to use web user interfaces. Users are encouraged to add or create resources in HydroShare early in the data life cycle. To encourage this we allow users to share and collaborate on HydroShare resources privately among individual users or groups, entering metadata while doing the work. HydroShare also provides enhanced functionality for users through web apps that provide tools and computational capability for actions on resources. HydroShare's architecture broadly is comprised of: (1) resource storage, (2) resource exploration website, and (3) web apps for actions on resources. System components are loosely coupled and interact through APIs, which enhances robustness, as components can be upgraded and advanced relatively independently. The full power of this paradigm is the extensibility it supports. Web apps are hosted on separate servers, which may be 3rd party servers. They are registered in HydroShare using a web app resource that configures the connectivity for them to be discovered and launched directly from resource types they are associated with.

  9. Transition from lognormal to χ2-superstatistics for financial time series

    NASA Astrophysics Data System (ADS)

    Xu, Dan; Beck, Christian

    2016-07-01

    Share price returns on different time scales can be well modelled by a superstatistical dynamics. Here we provide an investigation which type of superstatistics is most suitable to properly describe share price dynamics on various time scales. It is shown that while χ2-superstatistics works well on a time scale of days, on a much smaller time scale of minutes the price changes are better described by lognormal superstatistics. The system dynamics thus exhibits a transition from lognormal to χ2 superstatistics as a function of time scale. We discuss a more general model interpolating between both statistics which fits the observed data very well. We also present results on correlation functions of the extracted superstatistical volatility parameter, which exhibits exponential decay for returns on large time scales, whereas for returns on small time scales there are long-range correlations and power-law decay.

  10. Using learning networks to understand complex systems: a case study of biological, geophysical and social research in the Amazon.

    PubMed

    Barlow, Jos; Ewers, Robert M; Anderson, Liana; Aragao, Luiz E O C; Baker, Tim R; Boyd, Emily; Feldpausch, Ted R; Gloor, Emanuel; Hall, Anthony; Malhi, Yadvinder; Milliken, William; Mulligan, Mark; Parry, Luke; Pennington, Toby; Peres, Carlos A; Phillips, Oliver L; Roman-Cuesta, Rosa Maria; Tobias, Joseph A; Gardner, Toby A

    2011-05-01

    Developing high-quality scientific research will be most effective if research communities with diverse skills and interests are able to share information and knowledge, are aware of the major challenges across disciplines, and can exploit economies of scale to provide robust answers and better inform policy. We evaluate opportunities and challenges facing the development of a more interactive research environment by developing an interdisciplinary synthesis of research on a single geographic region. We focus on the Amazon as it is of enormous regional and global environmental importance and faces a highly uncertain future. To take stock of existing knowledge and provide a framework for analysis we present a set of mini-reviews from fourteen different areas of research, encompassing taxonomy, biodiversity, biogeography, vegetation dynamics, landscape ecology, earth-atmosphere interactions, ecosystem processes, fire, deforestation dynamics, hydrology, hunting, conservation planning, livelihoods, and payments for ecosystem services. Each review highlights the current state of knowledge and identifies research priorities, including major challenges and opportunities. We show that while substantial progress is being made across many areas of scientific research, our understanding of specific issues is often dependent on knowledge from other disciplines. Accelerating the acquisition of reliable and contextualized knowledge about the fate of complex pristine and modified ecosystems is partly dependent on our ability to exploit economies of scale in shared resources and technical expertise, recognise and make explicit interconnections and feedbacks among sub-disciplines, increase the temporal and spatial scale of existing studies, and improve the dissemination of scientific findings to policy makers and society at large. Enhancing interaction among research efforts is vital if we are to make the most of limited funds and overcome the challenges posed by addressing large-scale interdisciplinary questions. Bringing together a diverse scientific community with a single geographic focus can help increase awareness of research questions both within and among disciplines, and reveal the opportunities that may exist for advancing acquisition of reliable knowledge. This approach could be useful for a variety of globally important scientific questions. © 2010 The Authors. Biological Reviews © 2010 Cambridge Philosophical Society.

  11. Creating a data resource: what will it take to build a medical information commons?

    PubMed

    Deverka, Patricia A; Majumder, Mary A; Villanueva, Angela G; Anderson, Margaret; Bakker, Annette C; Bardill, Jessica; Boerwinkle, Eric; Bubela, Tania; Evans, Barbara J; Garrison, Nanibaa' A; Gibbs, Richard A; Gentleman, Robert; Glazer, David; Goldstein, Melissa M; Greely, Hank; Harris, Crane; Knoppers, Bartha M; Koenig, Barbara A; Kohane, Isaac S; La Rosa, Salvatore; Mattison, John; O'Donnell, Christopher J; Rai, Arti K; Rehm, Heidi L; Rodriguez, Laura L; Shelton, Robert; Simoncelli, Tania; Terry, Sharon F; Watson, Michael S; Wilbanks, John; Cook-Deegan, Robert; McGuire, Amy L

    2017-09-22

    National and international public-private partnerships, consortia, and government initiatives are underway to collect and share genomic, personal, and healthcare data on a massive scale. Ideally, these efforts will contribute to the creation of a medical information commons (MIC), a comprehensive data resource that is widely available for both research and clinical uses. Stakeholder participation is essential in clarifying goals, deepening understanding of areas of complexity, and addressing long-standing policy concerns such as privacy and security and data ownership. This article describes eight core principles proposed by a diverse group of expert stakeholders to guide the formation of a successful, sustainable MIC. These principles promote formation of an ethically sound, inclusive, participant-centric MIC and provide a framework for advancing the policy response to data-sharing opportunities and challenges.

  12. Changing from computing grid to knowledge grid in life-science grid.

    PubMed

    Talukdar, Veera; Konar, Amit; Datta, Ayan; Choudhury, Anamika Roy

    2009-09-01

    Grid computing has a great potential to become a standard cyber infrastructure for life sciences that often require high-performance computing and large data handling, which exceeds the computing capacity of a single institution. Grid computer applies the resources of many computers in a network to a single problem at the same time. It is useful to scientific problems that require a great number of computer processing cycles or access to a large amount of data.As biologists,we are constantly discovering millions of genes and genome features, which are assembled in a library and distributed on computers around the world.This means that new, innovative methods must be developed that exploit the re-sources available for extensive calculations - for example grid computing.This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing a "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. By extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  13. Managing large-scale workflow execution from resource provisioning to provenance tracking: The CyberShake example

    USGS Publications Warehouse

    Deelman, E.; Callaghan, S.; Field, E.; Francoeur, H.; Graves, R.; Gupta, N.; Gupta, V.; Jordan, T.H.; Kesselman, C.; Maechling, P.; Mehringer, J.; Mehta, G.; Okaya, D.; Vahi, K.; Zhao, L.

    2006-01-01

    This paper discusses the process of building an environment where large-scale, complex, scientific analysis can be scheduled onto a heterogeneous collection of computational and storage resources. The example application is the Southern California Earthquake Center (SCEC) CyberShake project, an analysis designed to compute probabilistic seismic hazard curves for sites in the Los Angeles area. We explain which software tools were used to build to the system, describe their functionality and interactions. We show the results of running the CyberShake analysis that included over 250,000 jobs using resources available through SCEC and the TeraGrid. ?? 2006 IEEE.

  14. Summary of Research 1992

    DTIC Science & Technology

    1992-12-01

    Tutorial on Their Data Sharing," The International Journal on Very Large Data Bases (VLDB Journal ), Vol. 1, No. 1, July 1992. Hsiao, D. K., "Federated...Databases and Systems: A Tutorial on Their Resource Consolidation," The International Journal on Very Large Data Bases (VLDB Journal ), Vol. 1, No. 2...Game: Normal Approximation," accepted extensions of games and considers for publication by International possible applications. Journal of Game Theory

  15. Transactive memory systems scale for couples: development and validation

    PubMed Central

    Hewitt, Lauren Y.; Roberts, Lynne D.

    2015-01-01

    People in romantic relationships can develop shared memory systems by pooling their cognitive resources, allowing each person access to more information but with less cognitive effort. Research examining such memory systems in romantic couples largely focuses on remembering word lists or performing lab-based tasks, but these types of activities do not capture the processes underlying couples’ transactive memory systems, and may not be representative of the ways in which romantic couples use their shared memory systems in everyday life. We adapted an existing measure of transactive memory systems for use with romantic couples (TMSS-C), and conducted an initial validation study. In total, 397 participants who each identified as being a member of a romantic relationship of at least 3 months duration completed the study. The data provided a good fit to the anticipated three-factor structure of the components of couples’ transactive memory systems (specialization, credibility and coordination), and there was reasonable evidence of both convergent and divergent validity, as well as strong evidence of test–retest reliability across a 2-week period. The TMSS-C provides a valuable tool that can quickly and easily capture the underlying components of romantic couples’ transactive memory systems. It has potential to help us better understand this intriguing feature of romantic relationships, and how shared memory systems might be associated with other important features of romantic relationships. PMID:25999873

  16. Baseline Suitability Analysis

    DTIC Science & Technology

    2013-07-18

    VA) • DFAS • Human Resources - HR Shared Services (Indianapolis, IN) • Personnel Security - HR Shared Services (Indianapolis, IN) DHRA...Security (Camp Lejeune) No Yes Yes AAFES Human Resources No No No Force Protection Yes Yes Yes DFAS Human Resources - HR Shared Services No...No No Personnel Security - HR Shared Services Yes Yes Yes DLA Human Resources No No Yes Personnel Security Yes Yes Yes DoDEA Human

  17. Preservice Teachers' Participation and Perceptions of Twitter Live Chats as Personal Learning Networks

    ERIC Educational Resources Information Center

    Luo, Tian; Sickel, Jamie; Cheng, Li

    2017-01-01

    This study presents two cases in which undergraduates were introduced to Twitter in their teacher preparation program as a means of developing a personal learning network. Twitter live chats are synchronous discussions that allow education stakeholders to discuss issues and share resources, engaging on potentially a global scale via the social…

  18. Assessing the accuracy of a regional land cover classification

    Treesearch

    William Clerke; Raymond Czaplewski; Jeff Campbell; Janet Fahringer

    1996-01-01

    The Southern Region USDA Forest Service recently completed the Southern Appalachian Assessment (SAA). The Assessment is a broad scale interagency analysis and sharing of existing information relative to the natural and human resources of the region. The SAA encompasses over 36 million acres extending from Northern Virginia to Northern Alabama. It was clear early in the...

  19. Forecasting the Depletion of Transboundary Groundwater Resources in Hyper-Arid Environments

    NASA Astrophysics Data System (ADS)

    Mazzoni, A.; Heggy, E.

    2014-12-01

    The increase in awareness about the overexploitation of transboundary groundwater resources in hyper-arid environments that occurred in the last decades has highlighted the need to better map, monitor and manage these resources. Climate change, economic and population growth are driving forces that put more pressure on these fragile but fundamental resources. The aim of our approach is to address the question of whether or not groundwater resources, especially non-renewable, could serve as "backstop" water resource during water shortage periods that would probably affect the drylands in the upcoming 100 years. The high dependence of arid regions on these resources requires prudent management to be able to preserve their fossil aquifers and exploit them in a more sustainable way. We use the NetLogo environment with the FAO Aquastat Database to evaluate if the actual trends of extraction, consumption and use of non-renewable groundwater resources would remain feasible with the future climate change impacts and the population growth scenarios. The case studies selected are three: the Nubian Sandstone Aquifer System, shared between Egypt, Libya, Sudan and Chad; the North Western Sahara Aquifer System, with Algeria, Tunisia and Libya and the Umm Radhuma Dammam Aquifer, in its central part, shared between Saudi Arabia, Qatar and Bahrain. The reason these three fossil aquifers were selected are manifold. First, they represent properly transboundary non-renewable groundwater resources, with all the implications that derive from this, i.e. the necessity of scientific and socio-political cooperation among riparians, the importance of monitoring the status of shared resources and the need to elaborate a shared management policy. Furthermore, each country is characterized by hyper-arid climatic conditions, which will be exacerbated in the next century by climate change and lead to probable severe water shortage periods. Together with climate change, the rate of population growth will be at unprecedented levels for these areas causing the water demand of these nations to grow largely. Our preliminary simulation results suggest that fossil aquifers cannot be used as a long-term solution for water shortage in hyper-arid environments. Aquifers in the Arabian Peninsula are forecasted to be depleted within decades.

  20. Space Industrialization: The Mirage of Abundance.

    ERIC Educational Resources Information Center

    Deudney, Daniel

    1982-01-01

    Large-scale space industrialization is not a viable solution to the population, energy, and resource problems of earth. The expense and technological difficulties involved in the development and maintenance of space manufacturing facilities, space colonies, and large-scale satellites for solar power are discussed. (AM)

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marrinan, Thomas; Leigh, Jason; Renambot, Luc

    Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less

  2. Are There Shared Environmental Influences on Attention-Deficit/Hyperactivity Disorder? Reply to Wood, Buitelaar, Rijsdijk, Asherson, and Kunsti (2010)

    ERIC Educational Resources Information Center

    Burt, S. Alexandra

    2010-01-01

    A recent large-scale meta-analysis of twin and adoption studies indicated that shared environmental influences make important contributions to most forms of child and adolescent psychopathology (Burt, 2009b). The sole exception to this robust pattern of results was observed for attention-deficit/hyperactivity disorder (ADHD), which appeared to be…

  3. From Okra to Oak: Reforestation of Abandoned Agricultural Fields in the Lower Mississippi Alluvial Valley

    Treesearch

    Callie Jo Schweitzer; John A. Stanturf

    1997-01-01

    There has been a tremendous upsurge in interest in reforestation of bottomland hardwoods. In the lower Mississippi alluvial valley, reforestation projects are occurringon a large scale on abandoned agricultural fields, often in conjunction with state or federal cost-share programs. This paper describes some of the cost share programs used to establish bottomland...

  4. The Year of the Solar System: An E/PO Community's Approach to Sharing Planetary Science

    NASA Astrophysics Data System (ADS)

    Shipp, S. S.; Boonstra, D.; Shupla, C.; Dalton, H.; Scalice, D.; Planetary Science E/Po Community

    2010-12-01

    YSS offers the opportunity to raise awareness, build excitement, and make connections with educators, students and the public about planetary science activities. The planetary science education and public outreach (E/PO) community is engaging and educating their audiences through ongoing mission and program activities. Based on discussion with partners, the community is presenting its products in the context of monthly thematic topics that are tied to the big questions of planetary science: how did the Sun’s family of planets and bodies originate and how have they evolved; and how did life begin and evolve on Earth, has it evolved elsewhere in our solar system, and what are characteristics that lead to the origins of life? Each month explores different compelling aspects of the solar system - its formation, volcanism, ice, life. Resources, activities, and events are interwoven in thematic context, and presented with ideas through which formal and informal educators can engage their audiences. The month-to-month themes place the big questions in a logical sequence of deepening learning experiences - and highlight mission milestones and viewing events. YSS encourages active participation and communication with its audiences. It includes nation-wide activities, such as a Walk Through the Solar System, held between October 2010 to March 2011, in which museums, libraries, science centers, schools, planetariums, amateur astronomers, and others are kicking off YSS by creating their own scale models of the solar system and sharing their events through online posting of pictures, video, and stories. YSS offers the E/PO community the opportunity to collaborate with each other and partners. The thematic approach leverages existing products, providing a home and allowing a “shelf life” that can outlast individual projects and missions. The broad themes highlight missions and programs multiple times. YSS also leverages existing online resources and social media. Hosted on the popular and long-lived Solar System Exploration website (http://solarsystem.nasa.gov/yss), multiple points of entry lead to YSS, ensuring sustained accessibility of thematic topics. Likewise, YSS is being shared through social media avenues of existing missions and programs, reaching a large audience without investment in building a fan-base on YSS-specific social media conduits. Create and share your own YSS event with the tools and resources offered on the website. Join the celebration!

  5. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    NASA Astrophysics Data System (ADS)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of climate change on crop productivity in a watershed. The first was carried out by the large-scale crop model alone. The second was carried out by the integrated model of the large-scale crop model and the H08 model. The former projected that changes in temperature and precipitation due to future climate change would give rise to increasing the water stress in crops. Nevertheless, the latter projected that the increasing amount of agricultural water resources in the watershed would supply sufficient amount of water for irrigation, consequently reduce the water stress. The integrated model demonstrated the importance of taking into account the water circulation in watershed when predicting the regional crop production.

  6. When the globe is your classroom: teaching and learning about large-scale environmental change online

    NASA Astrophysics Data System (ADS)

    Howard, E. A.; Coleman, K. J.; Barford, C. L.; Kucharik, C.; Foley, J. A.

    2005-12-01

    Understanding environmental problems that cross physical and disciplinary boundaries requires a more holistic view of the world - a "systems" approach. Yet it is a challenge for many learners to start thinking this way, particularly when the problems are large in scale and not easily visible. We will describe our online university course, "Humans and the Changing Biosphere," which takes a whole-systems perspective for teaching regional to global-scale environmental science concepts, including climate, hydrology, ecology, and human demographics. We will share our syllabus and learning objectives and summarize our efforts to incorporate "best" practices for online teaching. We will describe challenges we have faced, and our efforts to reach different learner types. Our goals for this presentation are: (1) to communicate how a systems approach ties together environmental sciences (including climate, hydrology, ecology, biogeochemistry, and demography) that are often taught as separate disciplines; (2) to generate discussion about challenges of teaching large-scale environmental processes; (3) to share our experiences in teaching these topics online; (4) to receive ideas and feedback on future teaching strategies. We will explain why we developed this course online, and share our experiences about benefits and challenges of teaching over the web - including some suggestions about how to use technology to supplement face-to-face learning experiences (and vice versa). We will summarize assessment data about what students learned during the course, and discuss key misconceptions and barriers to learning. We will highlight the role of an online discussion board in creating classroom community, identifying misconceptions, and engaging different types of learners.

  7. Large transboundary watersheds: Climate, water and streams of thought

    NASA Astrophysics Data System (ADS)

    Pulwarty, R. S.

    2001-05-01

    Water is a "fugitive" resource in the sense that it flows naturally from one place to another, from one reserve to another (e.g., groundwater to surface), and from one physical state (solid, liquid and gas) to another. Thus "trans-boundary" can mean many things including: transitions from wet to arid zones, from upstream to downstream, from one country or province to the next etc. The Convention on the Protection and Use of Transboundary Watercourses and International Lakes (1992) defines "transboundary waters" to mean "any surface or ground waters which mark, cross or are located on the boundaries between two or more states". Emerging issues in water resources emanate from three categories of problems; (1) transboundary water availability; (2) transboundary groundwater allocation, management, and conservation; and (3) transboundary water quality. Transboundary fluctuations and changes in river flow can be attributed to (1) climate variations and change on several timescales, and, (2) physical and biological transformations of basin hydrology including increased storage, diversions, and landscape changes. Researchers and practitioners have identified numerous factors underlying international disputes involving river flows, including: the variability and uncertainty of supply, interdependencies among users, increasing over-allocation and rising costs, the increasing vulnerability of water quality and aquatic ecosystems to human activities, ways and means of supplying safe water facilities, and the mobilization of financial resources for water development and management. Many of these issues derive from general concerns in water resources management. How these concerns are met is strongly shaped by the choice of the spatial unit within which studies and management actions are conducted, by the way problems have been defined and changed over time, and by who benefits from defining problems in a particular way. In the following discussion the scales of human activities and interactions with large river basins are put in the context of streamflow changes on the time scales of century, decadal, seasonal and extreme events. These conditioning factors on flow variability and change are discussed in general. Three basins, the Nile, the Colorado, and the Parana-Paraguay River systems, are then selected for detailed illustration. While governing institutions that more closely correspond with the physical water system can help to assure appropriate consideration of efficiency and equity, domestic policy can pose major institutional barriers to international agreements and management across national borders. Ultimately, the main tasks in the foreseeable future will be how to share common but variable water resources in a catchment area between upstream and downstream users, between various sectors, between rural and urban areas, between preservation of functioning ecosystems and more direct tangible needs. Engaging the many dimensions of transboundary river flow requires, more than ever, the need to understand these "regions" as integrators of social, cultural, climatic, economic, and ecological histories and networks, that help to shape shared community interests and values.

  8. Energy and material flows of megacities

    PubMed Central

    Kennedy, Christopher A.; Stewart, Iain; Facchini, Angelo; Cersosimo, Igor; Mele, Renata; Chen, Bin; Uda, Mariko; Kansal, Arun; Chiu, Anthony; Kim, Kwi-gon; Dubeux, Carolina; Lebre La Rovere, Emilio; Cunha, Bruno; Pincetl, Stephanie; Keirstead, James; Barles, Sabine; Pusaka, Semerdanta; Gunawan, Juniati; Adegbile, Michael; Nazariha, Mehrdad; Hoque, Shamsul; Marcotullio, Peter J.; González Otharán, Florencia; Genena, Tarek; Ibrahim, Nadine; Farooqui, Rizwan; Cervantes, Gemma; Sahin, Ahmet Duran

    2015-01-01

    Understanding the drivers of energy and material flows of cities is important for addressing global environmental challenges. Accessing, sharing, and managing energy and material resources is particularly critical for megacities, which face enormous social stresses because of their sheer size and complexity. Here we quantify the energy and material flows through the world’s 27 megacities with populations greater than 10 million people as of 2010. Collectively the resource flows through megacities are largely consistent with scaling laws established in the emerging science of cities. Correlations are established for electricity consumption, heating and industrial fuel use, ground transportation energy use, water consumption, waste generation, and steel production in terms of heating-degree-days, urban form, economic activity, and population growth. The results help identify megacities exhibiting high and low levels of consumption and those making efficient use of resources. The correlation between per capita electricity use and urbanized area per capita is shown to be a consequence of gross building floor area per capita, which is found to increase for lower-density cities. Many of the megacities are growing rapidly in population but are growing even faster in terms of gross domestic product (GDP) and energy use. In the decade from 2001–2011, electricity use and ground transportation fuel use in megacities grew at approximately half the rate of GDP growth. PMID:25918371

  9. Energy and material flows of megacities.

    PubMed

    Kennedy, Christopher A; Stewart, Iain; Facchini, Angelo; Cersosimo, Igor; Mele, Renata; Chen, Bin; Uda, Mariko; Kansal, Arun; Chiu, Anthony; Kim, Kwi-Gon; Dubeux, Carolina; Lebre La Rovere, Emilio; Cunha, Bruno; Pincetl, Stephanie; Keirstead, James; Barles, Sabine; Pusaka, Semerdanta; Gunawan, Juniati; Adegbile, Michael; Nazariha, Mehrdad; Hoque, Shamsul; Marcotullio, Peter J; González Otharán, Florencia; Genena, Tarek; Ibrahim, Nadine; Farooqui, Rizwan; Cervantes, Gemma; Sahin, Ahmet Duran

    2015-05-12

    Understanding the drivers of energy and material flows of cities is important for addressing global environmental challenges. Accessing, sharing, and managing energy and material resources is particularly critical for megacities, which face enormous social stresses because of their sheer size and complexity. Here we quantify the energy and material flows through the world's 27 megacities with populations greater than 10 million people as of 2010. Collectively the resource flows through megacities are largely consistent with scaling laws established in the emerging science of cities. Correlations are established for electricity consumption, heating and industrial fuel use, ground transportation energy use, water consumption, waste generation, and steel production in terms of heating-degree-days, urban form, economic activity, and population growth. The results help identify megacities exhibiting high and low levels of consumption and those making efficient use of resources. The correlation between per capita electricity use and urbanized area per capita is shown to be a consequence of gross building floor area per capita, which is found to increase for lower-density cities. Many of the megacities are growing rapidly in population but are growing even faster in terms of gross domestic product (GDP) and energy use. In the decade from 2001-2011, electricity use and ground transportation fuel use in megacities grew at approximately half the rate of GDP growth.

  10. Smart Data Infrastructure: The Sixth Generation of Mediation for Data Science

    NASA Astrophysics Data System (ADS)

    Fox, P. A.

    2014-12-01

    In the emergent "fourth paradigm" (data-driven) science, the scientific method is enhanced by the integration of significant data sources into the practice of scientific research. To address Big Science, there are challenges in understanding the role of data in enabling researchers to attack not just disciplinary issues, but also the system-level, large-scale, and transdisciplinary global scientific challenges facing society.Recognizing that the volume of data is only one of many dimensions to be considered, there is a clear need for improved data infrastructures to mediate data and information exchange, which we contend will need to be powered by semantic technologies. One clear need is to provide computational approaches for researchers to discover appropriate data resources, rapidly integrate data collections from heterogeneously resources or multiple data sets, and inter-compare results to allow generation and validation of hypotheses. Another trend is toward automated tools that allow researchers to better find and reuse data that they currently don't know they need, let alone know how to find. Again semantic technologies will be required. Finally, to turn data analytics from "art to science", technical solutions are needed for cross-dataset validation, reproducibility studies on data-driven results, and the concomitant citation of data products allowing recognition for those who curate and share important data resources.

  11. Importance of balanced architectures in the design of high-performance imaging systems

    NASA Astrophysics Data System (ADS)

    Sgro, Joseph A.; Stanton, Paul C.

    1999-03-01

    Imaging systems employed in demanding military and industrial applications, such as automatic target recognition and computer vision, typically require real-time high-performance computing resources. While high- performances computing systems have traditionally relied on proprietary architectures and custom components, recent advances in high performance general-purpose microprocessor technology have produced an abundance of low cost components suitable for use in high-performance computing systems. A common pitfall in the design of high performance imaging system, particularly systems employing scalable multiprocessor architectures, is the failure to balance computational and memory bandwidth. The performance of standard cluster designs, for example, in which several processors share a common memory bus, is typically constrained by memory bandwidth. The symptom characteristic of this problem is failure to the performance of the system to scale as more processors are added. The problem becomes exacerbated if I/O and memory functions share the same bus. The recent introduction of microprocessors with large internal caches and high performance external memory interfaces makes it practical to design high performance imaging system with balanced computational and memory bandwidth. Real word examples of such designs will be presented, along with a discussion of adapting algorithm design to best utilize available memory bandwidth.

  12. Shared resources : sharing right-of-way for telecommunications : identification, review and analysis of legal and institutional issues

    DOT National Transportation Integrated Search

    1996-04-01

    This report presents the results of research on the institutional and non-technical issues related to shared resource projects. Shared resource projects are a particular form of public-private partnering that may help public agencies underwrite their...

  13. Fair Shares and Sharing Fairly: A Survey of Public Views on Open Science, Informed Consent and Participatory Research in Biobanking.

    PubMed

    Joly, Yann; Dalpé, Gratien; So, Derek; Birko, Stanislav

    2015-01-01

    Biobanks are important resources which enable large-scale genomic research with human samples and data, raising significant ethical concerns about how participants' information is managed and shared. Three previous studies of the Canadian public's opinion about these topics have been conducted. Building on those results, an online survey representing the first study of public perceptions about biobanking spanning all Canadian provinces was conducted. Specifically, this study examined qualitative views about biobank objectives, governance structure, control and ownership of samples and data, benefit sharing, consent practices and data sharing norms, as well as additional questions and ethical concerns expressed by the public. Over half the respondents preferred to give a one-time general consent for the future sharing of their samples among researchers. Most expressed willingness for their data to be shared with the international scientific community rather than used by one or more Canadian institutions. Whereas more respondents indicated a preference for one-time general consent than any other model of consent, they constituted less than half of the total responses, revealing a lack of consensus among survey respondents regarding this question. Respondents identified biobank objectives, governance structure and accountability as the most important information to provide participants. Respondents' concerns about biobanking generally centred around the control and ownership of biological samples and data, especially with respect to potential misuse by insurers, the government and other third parties. Although almost half the respondents suggested that these should be managed by the researchers' institutions, results indicate that the public is interested in being well-informed about these projects and suggest the importance of increased involvement from participants. In conclusion, the study discusses the viability of several proposed models for informed consent, including e-governance, independent trustees and the use of exclusion clauses, in the context of these new findings about the views of the Canadian public.

  14. Fair Shares and Sharing Fairly: A Survey of Public Views on Open Science, Informed Consent and Participatory Research in Biobanking

    PubMed Central

    Joly, Yann; Dalpé, Gratien; So, Derek; Birko, Stanislav

    2015-01-01

    Context Biobanks are important resources which enable large-scale genomic research with human samples and data, raising significant ethical concerns about how participants’ information is managed and shared. Three previous studies of the Canadian public’s opinion about these topics have been conducted. Building on those results, an online survey representing the first study of public perceptions about biobanking spanning all Canadian provinces was conducted. Specifically, this study examined qualitative views about biobank objectives, governance structure, control and ownership of samples and data, benefit sharing, consent practices and data sharing norms, as well as additional questions and ethical concerns expressed by the public. Results Over half the respondents preferred to give a one-time general consent for the future sharing of their samples among researchers. Most expressed willingness for their data to be shared with the international scientific community rather than used by one or more Canadian institutions. Whereas more respondents indicated a preference for one-time general consent than any other model of consent, they constituted less than half of the total responses, revealing a lack of consensus among survey respondents regarding this question. Respondents identified biobank objectives, governance structure and accountability as the most important information to provide participants. Respondents’ concerns about biobanking generally centred around the control and ownership of biological samples and data, especially with respect to potential misuse by insurers, the government and other third parties. Although almost half the respondents suggested that these should be managed by the researchers’ institutions, results indicate that the public is interested in being well-informed about these projects and suggest the importance of increased involvement from participants. In conclusion, the study discusses the viability of several proposed models for informed consent, including e-governance, independent trustees and the use of exclusion clauses, in the context of these new findings about the views of the Canadian public. PMID:26154134

  15. On the Cross-Country Comparability of Indicators of Socioeconomic Resources in PISA

    ERIC Educational Resources Information Center

    Pokropek, Artur; Borgonovi, Francesca; McCormick, Carina

    2017-01-01

    Large-scale international assessments rely on indicators of the resources that students report having in their homes to capture the financial capital of their families. The scaling methodology currently used to develop the Programme for International Student Assessment (PISA) background indices is designed to maximize within-country comparability…

  16. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  17. Risk and the evolution of human exchange

    PubMed Central

    Kaplan, Hillard S.; Schniter, Eric; Smith, Vernon L.; Wilson, Bart J.

    2012-01-01

    Compared with other species, exchange among non-kin is a hallmark of human sociality in both the breadth of individuals and total resources involved. One hypothesis is that extensive exchange evolved to buffer the risks associated with hominid dietary specialization on calorie dense, large packages, especially from hunting. ‘Lucky’ individuals share food with ‘unlucky’ individuals with the expectation of reciprocity when roles are reversed. Cross-cultural data provide prima facie evidence of pair-wise reciprocity and an almost universal association of high-variance (HV) resources with greater exchange. However, such evidence is not definitive; an alternative hypothesis is that food sharing is really ‘tolerated theft’, in which individuals possessing more food allow others to steal from them, owing to the threat of violence from hungry individuals. Pair-wise correlations may reflect proximity providing greater opportunities for mutual theft of food. We report a laboratory experiment of foraging and food consumption in a virtual world, designed to test the risk-reduction hypothesis by determining whether people form reciprocal relationships in response to variance of resource acquisition, even when there is no external enforcement of any transfer agreements that might emerge. Individuals can forage in a high-mean, HV patch or a low-mean, low-variance (LV) patch. The key feature of the experimental design is that individuals can transfer resources to others. We find that sharing hardly occurs after LV foraging, but among HV foragers sharing increases dramatically over time. The results provide strong support for the hypothesis that people are pre-disposed to evaluate gains from exchange and respond to unsynchronized variance in resource availability through endogenous reciprocal trading relationships. PMID:22513855

  18. MiMiR – an integrated platform for microarray data sharing, mining and analysis

    PubMed Central

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-01-01

    Background Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. Results A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. Conclusion The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies. PMID:18801157

  19. MiMiR--an integrated platform for microarray data sharing, mining and analysis.

    PubMed

    Tomlinson, Chris; Thimma, Manjula; Alexandrakis, Stelios; Castillo, Tito; Dennis, Jayne L; Brooks, Anthony; Bradley, Thomas; Turnbull, Carly; Blaveri, Ekaterini; Barton, Geraint; Chiba, Norie; Maratou, Klio; Soutter, Pat; Aitman, Tim; Game, Laurence

    2008-09-18

    Despite considerable efforts within the microarray community for standardising data format, content and description, microarray technologies present major challenges in managing, sharing, analysing and re-using the large amount of data generated locally or internationally. Additionally, it is recognised that inconsistent and low quality experimental annotation in public data repositories significantly compromises the re-use of microarray data for meta-analysis. MiMiR, the Microarray data Mining Resource was designed to tackle some of these limitations and challenges. Here we present new software components and enhancements to the original infrastructure that increase accessibility, utility and opportunities for large scale mining of experimental and clinical data. A user friendly Online Annotation Tool allows researchers to submit detailed experimental information via the web at the time of data generation rather than at the time of publication. This ensures the easy access and high accuracy of meta-data collected. Experiments are programmatically built in the MiMiR database from the submitted information and details are systematically curated and further annotated by a team of trained annotators using a new Curation and Annotation Tool. Clinical information can be annotated and coded with a clinical Data Mapping Tool within an appropriate ethical framework. Users can visualise experimental annotation, assess data quality, download and share data via a web-based experiment browser called MiMiR Online. All requests to access data in MiMiR are routed through a sophisticated middleware security layer thereby allowing secure data access and sharing amongst MiMiR registered users prior to publication. Data in MiMiR can be mined and analysed using the integrated EMAAS open source analysis web portal or via export of data and meta-data into Rosetta Resolver data analysis package. The new MiMiR suite of software enables systematic and effective capture of extensive experimental and clinical information with the highest MIAME score, and secure data sharing prior to publication. MiMiR currently contains more than 150 experiments corresponding to over 3000 hybridisations and supports the Microarray Centre's large microarray user community and two international consortia. The MiMiR flexible and scalable hardware and software architecture enables secure warehousing of thousands of datasets, including clinical studies, from microarray and potentially other -omics technologies.

  20. The Alzheimerization of Aging.

    ERIC Educational Resources Information Center

    Adelman, Richard C.

    1995-01-01

    The National Institute on Aging (NIA) invests a disproportionately large share of its resources in research on Alzheimer's Disease at the expense of other interests of the broader scientific community in gerontology. Complex social forces that continue to shape this outcome embrace discipline-specific traditions of science advocacy, as well as…

  1. Perceived Benefits, Harms, and Views About How to Share Data Responsibly: A Qualitative Study of Experiences With and Attitudes Toward Data Sharing Among Research Staff and Community Representatives in Thailand.

    PubMed

    Cheah, Phaik Yeong; Tangseefa, Decha; Somsaman, Aimatcha; Chunsuttiwat, Tri; Nosten, François; Day, Nicholas P J; Bull, Susan; Parker, Michael

    2015-07-01

    The Thailand Major Overseas Programme coordinates large multi-center studies in tropical medicine and generates vast amounts of data. As the data sharing movement gains momentum, we wanted to understand attitudes and experiences of relevant stakeholders about what constitutes good data sharing practice. We conducted 15 interviews and three focus groups discussions involving 25 participants and found that they generally saw data sharing as something positive. Data sharing was viewed as a means to contribute to scientific progress and lead to better quality analysis, better use of resources, greater accountability, and more outputs. However, there were also important reservations including potential harms to research participants, their communities, and the researchers themselves. Given these concerns, several areas for discussion were identified: data standardization, appropriate consent models, and governance. © The Author(s) 2015.

  2. eBird—Using citizen-science data to help solve real-world conservation challenges (Invited)

    NASA Astrophysics Data System (ADS)

    Sullivan, B. L.; Iliff, M. J.; Wood, C. L.; Fink, D.; Kelling, S.

    2010-12-01

    eBird (www.ebird.org) is an Internet-based citizen-science project that collects bird observations worldwide. eBird is foremost a tool for birders, providing users with a resource for bird information and a way to keep track of their personal bird lists, thus establishing a model for sustained participation and new project growth. Importantly, eBird data are shared with scientists and conservationists working to save birds and their habitats. Here we highlight two different ways these data are used: as a real-time data gathering and visualization tool; and as the primary resource for developing large-scale bird distribution models that explore species-habitat associations and climate change scenarios. eBird provides data across broad temporal and spatial scales, and is a valuable tool for documenting and monitoring bird populations facing a multitude of anthropogenic and environmental impacts. For example, a focused effort to monitor birds on Gulf Coast beaches using eBird is providing essential baseline data and enabling long-term monitoring of bird populations throughout the region. Additionally, new data visualization tools that incorporate data from eBird, NOAA, and Google, are specifically designed to highlight the potential impacts of the Gulf oil spill on bird populations. Through a collaboration of partners in the DataONE network, such as the Oak Ridge National Laboratory, we will use supercomputing time from the National Science Foundation’s TeraGrid to allow Lab scientists to model bird migration phenology at the population level based on eBird data. The process involves combining bird observations with remotely sensed variables such as landcover and greening index to predict bird movements. Preliminary results of these models allow us to animate bird movements across large spatial scales, and to explore how migration timing might be affected under different climate change scenarios.

  3. Why a regional approach to postgraduate water education makes sense - the WaterNet experience in Southern Africa

    NASA Astrophysics Data System (ADS)

    Jonker, L.; van der Zaag, P.; Gumbo, B.; Rockström, J.; Love, D.; Savenije, H. H. G.

    2012-03-01

    This paper reports the experience of a regional network of academic departments involved in water education that started as a project and evolved, over a period of 12 yr, into an independent network organisation. The paper pursues three objectives. First, it argues that it makes good sense to organise postgraduate education and research on water resources on a regional scale. This is because water has a transboundary dimension that poses delicate sharing questions, an approach that promotes a common understanding of what the real water-related issues are, results in future water specialists speaking a common (water) language, enhances mutual respect, and can thus be considered an investment in future peace. Second, it presents the WaterNet experience as an example that a regional approach can work and has an impact. Third, it draws three generalised lessons from the WaterNet experience. Lesson 1: For a regional capacity building network to be effective, it must have a legitimate ownership structure and a clear mandate. Lesson 2: Organising water-related training opportunities at a regional and transboundary scale makes sense - not only because knowledge resources are scattered, but also because the topic - water - has a regional and transboundary scope. Lesson 3: Jointly developing educational programmes by sharing expertise and resources requires intense intellectual management and sufficient financial means.

  4. Synchronization of Finite State Shared Resources

    DTIC Science & Technology

    1976-03-01

    IMHI uiw mmm " AFOSR -TR- 70- 0^8 3 QC o SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Sei neide.- DEPARTMENT of COMPUTER...34" ■ ■ ^ I I. i. . : ,1 . i-i SYNCHRONIZATION OF FINITE STATE SHARED RESOURCES Edward A Schneider Department of Computer...SIGNIFICANT NUMBER OF PAGES WHICH DO NOT REPRODUCE LEGIBLY. ABSTRACT The problem of synchronizing a set of operations defined on a shared resource

  5. Social Networking Adapted for Distributed Scientific Collaboration

    NASA Technical Reports Server (NTRS)

    Karimabadi, Homa

    2012-01-01

    Share is a social networking site with novel, specially designed feature sets to enable simultaneous remote collaboration and sharing of large data sets among scientists. The site will include not only the standard features found on popular consumer-oriented social networking sites such as Facebook and Myspace, but also a number of powerful tools to extend its functionality to a science collaboration site. A Virtual Observatory is a promising technology for making data accessible from various missions and instruments through a Web browser. Sci-Share augments services provided by Virtual Observatories by enabling distributed collaboration and sharing of downloaded and/or processed data among scientists. This will, in turn, increase science returns from NASA missions. Sci-Share also enables better utilization of NASA s high-performance computing resources by providing an easy and central mechanism to access and share large files on users space or those saved on mass storage. The most common means of remote scientific collaboration today remains the trio of e-mail for electronic communication, FTP for file sharing, and personalized Web sites for dissemination of papers and research results. Each of these tools has well-known limitations. Sci-Share transforms the social networking paradigm into a scientific collaboration environment by offering powerful tools for cooperative discourse and digital content sharing. Sci-Share differentiates itself by serving as an online repository for users digital content with the following unique features: a) Sharing of any file type, any size, from anywhere; b) Creation of projects and groups for controlled sharing; c) Module for sharing files on HPC (High Performance Computing) sites; d) Universal accessibility of staged files as embedded links on other sites (e.g. Facebook) and tools (e.g. e-mail); e) Drag-and-drop transfer of large files, replacing awkward e-mail attachments (and file size limitations); f) Enterprise-level data and messaging encryption; and g) Easy-to-use intuitive workflow.

  6. Organization and scaling in water supply networks

    NASA Astrophysics Data System (ADS)

    Cheng, Likwan; Karney, Bryan W.

    2017-12-01

    Public water supply is one of the society's most vital resources and most costly infrastructures. Traditional concepts of these networks capture their engineering identity as isolated, deterministic hydraulic units, but overlook their physics identity as related entities in a probabilistic, geographic ensemble, characterized by size organization and property scaling. Although discoveries of allometric scaling in natural supply networks (organisms and rivers) raised the prospect for similar findings in anthropogenic supplies, so far such a finding has not been reported in public water or related civic resource supplies. Examining an empirical ensemble of large number and wide size range, we show that water supply networks possess self-organized size abundance and theory-explained allometric scaling in spatial, infrastructural, and resource- and emission-flow properties. These discoveries establish scaling physics for water supply networks and may lead to novel applications in resource- and jurisdiction-scale water governance.

  7. The Dynamics of a Semi-Arid Region in Response to Climate and Water - Use Policy

    NASA Technical Reports Server (NTRS)

    Mustard, John F.; Hamburg, Steve; Grant, John A.; Manning, Sara J.; Steinwand, Aaron; Howard, Chris

    2000-01-01

    The objectives of this project were to determine the response of semi-arid ecosystems to the combined forcings of climate variability and anthropogenic stress. Arid and semi-arid systems encompass close to 40% of the worlds land surface. The ecology of these regions are principally limited by water, and as the water resources wax and wane, so should the health and vigor of the ecosystems. Water, however, is a necessary and critical resource for humans living in these same regions. Thus for many and and semi-arid regions the natural systems and human systems are in direct competition for a limited resource. Increasing competition through development of and and semi-arid regions, export of water resources, as well as potential persistent changes in weather patterns are likely to lead to fundamental changes in carrying capacity, resilience, and ecology of these regions. A detailed understanding of these systems respond to forcing on a regional and local scale is required in order to better prepare for and manage future changes in the availability of water. In the Owens Valley CA, decadal changes in rainfall and increased use of groundwater resources by Los Angles (which derives 60-70% of its water from this region) have resulted in a large-scale experiment on the impacts of these changes in semi-arid ecosystems. This project works directly with the Inyo County Water Department (local water authority) and the Los Angles Department of Water and Power (regional demand on water resources) to understand changes, their causes, and impacts. Very detailed records have been kept for a number of selected sites in the valley which provide essential ground truth. These results are then scaled up through remote sensed data to regions scale to assess large scale patterns and link them to the fundamental decisions regarding the water resources of this region. A fundamental goal is to understand how resilient the native ecosystems are to large changes in water resources. Are they are on a spring (remove and return resources, do the systems return to the original state) or a vector (when water returns have the systems fundamentally changed).

  8. Generation of Earth's First-Order Biodiversity Pattern

    NASA Astrophysics Data System (ADS)

    Krug, Andrew Z.; Jablonski, David; Valentine, James W.; Roy, Kaustuv

    2009-02-01

    The first-order biodiversity pattern on Earth today and at least as far back as the Paleozoic is the latitudinal diversity gradient (LDG), a decrease in richness of species and higher taxa from the equator to the poles. LDGs are produced by geographic trends in origination, extinction, and dispersal over evolutionary timescales, so that analyses of static patterns will be insufficient to reveal underlying processes. The fossil record of marine bivalve genera, a model system for the analysis of biodiversity dynamics over large temporal and spatial scales, shows that an origination and range-expansion gradient plays a major role in generating the LDG. Peak origination rates and peak diversities fall within the tropics, with range expansion out of the tropics the predominant spatial dynamic thereafter. The origination-diversity link occurs even in a "contrarian" group whose diversity peaks at midlatitudes, an exception proving the rule that spatial variations in origination are key to latitudinal diversity patterns. Extinction rates are lower in polar latitudes (≥60°) than in temperate zones and thus cannot create the observed gradient alone. They may, however, help to explain why origination and immigration are evidently damped in higher latitudes. We suggest that species require more resources in higher latitudes, for the seasonality of primary productivity increases by more than an order of magnitude from equatorial to polar regions. Higher-latitude species are generalists that, unlike potential immigrants, are adapted to garner the large share of resources required for incumbency in those regions. When resources are opened up by extinctions, lineages spread chiefly poleward and chiefly through speciation.

  9. Generation of Earth's first-order biodiversity pattern.

    PubMed

    Krug, Andrew Z; Jablonski, David; Valentine, James W; Roy, Kaustuv

    2009-01-01

    The first-order biodiversity pattern on Earth today and at least as far back as the Paleozoic is the latitudinal diversity gradient (LDG), a decrease in richness of species and higher taxa from the equator to the poles. LDGs are produced by geographic trends in origination, extinction, and dispersal over evolutionary timescales, so that analyses of static patterns will be insufficient to reveal underlying processes. The fossil record of marine bivalve genera, a model system for the analysis of biodiversity dynamics over large temporal and spatial scales, shows that an origination and range-expansion gradient plays a major role in generating the LDG. Peak origination rates and peak diversities fall within the tropics, with range expansion out of the tropics the predominant spatial dynamic thereafter. The origination-diversity link occurs even in a "contrarian" group whose diversity peaks at midlatitudes, an exception proving the rule that spatial variations in origination are key to latitudinal diversity patterns. Extinction rates are lower in polar latitudes (> or =60 degrees ) than in temperate zones and thus cannot create the observed gradient alone. They may, however, help to explain why origination and immigration are evidently damped in higher latitudes. We suggest that species require more resources in higher latitudes, for the seasonality of primary productivity increases by more than an order of magnitude from equatorial to polar regions. Higher-latitude species are generalists that, unlike potential immigrants, are adapted to garner the large share of resources required for incumbency in those regions. When resources are opened up by extinctions, lineages spread chiefly poleward and chiefly through speciation.

  10. Project BALLOTS: Bibliographic Automation of Large Library Operations Using a Time-Sharing System. Progress Report (3/27/69 - 6/26/69).

    ERIC Educational Resources Information Center

    Veaner, Allen B.

    Project BALLOTS is a large-scale library automation development project of the Stanford University Libraries which has demonstrated the feasibility of conducting on-line interactive searches of complex bibliographic files, with a large number of users working simultaneously in the same or different files. This report documents the continuing…

  11. How much a galaxy knows about its large-scale environment?: An information theoretic perspective

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2017-05-01

    The small-scale environment characterized by the local density is known to play a crucial role in deciding the galaxy properties but the role of large-scale environment on galaxy formation and evolution still remain a less clear issue. We propose an information theoretic framework to investigate the influence of large-scale environment on galaxy properties and apply it to the data from the Galaxy Zoo project that provides the visual morphological classifications of ˜1 million galaxies from the Sloan Digital Sky Survey. We find a non-zero mutual information between morphology and environment that decreases with increasing length-scales but persists throughout the entire length-scales probed. We estimate the conditional mutual information and the interaction information between morphology and environment by conditioning the environment on different length-scales and find a synergic interaction between them that operates up to at least a length-scales of ˜30 h-1 Mpc. Our analysis indicates that these interactions largely arise due to the mutual information shared between the environments on different length-scales.

  12. Implementation of a Shared Resource Financial Management System

    PubMed Central

    Caldwell, T.; Gerlach, R.; Israel, M.; Bobin, S.

    2010-01-01

    CF-6 Norris Cotton Cancer Center (NCCC), an NCI-designated Comprehensive Cancer Center at Dartmouth Medical School, administers 12 Life Sciences Shared Resources. These resources are diverse and offer multiple products and services. Previous methods for tracking resource use, billing, and financial management were time consuming, error prone and lacked appropriate financial management tools. To address these problems, we developed and implemented a web-based application with a built-in authorization system that uses Perl, ModPerl, Apache2, and Oracle as the software infrastructure. The application uses a role-based system to differentiate administrative users with those requesting services and includes many features requested by users and administrators. To begin development, we chose a resource that had an uncomplicated service, a large number of users, and required the use of all of the applications features. The Molecular Biology Core Facility at NCCC fit these requirements and was used as a model for developing and testing the application. After model development, institution wide deployment followed a three-stage process. The first stage was to interview the resource manager and staff to understand day-to-day operations. At the second stage, we generated and tested customized forms defining resource services. During the third stage, we added new resource users and administrators to the system before final deployment. Twelve months after deployment, resource administrators reported that the new system performed well for internal and external billing and tracking resource utilization. Users preferred the application's web-based system for distribution of DNA sequencing and other data. The sample tracking features have enhanced day-to-day resource operations, and an on-line scheduling module for shared instruments has proven a much-needed utility. Principal investigators now are able to restrict user spending to specific accounts and have final approval of the invoices before the billing, which has significantly reduced the number of unpaid invoices.

  13. The relationship between human resource investments and organizational performance: a firm-level examination of equilibrium theory.

    PubMed

    Subramony, Mahesh; Krause, Nicole; Norton, Jacqueline; Burns, Gary N

    2008-07-01

    It is commonly believed that human resource investments can yield positive performance-related outcomes for organizations. Utilizing the theory of organizational equilibrium (H. A. Simon, D. W. Smithburg, & V. A. Thompson, 1950; J. G. March & H. A. Simon, 1958), the authors proposed that organizational inducements in the form of competitive pay will lead to 2 firm-level performance outcomes--labor productivity and customer satisfaction--and that financially successful organizations would be more likely to provide these inducements to their employees. To test their hypotheses, the authors gathered employee-survey and objective performance data from a sample of 126 large publicly traded U.S. organizations over a period of 3 years. Results indicated that (a) firm-level financial performance (net income) predicted employees' shared perceptions of competitive pay, (b) shared pay perceptions predicted future labor productivity, and (c) the relationship between shared pay perceptions and customer satisfaction was fully mediated by employee morale.

  14. The Importance and Satisfaction of Collaborative Innovation for Strategic Entrepreneurship

    ERIC Educational Resources Information Center

    Tsai, I-Chang; Lei, Han-Sheng

    2016-01-01

    Building on network, learning, resource-based and real options theories, collaborative innovation through the sharing of ideas, knowledge, expertise, and opportunities can enable both small and large firms to successfully engage in strategic entrepreneurship. We use the real case of a research-oriented organization and its incubator for analysis…

  15. African Schoolnet Toolkit

    ERIC Educational Resources Information Center

    Marquard, Stephen

    2005-01-01

    A schoolnet program is an organized set of activities that expands the use of ICTs and promotes sharing of educational resources by teachers and students at schools. Schoolnet programmes may be located inside or outside government, may be large initiatives with substantial funding, or may be smaller innovative projects without big budgets. The…

  16. Big Data and Dementia: Charting the Route Ahead for Research, Ethics, and Policy

    PubMed Central

    Ienca, Marcello; Vayena, Effy; Blasimme, Alessandro

    2018-01-01

    Emerging trends in pervasive computing and medical informatics are creating the possibility for large-scale collection, sharing, aggregation and analysis of unprecedented volumes of data, a phenomenon commonly known as big data. In this contribution, we review the existing scientific literature on big data approaches to dementia, as well as commercially available mobile-based applications in this domain. Our analysis suggests that big data approaches to dementia research and care hold promise for improving current preventive and predictive models, casting light on the etiology of the disease, enabling earlier diagnosis, optimizing resource allocation, and delivering more tailored treatments to patients with specific disease trajectories. Such promissory outlook, however, has not materialized yet, and raises a number of technical, scientific, ethical, and regulatory challenges. This paper provides an assessment of these challenges and charts the route ahead for research, ethics, and policy. PMID:29468161

  17. Sustainability, arid grasslands and grazing: New applications for technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pregenzer, A.L.; Parmenter, R.; Passell, H.D.

    1999-12-08

    The study of ecology is taking on increasing global importance as the value of well-functioning ecosystems to human well-being becomes better understood. However, the use of technological systems for the study of ecology lags behind the use of technologies in the study of other disciplines important to human well-being, such as medicine, chemistry and physics. The authors outline four different kinds of large-scale data needs required by land managers for the development of sustainable land use strategies, and which can be obtained with current or future technological systems. They then outline a hypothetical resource management scenario in which data onmore » all those needs are collected using remote and in situ technologies, transmitted to a central location, analyzed, and then disseminated for regional use in maintaining sustainable grazing systems. They conclude by highlighting various data-collection systems and data-sharing networks already in operation.« less

  18. Egalitarianism in young children.

    PubMed

    Fehr, Ernst; Bernhard, Helen; Rockenbach, Bettina

    2008-08-28

    Human social interaction is strongly shaped by other-regarding preferences, that is, a concern for the welfare of others. These preferences are important for a unique aspect of human sociality-large scale cooperation with genetic strangers-but little is known about their developmental roots. Here we show that young children's other-regarding preferences assume a particular form, inequality aversion that develops strongly between the ages of 3 and 8. At age 3-4, the overwhelming majority of children behave selfishly, whereas most children at age 7-8 prefer resource allocations that remove advantageous or disadvantageous inequality. Moreover, inequality aversion is strongly shaped by parochialism, a preference for favouring the members of one's own social group. These results indicate that human egalitarianism and parochialism have deep developmental roots, and the simultaneous emergence of altruistic sharing and parochialism during childhood is intriguing in view of recent evolutionary theories which predict that the same evolutionary process jointly drives both human altruism and parochialism.

  19. Big Data and Dementia: Charting the Route Ahead for Research, Ethics, and Policy.

    PubMed

    Ienca, Marcello; Vayena, Effy; Blasimme, Alessandro

    2018-01-01

    Emerging trends in pervasive computing and medical informatics are creating the possibility for large-scale collection, sharing, aggregation and analysis of unprecedented volumes of data, a phenomenon commonly known as big data. In this contribution, we review the existing scientific literature on big data approaches to dementia, as well as commercially available mobile-based applications in this domain. Our analysis suggests that big data approaches to dementia research and care hold promise for improving current preventive and predictive models, casting light on the etiology of the disease, enabling earlier diagnosis, optimizing resource allocation, and delivering more tailored treatments to patients with specific disease trajectories. Such promissory outlook, however, has not materialized yet, and raises a number of technical, scientific, ethical, and regulatory challenges. This paper provides an assessment of these challenges and charts the route ahead for research, ethics, and policy.

  20. Computer graphics for management: An abstract of capabilities and applications of the EIS system

    NASA Technical Reports Server (NTRS)

    Solem, B. J.

    1975-01-01

    The Executive Information Services (EIS) system, developed as a computer-based, time-sharing tool for making and implementing management decisions, and including computer graphics capabilities, was described. The following resources are available through the EIS languages: centralized corporate/gov't data base, customized and working data bases, report writing, general computational capability, specialized routines, modeling/programming capability, and graphics. Nearly all EIS graphs can be created by a single, on-line instruction. A large number of options are available, such as selection of graphic form, line control, shading, placement on the page, multiple images on a page, control of scaling and labeling, plotting of cum data sets, optical grid lines, and stack charts. The following are examples of areas in which the EIS system may be used: research, estimating services, planning, budgeting, and performance measurement, national computer hook-up negotiations.

  1. Dealing with uncertainty in water scarcity footprints

    NASA Astrophysics Data System (ADS)

    Scherer, Laura; Pfister, Stephan

    2016-05-01

    Water scarcity adversely affects ecosystems, human well-being and the economy. It can be described by water scarcity indices (WSIs) which we calculated globally for the decades 1981-1990 and 2001-2010. Based on a model ensemble, we calculated the WSI for both decades including uncertainties. While there is a slight tendency of increased water scarcity in 2001-2010, the likelihood of the increase is rather low (53%). Climate change played only a minor role, but increased water consumption is more decisive. In the last decade, a large share of the global population already lived under highly water scarce conditions with a global average monthly WSI of 0.51 (on a scale from 0 to 1). Considering that globally there are enough water resources to satisfy all our needs, this highlights the need for regional optimization of water consumption. In addition, crop choices within a food group can help reduce humanity’s water scarcity footprint without reducing its nutritional value.

  2. Food for contagion: synthesis and future directions for studying host-parasite responses to resource shifts in anthropogenic environments.

    PubMed

    Altizer, Sonia; Becker, Daniel J; Epstein, Jonathan H; Forbes, Kristian M; Gillespie, Thomas R; Hall, Richard J; Hawley, Dana M; Hernandez, Sonia M; Martin, Lynn B; Plowright, Raina K; Satterfield, Dara A; Streicker, Daniel G

    2018-05-05

    Human-provided resource subsidies for wildlife are diverse, common and have profound consequences for wildlife-pathogen interactions, as demonstrated by papers in this themed issue spanning empirical, theoretical and management perspectives from a range of study systems. Contributions cut across scales of organization, from the within-host dynamics of immune function, to population-level impacts on parasite transmission, to landscape- and regional-scale patterns of infection. In this concluding paper, we identify common threads and key findings from author contributions, including the consequences of resource subsidies for (i) host immunity; (ii) animal aggregation and contact rates; (iii) host movement and landscape-level infection patterns; and (iv) interspecific contacts and cross-species transmission. Exciting avenues for future work include studies that integrate mechanistic modelling and empirical approaches to better explore cross-scale processes, and experimental manipulations of food resources to quantify host and pathogen responses. Work is also needed to examine evolutionary responses to provisioning, and ask how diet-altered changes to the host microbiome influence infection processes. Given the massive public health and conservation implications of anthropogenic resource shifts, we end by underscoring the need for practical recommendations to manage supplemental feeding practices, limit human-wildlife conflicts over shared food resources and reduce cross-species transmission risks, including to humans.This article is part of the theme issue 'Anthropogenic resource subsidies and host-parasite dynamics in wildlife'. © 2018 The Author(s).

  3. SEARCHBreast: a new resource to locate and share surplus archival material from breast cancer animal models to help address the 3Rs.

    PubMed

    Blyth, Karen; Carter, Phil; Morrissey, Bethny; Chelala, Claude; Jones, Louise; Holen, Ingunn; Speirs, Valerie

    2016-04-01

    Animal models have contributed to our understanding of breast cancer, with publication of results in high-impact journals almost invariably requiring extensive in vivo experimentation. As such, many laboratories hold large collections of surplus animal material, with only a fraction being used in publications relating to the original projects. Despite being developed at considerable cost, this material is an invisible and hence an underutilised resource, which often ends up being discarded. Within the breast cancer research community there is both a need and desire to make this valuable material available for researchers. Lack of a coordinated system for visualisation and localisation of this has prevented progress. To fulfil this unmet need, we have developed a novel initiative called Sharing Experimental Animal Resources: Coordinating Holdings-Breast (SEARCHBreast) which facilitates sharing of archival tissue between researchers on a collaborative basis and, de facto will reduce overall usage of animal models in breast cancer research. A secure searchable database has been developed where researchers can find, share, or upload materials related to animal models of breast cancer, including genetic and transplant models. SEARCHBreast is a virtual compendium where the physical material remains with the original laboratory. A bioanalysis pipeline is being developed for the analysis of transcriptomics data associated with mouse models, allowing comparative study with human and cell line data. Additionally, SEARCHBreast is committed to promoting the use of humanised breast tissue models as replacement alternatives to animals. Access to this unique resource is freely available to all academic researchers following registration at https://searchbreast.org.

  4. Multidimensional model to assess the readiness of Saudi Arabia to implement evidence based child maltreatment prevention programs at a large scale.

    PubMed

    Almuneef, Maha A; Qayad, Mohamed; Noor, Ismail K; Al-Eissa, Majid A; Albuhairan, Fadia S; Inam, Sarah; Mikton, Christopher

    2014-03-01

    There has been increased awareness of child maltreatment in Saudi Arabia recently. This study assessed the readiness for implementing large-scale evidence-based child maltreatment prevention programs in Saudi Arabia. Key informants, who were key decision makers and senior managers in the field of child maltreatment, were invited to participate in the study. A multidimensional tool, developed by WHO and collaborators from several middle and low income countries, was used to assess 10 dimensions of readiness. A group of experts also gave an objective assessment of the 10 dimensions and key informants' and experts' scores were compared. On a scale of 100, the key informants gave a readiness score of 43% for Saudi Arabia to implement large-scale, evidence-based CM prevention programs, and experts gave an overall readiness score of 40%. Both the key informants and experts agreed that 4 of the dimensions (attitudes toward child maltreatment prevention, institutional links and resources, material resources, and human and technical resources) had low readiness scores (<5) each and three dimensions (knowledge of child maltreatment prevention, scientific data on child maltreatment prevention, and will to address child maltreatment problem) had high readiness scores (≥5) each. There was significant disagreement between key informants and experts on the remaining 3 dimensions. Overall, Saudi Arabia has a moderate/fair readiness to implement large-scale child maltreatment prevention programs. Capacity building; strengthening of material resources; and improving institutional links, collaborations, and attitudes toward the child maltreatment problem are required to improve the country's readiness to implement such programs. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Measuring the learning capacity of organisations: development and factor analysis of the Questionnaire for Learning Organizations.

    PubMed

    Oudejans, S C C; Schippers, G M; Schramade, M H; Koeter, M W J; van den Brink, W

    2011-04-01

    To investigate internal consistency and factor structure of a questionnaire measuring learning capacity based on Senge's theory of the five disciplines of a learning organisation: Personal Mastery, Mental Models, Shared Vision, Team Learning, and Systems Thinking. Cross-sectional study. Substance-abuse treatment centres (SATCs) in The Netherlands. A total of 293 SATC employees from outpatient and inpatient treatment departments, financial and human resources departments. Psychometric properties of the Questionnaire for Learning Organizations (QLO), including factor structure, internal consistency, and interscale correlations. A five-factor model representing the five disciplines of Senge showed good fit. The scales for Personal Mastery, Shared Vision and Team Learning had good internal consistency, but the scales for Systems Thinking and Mental Models had low internal consistency. The proposed five-factor structure was confirmed in the QLO, which makes it a promising instrument to assess learning capacity in teams. The Systems Thinking and the Mental Models scales have to be revised. Future research should be aimed at testing criterion and discriminatory validity.

  6. From Planetary Boundaries to national fair shares of the global safe operating space - How can the scales be bridged?

    NASA Astrophysics Data System (ADS)

    Häyhä, Tiina; Cornell, Sarah; Lucas, Paul; van Vuuren, Detlef; Hoff, Holger

    2016-04-01

    The planetary boundaries framework proposes precautionary quantitative global limits to the anthropogenic perturbation of crucial Earth system processes. In this way, it marks out a planetary 'safe operating space' for human activities. However, decisions regarding resource use and emissions are mostly made at much smaller scales, mostly by (sub-)national and regional governments, businesses, and other local actors. To operationalize the planetary boundaries, they need to be translated into and aligned with targets that are relevant at these smaller scales. In this paper, we develop a framework that addresses the three dimension of bridging across scales: biophysical, socio-economic and ethical, to provide a consistent universally applicable approach for translating the planetary boundaries into national level context-specific and fair shares of the safe operating space. We discuss our findings in the context of previous studies and their implications for future analyses and policymaking. In this way, we help link the planetary boundaries framework to widely- applied operational and policy concepts for more robust strong sustainability decision-making.

  7. Taking Stock: Existing Resources for Assessing a New Vision of Science Learning

    ERIC Educational Resources Information Center

    Alonzo, Alicia C.; Ke, Li

    2016-01-01

    A new vision of science learning described in the "Next Generation Science Standards"--particularly the science and engineering practices and their integration with content--pose significant challenges for large-scale assessment. This article explores what might be learned from advances in large-scale science assessment and…

  8. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Wucherl; Sim, Alex

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  9. Time-Series Forecast Modeling on High-Bandwidth Network Measurements

    DOE PAGES

    Yoo, Wucherl; Sim, Alex

    2016-06-24

    With the increasing number of geographically distributed scientific collaborations and the growing sizes of scientific data, it has become challenging for users to achieve the best possible network performance on a shared network. In this paper, we have developed a model to forecast expected bandwidth utilization on high-bandwidth wide area networks. The forecast model can improve the efficiency of the resource utilization and scheduling of data movements on high-bandwidth networks to accommodate ever increasing data volume for large-scale scientific data applications. A univariate time-series forecast model is developed with the Seasonal decomposition of Time series by Loess (STL) and themore » AutoRegressive Integrated Moving Average (ARIMA) on Simple Network Management Protocol (SNMP) path utilization measurement data. Compared with the traditional approach such as Box-Jenkins methodology to train the ARIMA model, our forecast model reduces computation time up to 92.6 %. It also shows resilience against abrupt network usage changes. Finally, our forecast model conducts the large number of multi-step forecast, and the forecast errors are within the mean absolute deviation (MAD) of the monitored measurements.« less

  10. Crossing Scales and Disciplines to Understand Challenges for Climate Change Adaptation and Water Resources Management in Chile and Californi

    NASA Astrophysics Data System (ADS)

    Vicuna, S.; Melo, O.; Meza, F. J.; Medellin-Azuara, J.; Herman, J. D.; Sandoval Solis, S.

    2017-12-01

    California and Chile share similarities in terms of climate, ecosystems, topography and water use. In both regions, the hydro-climatologic system is characterized by a typical Mediterranean climate, rainy winters and dry summers, highly variable annual precipitation, and snowmelt-dependent water supply systems. Water use in both regions has also key similarities, with the highest share devoted to high-value irrigated crops, followed by urban water use and a significant hydropower-driven power supply system. Snowmelt-driven basins in semiarid regions are highly sensitive to climate change for two reasons, temperature effects on snowmelt timing and water resources scarcity in these regions subject to ever-increasing demands. Research in both regions also coincide in terms of the potential climate change impacts. Expected impacts on California and Chile water resources have been well-documented in terms of changes in water supply and water demand, though significant uncertainties remain. Both regions have recently experienced prolonged droughts, providing an opportunity to understand the future challenges and potential adaptive responses under climate change. This study connects researchers from Chile and California with the goal of understanding the problem of how to adapt to climate change impacts on water resources and agriculture at the various spatial and temporal scales. The project takes advantage of the complementary contexts between Chile and California in terms of similar climate and hydrologic conditions, water management institutions, patterns of water consumption and, importantly, a similar challenge facing recent drought scenarios to understand the challenges faced by a changing climate.

  11. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    NASA Astrophysics Data System (ADS)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  12. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  13. Internest food sharing within wood ant colonies: resource redistribution behavior in a complex system

    PubMed Central

    Robinson, Elva J.H.

    2016-01-01

    Resource sharing is an important cooperative behavior in many animals. Sharing resources is particularly important in social insect societies, as division of labor often results in most individuals including, importantly, the reproductives, relying on other members of the colony to provide resources. Sharing resources between individuals is therefore fundamental to the success of social insects. Resource sharing is complicated if a colony inhabits several spatially separated nests, a nesting strategy common in many ant species. Resources must be shared not only between individuals in a single nest but also between nests. We investigated the behaviors facilitating resource redistribution between nests in a dispersed-nesting population of wood ant Formica lugubris. We marked ants, in the field, as they transported resources along the trails between nests of a colony, to investigate how the behavior of individual workers relates to colony-level resource exchange. We found that workers from a particular nest “forage” to other nests in the colony, treating them as food sources. Workers treating other nests as food sources means that simple, pre-existing foraging behaviors are used to move resources through a distributed system. It may be that this simple behavioral mechanism facilitates the evolution of this complex life-history strategy. PMID:27004016

  14. Resources for Functional Genomics Studies in Drosophila melanogaster

    PubMed Central

    Mohr, Stephanie E.; Hu, Yanhui; Kim, Kevin; Housden, Benjamin E.; Perrimon, Norbert

    2014-01-01

    Drosophila melanogaster has become a system of choice for functional genomic studies. Many resources, including online databases and software tools, are now available to support design or identification of relevant fly stocks and reagents or analysis and mining of existing functional genomic, transcriptomic, proteomic, etc. datasets. These include large community collections of fly stocks and plasmid clones, “meta” information sites like FlyBase and FlyMine, and an increasing number of more specialized reagents, databases, and online tools. Here, we introduce key resources useful to plan large-scale functional genomics studies in Drosophila and to analyze, integrate, and mine the results of those studies in ways that facilitate identification of highest-confidence results and generation of new hypotheses. We also discuss ways in which existing resources can be used and might be improved and suggest a few areas of future development that would further support large- and small-scale studies in Drosophila and facilitate use of Drosophila information by the research community more generally. PMID:24653003

  15. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE PAGES

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...

    2017-09-29

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  16. HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian

    Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less

  17. Quantitative analysis of population-scale family trees with millions of relatives.

    PubMed

    Kaplanis, Joanna; Gordon, Assaf; Shor, Tal; Weissbrod, Omer; Geiger, Dan; Wahl, Mary; Gershovits, Michael; Markus, Barak; Sheikh, Mona; Gymrek, Melissa; Bhatia, Gaurav; MacArthur, Daniel G; Price, Alkes L; Erlich, Yaniv

    2018-04-13

    Family trees have vast applications in fields as diverse as genetics, anthropology, and economics. However, the collection of extended family trees is tedious and usually relies on resources with limited geographical scope and complex data usage restrictions. We collected 86 million profiles from publicly available online data shared by genealogy enthusiasts. After extensive cleaning and validation, we obtained population-scale family trees, including a single pedigree of 13 million individuals. We leveraged the data to partition the genetic architecture of human longevity and to provide insights into the geographical dispersion of families. We also report a simple digital procedure to overlay other data sets with our resource. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  18. Linking climate projections to performance: A yield-based decision scaling assessment of a large urban water resources system

    NASA Astrophysics Data System (ADS)

    Turner, Sean W. D.; Marlow, David; Ekström, Marie; Rhodes, Bruce G.; Kularathna, Udaya; Jeffrey, Paul J.

    2014-04-01

    Despite a decade of research into climate change impacts on water resources, the scientific community has delivered relatively few practical methodological developments for integrating uncertainty into water resources system design. This paper presents an application of the "decision scaling" methodology for assessing climate change impacts on water resources system performance and asks how such an approach might inform planning decisions. The decision scaling method reverses the conventional ethos of climate impact assessment by first establishing the climate conditions that would compel planners to intervene. Climate model projections are introduced at the end of the process to characterize climate risk in such a way that avoids the process of propagating those projections through hydrological models. Here we simulated 1000 multisite synthetic monthly streamflow traces in a model of the Melbourne bulk supply system to test the sensitivity of system performance to variations in streamflow statistics. An empirical relation was derived to convert decision-critical flow statistics to climatic units, against which 138 alternative climate projections were plotted and compared. We defined the decision threshold in terms of a system yield metric constrained by multiple performance criteria. Our approach allows for fast and simple incorporation of demand forecast uncertainty and demonstrates the reach of the decision scaling method through successful execution in a large and complex water resources system. Scope for wider application in urban water resources planning is discussed.

  19. Using Analytics to Support Petabyte-Scale Science on the NASA Earth Exchange (NEX)

    NASA Astrophysics Data System (ADS)

    Votava, P.; Michaelis, A.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    NASA Earth Exchange (NEX) is a data, supercomputing and knowledge collaboratory that houses NASA satellite, climate and ancillary data where a focused community can come together to address large-scale challenges in Earth sciences. Analytics within NEX occurs at several levels - data, workflows, science and knowledge. At the data level, we are focusing on collecting and analyzing any information that is relevant to efficient acquisition, processing and management of data at the smallest granularity, such as files or collections. This includes processing and analyzing all local and many external metadata that are relevant to data quality, size, provenance, usage and other attributes. This then helps us better understand usage patterns and improve efficiency of data handling within NEX. When large-scale workflows are executed on NEX, we capture information that is relevant to processing and that can be analyzed in order to improve efficiencies in job scheduling, resource optimization, or data partitioning that would improve processing throughput. At this point we also collect data provenance as well as basic statistics of intermediate and final products created during the workflow execution. These statistics and metrics form basic process and data QA that, when combined with analytics algorithms, helps us identify issues early in the production process. We have already seen impact in some petabyte-scale projects, such as global Landsat processing, where we were able to reduce processing times from days to hours and enhance process monitoring and QA. While the focus so far has been mostly on support of NEX operations, we are also building a web-based infrastructure that enables users to perform direct analytics on science data - such as climate predictions or satellite data. Finally, as one of the main goals of NEX is knowledge acquisition and sharing, we began gathering and organizing information that associates users and projects with data, publications, locations and other attributes that can then be analyzed as a part of the NEX knowledge graph and used to greatly improve advanced search capabilities. Overall, we see data analytics at all levels as an important part of NEX as we are continuously seeking improvements in data management, workflow processing, use of resources, usability and science acceleration.

  20. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE PAGES

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    2016-09-28

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  1. Prospective CO 2 saline resource estimation methodology: Refinement of existing US-DOE-NETL methods based on data availability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Angela; Sanguinito, Sean; Levine, Jonathan S.

    Carbon storage resource estimation in subsurface saline formations plays an important role in establishing the scale of carbon capture and storage activities for governmental policy and commercial project decision-making. Prospective CO 2 resource estimation of large regions or subregions, such as a basin, occurs at the initial screening stages of a project using only limited publicly available geophysical data, i.e. prior to project-specific site selection data generation. As the scale of investigation is narrowed and selected areas and formations are identified, prospective CO 2 resource estimation can be refined and uncertainty narrowed when site-specific geophysical data are available. Here, wemore » refine the United States Department of Energy – National Energy Technology Laboratory (US-DOE-NETL) methodology as the scale of investigation is narrowed from very large regional assessments down to selected areas and formations that may be developed for commercial storage. In addition, we present a new notation that explicitly identifies differences between data availability and data sources used for geologic parameters and efficiency factors as the scale of investigation is narrowed. This CO 2 resource estimation method is available for screening formations in a tool called CO 2-SCREEN.« less

  2. Aetiology for the covariation between combined type ADHD and reading difficulties in a family study: the role of IQ

    PubMed Central

    Cheung, Celeste H.M.; Wood, Alexis C.; Paloyelis, Yannis; Arias-Vasquez, Alejandro; Buitelaar, Jan K.; Franke, Barbara; Miranda, Ana; Mulas, Fernando; Rommelse, Nanda; Sergeant, Joseph A.; Sonuga-Barke, Edmund J.; Faraone, Stephen V.; Asherson, Philip; Kuntsi, Jonna

    2012-01-01

    Background Twin studies using both clinical and population-based samples suggest that the frequent co-occurrence of attention deficit hyperactivity disorder (ADHD) and reading ability/disability (RD) is largely driven by shared genetic influences. While both disorders are associated with lower IQ, recent twin data suggest that the shared genetic variability between reading difficulties and ADHD inattention symptoms is largely independent from genetic influences contributing to general cognitive ability. The current study aimed to extend the previous findings that were based on rating scale measures in a population sample by examining the generalizability of the findings to a clinical population, and by measuring reading difficulties both with a rating scale and with an objective task. We therefore investigated the familial relationships between ADHD, reading difficulties and IQ in a sample of individuals diagnosed with ADHD combined type, their siblings and control sibling pairs. Methods We ran multivariate familial models on data from 1789 individuals at ages 6 to 19. Reading difficulties were measured with both rating scale and an objective task. IQ was obtained using the Wechsler Intelligence Scales (WISC-III / WAIS-III). Results Significant phenotypic (0.2–0.4) and familial (0.3–0.5) correlations were observed among ADHD, reading difficulties and IQ. Yet 53% to 72% of the overlapping familial influences between ADHD and reading difficulties were not shared with IQ. Conclusions Our finding that familial influences shared with general cognitive ability, though present, do not account for the majority of the overlapping familial influences on ADHD and reading difficulties extends previous findings from a population-based study to a clinically-ascertained sample with combined type ADHD. PMID:22324316

  3. E-Learning in a Large Organization: A Study of the Critical Role of Information Sharing

    ERIC Educational Resources Information Center

    Netteland, Grete; Wasson, Barbara; Morch, Anders I

    2007-01-01

    Purpose: The purpose of this paper is to provide new insights into the implementation of large-scale learning projects; thereby better understanding the difficulties, frustrations, and obstacles encountered when implementing enterprise-wide e-learning as a tool for training and organization transformation in a complex organization.…

  4. The Role of Evaluative Metadata in an Online Teacher Resource Exchange

    ERIC Educational Resources Information Center

    Abramovich, Samuel; Schunn, Christian D.; Correnti, Richard J.

    2013-01-01

    A large-scale online teacher resource exchange is studied to examine the ways in which metadata influence teachers' selection of resources. A hierarchical linear modeling approach was used to tease apart the simultaneous effects of resource features and author features. From a decision heuristics theoretical perspective, teachers appear to…

  5. A resource oriented webs service for environmental modeling

    NASA Astrophysics Data System (ADS)

    Ferencik, Ioan

    2013-04-01

    Environmental modeling is a largely adopted practice in the study of natural phenomena. Environmental models can be difficult to build and use and thus sharing them within the community is an important aspect. The most common approach to share a model is to expose it as a web service. In practice the interaction with this web service is cumbersome due to lack of standardized contract and the complexity of the model being exposed. In this work we investigate the use of a resource oriented approach in exposing environmental models as web services. We view a model as a layered resource build atop the object concept from Object Oriented Programming, augmented with persistence capabilities provided by an embedded object database to keep track of its state and implementing the four basic principles of resource oriented architectures: addressability, statelessness, representation and uniform interface. For implementation we use exclusively open source software: Django framework, dyBase object oriented database and Python programming language. We developed a generic framework of resources structured into a hierarchy of types and consequently extended this typology with recurses specific to the domain of environmental modeling. To test our web service we used cURL, a robust command-line based web client.

  6. A Protocol for Generating and Exchanging (Genome-Scale) Metabolic Resource Allocation Models.

    PubMed

    Reimers, Alexandra-M; Lindhorst, Henning; Waldherr, Steffen

    2017-09-06

    In this article, we present a protocol for generating a complete (genome-scale) metabolic resource allocation model, as well as a proposal for how to represent such models in the systems biology markup language (SBML). Such models are used to investigate enzyme levels and achievable growth rates in large-scale metabolic networks. Although the idea of metabolic resource allocation studies has been present in the field of systems biology for some years, no guidelines for generating such a model have been published up to now. This paper presents step-by-step instructions for building a (dynamic) resource allocation model, starting with prerequisites such as a genome-scale metabolic reconstruction, through building protein and noncatalytic biomass synthesis reactions and assigning turnover rates for each reaction. In addition, we explain how one can use SBML level 3 in combination with the flux balance constraints and our resource allocation modeling annotation to represent such models.

  7. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  8. MonitoringResources.org—Supporting coordinated and cost-effective natural resource monitoring across organizations

    USGS Publications Warehouse

    Bayer, Jennifer M.; Scully, Rebecca A.; Weltzin, Jake F.

    2018-05-21

    Natural resource managers who oversee the Nation’s resources require data to support informed decision-making at a variety of spatial and temporal scales that often cross typical jurisdictional boundaries such as states, agency regions, and watersheds. These data come from multiple agencies, programs, and sources, often with their own methods and standards for data collection and organization. Coordinating standards and methods is often prohibitively time-intensive and expensive. MonitoringResources.org offers a suite of tools and resources that support coordination of monitoring efforts, cost-effective planning, and sharing of knowledge among organizations. The website was developed by the Pacific Northwest Aquatic Monitoring Partnership—a collaboration of Federal, state, tribal, local, and private monitoring programs—and the U.S. Geological Survey (USGS), with funding from the Bonneville Power Administration and USGS. It is a key component of a coordinated monitoring and information network.

  9. Coal resources, reserves and peak coal production in the United States

    USGS Publications Warehouse

    Milici, Robert C.; Flores, Romeo M.; Stricker, Gary D.

    2013-01-01

    In spite of its large endowment of coal resources, recent studies have indicated that United States coal production is destined to reach a maximum and begin an irreversible decline sometime during the middle of the current century. However, studies and assessments illustrating coal reserve data essential for making accurate forecasts of United States coal production have not been compiled on a national basis. As a result, there is a great deal of uncertainty in the accuracy of the production forecasts. A very large percentage of the coal mined in the United States comes from a few large-scale mines (mega-mines) in the Powder River Basin of Wyoming and Montana. Reported reserves at these mines do not account for future potential reserves or for future development of technology that may make coal classified currently as resources into reserves in the future. In order to maintain United States coal production at or near current levels for an extended period of time, existing mines will eventually have to increase their recoverable reserves and/or new large-scale mines will have to be opened elsewhere. Accordingly, in order to facilitate energy planning for the United States, this paper suggests that probabilistic assessments of the remaining coal reserves in the country would improve long range forecasts of coal production. As it is in United States coal assessment projects currently being conducted, a major priority of probabilistic assessments would be to identify the numbers and sizes of remaining large blocks of coal capable of supporting large-scale mining operations for extended periods of time and to conduct economic evaluations of those resources.

  10. An Assessment of Potential Mining Impacts on Salmon ...

    EPA Pesticide Factsheets

    The Bristol Bay watershed in southwestern Alaska supports the largest sockeye salmon fishery in the world, is home to 25 federally recognized tribal governments, and contains large mineral resources. The potential for large-scale mining activities in the watershed has raised concerns about the impact of mining on the sustainability of Bristol Bay’s world-class commercial, recreational and subsistence fisheries and the future of Alaska Native tribes in the watershed who have maintained a salmon-based culture and subsistence-based way of life for at least 4,000 years. The purpose of this assessment is to provide a characterization of the biological and mineral resources of the Bristol Bay watershed, increase understanding of the potential impacts of large-scale mining on the region’s fish resources, and inform future government decisions related to protecting and maintaining the chemical, physical, and biological integrity of the watershed. It will also serve as a technical resource for the public, tribes, and governments who must consider how best to address the challenges of mining and ecological protection in the Bristol Bay watershed. The purpose of this assessment is to understand how future large-scale mining may affect water quality and the Bristol Bay salmon fisheries, which includes the largest wild sockeye salmon fishery in the world. Bristol Bay, Alaska, is home to a salmon fishery that is of significant economic and subsistence value to the peopl

  11. Participatory Modeling Processes to Build Community Knowledge Using Shared Model and Data Resources and in a Transboundary Pacific Northwest Watershed (Nooksack River Basin, Washington, USA)

    NASA Astrophysics Data System (ADS)

    Bandaragoda, C.; Dumas, M.

    2014-12-01

    As with many western US watersheds, the Nooksack River Basin faces strong pressures associated with climate variability and change, rapid population growth, and deep-rooted water law. This transboundary basin includes contributing areas in British Columbia, Canada, and has a long history of joint data collection, model development, and facilitated communication between governmental (federal, tribal, state, local), environmental, timber, agricultural, and recreational user groups. However, each entity in the watershed responds to unique data coordination, information sharing, and adaptive management regimes and thresholds, further increasing the complexity of watershed management. Over the past four years, participatory methods were used to compile and review scientific data and models, including fish habitat (endangered salmonid species), channel hydraulics, climate data, agricultural, municipal and industrial water use, and integrated watershed scale distributed hydrologic models from over 15 years of projects (from jointly funded to independent shared work by individual companies, agencies, and universities). A specific outcome of the work includes participatory design of a collective problem statement used for guidance on future investment of shared resources and development of a data-generation process where modeling results are communicated in a three-tiers for 1) public/decision-making, 2) technical, and 3) research audiences. We establish features for successful participation using tools that are iteratively developed, tested for usability through incremental knowledge building, and designed to provide rigor in modeling. A general outcome of the work is ongoing support by tribal, state, and local governments, as well as the agricultural community, to continue the generation of shared watershed data using models in a dynamic legal and regulatory setting, where two federally recognized tribes have requested federal court resolution of federal treaty rights. Our participatory modeling process aims to integrate disciplines and watershed processes over time and space, while building capacity for more holistic watershed-scale thinking, or community knowledge, by research, governmental and public interests.

  12. A new resource for developing and strengthening large-scale community health worker programs.

    PubMed

    Perry, Henry; Crigler, Lauren; Lewin, Simon; Glenton, Claire; LeBan, Karen; Hodgins, Steve

    2017-01-12

    Large-scale community health worker programs are now growing in importance around the world in response to the resurgence of interest and growing evidence of the importance of community-based primary health care for improving the health of populations in resource-constrained, high-mortality settings. These programs, because of their scale and operational challenges, merit special consideration by the global health community, national policy-makers, and program implementers. A new online resource is now available to assist in that effort: Developing and Strengthening Community Health Worker Programs at Scale: A Reference Guide and Case Studies for Program Managers and Policymakers ( http://www.mchip.net/CHWReferenceGuide ). This CHW Reference Guide is the product of 27 different collaborators who, collectively, have a formidable breadth and depth of experience and knowledge about CHW programming around the world. It provides a thoughtful discussion about the many operational issues that large-scale CHW programs need to address as they undergo the process of development, expansion or strengthening. Detailed case studies of 12 national CHW programs are included in the Appendix-the most current and complete cases studies as a group that are currently available. Future articles in this journal will highlight many of the themes in the CHW Reference Guide and provide an update of recent advances and experiences. These articles will serve, we hope, to (1) increase awareness about the CHW Reference Guide and its usefulness and (2) connect a broader audience to the critical importance of strengthening large-scale CHW programs for the health benefits that they can bring to underserved populations around the world.

  13. Integrating market processes into utility resource planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, E.P.

    1992-11-01

    Integrated resource planning has resulted in an abundance of alternatives for meeting existing and new demand for electricity services: (1) utility demand-side management (DSM) programs, (2) DSM bidding, (3) competitive bidding for private power supplies, (4) utility re-powering, and (5) new utility construction. Each alternative relies on a different degree of planning for implementation and, therefore, each alternative relies on markets to a greater or lesser degree. This paper shows how the interaction of planning processes and market forces results in resource allocations among the alternatives. The discussion focuses on three phenomena that are driving forces behind the unanticipated consequences'more » of contemporary integrated resource planning efforts. These forces are: (1) large-scale DSM efforts, (2) customer bypass, and (3) large-scale independent power projects. 22 refs., 3 figs., 2 tabs.« less

  14. Integrating genome-wide association studies and gene expression data highlights dysregulated multiple sclerosis risk pathways.

    PubMed

    Liu, Guiyou; Zhang, Fang; Jiang, Yongshuai; Hu, Yang; Gong, Zhongying; Liu, Shoufeng; Chen, Xiuju; Jiang, Qinghua; Hao, Junwei

    2017-02-01

    Much effort has been expended on identifying the genetic determinants of multiple sclerosis (MS). Existing large-scale genome-wide association study (GWAS) datasets provide strong support for using pathway and network-based analysis methods to investigate the mechanisms underlying MS. However, no shared genetic pathways have been identified to date. We hypothesize that shared genetic pathways may indeed exist in different MS-GWAS datasets. Here, we report results from a three-stage analysis of GWAS and expression datasets. In stage 1, we conducted multiple pathway analyses of two MS-GWAS datasets. In stage 2, we performed a candidate pathway analysis of the large-scale MS-GWAS dataset. In stage 3, we performed a pathway analysis using the dysregulated MS gene list from seven human MS case-control expression datasets. In stage 1, we identified 15 shared pathways. In stage 2, we successfully replicated 14 of these 15 significant pathways. In stage 3, we found that dysregulated MS genes were significantly enriched in 10 of 15 MS risk pathways identified in stages 1 and 2. We report shared genetic pathways in different MS-GWAS datasets and highlight some new MS risk pathways. Our findings provide new insights on the genetic determinants of MS.

  15. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  16. HydroShare: An online, collaborative environment for the sharing of hydrologic data and models (Invited)

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Arrigo, J.; Hooper, R. P.; Valentine, D. W.; Maidment, D. R.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models. The goal of HydroShare is to enable scientists to easily discover and access data and models, retrieve them to their desktop or perform analyses in a distributed computing environment that may include grid, cloud or high performance computing model instances as necessary. Scientists may also publish outcomes (data, results or models) into HydroShare, using the system as a collaboration platform for sharing data, models and analyses. HydroShare is expanding the data sharing capability of the CUAHSI Hydrologic Information System by broadening the classes of data accommodated, creating new capability to share models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. One of the fundamental concepts in HydroShare is that of a Resource. All content is represented using a Resource Data Model that separates system and science metadata and has elements common to all resources as well as elements specific to the types of resources HydroShare will support. These will include different data types used in the hydrology community and models and workflows that require metadata on execution functionality. HydroShare will use the integrated Rule-Oriented Data System (iRODS) to manage federated data content and perform rule-based background actions on data and model resources, including parsing to generate metadata catalog information and the execution of models and workflows. This presentation will introduce the HydroShare functionality developed to date, describe key elements of the Resource Data Model and outline the roadmap for future development.

  17. Pesticide use and biodiversity conservation in the Amazonian agricultural frontier.

    PubMed

    Schiesari, Luis; Waichman, Andrea; Brock, Theo; Adams, Cristina; Grillitsch, Britta

    2013-06-05

    Agricultural frontiers are dynamic environments characterized by the conversion of native habitats to agriculture. Because they are currently concentrated in diverse tropical habitats, agricultural frontiers are areas where the largest number of species is exposed to hazardous land management practices, including pesticide use. Focusing on the Amazonian frontier, we show that producers have varying access to resources, knowledge, control and reward mechanisms to improve land management practices. With poor education and no technical support, pesticide use by smallholders sharply deviated from agronomical recommendations, tending to overutilization of hazardous compounds. By contrast, with higher levels of technical expertise and resources, and aiming at more restrictive markets, large-scale producers adhered more closely to technical recommendations and even voluntarily replaced more hazardous compounds. However, the ecological footprint increased significantly over time because of increased dosage or because formulations that are less toxic to humans may be more toxic to other biodiversity. Frontier regions appear to be unique in terms of the conflicts between production and conservation, and the necessary pesticide risk management and risk reduction can only be achieved through responsibility-sharing by diverse stakeholders, including governmental and intergovernmental organizations, NGOs, financial institutions, pesticide and agricultural industries, producers, academia and consumers.

  18. Hydrological processes in glacierized high-altitude basins of the western Himalayas

    NASA Astrophysics Data System (ADS)

    Jeelani, Ghulam; Shah, Rouf A.; Fryar, Alan E.; Deshpande, Rajendrakumar D.; Mukherjee, Abhijit; Perrin, Jerome

    2018-03-01

    Western Himalaya is a strategically important region, where the water resources are shared by China, India and Pakistan. The economy of the region is largely dependent on the water resources delivered by snow and glacier melt. The presented study used stable isotopes of water to further understand the basin-scale hydro-meteorological, hydrological and recharge processes in three high-altitude mountainous basins of the western Himalayas. The study provided new insights in understanding the dominant factors affecting the isotopic composition of the precipitation, snowpack, glacier melt, streams and springs. It was observed that elevation-dependent post-depositional processes and snowpack evolution resulted in the higher isotopic altitude gradient in snowpacks. The similar temporal trends of isotopic signals in rivers and karst springs reflect the rapid flow transfer due to karstification of the carbonate aquifers. The attenuation of the extreme isotopic input signal in karst springs appears to be due to the mixing of source waters with the underground karst reservoirs. Basin-wise, the input-output response demonstrates the vital role of winter precipitation in maintaining the perennial flow in streams and karst springs in the region. Isotopic data were also used to estimate the mean recharge altitude of the springs.

  19. Pesticide use and biodiversity conservation in the Amazonian agricultural frontier

    PubMed Central

    Schiesari, Luis; Waichman, Andrea; Brock, Theo; Adams, Cristina; Grillitsch, Britta

    2013-01-01

    Agricultural frontiers are dynamic environments characterized by the conversion of native habitats to agriculture. Because they are currently concentrated in diverse tropical habitats, agricultural frontiers are areas where the largest number of species is exposed to hazardous land management practices, including pesticide use. Focusing on the Amazonian frontier, we show that producers have varying access to resources, knowledge, control and reward mechanisms to improve land management practices. With poor education and no technical support, pesticide use by smallholders sharply deviated from agronomical recommendations, tending to overutilization of hazardous compounds. By contrast, with higher levels of technical expertise and resources, and aiming at more restrictive markets, large-scale producers adhered more closely to technical recommendations and even voluntarily replaced more hazardous compounds. However, the ecological footprint increased significantly over time because of increased dosage or because formulations that are less toxic to humans may be more toxic to other biodiversity. Frontier regions appear to be unique in terms of the conflicts between production and conservation, and the necessary pesticide risk management and risk reduction can only be achieved through responsibility-sharing by diverse stakeholders, including governmental and intergovernmental organizations, NGOs, financial institutions, pesticide and agricultural industries, producers, academia and consumers. PMID:23610177

  20. Educational needs of employed family caregivers of older adults: Evaluation of a workplace project.

    PubMed

    Curry, Linda Cox; Walker, Charles; Hogstel, Mildred O

    2006-01-01

    Family members provide 80% of care for older adults in the United States. Many family caregivers are employed either full or part time. For employed caregivers, personal health, job performance, and the ability to advance their career are affected by the weight of their caregiving responsibilities. Some find it necessary to quit their jobs. Employed caregivers report a need for caregiving information; however, they seldom think of their workplace as a valuable resource. Results of the second of a 3-phase research and service project are discussed. Based on a needs assessment completed by employees of a large institution, educational sessions were offered during 3 consecutive months. Thirty-five employees attended 1 or more sessions. The sessions were evaluated highly on a 5-point Likert-type scale for usefulness of information, quality of presentation, and value of session. Sharing project results with the employing institution's human resources department yielded commitment to integrate caregiver education and referral into a newly organized work-life program. When properly managed, such workplace programs can provide needed assistance to employed caregivers. A nurse working with older adults is an ideal provider to initiate and manage this kind of program.

  1. Shared Socio-Economic Pathways of the Energy Sector – Quantifying the Narratives

    DOE PAGES

    Bauer, Nico; Calvin, Katherine; Emmerling, Johannes; ...

    2016-08-23

    Energy is crucial for supporting basic human needs, development and well-being. The future evolution of the scale and character of the energy system will be fundamentally shaped by socioeconomic conditions and drivers, available energy resources, technologies of energy supply and transformation, and end-use energy demand. However, because energy-related activities are significant sources of greenhouse gas (GHG) emissions and other environmental and social externalities, energy system development will also be influenced by social acceptance and strategic policy choices. All of these uncertainties have important implications for many aspects of economic and environmental sustainability, and climate change in particular. In the Shared-Socioeconomicmore » Pathway (SSP) framework these uncertainties are structured into five narratives, arranged according to the challenges to climate change mitigation and adaptation. In this study we explore future energy sector developments across the five SSPs using Integrated Assessment Models (IAMs), and we also provide summary output and analysis for selected scenarios of global emissions mitigation policies. The mitigation challenge strongly corresponds with global baseline energy sector growth over the 21st century, which varies between 40% and 230% depending on final energy consumer behavior, technological improvements, resource availability and policies. The future baseline CO 2-emission range is even larger, as the most energy-intensive SSP also incorporates a comparatively high share of carbon-intensive fossil fuels, and vice versa. Inter-regional disparities in the SSPs are consistent with the underlying socioeconomic assumptions; these differences are particularly strong in the SSPs with large adaptation challenges, which have little inter-regional convergence in long-term income and final energy demand levels. The scenarios presented do not include feedbacks of climate change on energy sector development. The energy sector SSPs with and without emissions mitigation policies are introduced and analyzed here in order to contribute to future research in climate sciences, mitigation analysis, and studies on impacts, adaptation and vulnerability.« less

  2. IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.

    2014-12-01

    The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.

  3. The Resource Consumption Principle: Attention and Memory in Volumes of Neural Tissue

    NASA Astrophysics Data System (ADS)

    Montague, P. Read

    1996-04-01

    In the cerebral cortex, the small volume of the extracellular space in relation to the volume enclosed by synapses suggests an important functional role for this relationship. It is well known that there are atoms and molecules in the extracellular space that are absolutely necessary for synapses to function (e.g., calcium). I propose here the hypothesis that the rapid shift of these atoms and molecules from extracellular to intrasynaptic compartments represents the consumption of a shared, limited resource available to local volumes of neural tissue. Such consumption results in a dramatic competition among synapses for resources necessary for their function. In this paper, I explore a theory in which this resource consumption plays a critical role in the way local volumes of neural tissue operate. On short time scales, this principle of resource consumption permits a tissue volume to choose those synapses that function in a particular context and thereby helps to integrate the many neural signals that impinge on a tissue volume at any given moment. On longer time scales, the same principle aids in the stable storage and recall of information. The theory provides one framework for understanding how cerebral cortical tissue volumes integrate, attend to, store, and recall information. In this account, the capacity of neural tissue to attend to stimuli is intimately tied to the way tissue volumes are organized at fine spatial scales.

  4. The Application of Large-Scale Hypermedia Information Systems to Training.

    ERIC Educational Resources Information Center

    Crowder, Richard; And Others

    1995-01-01

    Discusses the use of hypermedia in electronic information systems that support maintenance operations in large-scale industrial plants. Findings show that after establishing an information system, the same resource base can be used to train personnel how to use the computer system and how to perform operational and maintenance tasks. (Author/JMV)

  5. Large-scale hybrid poplar production economics: 1995 Alexandria, Minnesota establishment cost and management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Downing, M.; Langseth, D.; Stoffel, R.

    1996-12-31

    The purpose of this project was to track and monitor costs of planting, maintaining, and monitoring large scale commercial plantings of hybrid poplar in Minnesota. These costs assists potential growers and purchasers of this resource to determine the ways in which supply and demand may be secured through developing markets.

  6. 38 CFR 17.240 - Sharing specialized medical resources.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... medical resources. 17.240 Section 17.240 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS MEDICAL Sharing of Medical Facilities, Equipment, and Information § 17.240 Sharing specialized medical resources. Subject to such terms and conditions as the Under Secretary for Health shall prescribe...

  7. 38 CFR 17.240 - Sharing specialized medical resources.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... medical resources. 17.240 Section 17.240 Pensions, Bonuses, and Veterans' Relief DEPARTMENT OF VETERANS AFFAIRS MEDICAL Sharing of Medical Facilities, Equipment, and Information § 17.240 Sharing specialized medical resources. Subject to such terms and conditions as the Under Secretary for Health shall prescribe...

  8. Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling (Final Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William J. Schroeder

    2011-11-13

    This report contains the comprehensive summary of the work performed on the SBIR Phase II, Collaborative Visualization for Large-Scale Accelerator Electromagnetic Modeling at Kitware Inc. in collaboration with Stanford Linear Accelerator Center (SLAC). The goal of the work was to develop collaborative visualization tools for large-scale data as illustrated in the figure below. The solutions we proposed address the typical problems faced by geographicallyand organizationally-separated research and engineering teams, who produce large data (either through simulation or experimental measurement) and wish to work together to analyze and understand their data. Because the data is large, we expect that it cannotmore » be easily transported to each team member's work site, and that the visualization server must reside near the data. Further, we also expect that each work site has heterogeneous resources: some with large computing clients, tiled (or large) displays and high bandwidth; others sites as simple as a team member on a laptop computer. Our solution is based on the open-source, widely used ParaView large-data visualization application. We extended this tool to support multiple collaborative clients who may locally visualize data, and then periodically rejoin and synchronize with the group to discuss their findings. Options for managing session control, adding annotation, and defining the visualization pipeline, among others, were incorporated. We also developed and deployed a Web visualization framework based on ParaView that enables the Web browser to act as a participating client in a collaborative session. The ParaView Web Visualization framework leverages various Web technologies including WebGL, JavaScript, Java and Flash to enable interactive 3D visualization over the web using ParaView as the visualization server. We steered the development of this technology by teaming with the SLAC National Accelerator Laboratory. SLAC has a computationally-intensive problem important to the nations scientific progress as described shortly. Further, SLAC researchers routinely generate massive amounts of data, and frequently collaborate with other researchers located around the world. Thus SLAC is an ideal teammate through which to develop, test and deploy this technology. The nature of the datasets generated by simulations performed at SLAC presented unique visualization challenges especially when dealing with higher-order elements that were addressed during this Phase II. During this Phase II, we have developed a strong platform for collaborative visualization based on ParaView. We have developed and deployed a ParaView Web Visualization framework that can be used for effective collaboration over the Web. Collaborating and visualizing over the Web presents the community with unique opportunities for sharing and accessing visualization and HPC resources that hitherto with either inaccessible or difficult to use. The technology we developed in here will alleviate both these issues as it becomes widely deployed and adopted.« less

  9. Strengths amidst vulnerabilities: the paradox of resistance in a mining-affected community in Guatemala.

    PubMed

    Caxaj, C Susana; Berman, Helene; Ray, Susan L; Restoule, Jean-Paul; Varcoe, Coleen

    2014-11-01

    The influence of large-scale mining on the psychosocial wellbeing and mental health of diverse Indigenous communities has attracted increased attention. In previous reports, we have discussed the influence of a gold mining operation on the health of a community in the Western highlands of Guatemala. Here, we discuss the community strengths, and acts of resistance of this community, that is, community processes that promoted mental health amidst this context. Using an anti-colonial narrative methodology that incorporated participatory action research principles, we developed a research design in collaboration with community leaders and participants. Data collection involved focus groups, individual interviews and photo-sharing with 54 men and women between the ages of 18 and 67. Data analysis was guided by iterative and ongoing conversations with participants and McCormack's narrative lenses. Study findings revealed key mechanisms and sources of resistance, including a shared cultural identity, a spiritual knowing and being, 'defending our rights, defending our territory,' and, speaking truth to power. These overlapping strengths were identified by participants as key protective factors in facing challenges and adversity. Yet ultimately, these same strengths were often the most eroded or endangered due the influence of large-scale mining operations in the region. These community strengths and acts of resistance reveal important priorities for promoting mental health and wellbeing for populations impacted by large-scale mining operations. Mental health practitioners must attend to both the strengths and parallel vulnerabilities that may be occasioned by large-scale projects of this nature.

  10. Demonstration of Essential Reliability Services by a 300-MW Solar Photovoltaic Power Plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Loutan, Clyde; Klauer, Peter; Chowdhury, Sirajul

    The California Independent System Operator (CAISO), First Solar, and the National Renewable Energy Laboratory (NREL) conducted a demonstration project on a large utility-scale photovoltaic (PV) power plant in California to test its ability to provide essential ancillary services to the electric grid. With increasing shares of solar- and wind-generated energy on the electric grid, traditional generation resources equipped with automatic governor control (AGC) and automatic voltage regulation controls -- specifically, fossil thermal -- are being displaced. The deployment of utility-scale, grid-friendly PV power plants that incorporate advanced capabilities to support grid stability and reliability is essential for the large-scale integrationmore » of PV generation into the electric power grid, among other technical requirements. A typical PV power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. In this way, PV power plants can be used to mitigate the impact of variability on the grid, a role typically reserved for conventional generators. In August 2016, testing was completed on First Solar's 300-MW PV power plant, and a large amount of test data was produced and analyzed that demonstrates the ability of PV power plants to use grid-friendly controls to provide essential reliability services. These data showed how the development of advanced power controls can enable PV to become a provider of a wide range of grid services, including spinning reserves, load following, voltage support, ramping, frequency response, variability smoothing, and frequency regulation to power quality. Specifically, the tests conducted included various forms of active power control such as AGC and frequency regulation; droop response; and reactive power, voltage, and power factor controls. This project demonstrated that advanced power electronics and solar generation can be controlled to contribute to system-wide reliability. It was shown that the First Solar plant can provide essential reliability services related to different forms of active and reactive power controls, including plant participation in AGC, primary frequency control, ramp rate control, and voltage regulation. For AGC participation in particular, by comparing the PV plant testing results to the typical performance of individual conventional technologies, we showed that regulation accuracy by the PV plant is 24-30 points better than fast gas turbine technologies. The plant's ability to provide volt-ampere reactive control during periods of extremely low power generation was demonstrated as well. The project team developed a pioneering demonstration concept and test plan to show how various types of active and reactive power controls can leverage PV generation's value from being a simple variable energy resource to a resource that provides a wide range of ancillary services. With this project's approach to a holistic demonstration on an actual, large, utility-scale, operational PV power plant and dissemination of the obtained results, the team sought to close some gaps in perspectives that exist among various stakeholders in California and nationwide by providing real test data.« less

  11. Networked Microcomputers--The Next Generation in College Computing.

    ERIC Educational Resources Information Center

    Harris, Albert L.

    The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…

  12. Transient synchrony among populations of five foliage-feeding Lepidoptera

    Treesearch

    Maartje J. Klapwijk; Jonathan A. Walter; Anikó Hirka; György Csóka; Christer Björkman; Andrew M. Liebhold

    2018-01-01

    Studies of transient population dynamics have largely focused on temporal changes in dynamical behaviour, such as the transition between periods of stability and instability. This study explores a related dynamic pattern, namely transient synchrony during a 49-year period among populations of five sympatric species of forest insects that share host tree resources. The...

  13. Dare to Share

    ERIC Educational Resources Information Center

    Briggs, Linda L.

    2007-01-01

    Today, as difficult as it is for large institutions to keep software and hardware up-to-date, the challenge and expense of keeping up is only amplified for smaller colleges and universities. In the area of data-driven decision-making (DDD), the challenge can be even greater. Because smaller schools are pressed for time and resources on nearly all…

  14. The Human Salivary Microbiome Is Shaped by Shared Environment Rather than Genetics: Evidence from a Large Family of Closely Related Individuals.

    PubMed

    Shaw, Liam; Ribeiro, Andre L R; Levine, Adam P; Pontikos, Nikolas; Balloux, Francois; Segal, Anthony W; Roberts, Adam P; Smith, Andrew M

    2017-09-12

    The human microbiome is affected by multiple factors, including the environment and host genetics. In this study, we analyzed the salivary microbiomes of an extended family of Ashkenazi Jewish individuals living in several cities and investigated associations with both shared household and host genetic similarities. We found that environmental effects dominated over genetic effects. While there was weak evidence of geographical structuring at the level of cities, we observed a large and significant effect of shared household on microbiome composition, supporting the role of the immediate shared environment in dictating the presence or absence of taxa. This effect was also seen when including adults who had grown up in the same household but moved out prior to the time of sampling, suggesting that the establishment of the salivary microbiome earlier in life may affect its long-term composition. We found weak associations between host genetic relatedness and microbiome dissimilarity when using family pedigrees as proxies for genetic similarity. However, this association disappeared when using more-accurate measures of kinship based on genome-wide genetic markers, indicating that the environment rather than host genetics is the dominant factor affecting the composition of the salivary microbiome in closely related individuals. Our results support the concept that there is a consistent core microbiome conserved across global scales but that small-scale effects due to a shared living environment significantly affect microbial community composition. IMPORTANCE Previous research shows that the salivary microbiomes of relatives are more similar than those of nonrelatives, but it remains difficult to distinguish the effects of relatedness and shared household environment. Furthermore, pedigree measures may not accurately measure host genetic similarity. In this study, we include genetic relatedness based on genome-wide single nucleotide polymorphisms (SNPs) (rather than pedigree measures) and shared environment in the same analysis. We quantify the relative importance of these factors by studying the salivary microbiomes in members of a large extended Ashkenazi Jewish family living in different locations. We find that host genetics plays no significant role and that the dominant factor is the shared environment at the household level. We also find that this effect appears to persist in individuals who have moved out of the parental household, suggesting that aspects of salivary microbiome composition established during upbringing can persist over a time scale of years. Copyright © 2017 Shaw et al.

  15. Security controls in an integrated Biobank to protect privacy in data sharing: rationale and study design.

    PubMed

    Takai-Igarashi, Takako; Kinoshita, Kengo; Nagasaki, Masao; Ogishima, Soichi; Nakamura, Naoki; Nagase, Sachiko; Nagaie, Satoshi; Saito, Tomo; Nagami, Fuji; Minegishi, Naoko; Suzuki, Yoichi; Suzuki, Kichiya; Hashizume, Hiroaki; Kuriyama, Shinichi; Hozawa, Atsushi; Yaegashi, Nobuo; Kure, Shigeo; Tamiya, Gen; Kawaguchi, Yoshio; Tanaka, Hiroshi; Yamamoto, Masayuki

    2017-07-06

    With the goal of realizing genome-based personalized healthcare, we have developed a biobank that integrates personal health, genome, and omics data along with biospecimens donated by volunteers of 150,000. Such a large-scale of data integration involves obvious risks of privacy violation. The research use of personal genome and health information is a topic of global discussion with regard to the protection of privacy while promoting scientific advancement. The present paper reports on our plans, current attempts, and accomplishments in addressing security problems involved in data sharing to ensure donor privacy while promoting scientific advancement. Biospecimens and data have been collected in prospective cohort studies with the comprehensive agreement. The sample size of 150,000 participants was required for multiple researches including genome-wide screening of gene by environment interactions, haplotype phasing, and parametric linkage analysis. We established the T ohoku M edical M egabank (TMM) data sharing policy: a privacy protection rule that requires physical, personnel, and technological safeguards against privacy violation regarding the use and sharing of data. The proposed policy refers to that of NCBI and that of the Sanger Institute. The proposed policy classifies shared data according to the strength of re-identification risks. Local committees organized by TMM evaluate re-identification risk and assign a security category to a dataset. Every dataset is stored in an assigned segment of a supercomputer in accordance with its security category. A security manager should be designated to handle all security problems at individual data use locations. The proposed policy requires closed networks and IP-VPN remote connections. The mission of the biobank is to distribute biological resources most productively. This mission motivated us to collect biospecimens and health data and simultaneously analyze genome/omics data in-house. The biobank also has the mission of improving the quality and quantity of the contents of the biobank. This motivated us to request users to share the results of their research as feedback to the biobank. The TMM data sharing policy has tackled every security problem originating with the missions. We believe our current implementation to be the best way to protect privacy in data sharing.

  16. 2016 Offshore Wind Energy Resource Assessment for the United States

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musial, Walt; Heimiller, Donna; Beiter, Philipp

    2016-09-01

    This report, the 2016 Offshore Wind Energy Resource Assessment for the United States, was developed by the National Renewable Energy Laboratory, and updates a previous national resource assessment study, and refines and reaffirms that the available wind resource is sufficient for offshore wind to be a large-scale contributor to the nation's electric energy supply.

  17. The Development of GIS Educational Resources Sharing among Central Taiwan Universities

    NASA Astrophysics Data System (ADS)

    Chou, T.-Y.; Yeh, M.-L.; Lai, Y.-C.

    2011-09-01

    Using GIS in the classroom enhance students' computer skills and explore the range of knowledge. The paper highlights GIS integration on e-learning platform and introduces a variety of abundant educational resources. This research project will demonstrate tools for e-learning environment and delivers some case studies for learning interaction from Central Taiwan Universities. Feng Chia University (FCU) obtained a remarkable academic project subsidized by Ministry of Education and developed e-learning platform for excellence in teaching/learning programs among Central Taiwan's universities. The aim of the project is to integrate the educational resources of 13 universities in central Taiwan. FCU is serving as the hub of Center University. To overcome the problem of distance, e-platforms have been established to create experiences with collaboration enhanced learning. The e-platforms provide coordination of web service access among the educational community and deliver GIS educational resources. Most of GIS related courses cover the development of GIS, principles of cartography, spatial data analysis and overlaying, terrain analysis, buffer analysis, 3D GIS application, Remote Sensing, GPS technology, and WebGIS, MobileGIS, ArcGIS manipulation. In each GIS case study, students have been taught to know geographic meaning, collect spatial data and then use ArcGIS software to analyze spatial data. On one of e-Learning platforms provide lesson plans and presentation slides. Students can learn Arc GIS online. As they analyze spatial data, they can connect to GIS hub to get data they need including satellite images, aerial photos, and vector data. Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units

  18. Cloud-based crowd sensing: a framework for location-based crowd analyzer and advisor

    NASA Astrophysics Data System (ADS)

    Aishwarya, K. C.; Nambi, A.; Hudson, S.; Nadesh, R. K.

    2017-11-01

    Cloud computing is an emerging field of computer science to integrate and explore large and powerful computing systems and storages for personal and also for enterprise requirements. Mobile Cloud Computing is the inheritance of this concept towards mobile hand-held devices. Crowdsensing, or to be precise, Mobile Crowdsensing is the process of sharing resources from an available group of mobile handheld devices that support sharing of different resources such as data, memory and bandwidth to perform a single task for collective reasons. In this paper, we propose a framework to use Crowdsensing and perform a crowd analyzer and advisor whether the user can go to the place or not. This is an ongoing research and is a new concept to which the direction of cloud computing has shifted and is viable for more expansion in the near future.

  19. Mechanisation of large-scale agricultural fields in developing countries - a review.

    PubMed

    Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila

    2016-09-01

    Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  20. Using the satellite-derived normalized difference vegetation index (NDVI) to explain ranging patterns in a lek-breeding antelope: the importance of scale.

    PubMed

    Bro-Jørgensen, Jakob; Brown, Molly E; Pettorelli, Nathalie

    2008-11-01

    Lek-breeding species are characterized by a negative association between territorial resource availability and male mating success; however, the impact of resources on the overall distribution patterns of the two sexes in lek systems is not clear. The normalized difference vegetation index (NDVI) has recently emerged as a powerful proxy measure for primary productivity, allowing the links between the distributions of animals and resources to be explored. Using NDVI at four spatial resolutions, we here investigate how the distribution of the two sexes in a lek-breeding population of topi antelopes relates to resource abundance before and during the rut. We found that in the dry season preceding the rut, topi density correlated positively with NDVI at the large, but not the fine, scale. This suggests that before the rut, when resources were relatively scant, topi preferred pastures where green grass was widely abundant. The pattern was less pronounced in males, suggesting that the need for territorial attendance prevents males from tracking resources as freely as females do. During the rut, which occurs in the wet season, both male and female densities correlated negatively with NDVI at the fine scale. At this time, resources were generally plentiful and the results suggest that, rather than by resource maximization, distribution during the rut was determined by benefits of aggregating on relatively resource-poor leks for mating, and possibly antipredator, purposes. At the large scale, no correlation between density and NDVI was found during the rut in either sex, which can be explained by leks covering areas too small to be reflected at this resolution. The study illustrates that when investigating spatial organization, it is important: (1) to choose the appropriate analytic scale, and (2) to consider behavioural as well as strictly ecological factors.

  1. Exploring clouds, weather, climate, and modeling using bilingual content and activities from the Windows to the Universe program and the Center for Multiscale Modeling of Atmospheric Processes

    NASA Astrophysics Data System (ADS)

    Foster, S. Q.; Johnson, R. M.; Randall, D.; Denning, S.; Russell, R.; Gardiner, L.; Hatheway, B.; Genyuk, J.; Bergman, J.

    2008-12-01

    The need for improving the representation of cloud processes in climate models has been one of the most important limitations of the reliability of climate-change simulations. Now in its third year, the National Science Foundation-funded Center for Multi-scale Modeling of Atmospheric Processes (CMMAP) at Colorado State University is addressing this problem through a revolutionary new approach to representing cloud processes on their native scales, including the cloud-scale interaction processes that are active in cloud systems. CMMAP has set ambitious education and human-resource goals to share basic information about the atmosphere, clouds, weather, climate, and modeling with diverse K-12 and public audiences through its affiliation with the Windows to the Universe (W2U) program at University Corporation for Atmospheric Research (UCAR). W2U web pages are written at three levels in English and Spanish. This information targets learners at all levels, educators, and families who seek to understand and share resources and information about the nature of weather and the climate system, and career role models from related research fields. This resource can also be helpful to educators who are building bridges in the classroom between the sciences, the arts, and literacy. Visitors to the W2U's CMMAP web portal can access a beautiful new clouds image gallery; information about each cloud type and the atmospheric processes that produce them; a Clouds in Art interactive; collections of weather-themed poetry, art, and myths; links to games and puzzles for children; and extensive classroom- ready resources and activities for K-12 teachers. Biographies of CMMAP scientists and graduate students are featured. Basic science concepts important to understanding the atmosphere, such as condensation, atmosphere pressure, lapse rate, and more have been developed, as well as 'microworlds' that enable students to interact with experimental tools while building fundamental knowledge. These resources can be accessed online at no cost by the entire atmospheric science K-12 and informal science education community.

  2. Collaboratively Architecting a Scalable and Adaptable Petascale Infrastructure to Support Transdisciplinary Scientific Research for the Australian Earth and Environmental Sciences

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability and publication. There are also human resource challenges as highly skilled HPC/HPD specialists, specialist programmers, and data scientists are required whose skills can support scaling to the new paradigm of effective and efficient data-intensive earth science analytics on petascale, and soon to be exascale systems.

  3. On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment

    PubMed Central

    Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela

    2017-01-01

    Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems. PMID:28049820

  4. On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment.

    PubMed

    Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela

    2017-01-17

    Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems.

  5. Contention Modeling for Multithreaded Distributed Shared Memory Machines: The Cray XMT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Secchi, Simone; Tumeo, Antonino; Villa, Oreste

    Distributed Shared Memory (DSM) machines are a wide class of multi-processor computing systems where a large virtually-shared address space is mapped on a network of physically distributed memories. High memory latency and network contention are two of the main factors that limit performance scaling of such architectures. Modern high-performance computing DSM systems have evolved toward exploitation of massive hardware multi-threading and fine-grained memory hashing to tolerate irregular latencies, avoid network hot-spots and enable high scaling. In order to model the performance of such large-scale machines, parallel simulation has been proved to be a promising approach to achieve good accuracy inmore » reasonable times. One of the most critical factors in solving the simulation speed-accuracy trade-off is network modeling. The Cray XMT is a massively multi-threaded supercomputing architecture that belongs to the DSM class, since it implements a globally-shared address space abstraction on top of a physically distributed memory substrate. In this paper, we discuss the development of a contention-aware network model intended to be integrated in a full-system XMT simulator. We start by measuring the effects of network contention in a 128-processor XMT machine and then investigate the trade-off that exists between simulation accuracy and speed, by comparing three network models which operate at different levels of accuracy. The comparison and model validation is performed by executing a string-matching algorithm on the full-system simulator and on the XMT, using three datasets that generate noticeably different contention patterns.« less

  6. A lack of crowding? Body size does not decrease with density for two behavior-manipulating parasites

    USGS Publications Warehouse

    Weinersmith, KL; Warinner, Chloe B.; Tan, Virgina; Harris, David J.; Mora, Adrienne B.; Kuris, Armand M.; Lafferty, Kevin D.; Hechinger, Ryan F.

    2014-01-01

    For trophically transmitted parasites that manipulate the phenotype of their hosts, whether the parasites do or do not experience resource competition depends on such factors as the size of the parasites relative to their hosts, the intensity of infection, the extent to which parasites share the cost of defending against the host’s immune system or manipulating their host, and the extent to which parasites share transmission goals. Despite theoretical expectations for situations in which either no, or positive, or negative density-dependence should be observed, most studies document only negative density-dependence for trophically transmitted parasites. However, this trend may be an artifact of most studies having focused on systems in which parasites are large relative to their hosts. Yet, systems are common where parasites are small relative to their hosts, and these trophically transmitted parasites may be less likely to experience resource limitation. We looked for signs of density-dependence in Euhaplorchis californiensis (EUHA) and Renicola buchanani (RENB), two manipulative trematode parasites infecting wild-caught California killifish (Fundulus parvipinnis). These parasites are small relative to killifish (suggesting resources are not limiting), and are associated with changes in killifish behavior that are dependent on parasite-intensity and that increase predation rates by the parasites’ shared final host (indicating the possibility for cost sharing). We did not observe negative density-dependence in either species, indicating that resources are not limiting. In fact, observed patterns indicate possible mild positive density-dependence for EUHA. Although experimental confirmation is required, our findings suggest that some behavior-manipulating parasites suffer no reduction in size, and may even benefit when "crowded" by conspecifics.

  7. Hybrid MPI+OpenMP Programming of an Overset CFD Solver and Performance Investigations

    NASA Technical Reports Server (NTRS)

    Djomehri, M. Jahed; Jin, Haoqiang H.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    This report describes a two level parallelization of a Computational Fluid Dynamic (CFD) solver with multi-zone overset structured grids. The approach is based on a hybrid MPI+OpenMP programming model suitable for shared memory and clusters of shared memory machines. The performance investigations of the hybrid application on an SGI Origin2000 (O2K) machine is reported using medium and large scale test problems.

  8. Harmonising phenomics information for a better interoperability in the rare disease field.

    PubMed

    Maiella, Sylvie; Olry, Annie; Hanauer, Marc; Lanneau, Valérie; Lourghi, Halima; Donadille, Bruno; Rodwell, Charlotte; Köhler, Sebastian; Seelow, Dominik; Jupp, Simon; Parkinson, Helen; Groza, Tudor; Brudno, Michael; Robinson, Peter N; Rath, Ana

    2018-02-07

    HIPBI-RD (Harmonising phenomics information for a better interoperability in the rare disease field) is a three-year project which started in 2016 funded via the E-Rare 3 ERA-NET program. This project builds on three resources largely adopted by the rare disease (RD) community: Orphanet, its ontology ORDO (the Orphanet Rare Disease Ontology), HPO (the Human Phenotype Ontology) as well as PhenoTips software for the capture and sharing of structured phenotypic data for RD patients. Our project is further supported by resources developed by the European Bioinformatics Institute and the Garvan Institute. HIPBI-RD aims to provide the community with an integrated, RD-specific bioinformatics ecosystem that will harmonise the way phenomics information is stored in databases and patient files worldwide, and thereby contribute to interoperability. This ecosystem will consist of a suite of tools and ontologies, optimized to work together, and made available through commonly used software repositories. The project workplan follows three main objectives: The HIPBI-RD ecosystem will contribute to the interpretation of variants identified through exome and full genome sequencing by harmonising the way phenotypic information is collected, thus improving diagnostics and delineation of RD. The ultimate goal of HIPBI-RD is to provide a resource that will contribute to bridging genome-scale biology and a disease-centered view on human pathobiology. Achievements in Year 1. Copyright © 2018. Published by Elsevier Masson SAS.

  9. Leveraging Resources to Address Transportation Needs: Transportation Pooled Fund Program

    DOT National Transportation Integrated Search

    2004-05-28

    This brochure describes the Transportation Pooled Fund (TPF) Program. The objectives of the TPF Program are to leverage resources, avoid duplication of effort, undertake large-scale projects, obtain greater input on project definition, achieve broade...

  10. Achieving Land, Energy, and Environmental Compatibility: Utility-Scale Solar Energy Potential and Land-Use in California

    NASA Astrophysics Data System (ADS)

    Hoffacker, M. K.; Hernandez, R. R.; Field, C. B.

    2013-12-01

    Solar energy is an archetype renewable energy technology with great potential to reduce greenhouse gas emissions when substituted for carbon-intensive energy. Utility-scale solar energy (USSE; i.e., > 1 MW) necessitates large quantities of space making the efficient use of land for USSE development critical to realizing its full potential. However, studies elucidating the interaction between land-use and utility-scale solar energy (USSE) are limited. In this study, we assessed 1) the theoretical and technical potential of terrestrial-based USSE systems, and 2) land-use and land-cover change impacts from actual USSE installations (> 20 MW; planned, under construction, operating), using California as a case study due to its early adoption of renewable energy systems, unique constraints on land availability, immense energy demand, and vast natural resources. We used topo-climatic (e.g., slope, irradiance), infrastructural (e.g., proximity to transmission lines), and ecological constraints (e.g., threatened and endangered species) to determine highly favorable, favorable, and unfavorable locations for USSE and to assess its technical potential. We found that the theoretical potential of photovoltaic (PV) and concentrating solar power (CSP) in California is 26,097 and 29,422 kWh/m2/day, respectively. We identified over 150 planned, under construction, and operating USSE installations in California, ranging in size from 20 to 1,000 MW. Currently, 29% are located on shrub- and scrublands, 23% on cultivated crop land, 13% on pasture/hay areas, 11% on grassland/herbaceous and developed open space, and 7% in the built environment. Understanding current land-use decisions of USSE systems and assessing its future potential can be instructive for achieving land, energy, and environmental compatibility, especially for other global regions that share similar resource demands and limitations.

  11. Integrating land and resource management plans and applied large-scale research on two national forests

    Treesearch

    Callie Jo Schweitzer; Stacy Clark; Glen Gaines; Paul Finke; Kurt Gottschalk; David Loftis

    2008-01-01

    Researchers working out of the Southern and Northern Research Stations have partnered with two National Forests to conduct two large-scale studies designed to assess the effectiveness of silvicultural techniques used to restore and maintain upland oak (Quercus spp.)-dominated ecosystems in the Cumberland Plateau Region of the southeastern United...

  12. Scalable Iterative Classification for Sanitizing Large-Scale Datasets

    PubMed Central

    Li, Bo; Vorobeychik, Yevgeniy; Li, Muqun; Malin, Bradley

    2017-01-01

    Cheap ubiquitous computing enables the collection of massive amounts of personal data in a wide variety of domains. Many organizations aim to share such data while obscuring features that could disclose personally identifiable information. Much of this data exhibits weak structure (e.g., text), such that machine learning approaches have been developed to detect and remove identifiers from it. While learning is never perfect, and relying on such approaches to sanitize data can leak sensitive information, a small risk is often acceptable. Our goal is to balance the value of published data and the risk of an adversary discovering leaked identifiers. We model data sanitization as a game between 1) a publisher who chooses a set of classifiers to apply to data and publishes only instances predicted as non-sensitive and 2) an attacker who combines machine learning and manual inspection to uncover leaked identifying information. We introduce a fast iterative greedy algorithm for the publisher that ensures a low utility for a resource-limited adversary. Moreover, using five text data sets we illustrate that our algorithm leaves virtually no automatically identifiable sensitive instances for a state-of-the-art learning algorithm, while sharing over 93% of the original data, and completes after at most 5 iterations. PMID:28943741

  13. B-CAN: a resource sharing platform to improve the operation, visualization and integrated analysis of TCGA breast cancer data.

    PubMed

    Wen, Can-Hong; Ou, Shao-Min; Guo, Xiao-Bo; Liu, Chen-Feng; Shen, Yan-Bo; You, Na; Cai, Wei-Hong; Shen, Wen-Jun; Wang, Xue-Qin; Tan, Hai-Zhu

    2017-12-12

    Breast cancer is a high-risk heterogeneous disease with myriad subtypes and complicated biological features. The Cancer Genome Atlas (TCGA) breast cancer database provides researchers with the large-scale genome and clinical data via web portals and FTP services. Researchers are able to gain new insights into their related fields, and evaluate experimental discoveries with TCGA. However, it is difficult for researchers who have little experience with database and bioinformatics to access and operate on because of TCGA's complex data format and diverse files. For ease of use, we build the breast cancer (B-CAN) platform, which enables data customization, data visualization, and private data center. The B-CAN platform runs on Apache server and interacts with the backstage of MySQL database by PHP. Users can customize data based on their needs by combining tables from original TCGA database and selecting variables from each table. The private data center is applicable for private data and two types of customized data. A key feature of the B-CAN is that it provides single table display and multiple table display. Customized data with one barcode corresponding to many records and processed customized data are allowed in Multiple Tables Display. The B-CAN is an intuitive and high-efficient data-sharing platform.

  14. Managing aquatic ecosystems and water resources under multiple stress--an introduction to the MARS project.

    PubMed

    Hering, Daniel; Carvalho, Laurence; Argillier, Christine; Beklioglu, Meryem; Borja, Angel; Cardoso, Ana Cristina; Duel, Harm; Ferreira, Teresa; Globevnik, Lidija; Hanganu, Jenica; Hellsten, Seppo; Jeppesen, Erik; Kodeš, Vit; Solheim, Anne Lyche; Nõges, Tiina; Ormerod, Steve; Panagopoulos, Yiannis; Schmutz, Stefan; Venohr, Markus; Birk, Sebastian

    2015-01-15

    Water resources globally are affected by a complex mixture of stressors resulting from a range of drivers, including urban and agricultural land use, hydropower generation and climate change. Understanding how stressors interfere and impact upon ecological status and ecosystem services is essential for developing effective River Basin Management Plans and shaping future environmental policy. This paper details the nature of these problems for Europe's water resources and the need to find solutions at a range of spatial scales. In terms of the latter, we describe the aims and approaches of the EU-funded project MARS (Managing Aquatic ecosystems and water Resources under multiple Stress) and the conceptual and analytical framework that it is adopting to provide this knowledge, understanding and tools needed to address multiple stressors. MARS is operating at three scales: At the water body scale, the mechanistic understanding of stressor interactions and their impact upon water resources, ecological status and ecosystem services will be examined through multi-factorial experiments and the analysis of long time-series. At the river basin scale, modelling and empirical approaches will be adopted to characterise relationships between multiple stressors and ecological responses, functions, services and water resources. The effects of future land use and mitigation scenarios in 16 European river basins will be assessed. At the European scale, large-scale spatial analysis will be carried out to identify the relationships amongst stress intensity, ecological status and service provision, with a special focus on large transboundary rivers, lakes and fish. The project will support managers and policy makers in the practical implementation of the Water Framework Directive (WFD), of related legislation and of the Blueprint to Safeguard Europe's Water Resources by advising the 3rd River Basin Management Planning cycle, the revision of the WFD and by developing new tools for diagnosing and predicting multiple stressors. Copyright © 2014. Published by Elsevier B.V.

  15. Adaptive responses and disruptive effects: how major wildfire influences kinship-based social interactions in a forest marsupial.

    PubMed

    Banks, Sam C; Blyton, Michaela D J; Blair, David; McBurney, Lachlan; Lindenmayer, David B

    2012-02-01

    Environmental disturbance is predicted to play a key role in the evolution of animal social behaviour. This is because disturbance affects key factors underlying social systems, such as demography, resource availability and genetic structure. However, because natural disturbances are unpredictable there is little information on their effects on social behaviour in wild populations. Here, we investigated how a major wildfire affected cooperation (sharing of hollow trees) by a hollow-dependent marsupial. We based two alternative social predictions on the impacts of fire on population density, genetic structure and resources. We predicted an adaptive social response from previous work showing that kin selection in den-sharing develops as competition for den resources increases. Thus, kin selection should occur in burnt areas because the fire caused loss of the majority of hollow-bearing trees, but no detectable mortality. Alternatively, fire may have a disruptive social effect, whereby postfire home range-shifts 'neutralize' fine-scale genetic structure, thereby removing opportunities for kin selection between neighbours. Both predictions occurred: the disruptive social effect in burnt habitat and the adaptive social response in adjacent unburnt habitat. The latter followed a massive demographic influx to unburnt 'refuge' habitat that increased competition for dens, leading to a density-related kin selection response. Our results show remarkable short-term plasticity of animal social behaviour and demonstrate how the social effects of disturbance extend into undisturbed habitat owing to landscape-scale demographic shifts. We predicted long-term changes in kinship-based cooperative behaviour resulting from the genetic and resource impacts of forecast changes to fire regimes in these forests. © 2011 Blackwell Publishing Ltd.

  16. Current and future trends of Volcanology in Italy and abroad

    NASA Astrophysics Data System (ADS)

    Papale, P.

    2010-12-01

    Volcanology in Italy and in the world has rapidly developed during last decades. In the Seventies, stratigraphy and petrology provided the basic knowledge on the volcanic activities that still forms the root for modern volcano research. During the Eighties and Nineties the interest was more on the quantitative description of the volcanic processes, with enormous progresses in different but complementary fields including laboratory measurements and experiments, physico-mathematical modeling and numerical simulations, geophysical surveys and inverse analysis, and volcano monitoring and surveillance. In year 2000 a large number of magma properties and magmatic and volcanic processes was characterized at a first or higher order. Volcano research in Italy during the first decade of the new millennium has further developed along those lines. To-date, the very high risk Campi Flegrei and Vesuvius volcanoes, and the less risky but permanently active Etna and Stromboli volcanoes, are among the best monitored and more deeply investigated worldwide. The last decade has also seen coordinated efforts aimed at exploring exploitation of knowledge and skills for the benefit of the society. A series of projects focused on volcanic hazard and risk have joined >1000 researchers from Italian and foreign (Europe, US, Japan) Universities and Research Centers, on themes and objectives jointly defined by scientists from INGV and end-users from the national Civil Protection Department. These projects provide a global picture of volcano research in year 2010, that appears to be evolving through i) further rapid developments in the fields of investigation listed above, ii) their merging into effective multidisciplinary approaches, and iii) the full inclusion of the concepts of uncertainty and probabilities in volcanic scenario predictions and hazard forecast. The latter reflects the large inaccessibility of the volcanic systems, the extreme non-linear behaviour of volcanic processes put in light by the numerical studies, and the need of communicating in a formal and structured way the uncertain nature of volcanic predictions to emergency management authorities. Projections to year 2020 suggest a progressive relevance of structured volcano databases, that will provide large-scale sharing of basic knowledge and data for statistical analyses as for epidemiological databases in medicine; full coverage of the frequency range of geophysical and geochemical signals at active volcanoes, today not yet fully achieved; the development of standard volcano models and of global volcano simulator resources and tools, allowing separate sets of observations to be organized in a consistent global picture of the volcano dynamics; the further development of methods for the evaluation of probabilistic scenarios and their organization in event tree systems and hazard forecasting tools; the creation of large-scale volcano infrastructures for sharing of laboratory and computational resources; and the definition of international best practices for volcanic hazard and risk evaluation and for emergency preparedness and response activities. Recent initiatives in Italy and Europe (e.g., EPOS, DIVO, INGV-DPC, Exploris, and others) are developing largely along those lines, providing a view of the expected progresses in volcanology in the next decade.

  17. Quantitative variability of renewable energy resources in Norway

    NASA Astrophysics Data System (ADS)

    Christakos, Konstantinos; Varlas, George; Cheliotis, Ioannis; Aalstad, Kristoffer; Papadopoulos, Anastasios; Katsafados, Petros; Steeneveld, Gert-Jan

    2017-04-01

    Based on European Union (EU) targets for 2030, the share of renewable energy (RE) consumption should be increased at 27%. RE resources such as hydropower, wind, wave power and solar power are strongly depending on the chaotic behavior of the weather conditions and climate. Due to this dependency, the prediction of the spatiotemporal variability of the RE resources is more crucial factor than in other energy resources (i.e. carbon based energy). The fluctuation of the RE resources can affect the development of the RE technologies, the energy grid, supply and prices. This study investigates the variability of the potential RE resources in Norway. More specifically, hydropower, wind, wave, and solar power are quantitatively analyzed and correlated with respect to various spatial and temporal scales. In order to analyze the diversities and their interrelationships, reanalysis and observational data of wind, precipitation, wave, and solar radiation are used for a quantitative assessment. The results indicate a high variability of marine RE resources in the North Sea and the Norwegian Sea.

  18. Pushing HTCondor and glideinWMS to 200K+ Jobs in a Global Pool for CMS before Run 2

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Belforte, S.; Bockelman, B.; Gutsche, O.; Khan, F.; Larson, K.; Letts, J.; Mascheroni, M.; Mason, D.; McCrea, A.; Saiz-Santos, M.; Sfiligoi, I.

    2015-12-01

    The CMS experiment at the LHC relies on HTCondor and glideinWMS as its primary batch and pilot-based Grid provisioning system. So far we have been running several independent resource pools, but we are working on unifying them all to reduce the operational load and more effectively share resources between various activities in CMS. The major challenge of this unification activity is scale. The combined pool size is expected to reach 200K job slots, which is significantly bigger than any other multi-user HTCondor based system currently in production. To get there we have studied scaling limitations in our existing pools, the biggest of which tops out at about 70K slots, providing valuable feedback to the development communities, who have responded by delivering improvements which have helped us reach higher and higher scales with more stability. We have also worked on improving the organization and support model for this critical service during Run 2 of the LHC. This contribution will present the results of the scale testing and experiences from the first months of running the Global Pool.

  19. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    NASA Astrophysics Data System (ADS)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  20. Methane hydrates and the future of natural gas

    USGS Publications Warehouse

    Ruppel, Carolyn

    2011-01-01

    For decades, gas hydrates have been discussed as a potential resource, particularly for countries with limited access to conventional hydrocarbons or a strategic interest in establishing alternative, unconventional gas reserves. Methane has never been produced from gas hydrates at a commercial scale and, barring major changes in the economics of natural gas supply and demand, commercial production at a large scale is considered unlikely to commence within the next 15 years. Given the overall uncertainty still associated with gas hydrates as a potential resource, they have not been included in the EPPA model in MITEI’s Future of Natural Gas report. Still, gas hydrates remain a potentially large methane resource and must necessarily be included in any consideration of the natural gas supply beyond two decades from now.

  1. Klamath Basin: A Watershed Approach to Support Habitat Restoration, Species Recovery, and Water Resource Planning

    USGS Publications Warehouse

    VanderKooi, S.P.; Thorsteinson, L.

    2007-01-01

    Water allocation among human and natural resource uses in the American West is challenging. Western rivers have been largely managed for hydropower, irrigation, drinking water, and navigation. Today land and water use practices have gained importance, particularly as aging dams are faced with re-licensing requirements and provisions of the Endangered Species and Clean Water Acts. Rising demand for scarce water heightens the need for scientific research to predict consequences of management actions on habitats, human resource use, and fish and wildlife. Climate change, introduction of invasive species, or restoration of fish passage can have large, landscape-scaled consequences - research must expand to encompass the appropriate scale and by applying multiple scientific disciplines to complex ecosystem challenges improve the adaptive management framework for decision-making.

  2. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  3. Are there shared environmental influences on attention-deficit/hyperactivity disorder? Reply to Wood, Buitelaar, Rijsdijk, Asherson, and Kuntsi [corrected] (2010).

    PubMed

    Burt, S Alexandra

    2010-05-01

    A recent large-scale meta-analysis of twin and adoption studies indicated that shared environmental influences make important contributions to most forms of child and adolescent psychopathology (Burt, 2009b). The sole exception to this robust pattern of results was observed for attention-deficit/hyperactivity disorder (ADHD), which appeared to be largely genetic (and particularly nonadditive genetic) in origin, with no observable influence of the shared environment. The central thesis of Wood, Buitelaar, Rijsdijk, Asherson, and Kuntsi [corrected] (2010) is that, contrary to these findings, shared environmental influences are important for ADHD. As evidence for this thesis, Wood et al. presented a summary of prior twin studies, followed by a discussion of 4 methodological issues that may account for my findings in Burt (2009b). I argue that, although the methodological concerns raised by Wood et al. are very important, they do not undermine my earlier results (Burt, 2009b). I close with a discussion of 2 issues that may allow for some shared environmental influences on ADHD. (c) 2010 APA, all rights reserved.

  4. Aquatic synthesis for Voyageurs National Park

    USGS Publications Warehouse

    Kallemeyn, Larry A.; Holmberg, Kerry L.; Perry, Jim A.; Odde, Beth Y.

    2003-01-01

    Voyageurs National Park (VOYA), which was established in 1975, contains significant aquatic resources with about 50% of its total area of 883 km2 (341 mi2) consisting of aquatic habitats.  In addition to the Park's 30 named lakes, there are numerous wetlands including hundreds of beaver ponds.  Due to the Park's size and location in the drainage basin, aquatic resources within the Park are particularly susceptible to activities and developments that occur outside its' boundary.  This is particularly true in regard to the water quality and aquatic communities in the four large lakes that comprise 96% of the Park's total lake area of 34,400 ha (133 mi2).  Because most Park activities center on the lakes, particularly the large lakes, resource managers need to have knowledge and understanding of VOYA's aquatic resources to effectively preserve, in an unimpaired condition, the ecological processed, biological and cultural diversity, and history of the northwoods, lakecountry border shared with Canada.

  5. Army Logistician. Volume 34, Issue 2, March-April 2002

    DTIC Science & Technology

    2002-04-01

    potential coalition part- ners; and, • Leverage of U.S. resources through cost sharing and economies of scale.” DOD guidance focuses on the broad goals of...be funded privately through the Army Historical Foundation. • Operations research and operations management. • Engineering economy , life-cycle cost ...will be implemented this summer. A review of Army organizations below the headquarters level should be completed this spring. EDGEWOOD ENZYMATIC DECON

  6. Library Partnerships: Making Connections between School and Public Libraries

    ERIC Educational Resources Information Center

    Squires, Tasha

    2009-01-01

    Connecting to share ideas, resources, and programs offers school and public libraries an exciting means of achieving their own goals as well as those of the community at large. In this timely guide, young adult library consultant Tasha Squires delves into the many possible avenues for partnership, from summer reading programs to book talks to…

  7. Strategies and Exemplars for Public Outreach Events: Planning, Implementation, Evaluation

    NASA Astrophysics Data System (ADS)

    Cobb, W. H.; Buxner, S.; Shipp, S. S.; Shebby, S.

    2015-12-01

    IntroductionEach year the National Aeronautics and Space Administration (NASA) sponsors a variety of public outreach events to share information with educators, students, and the general public. These events are designed to increase interest in and awareness of the mission and goals of NASA. Planning and implementation best practices gleaned from the NASA SMD Education's review of large-scale events, "Best Practices in Outreach Events" will be shared. Outcomes from an event, i C Ceres, celebrating the Dawn mission's arrival at dwarf planet Ceres that utilized these strategies will be shared. Best practices included can be pertinent for all event organizers and evaluators regardless of event size. BackgroundThe literature review focused on identifying evaluations of large-scale public outreach events—and, within these evaluations, identifying best practices. The following criteria for identifying journal articles and reports to potentially include: Public, science-related events open to adults and children. Events with more than 1,000 attendees. Events that occurred during the last 5 years. Evaluations that included information on data collected from visitors and/or volunteers. Evaluations that specified the type of data collected, methodology, and associated results. Planning and Implementation Best PracticesThe literature review revealed key considerations for planning and of large-scale events implementing events. A summary of related best practices is presented below. 1) Advertise the event 2) Use and advertise access to scientists 3) Recruit scientists using these findings 4) Ensure that the event is group and particularly child friendly 5) Target specific event outcomes Best Practices Informing Real-world Planning, Implementation and EvaluationDawn mission's collaborative design of a series of events, i C Ceres, including in-person, interactive events geared to families and live presentations will be shared. Outcomes and lessons learned will be imparted rising from these events and their evaluation. There will be a focus on the family event, in particular the evidence that scientist participation was a particular driver for the event's impact and success.

  8. The Sequenced Angiosperm Genomes and Genome Databases.

    PubMed

    Chen, Fei; Dong, Wei; Zhang, Jiawei; Guo, Xinyue; Chen, Junhao; Wang, Zhengjia; Lin, Zhenguo; Tang, Haibao; Zhang, Liangsheng

    2018-01-01

    Angiosperms, the flowering plants, provide the essential resources for human life, such as food, energy, oxygen, and materials. They also promoted the evolution of human, animals, and the planet earth. Despite the numerous advances in genome reports or sequencing technologies, no review covers all the released angiosperm genomes and the genome databases for data sharing. Based on the rapid advances and innovations in the database reconstruction in the last few years, here we provide a comprehensive review for three major types of angiosperm genome databases, including databases for a single species, for a specific angiosperm clade, and for multiple angiosperm species. The scope, tools, and data of each type of databases and their features are concisely discussed. The genome databases for a single species or a clade of species are especially popular for specific group of researchers, while a timely-updated comprehensive database is more powerful for address of major scientific mysteries at the genome scale. Considering the low coverage of flowering plants in any available database, we propose construction of a comprehensive database to facilitate large-scale comparative studies of angiosperm genomes and to promote the collaborative studies of important questions in plant biology.

  9. Accelerating advances in continental domain hydrologic modeling

    USGS Publications Warehouse

    Archfield, Stacey A.; Clark, Martyn; Arheimer, Berit; Hay, Lauren E.; McMillan, Hilary; Kiang, Julie E.; Seibert, Jan; Hakala, Kirsti; Bock, Andrew R.; Wagener, Thorsten; Farmer, William H.; Andreassian, Vazken; Attinger, Sabine; Viglione, Alberto; Knight, Rodney; Markstrom, Steven; Over, Thomas M.

    2015-01-01

    In the past, hydrologic modeling of surface water resources has mainly focused on simulating the hydrologic cycle at local to regional catchment modeling domains. There now exists a level of maturity among the catchment, global water security, and land surface modeling communities such that these communities are converging toward continental domain hydrologic models. This commentary, written from a catchment hydrology community perspective, provides a review of progress in each community toward this achievement, identifies common challenges the communities face, and details immediate and specific areas in which these communities can mutually benefit one another from the convergence of their research perspectives. Those include: (1) creating new incentives and infrastructure to report and share model inputs, outputs, and parameters in data services and open access, machine-independent formats for model replication or reanalysis; (2) ensuring that hydrologic models have: sufficient complexity to represent the dominant physical processes and adequate representation of anthropogenic impacts on the terrestrial water cycle, a process-based approach to model parameter estimation, and appropriate parameterizations to represent large-scale fluxes and scaling behavior; (3) maintaining a balance between model complexity and data availability as well as uncertainties; and (4) quantifying and communicating significant advancements toward these modeling goals.

  10. The Sequenced Angiosperm Genomes and Genome Databases

    PubMed Central

    Chen, Fei; Dong, Wei; Zhang, Jiawei; Guo, Xinyue; Chen, Junhao; Wang, Zhengjia; Lin, Zhenguo; Tang, Haibao; Zhang, Liangsheng

    2018-01-01

    Angiosperms, the flowering plants, provide the essential resources for human life, such as food, energy, oxygen, and materials. They also promoted the evolution of human, animals, and the planet earth. Despite the numerous advances in genome reports or sequencing technologies, no review covers all the released angiosperm genomes and the genome databases for data sharing. Based on the rapid advances and innovations in the database reconstruction in the last few years, here we provide a comprehensive review for three major types of angiosperm genome databases, including databases for a single species, for a specific angiosperm clade, and for multiple angiosperm species. The scope, tools, and data of each type of databases and their features are concisely discussed. The genome databases for a single species or a clade of species are especially popular for specific group of researchers, while a timely-updated comprehensive database is more powerful for address of major scientific mysteries at the genome scale. Considering the low coverage of flowering plants in any available database, we propose construction of a comprehensive database to facilitate large-scale comparative studies of angiosperm genomes and to promote the collaborative studies of important questions in plant biology. PMID:29706973

  11. SWATShare- A Platform for Collaborative Hydrology Research and Education with Cyber-enabled Sharing, Running and Visualization of SWAT Models

    NASA Astrophysics Data System (ADS)

    Rajib, M. A.; Merwade, V.; Song, C.; Zhao, L.; Kim, I. L.; Zhe, S.

    2014-12-01

    Setting up of any hydrologic model requires a large amount of efforts including compilation of all the data, creation of input files, calibration and validation. Given the amount of efforts involved, it is possible that models for a watershed get created multiple times by multiple groups or organizations to accomplish different research, educational or policy goals. To reduce the duplication of efforts and enable collaboration among different groups or organizations around an already existing hydrology model, a platform is needed where anyone can search for existing models, perform simple scenario analysis and visualize model results. The creator and users of a model on such a platform can then collaborate to accomplish new research or educational objectives. From this perspective, a prototype cyber-infrastructure (CI), called SWATShare, is developed for sharing, running and visualizing Soil Water Assessment Tool (SWAT) models in an interactive GIS-enabled web environment. Users can utilize SWATShare to publish or upload their own models, search and download existing SWAT models developed by others, run simulations including calibration using high performance resources provided by XSEDE and Cloud. Besides running and sharing, SWATShare hosts a novel spatio-temporal visualization system for SWAT model outputs. In temporal scale, the system creates time-series plots for all the hydrology and water quality variables available along the reach as well as in watershed-level. In spatial scale, the system can dynamically generate sub-basin level thematic maps for any variable at any user-defined date or date range; and thereby, allowing users to run animations or download the data for subsequent analyses. In addition to research, SWATShare can also be used within a classroom setting as an educational tool for modeling and comparing the hydrologic processes under different geographic and climatic settings. SWATShare is publicly available at https://www.water-hub.org/swatshare.

  12. HydroShare: A Platform for Collaborative Data and Model Sharing in Hydrology

    NASA Astrophysics Data System (ADS)

    Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Couch, A.; Hooper, R. P.; Dash, P. K.; Stealey, M.; Yi, H.; Bandaragoda, C.; Castronova, A. M.

    2017-12-01

    HydroShare is an online, collaboration system for sharing of hydrologic data, analytical tools, and models. It supports the sharing of and collaboration around "resources" which are defined by standardized content types for data formats and models commonly used in hydrology. With HydroShare you can: Share your data and models with colleagues; Manage who has access to the content that you share; Share, access, visualize and manipulate a broad set of hydrologic data types and models; Use the web services application programming interface (API) to program automated and client access; Publish data and models and obtain a citable digital object identifier (DOI); Aggregate your resources into collections; Discover and access data and models published by others; Use web apps to visualize, analyze and run models on data in HydroShare. This presentation will describe the functionality and architecture of HydroShare highlighting its use as a virtual environment supporting education and research. HydroShare has components that support: (1) resource storage, (2) resource exploration, and (3) web apps for actions on resources. The HydroShare data discovery, sharing and publishing functions as well as HydroShare web apps provide the capability to analyze data and execute models completely in the cloud (servers remote from the user) overcoming desktop platform limitations. The HydroShare GIS app provides a basic capability to visualize spatial data. The HydroShare JupyterHub Notebook app provides flexible and documentable execution of Python code snippets for analysis and modeling in a way that results can be shared among HydroShare users and groups to support research collaboration and education. We will discuss how these developments can be used to support different types of educational efforts in Hydrology where being completely web based is of value in an educational setting as students can all have access to the same functionality regardless of their computer.

  13. Exploring Resource Sharing between Secondary School Teachers of Agriculture and Science Departments Nationally.

    ERIC Educational Resources Information Center

    Dormody, Thomas J.

    1992-01-01

    A survey of 372 secondary agriculture teachers received 274 responses showing a majority of agriculture and science departments share resources, although at low levels. Many more predicted future sharing. Equipment and supplies were most often shared, instructional services least often. (SK)

  14. Representing Hydrologic Models as HydroShare Resources to Facilitate Model Sharing and Collaboration

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Goodall, J. L.; Mbewe, P.

    2013-12-01

    The CUAHSI HydroShare project is a collaborative effort that aims to provide software for sharing data and models within the hydrologic science community. One of the early focuses of this work has been establishing metadata standards for describing models and model-related data as HydroShare resources. By leveraging this metadata definition, a prototype extension has been developed to create model resources that can be shared within the community using the HydroShare system. The extension uses a general model metadata definition to create resource objects, and was designed so that model-specific parsing routines can extract and populate metadata fields from model input and output files. The long term goal is to establish a library of supported models where, for each model, the system has the ability to extract key metadata fields automatically, thereby establishing standardized model metadata that will serve as the foundation for model sharing and collaboration within HydroShare. The Soil Water & Assessment Tool (SWAT) is used to demonstrate this concept through a case study application.

  15. Outbreaks associated to large open air festivals, including music festivals, 1980 to 2012.

    PubMed

    Botelho-Nevers, E; Gautret, P

    2013-03-14

    In the minds of many, large scale open air festivals have become associated with spring and summer, attracting many people, and in the case of music festivals, thousands of music fans. These festivals share the usual health risks associated with large mass gatherings, including transmission of communicable diseases and risk of outbreaks. Large scale open air festivals have however specific characteristics, including outdoor settings, on-site housing and food supply and the generally young age of the participants. Outbreaks at large scale open air festivals have been caused by Cryptosporium parvum, Campylobacter spp., Escherichia coli, Salmonella enterica, Shigella sonnei, Staphylococcus aureus, hepatitis A virus, influenza virus, measles virus, mumps virus and norovirus. Faecal-oral and respiratory transmissions of pathogens result from non-compliance with hygiene rules, inadequate sanitation and insufficient vaccination coverage. Sexual transmission of infectious diseases may also occur and is likely to be underestimated and underreported. Enhanced surveillance during and after festivals is essential. Preventive measures such as immunisations of participants and advice on-site and via social networks should be considered to reduce outbreaks at these large scale open air festivals.

  16. MaRaCluster: A Fragment Rarity Metric for Clustering Fragment Spectra in Shotgun Proteomics.

    PubMed

    The, Matthew; Käll, Lukas

    2016-03-04

    Shotgun proteomics experiments generate large amounts of fragment spectra as primary data, normally with high redundancy between and within experiments. Here, we have devised a clustering technique to identify fragment spectra stemming from the same species of peptide. This is a powerful alternative method to traditional search engines for analyzing spectra, specifically useful for larger scale mass spectrometry studies. As an aid in this process, we propose a distance calculation relying on the rarity of experimental fragment peaks, following the intuition that peaks shared by only a few spectra offer more evidence than peaks shared by a large number of spectra. We used this distance calculation and a complete-linkage scheme to cluster data from a recent large-scale mass spectrometry-based study. The clusterings produced by our method have up to 40% more identified peptides for their consensus spectra compared to those produced by the previous state-of-the-art method. We see that our method would advance the construction of spectral libraries as well as serve as a tool for mining large sets of fragment spectra. The source code and Ubuntu binary packages are available at https://github.com/statisticalbiotechnology/maracluster (under an Apache 2.0 license).

  17. Converting ODM Metadata to FHIR Questionnaire Resources.

    PubMed

    Doods, Justin; Neuhaus, Philipp; Dugas, Martin

    2016-01-01

    Interoperability between systems and data sharing between domains is becoming more and more important. The portal medical-data-models.org offers more than 5.300 UMLS annotated forms in CDISC ODM format in order to support interoperability, but several additional export formats are available. CDISC's ODM and HL7's framework FHIR Questionnaire resource were analyzed, a mapping between elements created and a converter implemented. The developed converter was integrated into the portal with FHIR Questionnaire XML or JSON download options. New FHIR applications can now use this large library of forms.

  18. Semantic Concept Discovery for Large Scale Zero Shot Event Detection

    DTIC Science & Technology

    2015-07-25

    sources and can be shared among many different events, including unseen ones. Based on this idea, events can be detected by inspect- ing the individual...2013]. Partial success along this vein has also been achieved in the zero-shot setting, e.g. [Habibian et al., 2014; Wu et al., 2014], but the...candle”, “birthday cake” and “applaud- ing”. Since concepts are shared among many different classes (events) and each concept classifier can be trained

  19. Avoiding and tolerating latency in large-scale next-generation shared-memory multiprocessors

    NASA Technical Reports Server (NTRS)

    Probst, David K.

    1993-01-01

    A scalable solution to the memory-latency problem is necessary to prevent the large latencies of synchronization and memory operations inherent in large-scale shared-memory multiprocessors from reducing high performance. We distinguish latency avoidance and latency tolerance. Latency is avoided when data is brought to nearby locales for future reference. Latency is tolerated when references are overlapped with other computation. Latency-avoiding locales include: processor registers, data caches used temporally, and nearby memory modules. Tolerating communication latency requires parallelism, allowing the overlap of communication and computation. Latency-tolerating techniques include: vector pipelining, data caches used spatially, prefetching in various forms, and multithreading in various forms. Relaxing the consistency model permits increased use of avoidance and tolerance techniques. Each model is a mapping from the program text to sets of partial orders on program operations; it is a convention about which temporal precedences among program operations are necessary. Information about temporal locality and parallelism constrains the use of avoidance and tolerance techniques. Suitable architectural primitives and compiler technology are required to exploit the increased freedom to reorder and overlap operations in relaxed models.

  20. Water limited agriculture in Africa: Climate change sensitivity of large scale land investments

    NASA Astrophysics Data System (ADS)

    Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.

    2015-12-01

    The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.

  1. LSD: Large Survey Database framework

    NASA Astrophysics Data System (ADS)

    Juric, Mario

    2012-09-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures.

  2. A Bibliographic Bank for Resource Sharing in Library Systems: A Feasibility Study. Final Report.

    ERIC Educational Resources Information Center

    Schwartz, Eugene S.; Saxe, Henry I.

    This study of resource sharing among public libraries was made possible by six library systems in northern Illinois. With the organization of the library systems and development of interlibrary loan services and other cooperative activities, the problem of extending resource sharing among member libraries and between library systems arose. Several…

  3. Shared Resources

    Treesearch

    David B. Butts

    1987-01-01

    Wildfires do not respect property boundaries. Whole geographic regions are typically impacted by major wildfire outbreaks. Various fire related resources can be shared to solve such crises; whether they are shared, and how they are shared depends to a great extent upon the rapport among the agencies involved. Major progress has been achieved over the past decade...

  4. Shared communications. Volume I, a summary and literature review

    DOT National Transportation Integrated Search

    2004-09-01

    This paper provides a review of examples from the literature of shared communication resources and of agencies and/or organizations that share communication resources. The primary emphasis is on rural, intelligent transportation system communications...

  5. Oral Narrative Genres as Dialogic Resources for Classroom Literature Study: A Contextualized Case Study of Conversational Narrative Discussion

    ERIC Educational Resources Information Center

    Juzwik, Mary M.; Nystrand, Martin; Kelly, Sean; Sherry, Michael B.

    2008-01-01

    Five questions guided a case study exploring the relationship between oral narrative and discussion in middle school literature study: (a) Relative to similar classrooms in a large-scale study, how can overall literature instruction be characterized? (b) Relative to similar classrooms in a large-scale study, how well do students achieve in the…

  6. Generating community-built tools for data sharing and analysis in environmental networks

    USGS Publications Warehouse

    Read, Jordan S.; Gries, Corinna; Read, Emily K.; Klug, Jennifer; Hanson, Paul C.; Hipsey, Matthew R.; Jennings, Eleanor; O'Reilley, Catherine; Winslow, Luke A.; Pierson, Don; McBride, Christopher G.; Hamilton, David

    2016-01-01

    Rapid data growth in many environmental sectors has necessitated tools to manage and analyze these data. The development of tools often lags behind the proliferation of data, however, which may slow exploratory opportunities and scientific progress. The Global Lake Ecological Observatory Network (GLEON) collaborative model supports an efficient and comprehensive data–analysis–insight life cycle, including implementations of data quality control checks, statistical calculations/derivations, models, and data visualizations. These tools are community-built and openly shared. We discuss the network structure that enables tool development and a culture of sharing, leading to optimized output from limited resources. Specifically, data sharing and a flat collaborative structure encourage the development of tools that enable scientific insights from these data. Here we provide a cross-section of scientific advances derived from global-scale analyses in GLEON. We document enhancements to science capabilities made possible by the development of analytical tools and highlight opportunities to expand this framework to benefit other environmental networks.

  7. Assessing Public Metabolomics Metadata, Towards Improving Quality.

    PubMed

    Ferreira, João D; Inácio, Bruno; Salek, Reza M; Couto, Francisco M

    2017-12-13

    Public resources need to be appropriately annotated with metadata in order to make them discoverable, reproducible and traceable, further enabling them to be interoperable or integrated with other datasets. While data-sharing policies exist to promote the annotation process by data owners, these guidelines are still largely ignored. In this manuscript, we analyse automatic measures of metadata quality, and suggest their application as a mean to encourage data owners to increase the metadata quality of their resources and submissions, thereby contributing to higher quality data, improved data sharing, and the overall accountability of scientific publications. We analyse these metadata quality measures in the context of a real-world repository of metabolomics data (i.e. MetaboLights), including a manual validation of the measures, and an analysis of their evolution over time. Our findings suggest that the proposed measures can be used to mimic a manual assessment of metadata quality.

  8. Optimizing CMS build infrastructure via Apache Mesos

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-01

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  9. All inequality is not equal: children correct inequalities using resource value.

    PubMed

    Shaw, Alex; Olson, Kristina R

    2013-01-01

    Fairness concerns guide children's judgments about how to share resources with others. However, it is unclear from past research if children take extant inequalities or the value of resources involved in an inequality into account when sharing with others; these questions are the focus of the current studies. In all experiments, children saw an inequality between two recipients-one had two more resources than another. What varied between conditions was the value of the resources that the child could subsequently distribute. When the resources were equal in value to those involved in the original inequality, children corrected the previous inequality by giving two resources to the child with fewer resources (Experiment 1). However, as the value of the resources increased relative to those initially shared by the experimenter, children were more likely to distribute the two high value resources equally between the two recipients, presumably to minimize the overall inequality in value (Experiments 1 and 2). We found that children specifically use value, not just size, when trying to equalize outcomes (Experiment 3) and further found that children focus on the relative rather than absolute value of the resources they share-when the experimenter had unequally distributed the same high value resource that the child would later share, children corrected the previous inequality by giving two high value resources to the person who had received fewer high value resources. These results illustrate that children attempt to correct past inequalities and try to maintain equality not just in the count of resources but also by using the value of resources.

  10. Dissecting the large-scale galactic conformity

    NASA Astrophysics Data System (ADS)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  11. Learning about water resource sharing through game play

    NASA Astrophysics Data System (ADS)

    Ewen, Tracy; Seibert, Jan

    2016-10-01

    Games are an optimal way to teach about water resource sharing, as they allow real-world scenarios to be enacted. Both students and professionals learning about water resource management can benefit from playing games, through the process of understanding both the complexity of sharing of resources between different groups and decision outcomes. Here we address how games can be used to teach about water resource sharing, through both playing and developing water games. An evaluation of using the web-based game Irrigania in the classroom setting, supported by feedback from several educators who have used Irrigania to teach about the sustainable use of water resources, and decision making, at university and high school levels, finds Irrigania to be an effective and easy tool to incorporate into a curriculum. The development of two water games in a course for masters students in geography is also presented as a way to teach and communicate about water resource sharing. Through game development, students learned soft skills, including critical thinking, problem solving, team work, and time management, and overall the process was found to be an effective way to learn about water resource decision outcomes. This paper concludes with a discussion of learning outcomes from both playing and developing water games.

  12. Cloud computing for genomic data analysis and collaboration.

    PubMed

    Langmead, Ben; Nellore, Abhinav

    2018-04-01

    Next-generation sequencing has made major strides in the past decade. Studies based on large sequencing data sets are growing in number, and public archives for raw sequencing data have been doubling in size every 18 months. Leveraging these data requires researchers to use large-scale computational resources. Cloud computing, a model whereby users rent computers and storage from large data centres, is a solution that is gaining traction in genomics research. Here, we describe how cloud computing is used in genomics for research and large-scale collaborations, and argue that its elasticity, reproducibility and privacy features make it ideally suited for the large-scale reanalysis of publicly available archived data, including privacy-protected data.

  13. Large-scale virtual screening on public cloud resources with Apache Spark.

    PubMed

    Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola

    2017-01-01

    Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.

  14. Body size mediated coexistence of consumers competing for resources in space

    USGS Publications Warehouse

    Basset, A.; Angelis, D.L.

    2007-01-01

    Body size is a major phenotypic trait of individuals that commonly differentiates co-occurring species. We analyzed inter-specific competitive interactions between a large consumer and smaller competitors, whose energetics, selection and giving-up behaviour on identical resource patches scaled with individual body size. The aim was to investigate whether pure metabolic constraints on patch behaviour of vagile species can determine coexistence conditions consistent with existing theoretical and experimental evidence. We used an individual-based spatially explicit simulation model at a spatial scale defined by the home range of the large consumer, which was assumed to be parthenogenic and semelparous. Under exploitative conditions, competitive coexistence occurred in a range of body size ratios between 2 and 10. Asymmetrical competition and the mechanism underlying asymmetry, determined by the scaling of energetics and patch behaviour with consumer body size, were the proximate determinant of inter-specific coexistence. The small consumer exploited patches more efficiently, but searched for profitable patches less effectively than the larger competitor. Therefore, body-size related constraints induced niche partitioning, allowing competitive coexistence within a set of conditions where the large consumer maintained control over the small consumer and resource dynamics. The model summarises and extends the existing evidence of species coexistence on a limiting resource, and provides a mechanistic explanation for decoding the size-abundance distribution patterns commonly observed at guild and community levels. ?? Oikos.

  15. PubMed Central

    Fernandes, Pedro; Gevaert, Kris; Rothacker, Julie; Saiyed, Taslimarif; Detwiler, Michelle

    2012-01-01

    This roundtable will feature four international speakers who will discuss national and international collaborative initiatives and outreach efforts in which they participate. They will share how these efforts have facilitated access to cutting-edge technology, fostered new generations of scientists, and ultimately advanced the progression of global scientific research. Open discussion will follow the presentations! Centre for Cellular and Molecular Platforms, National Centre for Biological Sciences, India: experiences in implementing a national high-end core facility organization with the goals of improving regional technology access and enhancing the quality of research for scientists in academia, biotechnology companies, and the biopharmaceutical industry.Monash University Technology Platforms and Broader Victorian and Australian Networks: Australian initiatives to build global research capabilities and identify means to internationally benchmark regional capabilities to ensure delivery of world class infrastructure. Within the context of the current Australian strategic framework, funding considerations will be discussed, along with expectations for partner facilities to collaborate and be fully accessible to academia and industry.Instituto Gulbenkian de Ciencia, Portugal and beyond: Multiple roles of networking in science and extending outreach while consolidating community integration. Discussion will include achievement of community building and integration using concepts of sharing, training, resource availability, and the value and empowerment gained using acquired skills. The role of networking and institutional visibility will also be discussed.PRIME-XS: This EU-funded consortium provides an infrastructure of proteomics technologies to the European research community. The core is formed by six access facilities through which the consortium provides access to their technologies. Twelve partners work together to develop new resources to aid the community including the development of bioinformatic tools to analyze large-scale proteomics data and novel technologies to analyze protein interaction networks, post-translational modifications and more sensitive ways to detect protein and peptide biomarkers in complex samples.

  16. Divergent evolution of arrested development in the dauer stage of Caenorhabditis elegans and the infective stage of Heterodera glycines

    PubMed Central

    Elling, Axel A; Mitreva, Makedonka; Recknor, Justin; Gai, Xiaowu; Martin, John; Maier, Thomas R; McDermott, Jeffrey P; Hewezi, Tarek; McK Bird, David; Davis, Eric L; Hussey, Richard S; Nettleton, Dan; McCarter, James P; Baum, Thomas J

    2007-01-01

    Background The soybean cyst nematode Heterodera glycines is the most important parasite in soybean production worldwide. A comprehensive analysis of large-scale gene expression changes throughout the development of plant-parasitic nematodes has been lacking to date. Results We report an extensive genomic analysis of H. glycines, beginning with the generation of 20,100 expressed sequence tags (ESTs). In-depth analysis of these ESTs plus approximately 1,900 previously published sequences predicted 6,860 unique H. glycines genes and allowed a classification by function using InterProScan. Expression profiling of all 6,860 genes throughout the H. glycines life cycle was undertaken using the Affymetrix Soybean Genome Array GeneChip. Our data sets and results represent a comprehensive resource for molecular studies of H. glycines. Demonstrating the power of this resource, we were able to address whether arrested development in the Caenorhabditis elegans dauer larva and the H. glycines infective second-stage juvenile (J2) exhibits shared gene expression profiles. We determined that the gene expression profiles associated with the C. elegans dauer pathway are not uniformly conserved in H. glycines and that the expression profiles of genes for metabolic enzymes of C. elegans dauer larvae and H. glycines infective J2 are dissimilar. Conclusion Our results indicate that hallmark gene expression patterns and metabolism features are not shared in the developmentally arrested life stages of C. elegans and H. glycines, suggesting that developmental arrest in these two nematode species has undergone more divergent evolution than previously thought and pointing to the need for detailed genomic analyses of individual parasite species. PMID:17919324

  17. Training and supervision of community health workers conducting population-based, noninvasive screening for CVD in LMIC: implications for scaling up.

    PubMed

    Abrahams-Gessel, Shafika; Denman, Catalina A; Montano, Carlos Mendoza; Gaziano, Thomas A; Levitt, Naomi; Rivera-Andrade, Alvaro; Carrasco, Diana Munguía; Zulu, Jabu; Khanam, Masuma Akter; Puoane, Thandi

    2015-03-01

    Community health workers (CHW) can screen for cardiovascular disease risk as well as health professionals using a noninvasive screening tool. However, this demonstrated success does not guarantee effective scaling of the intervention to a population level. This study sought to report lessons learned from supervisors' experiences monitoring CHW and perceptions of other stakeholders regarding features for successful scaling of interventions that incorporate task-sharing with CHW. We conducted a qualitative analysis of in-depth interviews to explore stakeholder perceptions. Data was collected through interviews of 36 supervisors and administrators at nongovernmental organizations contracted to deliver and manage primary care services using CHW, directors, and staff at the government health care clinics, and officials from the departments of health responsible for the implementation of health policy. CHW are recognized for their value in offsetting severe human resource shortages and for their expert community knowledge. There is a lack of clear definitions for roles, expectations, and career paths for CHW. Formal evaluation and supervisory systems are highly desirable but nonexistent or poorly implemented, creating a critical deficit for effective implementation of programs using task-sharing. There is acknowledgment of environmental challenges (e.g., safety) and systemic challenges (e.g., respect from trained health professionals) that hamper the effectiveness of CHW. The government-community relationships presumed to form the basis of redesigned health care services have to be supported more explicitly and consistently on both sides in order to increase the acceptability of CHW and their effectiveness. The criteria critical for successful scaling of CHW-led screening are consistent with evidence for scaling-up communicable disease programs. Policy makers have to commit appropriate levels of resources and political will to ensure successful scaling of this intervention. Copyright © 2015 World Heart Federation (Geneva). Published by Elsevier B.V. All rights reserved.

  18. Lessons from Training and Supervision of Community Health Workers conducting non-invasive, population-based screening for Cardiovascular Disease in Four Communities in Low and Middle-Income Settings: Implications for Scaling Up

    PubMed Central

    Denman, Catalina A.; Montano, Carlos Mendoza; Gaziano, Thomas A.; Levitt, Naomi; Rivera-Andrade, Alvaro; Carrasco, Diana Munguía; Zulu, Jabu; Khanam, Masuma Akter; Puoane, Thandi

    2015-01-01

    Background Community health workers (CHWs) can screen for cardiovascular disease (CVD) risk as well as health professionals using a non-invasive screening tool (data unpublished). However, this demonstrated success does not guarantee effective scaling of the intervention to a population level. Objectives To report lessons learned from supervisors’ experiences monitoring CHWs and perceptions of other stakeholders regarding features for successful scaling of interventions which incorporate task-sharing with CHWs. Methods We conducted a qualitative analysis of in-depth interviews to explore stakeholder perceptions. Data was collected through interviews of 36 supervisors and administrators at non-governmental organizations contracted to deliver and manage primary care services using CHWs, directors and staff at the government health care clinics, and officials from the departments of health responsible for the implementation of health policy. Results CHWs are recognized for their value in offsetting severe human resource shortages and for their expert community knowledge. There is a lack of clear definitions for roles, expectations, and career paths for CHWs. Formal evaluation and supervisory systems are highly desirable but nonexistent or poorly implemented, creating a critical deficit for effective implementation of programs utilizing task sharing. There is acknowledgement of environmental challenges (e.g. safety) and systemic challenges (e.g. respect from trained health professionals) that hamper the effectiveness of CHWs. The government-community relationships presumed to form the basis of redesigned health care services have to be supported more explicitly and consistently on both sides in order to increase the acceptability of CHWs and their effectiveness. Conclusions The criteria critical for successful scaling of CHW led screening are consistent with evidence for scaling up communicable disease programs. Policy makers have to commit appropriate levels of resources and political will to ensure successful scaling of this intervention. PMID:25754565

  19. Living in a network of scaling cities and finite resources.

    PubMed

    Qubbaj, Murad R; Shutters, Shade T; Muneepeerakul, Rachata

    2015-02-01

    Many urban phenomena exhibit remarkable regularity in the form of nonlinear scaling behaviors, but their implications on a system of networked cities has never been investigated. Such knowledge is crucial for our ability to harness the complexity of urban processes to further sustainability science. In this paper, we develop a dynamical modeling framework that embeds population-resource dynamics-a generalized Lotka-Volterra system with modifications to incorporate the urban scaling behaviors-in complex networks in which cities may be linked to the resources of other cities and people may migrate in pursuit of higher welfare. We find that isolated cities (i.e., no migration) are susceptible to collapse if they do not have access to adequate resources. Links to other cities may help cities that would otherwise collapse due to insufficient resources. The effects of inter-city links, however, can vary due to the interplay between the nonlinear scaling behaviors and network structure. The long-term population level of a city is, in many settings, largely a function of the city's access to resources over which the city has little or no competition. Nonetheless, careful investigation of dynamics is required to gain mechanistic understanding of a particular city-resource network because cities and resources may collapse and the scaling behaviors may influence the effects of inter-city links, thereby distorting what topological metrics really measure.

  20. Resource allocation in shared spectrum access communications for operators with diverse service requirements

    NASA Astrophysics Data System (ADS)

    Kibria, Mirza Golam; Villardi, Gabriel Porto; Ishizu, Kentaro; Kojima, Fumihide; Yano, Hiroyuki

    2016-12-01

    In this paper, we study inter-operator spectrum sharing and intra-operator resource allocation in shared spectrum access communication systems and propose efficient dynamic solutions to address both inter-operator and intra-operator resource allocation optimization problems. For inter-operator spectrum sharing, we present two competent approaches, namely the subcarrier gain-based sharing and fragmentation-based sharing, which carry out fair and flexible allocation of the available shareable spectrum among the operators subject to certain well-defined sharing rules, traffic demands, and channel propagation characteristics. The subcarrier gain-based spectrum sharing scheme has been found to be more efficient in terms of achieved throughput. However, the fragmentation-based sharing is more attractive in terms of computational complexity. For intra-operator resource allocation, we consider resource allocation problem with users' dissimilar service requirements, where the operator supports users with delay constraint and non-delay constraint service requirements, simultaneously. This optimization problem is a mixed-integer non-linear programming problem and non-convex, which is computationally very expensive, and the complexity grows exponentially with the number of integer variables. We propose less-complex and efficient suboptimal solution based on formulating exact linearization, linear approximation, and convexification techniques for the non-linear and/or non-convex objective functions and constraints. Extensive simulation performance analysis has been carried out that validates the efficiency of the proposed solution.

  1. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows

    PubMed Central

    O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774

  2. The Dockstore: enabling modular, community-focused sharing of Docker-based genomics tools and workflows.

    PubMed

    O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent

    2017-01-01

    As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).

  3. Women in Planetary Science: Career Resources and e-Mentoring on Blogs, Twitter, Facebook, Google+, and Pinterest

    NASA Astrophysics Data System (ADS)

    Niebur, S. M.; Singer, K.; Gardner-Vandy, K.

    2012-08-01

    Fifty-one interviews with women in planetary science are now available as an e-mentoring and teaching resource on WomeninPlanetaryScience.com. Each scientist was nominated and interviewed by a fellow member of the planetary science community, and each gladly shared her advice for advancement in the field. Women in Planetary Science was founded in 2008 to connect communities of current and prospective scientists, to promote proposal and award opportunities, and to stimulate discussion in the planetary science community at large. Regular articles, or posts, by nearly a dozen collaborators highlight a range of current issues for women in this field. These articles are promoted by collaborators on Twitter, Facebook, and Google+ and shared again by the collaborators' contacts, reaching a significantly wider audience. The group's latest project, on Pinterest, is a crowd-sourced photo gallery of more than 350 inspiring women in planetary science; each photo links to the scientist's CV. The interviews, the essays, and the photo gallery are available online as resources for prospective scientists, planetary scientists, parents, and educators.

  4. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    NASA Astrophysics Data System (ADS)

    Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.

    2013-12-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.

  5. Designing the Bridge: Perceptions and Use of Downscaled Climate Data by Climate Modelers and Resource Managers in Hawaii

    NASA Astrophysics Data System (ADS)

    Keener, V. W.; Brewington, L.; Jaspers, K.

    2016-12-01

    To build an effective bridge from the climate modeling community to natural resource managers, we assessed the existing landscape to see where different groups diverge in their perceptions of climate data and needs. An understanding of a given community's shared knowledge and differences can help design more actionable science. Resource managers in Hawaii are eager to have future climate projections at spatial scales relevant to the islands. National initiatives to downscale climate data often exclude US insular regions, so researchers in Hawaii have generated regional dynamically and statistically downscaled projections. Projections of precipitation diverge, however, leading to difficulties in communication and use. Recently, a two day workshop was held with scientists and managers to evaluate available models and determine a set of best practices for moving forward with decision-relevant downscaling in Hawaii. To seed the discussion, the Pacific Regional Integrated Sciences and Assessments (RISA) program conducted a pre-workshop survey (N=65) of climate modelers and freshwater, ecosystem, and wildfire managers working in Hawaii. Scientists reported spending less than half of their time on operational research, although the majority was eager to partner with managers on specific projects. Resource managers had varying levels of familiarity with downscaled climate projections, but reported needing more information about uncertainty for decision making, and were less interested in the technical model details. There were large differences between groups of managers, with 41.7% of freshwater managers reporting that they used climate projections regularly, while a majority of ecosystem and wildfire managers reported having "no familiarity". Scientists and managers rated which spatial and temporal scales were most relevant to decision making. Finally, when asked to compare how confident they were in projections of specific climate variables between the dynamical and statistical data, 80-90% of managers responded that they had no opinion. Workshop attendees were very interested in the survey results, adding to evidence of a need for sustained engagement between modeler and user groups, as well as different strategies for working with different types of resource managers.

  6. Development of high performance scientific components for interoperability of computing packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gulabani, Teena Pratap

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achievedmore » by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.« less

  7. CaseWorld™: Interactive, media rich, multidisciplinary case based learning.

    PubMed

    Gillham, David; Tucker, Katie; Parker, Steve; Wright, Victoria; Kargillis, Christina

    2015-11-01

    Nurse educators are challenged to keep up with highly specialised clinical practice, emerging research evidence, regulation requirements and rapidly changing information technology while teaching very large numbers of diverse students in a resource constrained environment. This complex setting provides the context for the CaseWorld project, which aims to simulate those aspects of clinical practice that can be represented by e-learning. This paper describes the development, implementation and evaluation of CaseWorld, a simulated learning environment that supports case based learning. CaseWorld provides nursing students with the opportunity to view unfolding authentic cases presented in a rich multimedia context. The first round of comprehensive summative evaluation of CaseWorld is discussed in the context of earlier formative evaluation, reference group input and strategies for integration of CaseWorld with subject content. This discussion highlights the unique approach taken in this project that involved simultaneous prototype development and large scale implementation, thereby necessitating strong emphasis on staff development, uptake and engagement. The lessons learned provide an interesting basis for further discussion of broad content sharing across disciplines and universities, and the contribution that local innovations can make to global education advancement. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  8. Lost in Cloud

    NASA Technical Reports Server (NTRS)

    Maluf, David A.; Shetye, Sandeep D.; Chilukuri, Sri; Sturken, Ian

    2012-01-01

    Cloud computing can reduce cost significantly because businesses can share computing resources. In recent years Small and Medium Businesses (SMB) have used Cloud effectively for cost saving and for sharing IT expenses. With the success of SMBs, many perceive that the larger enterprises ought to move into Cloud environment as well. Government agency s stove-piped environments are being considered as candidates for potential use of Cloud either as an enterprise entity or pockets of small communities. Cloud Computing is the delivery of computing as a service rather than as a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network. Underneath the offered services, there exists a modern infrastructure cost of which is often spread across its services or its investors. As NASA is considered as an Enterprise class organization, like other enterprises, a shift has been occurring in perceiving its IT services as candidates for Cloud services. This paper discusses market trends in cloud computing from an enterprise angle and then addresses the topic of Cloud Computing for NASA in two possible forms. First, in the form of a public Cloud to support it as an enterprise, as well as to share it with the commercial and public at large. Second, as a private Cloud wherein the infrastructure is operated solely for NASA, whether managed internally or by a third-party and hosted internally or externally. The paper addresses the strengths and weaknesses of both paradigms of public and private Clouds, in both internally and externally operated settings. The content of the paper is from a NASA perspective but is applicable to any large enterprise with thousands of employees and contractors.

  9. Deep-sea genetic resources: New frontiers for science and stewardship in areas beyond national jurisdiction

    NASA Astrophysics Data System (ADS)

    Harden-Davies, Harriet

    2017-03-01

    The deep-sea is a large source of marine genetic resources (MGR), which have many potential uses and are a growing area of research. Much of the deep-sea lies in areas beyond national jurisdiction (ABNJ), including 65% of the global ocean. MGR in ABNJ occupy a significant gap in the international legal framework. Access and benefit sharing of MGR is a key issue in the development of a new international legally-binding instrument under the United Nations Convention on the Law of the Sea (UNCLOS) for the conservation and sustainable use of marine biological diversity in ABNJ. This paper examines how this is relevant to deep-sea scientific research and identifies emerging challenges and opportunities. There is no internationally agreed definition of MGR, however, deep-sea genetic resources could incorporate any biological material including genes, proteins and natural products. Deep-sea scientific research is the key actor accessing MGR in ABNJ and sharing benefits such as data, samples and knowledge. UNCLOS provides the international legal framework for marine scientific research, international science cooperation, capacity building and marine technology transfer. Enhanced implementation could support access and benefit sharing of MGR in ABNJ. Deep-sea scientific researchers could play an important role in informing practical new governance solutions for access and benefit sharing of MGR that promote scientific research in ABNJ and support deep-sea stewardship. Advancing knowledge of deep-sea biodiversity in ABNJ, enhancing open-access to data and samples, standardisation and international marine science cooperation are significant potential opportunity areas.

  10. Perspectives of the optical coherence tomography community on code and data sharing

    NASA Astrophysics Data System (ADS)

    Lurie, Kristen L.; Mistree, Behram F. T.; Ellerbee, Audrey K.

    2015-03-01

    As optical coherence tomography (OCT) grows to be a mature and successful field, it is important for the research community to develop a stronger practice of sharing code and data. A prolific culture of sharing can enable new and emerging laboratories to enter the field, allow research groups to gain new exposure and notoriety, and enable benchmarking of new algorithms and methods. Our long-term vision is to build tools to facilitate a stronger practice of sharing within this community. In line with this goal, our first aim was to understand the perceptions and practices of the community with respect to sharing research contributions (i.e., as code and data). We surveyed 52 members of the OCT community using an online polling system. Our main findings indicate that while researchers infrequently share their code and data, they are willing to contribute their research resources to a shared repository, and they believe that such a repository would benefit both their research and the OCT community at large. We plan to use the results of this survey to design a platform targeted to the OCT research community - an effort that ultimately aims to facilitate a more prolific culture of sharing.

  11. Liquidation sales: Land speculation and landscape change

    NASA Astrophysics Data System (ADS)

    Lazarus, E.

    2012-12-01

    Large-scale land-use transitions can occur with astonishing speed, and landscape stability can change with equal suddenness: for example, the catastrophic dustbowl that paralyzed the Midwestern US in the early 1930s came barely 40 years after the derby for homestead land in Oklahoma in 1889. Some human-landscape systems, like the large prehistoric settlements in the Brazilian Amazon, persisted for centuries without environmental collapse. Others quickly exhausted all of the environmental resources available, as occurred with phosphate mining on the Pacific Island of Nauru. Although abrupt shifts from resource plenty to resource scarcity are theoretically interesting for their complexity, the very real consequences of modern social and environmental boom-bust dynamics can catalyze humanitarian crises. Drawing on historical examples and investigative reporting of current events, I explore the hypothesis that land speculation drives rapid transitions in physical landscapes at large spatial scales. "Land grabs" is one of four core environmental justice and equality issues Oxfam International is targeting in its GROW campaign, citing evidence that foreign investors are buying up vast tracts of land in developing countries, and as a consequence exacerbating food scarcity and marginalization of poor families. Al Jazeera has reported extensively on land-rights disputes in Honduras and investment deals involving foreign ownership of large areas of agricultural land in New Zealand, India, Africa, and South America. Overlapping coverage has also appeared in the New York Times, the Washington Post, the BBC News, the Guardian, and other outlets. Although land itself is only one kind of natural resource, land rights typically determine access to other natural resources (e.g. water, timber, minerals, fossil fuels). Consideration of land speculation therefore includes speculative bubbles in natural-resource markets more broadly. There are categorical commonalities in agricultural change and deforestation around the world. Although the details differ at local scales, even disparate cases of land use and landscape changes may express similar patterns and structures. Records of sediment flux in salt marshes and fluvial deposits indicate rates of past landscape responses to human activities; the 1930s dustbowl event left a sedimentary signature in western North American lakes. Petrochemicals and fertilizers from agricultural runnoff are causing hypoxic dead zones in coastal waters to expand. In the Brazilian Amazon, regional-scale changes in weather and climate have been linked to deforestation, and deforestation has been linked to patterns of boom-bust development. But even when rampant land acquisition for agriculture or housing has been identified as problematic, the attendant environmental consequences are not necessarily obvious. The nonlinear attenuation of cause and effect is a function of the hierarchy of scales that typify these complex, human-landscape systems: the emergence of long-term, large-scale environmental dynamics lag behind the short-term, localized dynamics of a resource bubble. Insight into how these coupled systems behave may reveal the scales at which government, institutional, or self-organized social intervention may be most effective, and presents an opportunity to integrate evolving spheres of research from the behavioural sciences and Earth-surface processes.

  12. Development of a consent resource for genomic data sharing in the clinical setting.

    PubMed

    Riggs, Erin Rooney; Azzariti, Danielle R; Niehaus, Annie; Goehringer, Scott R; Ramos, Erin M; Rodriguez, Laura Lyman; Knoppers, Bartha; Rehm, Heidi L; Martin, Christa Lese

    2018-06-13

    Data sharing between clinicians, laboratories, and patients is essential for improvements in genomic medicine, but obtaining consent for individual-level data sharing is often hindered by a lack of time and resources. To address this issue, the Clinical Genome Resource (ClinGen) developed tools to facilitate consent, including a one-page consent form and online supplemental video with information on key topics, such as risks and benefits of data sharing. To determine whether the consent form and video accurately conveyed key data sharing concepts, we surveyed 5,162 members of the general public. We measured comprehension at baseline, after reading the form and watching the video. Additionally, we assessed participants' attitudes toward genomic data sharing. Participants' performance on comprehension questions significantly improved over baseline after reading the form and continued to improve after watching the video. Results suggest reading the form alone provided participants with important knowledge regarding broad data sharing, and watching the video allowed for broader comprehension. These materials are now available at http://www.clinicalgenome.org/share . These resources will provide patients a straightforward way to share their genetic and health information, and improve the scientific community's access to data generated through routine healthcare.

  13. Virtual Control Policy for Binary Ordered Resources Petri Net Class.

    PubMed

    Rovetto, Carlos A; Concepción, Tomás J; Cano, Elia Esther

    2016-08-18

    Prevention and avoidance of deadlocks in sensor networks that use the wormhole routing algorithm is an active research domain. There are diverse control policies that will address this problem being our approach a new method. In this paper we present a virtual control policy for the new specialized Petri net subclass called Binary Ordered Resources Petri Net (BORPN). Essentially, it is an ordinary class constructed from various state machines that share unitary resources in a complex form, which allows branching and joining of processes. The reduced structure of this new class gives advantages that allow analysis of the entire system's behavior, which is a prohibitive task for large systems because of the complexity and routing algorithms.

  14. Early Care and Education in Tribal Communities: Research to Policy Resources

    ERIC Educational Resources Information Center

    Grate, Sheree; Stephens, Samuel A.

    2016-01-01

    There are approximately 390,000 children age nine and under who are identified by their parents as being of American Indian or Alaska Native (AIAN) heritage alone, while more than 400,000 other children in the same age range share this heritage with that of other race and ethnic groups. The large majority--about 80 percent--of AIAN individuals…

  15. The OSG open facility: A sharing ecosystem

    DOE PAGES

    Jayatilaka, B.; Levshina, T.; Rynge, M.; ...

    2015-12-23

    The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers whomore » are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. Furthermore, we believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.« less

  16. Potential gains from hospital mergers in Denmark.

    PubMed

    Kristensen, Troels; Bogetoft, Peter; Pedersen, Kjeld Moeller

    2010-12-01

    The Danish hospital sector faces a major rebuilding program to centralize activity in fewer and larger hospitals. We aim to conduct an efficiency analysis of hospitals and to estimate the potential cost savings from the planned hospital mergers. We use Data Envelopment Analysis (DEA) to estimate a cost frontier. Based on this analysis, we calculate an efficiency score for each hospital and estimate the potential gains from the proposed mergers by comparing individual efficiencies with the efficiency of the combined hospitals. Furthermore, we apply a decomposition algorithm to split merger gains into technical efficiency, size (scale) and harmony (mix) gains. The motivation for this decomposition is that some of the apparent merger gains may actually be available with less than a full-scale merger, e.g., by sharing best practices and reallocating certain resources and tasks. Our results suggest that many hospitals are technically inefficient, and the expected "best practice" hospitals are quite efficient. Also, some mergers do not seem to lower costs. This finding indicates that some merged hospitals become too large and therefore experience diseconomies of scale. Other mergers lead to considerable cost reductions; we find potential gains resulting from learning better practices and the exploitation of economies of scope. To ensure robustness, we conduct a sensitivity analysis using two alternative returns-to-scale assumptions and two alternative estimation approaches. We consistently find potential gains from improving the technical efficiency and the exploitation of economies of scope from mergers.

  17. Developing enterprise tools and capacities for large-scale natural resource monitoring: A visioning workshop

    USGS Publications Warehouse

    Bayer, Jennifer M.; Weltzin, Jake F.; Scully, Rebecca A.

    2017-01-01

    Objectives of the workshop were: 1) identify resources that support natural resource monitoring programs working across the data life cycle; 2) prioritize desired capacities and tools to facilitate monitoring design and implementation; 3) identify standards and best practices that improve discovery, accessibility, and interoperability of data across programs and jurisdictions; and 4) contribute to an emerging community of practice focused on natural resource monitoring.

  18. Complexity as a Factor of Quality and Cost in Large Scale Software Development.

    DTIC Science & Technology

    1979-12-01

    allocating testing resources." [69 69I V. THE ROLE OF COMPLEXITY IN RESOURCE ESTIMATION AND ALLOCATION A. GENERAL It can be argued that blame for the...and allocation of testing resource by - identifying independent substructures and - identifying heavily used logic paths. 2. Setting a Design Threshold... RESOURCE ESTIMATION -------- 70 1. New Dynamic Field ------------------------- 70 2. Quality and Testing ----------------------- 71 3. Programming Units of

  19. Design and implementation of a distributed large-scale spatial database system based on J2EE

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  20. Command and Control for Large-Scale Hybrid Warfare Systems

    DTIC Science & Technology

    2014-06-05

    Prescribed by ANSI Std Z39-18 2 CK Pang et al. in C2 architectures was proposed using Petri nets (PNs).10 Liao in [11] reported an architecture for...arises from the chal- lenging and often-conflicting user requirements, scale, scope, inter-connectivity with different large-scale net - worked teams and...resources can be easily modelled and reconfigured by the notion of block matrix. At any time, the various missions of the net - worked team can be added

  1. Wireless shared resources : sharing of right-of-way for wireless technology : guidance on legal and institutional issues

    DOT National Transportation Integrated Search

    1997-06-06

    Shared resource projects offer an opportunity for public transportation agencies to leverage property assets in exchange for support for transportation programs. Intelligent transportation systems (ITS) require wireline infrastructure in roadway ROW ...

  2. AIMES Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Daniel S; Jha, Shantenu; Weissman, Jon

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large-scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable and interoperablemore » distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less

  3. AIMES Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weissman, Jon; Katz, Dan; Jha, Shantenu

    2017-01-31

    This is the final technical report for the AIMES project. Many important advances in science and engineering are due to large scale distributed computing. Notwithstanding this reliance, we are still learning how to design and deploy large-scale production Distributed Computing Infrastructures (DCI). This is evidenced by missing design principles for DCI, and an absence of generally acceptable and usable distributed computing abstractions. The AIMES project was conceived against this backdrop, following on the heels of a comprehensive survey of scientific distributed applications. AIMES laid the foundations to address the tripartite challenge of dynamic resource management, integrating information, and portable andmore » interoperable distributed applications. Four abstractions were defined and implemented: skeleton, resource bundle, pilot, and execution strategy. The four abstractions were implemented into software modules and then aggregated into the AIMES middleware. This middleware successfully integrates information across the application layer (skeletons) and resource layer (Bundles), derives a suitable execution strategy for the given skeleton and enacts its execution by means of pilots on one or more resources, depending on the application requirements, and resource availabilities and capabilities.« less

  4. Building the Scientific Modeling Assistant: An interactive environment for specialized software design

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.

    1991-01-01

    The construction of scientific software models is an integral part of doing science, both within NASA and within the scientific community at large. Typically, model-building is a time-intensive and painstaking process, involving the design of very large, complex computer programs. Despite the considerable expenditure of resources involved, completed scientific models cannot easily be distributed and shared with the larger scientific community due to the low-level, idiosyncratic nature of the implemented code. To address this problem, we have initiated a research project aimed at constructing a software tool called the Scientific Modeling Assistant. This tool provides automated assistance to the scientist in developing, using, and sharing software models. We describe the Scientific Modeling Assistant, and also touch on some human-machine interaction issues relevant to building a successful tool of this type.

  5. Philanthropy and Beyond: Creating Shared Value to Promote Well-Being for Individuals in Their Communities.

    PubMed

    Kottke, Thomas E; Pronk, Nico; Zinkel, Andrew R; Isham, George J

    2017-01-01

    Health care organizations can magnify the impact of their community service and other philanthropic activities by implementing programs that create shared value. By definition, shared value is created when an initiative generates benefit for the sponsoring organization while also generating societal and community benefit. Because the programs generate benefit for the sponsoring organizations, the magnitude of any particular initiative is limited only by the market for the benefit and not the resources that are available for philanthropy.In this article we use three initiatives in sectors other than health care to illustrate the concept of shared value. We also present examples of five types of shared value programs that are sponsored by health care organizations: telehealth, worksite health promotion, school-based health centers, green and healthy housing, and clean and green health services. On the basis of the innovativeness of health care organizations that have already implemented programs that create shared value, we conclude that the opportunities for all health care organizations to create positive impact for individuals and communities through similar programs is large, and the limits have yet to be defined.

  6. Modeling relief demands in an emergency supply chain system under large-scale disasters based on a queuing network.

    PubMed

    He, Xinhua; Hu, Wenfa

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model.

  7. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    PubMed

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  8. Modeling Relief Demands in an Emergency Supply Chain System under Large-Scale Disasters Based on a Queuing Network

    PubMed Central

    He, Xinhua

    2014-01-01

    This paper presents a multiple-rescue model for an emergency supply chain system under uncertainties in large-scale affected area of disasters. The proposed methodology takes into consideration that the rescue demands caused by a large-scale disaster are scattered in several locations; the servers are arranged in multiple echelons (resource depots, distribution centers, and rescue center sites) located in different places but are coordinated within one emergency supply chain system; depending on the types of rescue demands, one or more distinct servers dispatch emergency resources in different vehicle routes, and emergency rescue services queue in multiple rescue-demand locations. This emergency system is modeled as a minimal queuing response time model of location and allocation. A solution to this complex mathematical problem is developed based on genetic algorithm. Finally, a case study of an emergency supply chain system operating in Shanghai is discussed. The results demonstrate the robustness and applicability of the proposed model. PMID:24688367

  9. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    PubMed Central

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  10. A large-scale full-length cDNA analysis to explore the budding yeast transcriptome

    PubMed Central

    Miura, Fumihito; Kawaguchi, Noriko; Sese, Jun; Toyoda, Atsushi; Hattori, Masahira; Morishita, Shinichi; Ito, Takashi

    2006-01-01

    We performed a large-scale cDNA analysis to explore the transcriptome of the budding yeast Saccharomyces cerevisiae. We sequenced two cDNA libraries, one from the cells exponentially growing in a minimal medium and the other from meiotic cells. Both libraries were generated by using a vector-capping method that allows the accurate mapping of transcription start sites (TSSs). Consequently, we identified 11,575 TSSs associated with 3,638 annotated genomic features, including 3,599 ORFs, to suggest that most yeast genes have two or more TSSs. In addition, we identified 45 previously undescribed introns, including those affecting current ORF annotations and those spliced alternatively. Furthermore, the analysis revealed 667 transcription units in the intergenic regions and transcripts derived from antisense strands of 367 known features. We also found that 348 ORFs carry TSSs in their 3′-halves to generate sense transcripts starting from inside the ORFs. These results indicate that the budding yeast transcriptome is considerably more complex than previously thought, and it shares many recently revealed characteristics with the transcriptomes of mammals and other higher eukaryotes. Thus, the genome-wide active transcription that generates novel classes of transcripts appears to be an intrinsic feature of the eukaryotic cells. The budding yeast will serve as a versatile model for the studies on these aspects of transcriptome, and the full-length cDNA clones can function as an invaluable resource in such studies. PMID:17101987

  11. Complex dewetting scenarios of ultrathin silicon films for large-scale nanoarchitectures

    PubMed Central

    Naffouti, Meher; Backofen, Rainer; Salvalaglio, Marco; Bottein, Thomas; Lodari, Mario; Voigt, Axel; David, Thomas; Benkouider, Abdelmalek; Fraj, Ibtissem; Favre, Luc; Ronda, Antoine; Berbezier, Isabelle; Grosso, David; Abbarchi, Marco; Bollani, Monica

    2017-01-01

    Dewetting is a ubiquitous phenomenon in nature; many different thin films of organic and inorganic substances (such as liquids, polymers, metals, and semiconductors) share this shape instability driven by surface tension and mass transport. Via templated solid-state dewetting, we frame complex nanoarchitectures of monocrystalline silicon on insulator with unprecedented precision and reproducibility over large scales. Phase-field simulations reveal the dominant role of surface diffusion as a driving force for dewetting and provide a predictive tool to further engineer this hybrid top-down/bottom-up self-assembly method. Our results demonstrate that patches of thin monocrystalline films of metals and semiconductors share the same dewetting dynamics. We also prove the potential of our method by fabricating nanotransfer molding of metal oxide xerogels on silicon and glass substrates. This method allows the novel possibility of transferring these Si-based patterns on different materials, which do not usually undergo dewetting, offering great potential also for microfluidic or sensing applications. PMID:29296680

  12. Complex dewetting scenarios of ultrathin silicon films for large-scale nanoarchitectures.

    PubMed

    Naffouti, Meher; Backofen, Rainer; Salvalaglio, Marco; Bottein, Thomas; Lodari, Mario; Voigt, Axel; David, Thomas; Benkouider, Abdelmalek; Fraj, Ibtissem; Favre, Luc; Ronda, Antoine; Berbezier, Isabelle; Grosso, David; Abbarchi, Marco; Bollani, Monica

    2017-11-01

    Dewetting is a ubiquitous phenomenon in nature; many different thin films of organic and inorganic substances (such as liquids, polymers, metals, and semiconductors) share this shape instability driven by surface tension and mass transport. Via templated solid-state dewetting, we frame complex nanoarchitectures of monocrystalline silicon on insulator with unprecedented precision and reproducibility over large scales. Phase-field simulations reveal the dominant role of surface diffusion as a driving force for dewetting and provide a predictive tool to further engineer this hybrid top-down/bottom-up self-assembly method. Our results demonstrate that patches of thin monocrystalline films of metals and semiconductors share the same dewetting dynamics. We also prove the potential of our method by fabricating nanotransfer molding of metal oxide xerogels on silicon and glass substrates. This method allows the novel possibility of transferring these Si-based patterns on different materials, which do not usually undergo dewetting, offering great potential also for microfluidic or sensing applications.

  13. Challenges and Opportunities: One Stop Processing of Automatic Large-Scale Base Map Production Using Airborne LIDAR Data Within GIS Environment. Case Study: Makassar City, Indonesia

    NASA Astrophysics Data System (ADS)

    Widyaningrum, E.; Gorte, B. G. H.

    2017-05-01

    LiDAR data acquisition is recognized as one of the fastest solutions to provide basis data for large-scale topographical base maps worldwide. Automatic LiDAR processing is believed one possible scheme to accelerate the large-scale topographic base map provision by the Geospatial Information Agency in Indonesia. As a progressive advanced technology, Geographic Information System (GIS) open possibilities to deal with geospatial data automatic processing and analyses. Considering further needs of spatial data sharing and integration, the one stop processing of LiDAR data in a GIS environment is considered a powerful and efficient approach for the base map provision. The quality of the automated topographic base map is assessed and analysed based on its completeness, correctness, quality, and the confusion matrix.

  14. Space-time dependence between energy sources and climate related energy production

    NASA Astrophysics Data System (ADS)

    Engeland, Kolbjorn; Borga, Marco; Creutin, Jean-Dominique; Ramos, Maria-Helena; Tøfte, Lena; Warland, Geir

    2014-05-01

    The European Renewable Energy Directive adopted in 2009 focuses on achieving a 20% share of renewable energy in the EU overall energy mix by 2020. A major part of renewable energy production is related to climate, called "climate related energy" (CRE) production. CRE production systems (wind, solar, and hydropower) are characterized by a large degree of intermittency and variability on both short and long time scales due to the natural variability of climate variables. The main strategies to handle the variability of CRE production include energy-storage, -transport, -diversity and -information (smart grids). The three first strategies aim to smooth out the intermittency and variability of CRE production in time and space whereas the last strategy aims to provide a more optimal interaction between energy production and demand, i.e. to smooth out the residual load (the difference between demand and production). In order to increase the CRE share in the electricity system, it is essential to understand the space-time co-variability between the weather variables and CRE production under both current and future climates. This study presents a review of the literature that searches to tackle these problems. It reveals that the majority of studies deals with either a single CRE source or with the combination of two CREs, mostly wind and solar. This may be due to the fact that the most advanced countries in terms of wind equipment have also very little hydropower potential (Denmark, Ireland or UK, for instance). Hydropower is characterized by both a large storage capacity and flexibility in electricity production, and has therefore a large potential for both balancing and storing energy from wind- and solar-power. Several studies look at how to better connect regions with large share of hydropower (e.g., Scandinavia and the Alps) to regions with high shares of wind- and solar-power (e.g., green battery North-Sea net). Considering time scales, various studies consider wind and solar power production and their co-fluctuation at small time scales. The multi-scale nature of the variability is less studied, i.e., the potential adverse or favorable co-fluctuation at intermediate time scales involving water scarcity or abundance, is less present in the literature.Our review points out that it could be especially interesting to promote research on how the pronounced large-scale fluctuations in inflow to hydropower (intra-annual run-off) and smaller scale fluctuations in wind- and solar-power interact in an energy system. There is a need to better represent the profound difference between wind-, solar- and hydro-energy sources. On the one hand, they are all directly linked to the 2-D horizontal dynamics of meteorology. On the other hand, the branching structure of hydrological systems transforms this variability and governs the complex combination of natural inflows and reservoir storage.Finally, we note that the CRE production is, in addition to weather, also influenced by the energy system and market, i.e., the energy transport and demand across scales as well as changes of market regulation. The CRE production system lies thus in this nexus between climate, energy systems and market regulations. The work presented is part of the FP7 project COMPLEX (Knowledge based climate mitigation systems for a low carbon economy; http://www.complex.ac.uk)

  15. A Study of Veterans Administration/Department of Defense Health Care Resources Sharing at Keller Army Community Hospital West Point, New York 10996

    DTIC Science & Technology

    1984-04-01

    civilian facility. In FY 79, of the $20 million that the Veterans Administration (VA) spent on shared services , only $17,000 was for services shared...2) present incentives to encourage shared services are inadequate; and (3) such sharing of resources can be effected without a detrimental impact on...Regionalization in Perspective", which provided an excellent review of hospital regionalization and the potential benefits associated with shared services . 6

  16. The medical science DMZ: a network design pattern for data-intensive medical science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Dart, Eli; Barnett, William

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations.High-end networking, packet-filter firewalls, network intrusion-detection systems.We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs.The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and networkmore » resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows.By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements.« less

  17. The Medical Science DMZ.

    PubMed

    Peisert, Sean; Barnett, William; Dart, Eli; Cuff, James; Grossman, Robert L; Balas, Edward; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2016-11-01

    We describe use cases and an institutional reference architecture for maintaining high-capacity, data-intensive network flows (e.g., 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet filter firewalls, network intrusion detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive data sets between research institutions over national research networks. The exponentially increasing amounts of "omics" data, the rapid increase of high-quality imaging, and other rapidly growing clinical data sets have resulted in the rise of biomedical research "big data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large data sets. Maintaining data-intensive flows that comply with HIPAA and other regulations presents a new challenge for biomedical research. Recognizing this, we describe a strategy that marries performance and security by borrowing from and redefining the concept of a "Science DMZ"-a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  18. The medical science DMZ: a network design pattern for data-intensive medical science.

    PubMed

    Peisert, Sean; Dart, Eli; Barnett, William; Balas, Edward; Cuff, James; Grossman, Robert L; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2017-10-06

    We describe a detailed solution for maintaining high-capacity, data-intensive network flows (eg, 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. High-end networking, packet-filter firewalls, network intrusion-detection systems. We describe a "Medical Science DMZ" concept as an option for secure, high-volume transport of large, sensitive datasets between research institutions over national research networks, and give 3 detailed descriptions of implemented Medical Science DMZs. The exponentially increasing amounts of "omics" data, high-quality imaging, and other rapidly growing clinical datasets have resulted in the rise of biomedical research "Big Data." The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large datasets. Maintaining data-intensive flows that comply with the Health Insurance Portability and Accountability Act (HIPAA) and other regulations presents a new challenge for biomedical research. We describe a strategy that marries performance and security by borrowing from and redefining the concept of a Science DMZ, a framework that is used in physical sciences and engineering research to manage high-capacity data flows. By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  19. The Medical Science DMZ

    PubMed Central

    Barnett, William; Dart, Eli; Cuff, James; Grossman, Robert L; Balas, Edward; Berman, Ari; Shankar, Anurag; Tierney, Brian

    2016-01-01

    Objective We describe use cases and an institutional reference architecture for maintaining high-capacity, data-intensive network flows (e.g., 10, 40, 100 Gbps+) in a scientific, medical context while still adhering to security and privacy laws and regulations. Materials and Methods High-end networking, packet filter firewalls, network intrusion detection systems. Results We describe a “Medical Science DMZ” concept as an option for secure, high-volume transport of large, sensitive data sets between research institutions over national research networks. Discussion The exponentially increasing amounts of “omics” data, the rapid increase of high-quality imaging, and other rapidly growing clinical data sets have resulted in the rise of biomedical research “big data.” The storage, analysis, and network resources required to process these data and integrate them into patient diagnoses and treatments have grown to scales that strain the capabilities of academic health centers. Some data are not generated locally and cannot be sustained locally, and shared data repositories such as those provided by the National Library of Medicine, the National Cancer Institute, and international partners such as the European Bioinformatics Institute are rapidly growing. The ability to store and compute using these data must therefore be addressed by a combination of local, national, and industry resources that exchange large data sets. Maintaining data-intensive flows that comply with HIPAA and other regulations presents a new challenge for biomedical research. Recognizing this, we describe a strategy that marries performance and security by borrowing from and redefining the concept of a “Science DMZ”—a framework that is used in physical sciences and engineering research to manage high-capacity data flows. Conclusion By implementing a Medical Science DMZ architecture, biomedical researchers can leverage the scale provided by high-performance computer and cloud storage facilities and national high-speed research networks while preserving privacy and meeting regulatory requirements. PMID:27136944

  20. Climate change collaboration among natural resource management agencies: lessons learned from two US regions

    USGS Publications Warehouse

    Lemieux, Christopher J.; Thompson, Jessica; Slocombe, D. Scott; Schuster, Rudy

    2015-01-01

    It has been argued that regional collaboration can facilitate adaptation to climate change impacts through integrated planning and management. In an attempt to understand the underlying institutional factors that either support or contest this assumption, this paper explores the institutional factors influencing adaptation to climate change at the regional scale, where multiple public land and natural resource management jurisdictions are involved. Insights from two mid-western US case studies reveal that several challenges to collaboration persist and prevent fully integrative multi-jurisdictional adaptation planning at a regional scale. We propose that some of these challenges, such as lack of adequate time, funding and communication channels, be reframed as opportunities to build interdependence, identify issue-linkages and collaboratively explore the nature and extent of organisational trade-offs with respect to regional climate change adaptation efforts. Such a reframing can better facilitate multi-jurisdictional adaptation planning and management of shared biophysical resources generally while simultaneously enhancing organisational capacity to mitigate negative effects and take advantage of potentially favourable future conditions in an era characterised by rapid climate change.

  1. Higher-Order Exploratory Factor Analysis of the Reynolds Intellectual Assessment Scales with a Referred Sample

    ERIC Educational Resources Information Center

    Nelson, Jason M.; Canivez, Gary L.; Lindstrom, Will; Hatt, Clifford V.

    2007-01-01

    The factor structure of the Reynolds Intellectual Assessment Scales (RIAS; [Reynolds, C.R., & Kamphaus, R.W. (2003). "Reynolds Intellectual Assessment Scales". Lutz, FL: Psychological Assessment Resources, Inc.]) was investigated with a large (N=1163) independent sample of referred students (ages 6-18). More rigorous factor extraction criteria…

  2. Trophic pathways supporting juvenile Chinook and Coho salmon in the glacial Susitna River, Alaska: patterns of freshwater, marine, and terrestrial resource use across a seasonally dynamic habitat mosaic

    USGS Publications Warehouse

    Rine, Kristin M.; Wipfli, Mark S.; Schoen, Erik R.; Nightengale, Timothy L.; Stricker, Craig A.

    2016-01-01

    Contributions of terrestrial-, freshwater-, and marine-derived prey resources to stream fishes vary over time and space, altering the energy pathways that regulate production. In this study, we determined large-scale use of these resources by juvenile Chinook and coho salmon (Oncorhynchus tshawytscha and Oncorhynchus kisutch, respectively) in the glacial Susitna River, Alaska. We resolved spatial and temporal trophic patterns among multiple macrohabitat types along a 97 km segment of the river corridor via stable isotope and stomach content analyses. Juvenile salmon were supported primarily by freshwater-derived resources and secondarily by marine and terrestrial sources. The relative contribution of marine-derived prey to rearing salmon was greatest in the fall within off-channel macrohabitats, whereas the contributions of terrestrial invertebrate prey were generally greatest during midsummer, across all macrohabitats. No longitudinal (upstream–downstream) diet pattern was discernable. These results highlight large-scale spatial and seasonal patterns of energy flow and the dynamic interplay of pulsed marine and terrestrial prey subsidies to juvenile Chinook and coho salmon in a large, complex, and relatively pristine glacial river.

  3. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size.

    PubMed

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics.

  4. A Life-Cycle Model of Human Social Groups Produces a U-Shaped Distribution in Group Size

    PubMed Central

    Salali, Gul Deniz; Whitehouse, Harvey; Hochberg, Michael E.

    2015-01-01

    One of the central puzzles in the study of sociocultural evolution is how and why transitions from small-scale human groups to large-scale, hierarchically more complex ones occurred. Here we develop a spatially explicit agent-based model as a first step towards understanding the ecological dynamics of small and large-scale human groups. By analogy with the interactions between single-celled and multicellular organisms, we build a theory of group lifecycles as an emergent property of single cell demographic and expansion behaviours. We find that once the transition from small-scale to large-scale groups occurs, a few large-scale groups continue expanding while small-scale groups gradually become scarcer, and large-scale groups become larger in size and fewer in number over time. Demographic and expansion behaviours of groups are largely influenced by the distribution and availability of resources. Our results conform to a pattern of human political change in which religions and nation states come to be represented by a few large units and many smaller ones. Future enhancements of the model should include decision-making rules and probabilities of fragmentation for large-scale societies. We suggest that the synthesis of population ecology and social evolution will generate increasingly plausible models of human group dynamics. PMID:26381745

  5. Global teaching and training initiatives for emerging cohort studies

    PubMed Central

    Paulus, Jessica K.; Santoyo-Vistrain, Rocío; Havelick, David; Cohen, Amy; Kalyesubula, Robert; Ajayi, Ikeoluwapo O.; Mattsson, Jens G.; Adami, Hans-Olov; Dalal, Shona

    2015-01-01

    A striking disparity exists across the globe, with essentially no large-scale longitudinal studies ongoing in regions that will be significantly affected by the oncoming non-communicable disease epidemic. The successful implementation of cohort studies in most low-resource research environments presents unique challenges that may be aided by coordinated training programs. Leaders of emerging cohort studies attending the First World Cohort Integration Workshop were surveyed about training priorities, unmet needs and potential cross-cohort solutions to these barriers through an electronic pre-workshop questionnaire and focus groups. Cohort studies representing India, Mexico, Nigeria, South Africa, Sweden, Tanzania and Uganda described similar training needs, including on-the-job training, data analysis software instruction, and database and bio-bank management. A lack of funding and protected time for training activities were commonly identified constraints. Proposed solutions include a collaborative cross-cohort teaching platform with web-based content and interactive teaching methods for a range of research personnel. An international network for research mentorship and idea exchange, and modifying the graduate thesis structure were also identified as key initiatives. Cross-cohort integrated educational initiatives will efficiently meet shared needs, catalyze the development of emerging cohorts, speed closure of the global disparity in cohort research, and may fortify scientific capacity development in low-resource settings. PMID:23856451

  6. Interacting with the National Database for Autism Research (NDAR) via the LONI Pipeline workflow environment.

    PubMed

    Torgerson, Carinna M; Quinn, Catherine; Dinov, Ivo; Liu, Zhizhong; Petrosyan, Petros; Pelphrey, Kevin; Haselgrove, Christian; Kennedy, David N; Toga, Arthur W; Van Horn, John Darrell

    2015-03-01

    Under the umbrella of the National Database for Clinical Trials (NDCT) related to mental illnesses, the National Database for Autism Research (NDAR) seeks to gather, curate, and make openly available neuroimaging data from NIH-funded studies of autism spectrum disorder (ASD). NDAR has recently made its database accessible through the LONI Pipeline workflow design and execution environment to enable large-scale analyses of cortical architecture and function via local, cluster, or "cloud"-based computing resources. This presents a unique opportunity to overcome many of the customary limitations to fostering biomedical neuroimaging as a science of discovery. Providing open access to primary neuroimaging data, workflow methods, and high-performance computing will increase uniformity in data collection protocols, encourage greater reliability of published data, results replication, and broaden the range of researchers now able to perform larger studies than ever before. To illustrate the use of NDAR and LONI Pipeline for performing several commonly performed neuroimaging processing steps and analyses, this paper presents example workflows useful for ASD neuroimaging researchers seeking to begin using this valuable combination of online data and computational resources. We discuss the utility of such database and workflow processing interactivity as a motivation for the sharing of additional primary data in ASD research and elsewhere.

  7. Parallelizing ATLAS Reconstruction and Simulation: Issues and Optimization Solutions for Scaling on Multi- and Many-CPU Platforms

    NASA Astrophysics Data System (ADS)

    Leggett, C.; Binet, S.; Jackson, K.; Levinthal, D.; Tatarkhanov, M.; Yao, Y.

    2011-12-01

    Thermal limitations have forced CPU manufacturers to shift from simply increasing clock speeds to improve processor performance, to producing chip designs with multi- and many-core architectures. Further the cores themselves can run multiple threads as a zero overhead context switch allowing low level resource sharing (Intel Hyperthreading). To maximize bandwidth and minimize memory latency, memory access has become non uniform (NUMA). As manufacturers add more cores to each chip, a careful understanding of the underlying architecture is required in order to fully utilize the available resources. We present AthenaMP and the Atlas event loop manager, the driver of the simulation and reconstruction engines, which have been rewritten to make use of multiple cores, by means of event based parallelism, and final stage I/O synchronization. However, initial studies on 8 andl6 core Intel architectures have shown marked non-linearities as parallel process counts increase, with as much as 30% reductions in event throughput in some scenarios. Since the Intel Nehalem architecture (both Gainestown and Westmere) will be the most common choice for the next round of hardware procurements, an understanding of these scaling issues is essential. Using hardware based event counters and Intel's Performance Tuning Utility, we have studied the performance bottlenecks at the hardware level, and discovered optimization schemes to maximize processor throughput. We have also produced optimization mechanisms, common to all large experiments, that address the extreme nature of today's HEP code, which due to it's size, places huge burdens on the memory infrastructure of today's processors.

  8. International Comparison of Poststroke Resource Use: A Longitudinal Analysis in Europe.

    PubMed

    Matchar, David B; Bilger, Marcel; Do, Young K; Eom, Kirsten

    2015-10-01

    Long-term costs often represent a large proportion of the total costs induced by stroke, but data on long-term poststroke resource use are sparse, especially regarding the trajectory of costs by severity. We used a multinational longitudinal survey to estimate patterns of poststroke resource use by degree of functional disability and to compare resource use between regions. The Survey of Health, Ageing and Retirement in Europe (SHARE) is a multinational database of adults 50 years and older, which includes demographic information about respondents, age when stroke first occurred, current activity of daily living (ADL) limitations, and health care resource use in the year before interview. We modeled resource use with a 2-part regression for number of hospital days, home nursing hours, and paid and unpaid home caregiving hours. After accounting for time since stroke, number of strokes and comorbidities, age, gender, and European regions, we found that poststroke resource use was strongly associated with ADL limitations. The duration since the stroke event was significantly associated only with inpatient care, and informal help showed significant regional heterogeneity across all ADL limitation levels. Poststroke physical deficits appear to be a strong driver of long-term resource utilization; treatments that decrease such deficits offer substantial potential for downline cost savings. Analyzing internationally comparable panel data, such as SHARE, provide valuable insight into long-term cost of stroke. More comprehensive international comparisons will require registries with follow-up, particularly for informal and formal home-based care. Copyright © 2015 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  9. A methodology towards virtualisation-based high performance simulation platform supporting multidisciplinary design of complex products

    NASA Astrophysics Data System (ADS)

    Ren, Lei; Zhang, Lin; Tao, Fei; (Luke) Zhang, Xiaolong; Luo, Yongliang; Zhang, Yabin

    2012-08-01

    Multidisciplinary design of complex products leads to an increasing demand for high performance simulation (HPS) platforms. One great challenge is how to achieve high efficient utilisation of large-scale simulation resources in distributed and heterogeneous environments. This article reports a virtualisation-based methodology to realise a HPS platform. This research is driven by the issues concerning large-scale simulation resources deployment and complex simulation environment construction, efficient and transparent utilisation of fine-grained simulation resources and high reliable simulation with fault tolerance. A framework of virtualisation-based simulation platform (VSIM) is first proposed. Then the article investigates and discusses key approaches in VSIM, including simulation resources modelling, a method to automatically deploying simulation resources for dynamic construction of system environment, and a live migration mechanism in case of faults in run-time simulation. Furthermore, the proposed methodology is applied to a multidisciplinary design system for aircraft virtual prototyping and some experiments are conducted. The experimental results show that the proposed methodology can (1) significantly improve the utilisation of fine-grained simulation resources, (2) result in a great reduction in deployment time and an increased flexibility for simulation environment construction and (3)achieve fault tolerant simulation.

  10. Large-Scale Residential Demolition

    EPA Pesticide Factsheets

    The EPA provides resources for handling residential demolitions or renovations. This includes planning, handling harmful materials, recycling, funding, compliance assistance, good practices and regulations.

  11. The Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial and Its Associated Research Resource

    PubMed Central

    2013-01-01

    The Prostate, Lung, Colorectal, and Ovarian (PLCO) Cancer Screening Trial is a large-scale research effort conducted by the National Cancer Institute. PLCO offers an example of coordinated research by both the extramural and intramural communities of the National Institutes of Health. The purpose of this article is to describe the PLCO research resource and how it is managed and to assess the productivity and the costs associated with this resource. Such an in-depth analysis of a single large-scale project can shed light on questions such as how large-scale projects should be managed, what metrics should be used to assess productivity, and how costs can be compared with productivity metrics. A comprehensive publication analysis identified 335 primary research publications resulting from research using PLCO data and biospecimens from 2000 to 2012. By the end of 2012, a total of 9679 citations (excluding self-citations) have resulted from this body of research publications, with an average of 29.7 citations per article, and an h index of 45, which is comparable with other large-scale studies, such as the Nurses’ Health Study. In terms of impact on public health, PLCO trial results have been used by the US Preventive Services Task Force in making recommendations concerning prostate and ovarian cancer screening. The overall cost of PLCO was $454 million over 20 years, adjusted to 2011 dollars, with approximately $37 million for the collection, processing, and storage of biospecimens, including blood samples, buccal cells, and pathology tissues. PMID:24115361

  12. Vehicle-to-Grid Automatic Load Sharing with Driver Preference in Micro-Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yubo; Nazaripouya, Hamidreza; Chu, Chi-Cheng

    Integration of Electrical Vehicles (EVs) with power grid not only brings new challenges for load management, but also opportunities for distributed storage and generation. This paper comprehensively models and analyzes distributed Vehicle-to-Grid (V2G) for automatic load sharing with driver preference. In a micro-grid with limited communications, V2G EVs need to decide load sharing based on their own power and voltage profile. A droop based controller taking into account driver preference is proposed in this paper to address the distributed control of EVs. Simulations are designed for three fundamental V2G automatic load sharing scenarios that include all system dynamics of suchmore » applications. Simulation results demonstrate that active power sharing is achieved proportionally among V2G EVs with consideration of driver preference. In additional, the results also verify the system stability and reactive power sharing analysis in system modelling, which sheds light on large scale V2G automatic load sharing in more complicated cases.« less

  13. Paradigms and nursing management, analysis of the current organizational structure in a large hospital.

    PubMed

    Wilson, D

    1992-01-01

    Hospitals developed over the period of time when positivism become a predominant world view. Positivism was founded by four Western trends: preponderance of hierarchy and autocracy, popularization of bureaucracy, extensive application of a machine orientation to work and predominance of "scientific" inquiry. Organizational theory developed largely from quantitative research findings arising from a positivistic world view. A case study, analyzing a current nursing organizational structure at one large hospital, is presented. Nursing management was found to be based upon the positivistic paradigm. The predominance of a machine orientation, and an autocratic and bureaucratic structure are evidence of this. A change to shared governance had been attempted, indicating a shift to a more modern organizational structure based on a different paradigm. The article concludes by emphasizing that managers are largely responsible for facilitating change; change that will meet internal human resource needs and the cost-effectiveness crises of hospitals today through more effective use of human resources.

  14. Genetic Advances in Autism: Heterogeneity and Convergence on Shared Pathways

    PubMed Central

    Bill, Brent R.; Geschwind, Daniel H.

    2009-01-01

    The autism spectrum disorders (ASD) are a heterogeneous set of developmental disorders characterized at their core by deficits in social interaction and communication. Current psychiatric nosology groups this broad set of disorders with strong genetic liability and multiple etiologies into the same diagnostic category. This heterogeneity has challenged genetic analyses. But shared patient resources, genomic technologies, more refined phenotypes, and novel computational approaches have begun to yield dividends in defining the genetic mechanisms at work. Over the last five years, a large number of autism susceptibility loci have emerged, redefining our notion of autism’s etiologies, and reframing how we think about ASD. PMID:19477629

  15. Resource Sharing: New Technologies as a Must for Universal Availability of Information. International Essen Symposium (16th, Essen, Germany, October 18-21, 1993). Festschrift in Honor of Hans-Peter Geh.

    ERIC Educational Resources Information Center

    Helal, Ahmed H., Ed.; Weiss, Joachim W.

    This proceedings includes the following papers presented at the 16th International Essen Symposium: "Electronic Resource Sharing: It May Seem Obvious, But It's Not as Simple as it Looks" (Herbert S. White); "Resource Sharing through OCLC: A Comprehensive Approach" (Janet Mitchell); "The Business Information Network:…

  16. Individual and group-level job resources and their relationships with individual work engagement.

    PubMed

    Füllemann, Désirée; Brauchli, Rebecca; Jenny, Gregor J; Bauer, Georg F

    2016-06-16

    This study adds a multilevel perspective to the well-researched individual-level relationship between job resources and work engagement. In addition, we explored whether individual job resources cluster within work groups because of a shared psychosocial environment and investigated whether a resource-rich psychosocial work group environment is beneficial for employee engagement over and above the beneficial effect of individual job resources and independent of their variability within groups. Data of 1,219 employees nested in 103 work groups were obtained from a baseline employee survey of a large stress management intervention project implemented in six medium and large-sized organizations in diverse sectors. A variety of important job resources were assessed and grouped to an overall job resource factor with three subfactors (manager behavior, peer behavior, and task-related resources). Data were analyzed using multilevel random coefficient modeling. The results indicated that job resources cluster within work groups and can be aggregated to a group-level job resources construct. However, a resource-rich environment, indicated by high group-level job resources, did not additionally benefit employee work engagement but on the contrary, was negatively related to it. On the basis of this unexpected result, replication studies are encouraged and suggestions for future studies on possible underlying within-group processes are discussed. The study supports the presumed value of integrating work group as a relevant psychosocial environment into the motivational process and indicates a need to further investigate emergent processes involved in aggregation procedures across levels.

  17. A transient FETI methodology for large-scale parallel implicit computations in structural mechanics

    NASA Technical Reports Server (NTRS)

    Farhat, Charbel; Crivelli, Luis; Roux, Francois-Xavier

    1992-01-01

    Explicit codes are often used to simulate the nonlinear dynamics of large-scale structural systems, even for low frequency response, because the storage and CPU requirements entailed by the repeated factorizations traditionally found in implicit codes rapidly overwhelm the available computing resources. With the advent of parallel processing, this trend is accelerating because explicit schemes are also easier to parallelize than implicit ones. However, the time step restriction imposed by the Courant stability condition on all explicit schemes cannot yet -- and perhaps will never -- be offset by the speed of parallel hardware. Therefore, it is essential to develop efficient and robust alternatives to direct methods that are also amenable to massively parallel processing because implicit codes using unconditionally stable time-integration algorithms are computationally more efficient when simulating low-frequency dynamics. Here we present a domain decomposition method for implicit schemes that requires significantly less storage than factorization algorithms, that is several times faster than other popular direct and iterative methods, that can be easily implemented on both shared and local memory parallel processors, and that is both computationally and communication-wise efficient. The proposed transient domain decomposition method is an extension of the method of Finite Element Tearing and Interconnecting (FETI) developed by Farhat and Roux for the solution of static problems. Serial and parallel performance results on the CRAY Y-MP/8 and the iPSC-860/128 systems are reported and analyzed for realistic structural dynamics problems. These results establish the superiority of the FETI method over both the serial/parallel conjugate gradient algorithm with diagonal scaling and the serial/parallel direct method, and contrast the computational power of the iPSC-860/128 parallel processor with that of the CRAY Y-MP/8 system.

  18. Language influences music harmony perception: effects of shared syntactic integration resources beyond attention

    PubMed Central

    Willems, Roel M.; Hagoort, Peter

    2016-01-01

    Many studies have revealed shared music–language processing resources by finding an influence of music harmony manipulations on concurrent language processing. However, the nature of the shared resources has remained ambiguous. They have been argued to be syntax specific and thus due to shared syntactic integration resources. An alternative view regards them as related to general attention and, thus, not specific to syntax. The present experiments evaluated these accounts by investigating the influence of language on music. Participants were asked to provide closure judgements on harmonic sequences in order to assess the appropriateness of sequence endings. At the same time participants read syntactic garden-path sentences. Closure judgements revealed a change in harmonic processing as the result of reading a syntactically challenging word. We found no influence of an arithmetic control manipulation (experiment 1) or semantic garden-path sentences (experiment 2). Our results provide behavioural evidence for a specific influence of linguistic syntax processing on musical harmony judgements. A closer look reveals that the shared resources appear to be needed to hold a harmonic key online in some form of syntactic working memory or unification workspace related to the integration of chords and words. Overall, our results support the syntax specificity of shared music–language processing resources. PMID:26998339

  19. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  20. INVENTORY AND CLASSIFICATION OF GREAT LAKES COASTAL WETLANDS FOR MONITORING AND ASSESSMENT AT LARGE SPATIAL SCALES

    EPA Science Inventory

    Monitoring aquatic resources for regional assessments requires an accurate and comprehensive inventory of the resource and useful classification of exosystem similarities. Our research effort to create an electronic database and work with various ways to classify coastal wetlands...

Top