42 CFR § 512.500 - Sharing arrangements under the EPM.
Code of Federal Regulations, 2010 CFR
2017-10-01
... SERVICES (CONTINUED) HEALTH CARE INFRASTRUCTURE AND MODEL PROGRAMS EPISODE PAYMENT MODEL Financial... participant may enter into a sharing arrangement with an EPM collaborator to make a gainsharing payment, or to receive an alignment payment, or both. An EPM participant must not make a gainsharing payment or receive...
47 CFR 59.3 - Information concerning deployment of new services and equipment.
Code of Federal Regulations, 2010 CFR
2010-10-01
... services and equipment, including any software or upgrades of software integral to the use or operation of... services and equipment. 59.3 Section 59.3 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES (CONTINUED) INFRASTRUCTURE SHARING § 59.3 Information concerning deployment of...
Quantifying the conservation gains from shared access to linear infrastructure.
Runge, Claire A; Tulloch, Ayesha I T; Gordon, Ascelin; Rhodes, Jonathan R
2017-12-01
The proliferation of linear infrastructure such as roads and railways is a major global driver of cumulative biodiversity loss. One strategy for reducing habitat loss associated with development is to encourage linear infrastructure providers and users to share infrastructure networks. We quantified the reductions in biodiversity impact and capital costs under linear infrastructure sharing of a range of potential mine to port transportation links for 47 mine locations operated by 28 separate companies in the Upper Spencer Gulf Region of South Australia. We mapped transport links based on least-cost pathways for different levels of linear-infrastructure sharing and used expert-elicited impacts of linear infrastructure to estimate the consequences for biodiversity. Capital costs were calculated based on estimates of construction costs, compensation payments, and transaction costs. We evaluated proposed mine-port links by comparing biodiversity impacts and capital costs across 3 scenarios: an independent scenario, where no infrastructure is shared; a restricted-access scenario, where the largest mining companies share infrastructure but exclude smaller mining companies from sharing; and a shared scenario where all mining companies share linear infrastructure. Fully shared development of linear infrastructure reduced overall biodiversity impacts by 76% and reduced capital costs by 64% compared with the independent scenario. However, there was considerable variation among companies. Our restricted-access scenario showed only modest biodiversity benefits relative to the independent scenario, indicating that reductions are likely to be limited if the dominant mining companies restrict access to infrastructure, which often occurs without policies that promote sharing of infrastructure. Our research helps illuminate the circumstances under which infrastructure sharing can minimize the biodiversity impacts of development. © 2017 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.
47 CFR 59.4 - Definition of “qualifying carrier”.
Code of Federal Regulations, 2010 CFR
2010-10-01
... (CONTINUED) INFRASTRUCTURE SHARING § 59.4 Definition of “qualifying carrier”. For purposes of this part, the term “qualifying carrier” means a telecommunications carrier that: (a) Lacks economies of scale or...
77 FR 60607 - National Cybersecurity Awareness Month, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-04
... released the Blueprint for a Secure Cyber Future--a strategic plan to protect government, the private sector, and the public against cyber threats today and tomorrow. As we continue to improve our... infrastructure, facilitating greater cyber information sharing between government and the private sector, and...
Changes in Data Sharing and Data Reuse Practices and Perceptions among Scientists Worldwide
Tenopir, Carol; Dalton, Elizabeth D.; Allard, Suzie; Frame, Mike; Pjesivac, Ivanka; Birch, Ben; Pollock, Danielle; Dorsett, Kristina
2015-01-01
The incorporation of data sharing into the research lifecycle is an important part of modern scholarly debate. In this study, the DataONE Usability and Assessment working group addresses two primary goals: To examine the current state of data sharing and reuse perceptions and practices among research scientists as they compare to the 2009/2010 baseline study, and to examine differences in practices and perceptions across age groups, geographic regions, and subject disciplines. We distributed surveys to a multinational sample of scientific researchers at two different time periods (October 2009 to July 2010 and October 2013 to March 2014) to observe current states of data sharing and to see what, if any, changes have occurred in the past 3–4 years. We also looked at differences across age, geographic, and discipline-based groups as they currently exist in the 2013/2014 survey. Results point to increased acceptance of and willingness to engage in data sharing, as well as an increase in actual data sharing behaviors. However, there is also increased perceived risk associated with data sharing, and specific barriers to data sharing persist. There are also differences across age groups, with younger respondents feeling more favorably toward data sharing and reuse, yet making less of their data available than older respondents. Geographic differences exist as well, which can in part be understood in terms of collectivist and individualist cultural differences. An examination of subject disciplines shows that the constraints and enablers of data sharing and reuse manifest differently across disciplines. Implications of these findings include the continued need to build infrastructure that promotes data sharing while recognizing the needs of different research communities. Moving into the future, organizations such as DataONE will continue to assess, monitor, educate, and provide the infrastructure necessary to support such complex grand science challenges. PMID:26308551
Changes in data sharing and data reuse practices and perceptions among scientists worldwide
Tenopir, Carol; Dalton, Elizabeth D.; Allard, Suzie; Frame, Mike; Pjesivac, Ivanka; Birch, Ben; Pollock, Danielle; Dorsett, Kristina
2015-01-01
The incorporation of data sharing into the research lifecycle is an important part of modern scholarly debate. In this study, the DataONE Usability and Assessment working group addresses two primary goals: To examine the current state of data sharing and reuse perceptions and practices among research scientists as they compare to the 2009/2010 baseline study, and to examine differences in practices and perceptions across age groups, geographic regions, and subject disciplines. We distributed surveys to a multinational sample of scientific researchers at two different time periods (October 2009 to July 2010 and October 2013 to March 2014) to observe current states of data sharing and to see what, if any, changes have occurred in the past 3–4 years. We also looked at differences across age, geographic, and discipline-based groups as they currently exist in the 2013/2014 survey. Results point to increased acceptance of and willingness to engage in data sharing, as well as an increase in actual data sharing behaviors. However, there is also increased perceived risk associated with data sharing, and specific barriers to data sharing persist. There are also differences across age groups, with younger respondents feeling more favorably toward data sharing and reuse, yet making less of their data available than older respondents. Geographic differences exist as well, which can in part be understood in terms of collectivist and individualist cultural differences. An examination of subject disciplines shows that the constraints and enablers of data sharing and reuse manifest differently across disciplines. Implications of these findings include the continued need to build infrastructure that promotes data sharing while recognizing the needs of different research communities. Moving into the future, organizations such as DataONE will continue to assess, monitor, educate, and provide the infrastructure necessary to support such complex grand science challenges.
Planning Quality for Successful International Environmental Monitoring
George M. Brilis; John G. Lyon; Jeffery C. Worthington
2006-01-01
Federal, State, and municipal government entities are increasingly depending on geospatial data for a myriad of purposes. This trend is expected to continue. Information sharing and interoperability are in line with the Federal Executive Order 12906 (Clinton, 1994) which calls for the establishment of the National Spatial Data Infrastructure (NSDI). If other...
42 CFR § 512.725 - Data sharing for FFS-CR participants.
Code of Federal Regulations, 2010 CFR
2017-10-01
... SERVICES (CONTINUED) HEALTH CARE INFRASTRUCTURE AND MODEL PROGRAMS EPISODE PAYMENT MODEL CR Incentive Payment Model for EPM and Medicare Fee-for-Service Participants Provisions for Ffs-Cr Participants § 512... quality. (3) Enhance efficiencies in the delivery of care. (4) Otherwise achieve the goals of the model...
ERIC Educational Resources Information Center
Wooldridge, Brooke; Taylor, Laurie; Sullivan, Mark
2009-01-01
Developing an Open Access, multi-institutional, multilingual, international digital library requires robust technological and institutional infrastructures that support both the needs of individual institutions alongside the needs of the growing partnership and ensure continuous communication and development of the shared vision for the digital…
Optical stabilization for time transfer infrastructure
NASA Astrophysics Data System (ADS)
Vojtech, Josef; Altmann, Michal; Skoda, Pavel; Horvath, Tomas; Slapak, Martin; Smotlacha, Vladimir; Havlis, Ondrej; Munster, Petr; Radil, Jan; Kundrat, Jan; Altmannova, Lada; Velc, Radek; Hula, Miloslav; Vohnout, Rudolf
2017-08-01
In this paper, we propose and present verification of all-optical methods for stabilization of the end-to-end delay of an optical fiber link. These methods are verified for deployment within infrastructure for accurate time and stable frequency distribution, based on sharing of fibers with research and educational network carrying live data traffic. Methods range from path length control, through temperature conditioning method to transmit wavelength control. Attention is given to achieve continuous control for relatively broad range of delays. We summarize design rules for delay stabilization based on the character and the total delay jitter.
CERN data services for LHC computing
NASA Astrophysics Data System (ADS)
Espinal, X.; Bocchi, E.; Chan, B.; Fiorot, A.; Iven, J.; Lo Presti, G.; Lopez, J.; Gonzalez, H.; Lamanna, M.; Mascetti, L.; Moscicki, J.; Pace, A.; Peters, A.; Ponce, S.; Rousseau, H.; van der Ster, D.
2017-10-01
Dependability, resilience, adaptability and efficiency. Growing requirements require tailoring storage services and novel solutions. Unprecedented volumes of data coming from the broad number of experiments at CERN need to be quickly available in a highly scalable way for large-scale processing and data distribution while in parallel they are routed to tape for long-term archival. These activities are critical for the success of HEP experiments. Nowadays we operate at high incoming throughput (14GB/s during 2015 LHC Pb-Pb run and 11PB in July 2016) and with concurrent complex production work-loads. In parallel our systems provide the platform for the continuous user and experiment driven work-loads for large-scale data analysis, including end-user access and sharing. The storage services at CERN cover the needs of our community: EOS and CASTOR as a large-scale storage; CERNBox for end-user access and sharing; Ceph as data back-end for the CERN OpenStack infrastructure, NFS services and S3 functionality; AFS for legacy distributed-file-system services. In this paper we will summarise the experience in supporting LHC experiments and the transition of our infrastructure from static monolithic systems to flexible components providing a more coherent environment with pluggable protocols, tuneable QoS, sharing capabilities and fine grained ACLs management while continuing to guarantee dependable and robust services.
3 CFR 8875 - Proclamation 8875 of October 1, 2012. National Cybersecurity Awareness Month, 2012
Code of Federal Regulations, 2013 CFR
2013-01-01
... November 2011, we released the Blueprint for a Secure Cyber Future—a strategic plan to protect government, the private sector, and the public against cyber threats today and tomorrow. As we continue to improve... our critical infrastructure, facilitating greater cyber information sharing between government and the...
Lindau, Stacy Tessler; Makelarski, Jennifer A.; Chin, Marshall H.; Desautels, Shane; Johnson, Daniel; Johnson, Waldo E.; Miller, Doriane; Peters, Susan; Robinson, Connie; Schneider, John; Thicklin, Florence; Watson, Natalie P.; Wolfe, Marcus; Whitaker, Eric
2011-01-01
Objective To describe the roles community members can and should play in, and an asset-based strategy used by Chicago’s South Side Health and Vitality Studies for, building sustainable, large-scale community health research infrastructure. The Studies are a family of research efforts aiming to produce actionable knowledge to inform health policy, programming, and investments for the region. Methods Community and university collaborators, using a consensus-based approach, developed shared theoretical perspectives, guiding principles, and a model for collaboration in 2008, which were used to inform an asset-based operational strategy. Ongoing community engagement and relationship-building support the infrastructure and research activities of the Studies. Results Key steps in the asset-based strategy include: 1) continuous community engagement and relationship building, 2) identifying community priorities, 3) identifying community assets, 4) leveraging assets, 5) conducting research, 6) sharing knowledge and 7) informing action. Examples of community member roles, and how these are informed by the Studies’ guiding principles, are provided. Conclusions Community and university collaborators, with shared vision and principles, can effectively work together to plan innovative, large-scale community-based research that serves community needs and priorities. Sustainable, effective models are needed to realize NIH’s mandate for meaningful translation of biomedical discovery into improved population health. PMID:21236295
Dynamic Collaboration Infrastructure for Hydrologic Science
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Castillo, C.; Yi, H.; Jiang, F.; Jones, N.; Goodall, J. L.
2016-12-01
Data and modeling infrastructure is becoming increasingly accessible to water scientists. HydroShare is a collaborative environment that currently offers water scientists the ability to access modeling and data infrastructure in support of data intensive modeling and analysis. It supports the sharing of and collaboration around "resources" which are social objects defined to include both data and models in a structured standardized format. Users collaborate around these objects via comments, ratings, and groups. HydroShare also supports web services and cloud based computation for the execution of hydrologic models and analysis and visualization of hydrologic data. However, the quantity and variety of data and modeling infrastructure available that can be accessed from environments like HydroShare is increasing. Storage infrastructure can range from one's local PC to campus or organizational storage to storage in the cloud. Modeling or computing infrastructure can range from one's desktop to departmental clusters to national HPC resources to grid and cloud computing resources. How does one orchestrate this vast number of data and computing infrastructure without needing to correspondingly learn each new system? A common limitation across these systems is the lack of efficient integration between data transport mechanisms and the corresponding high-level services to support large distributed data and compute operations. A scientist running a hydrology model from their desktop may require processing a large collection of files across the aforementioned storage and compute resources and various national databases. To address these community challenges a proof-of-concept prototype was created integrating HydroShare with RADII (Resource Aware Data-centric collaboration Infrastructure) to provide software infrastructure to enable the comprehensive and rapid dynamic deployment of what we refer to as "collaborative infrastructure." In this presentation we discuss the results of this proof-of-concept prototype which enabled HydroShare users to readily instantiate virtual infrastructure marshaling arbitrary combinations, varieties, and quantities of distributed data and computing infrastructure in addressing big problems in hydrology.
Commercial Space with Technology Maturation
NASA Technical Reports Server (NTRS)
McCleskey, Carey M.; Rhodes, Russell E.; Robinson, John W.
2013-01-01
To provide affordable space transportation we must be capable of using common fixed assets and the infrastructure for multiple purposes simultaneously. The Space Shuttle was operated for thirty years, but was not able to establish an effective continuous improvement program because of the high risk to the crew on every mission. An unmanned capability is needed to provide an acceptable risk to the primary mission. This paper is intended to present a case where a commercial space venture could share the large fixed cost of operating the infrastructure with the government while the government provides new advanced technology that is focused on reduced operating cost to the common launch transportation system. A conceivable commercial space venture could provide educational entertainment for the country's youth that would stimulate their interest in the science, technology, engineering, and mathematics (STEM) through access at entertainment parks or the existing Space Visitor Centers. The paper uses this example to demonstrate how growing public-private space market demand will re-orient space transportation industry priorities in flight and ground system design and technology development, and how the infrastructure is used and shared.
Sharing information among existing data sources
NASA Astrophysics Data System (ADS)
Ashley, W. R., III
1999-01-01
The sharing of information between law enforcement agencies is a premise for the success of all jurisdictions. A wealth of information resides in both the databases and infrastructures of local, state, and regional agencies. However, this information is often not available to the law enforcement professionals who require it. When the information is, available, individual investigators must not only know that it exists, but where it resides, and how to retrieve it. In many cases, these types of cross-jurisdictional communications are limited to personal relationships that result from telephone calls, faxes, and in some cases, e-mail. As criminal elements become more sophisticated and distributed, law enforcement agencies must begin to develop infrastructures and common sharing mechanisms that address a constantly evolving criminal threat. Historically, criminals have taken advantage of the lack of communication between law enforcement agencies. Examples of this are evident in the search for stolen property and monetary dealings. Pawned property, cash transactions, and failure to supply child support are three common cross- jurisdictional crimes that could be better enforced by strengthening the lines of communication. Criminal behavior demonstrates that it is easier to profit from their actions by dealing in separate jurisdictions. For example, stolen property is sold outside of the jurisdiction of its origin. In most cases, simply traveling a short distance to the adjoining county or municipality is sufficient to ensure that apprehension of the criminal or seizure of the stolen property is highly unlikely. In addition to the traditional burglar, fugitives often sell or pawn property to finance their continued evasion from the law. Sharing of information in a rapid manner would increase the ability of law enforcement personnel to track and capture fugitives, as well as criminals. In an example to combat this threat, the State of Florida recently acted on the need to share crucial investigative information across jurisdictional bounds by establishing a communications infrastructure for all of its law enforcement jurisdictions. The Criminal Justice Network (CJ-Net) is a statewide TCP/IP network, dedicated to the sharing of law enforcement information. CJ-Net is managed and maintained by the Florida Department of Law Enforcement (FDLE) and provides open access and privileges to any criminal justice agency, including the state court and penitentiary systems. In addition to Florida, other states, such as North Carolina, are also beginning to implement common protocol communication infrastructures and architectures in order to link local jurisdictions together throughout the state. The law enforcement domain in an optimum situation for information-sharing technologies. Communication infrastructures are continually established, and as such, action is required to effectively use these networks to their full potential. Information technologies that are best suited for the law enforcement domain, must be evaluated and implemented in a cost-effective manner. Unlike the Defense Department and other large federal agencies, individual jurisdictions at both the local and state level cannot afford to expend limited resources on research and development of prototype systems. Therefore, we must identify enabling technologies that have matured in related domains and transition them into law enforcement at a minimum cost. Crucial to this measure, is the selection of the appropriate levels of information-sharing technologies to be inserted. Information-sharing technologies that are unproven or have extensive recurring costs are not suitable for this domain. Information-sharing technologies traditionally exist between two distinct polar bounds: the data warehousing approach and mediation across distributed heterogeneous data sources. These two ends of the spectrum represent extremely different philosophies in accomplishing the same goal. In the following sections of this paper, discussions of information-sharing mechanisms will be addressed and the effectiveness of each is examined for the law enforcement domain. In each case, it is the opinion of the author as to which approach would lend itself to the most appropriate solution to the problem of effectively sharing criminal justice information.
Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership
NASA Astrophysics Data System (ADS)
Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya
CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.
Towards Social Radiology as an Information Infrastructure: Reconciling the Local With the Global
2014-01-01
The current widespread use of medical images and imaging procedures in clinical practice and patient diagnosis has brought about an increase in the demand for sharing medical imaging studies among health professionals in an easy and effective manner. This article reveals the existence of a polarization between the local and global demands for radiology practice. While there are no major barriers for sharing such studies, when access is made from a (local) picture archive and communication system (PACS) within the domain of a healthcare organization, there are a number of impediments for sharing studies among health professionals on a global scale. Social radiology as an information infrastructure involves the notion of a shared infrastructure as a public good, affording a social space where people, organizations and technical components may spontaneously form associations in order to share clinical information linked to patient care and radiology practice. This article shows however, that such polarization establishes a tension between local and global demands, which hinders the emergence of social radiology as an information infrastructure. Based on an analysis of the social space for radiology practice, the present article has observed that this tension persists due to the inertia of a locally installed base in radiology departments, for which common teleradiology models are not truly capable of reorganizing as a global social space for radiology practice. Reconciling the local with the global signifies integrating PACS and teleradiology into an evolving, secure, heterogeneous, shared, open information infrastructure where the conceptual boundaries between (local) PACS and (global) teleradiology are transparent, signaling the emergence of social radiology as an information infrastructure. PMID:25600710
Information Technology Strategic Plan 2009-2013
2009-01-01
and the absence of Enterprise funding models for shared services . Also, though progress has been made within the DHS IT community regarding...security access regulations for shared services ; and difficulties associated with 3 Office of the Chief Information Officer...infrastructure and shared services is the vision for the Infrastructure Transformation Program at DHS and is the means by which to reduce IT commodity
A physical layer perspective on access network sharing
NASA Astrophysics Data System (ADS)
Pfeiffer, Thomas
2015-12-01
Unlike in copper or wireless networks, there is no sharing of resources in fiber access networks yet, other than bit stream access or cable sharing, in which the fibers of a cable are let to one or multiple operators. Sharing optical resources on a single fiber among multiple operators or different services has not yet been applied. While this would allow for a better exploitation of installed infrastructures, there are operational issues which still need to be resolved, before this sharing model can be implemented in networks. Operating multiple optical systems and services over a common fiber plant, autonomously and independently from each other, can result in mutual distortions on the physical layer. These distortions will degrade the performance of the involved systems, unless precautions are taken in the infrastructure hardware to eliminate or to reduce them to an acceptable level. Moreover, the infrastructure needs to be designed such as to support different system technologies and to ensure a guaranteed quality of the end-to-end connections. In this paper, suitable means are proposed to be introduced in fiber access infrastructures that will allow for shared utilization of the fibers while safeguarding the operational needs and business interests of the involved parties.
Advanced Development of Certified OS Kernels
2015-06-01
It provides an infrastructure to map a physical page into multiple processes’ page maps in different address spaces. Their ownership mechanism ensures...of their shared memory infrastructure . Trap module The trap module specifies the behaviors of exception handlers and mCertiKOS system calls. In...layers), 1 pm for the shared memory infrastructure (3 layers), 3.5 pm for the thread management (10 layers), 1 pm for the process management (4 layers
COOPEUS - connecting research infrastructures in environmental sciences
NASA Astrophysics Data System (ADS)
Koop-Jakobsen, Ketil; Waldmann, Christoph; Huber, Robert
2015-04-01
The COOPEUS project was initiated in 2012 bringing together 10 research infrastructures (RIs) in environmental sciences from the EU and US in order to improve the discovery, access, and use of environmental information and data across scientific disciplines and across geographical borders. The COOPEUS mission is to facilitate readily accessible research infrastructure data to advance our understanding of Earth systems through an international community-driven effort, by: Bringing together both user communities and top-down directives to address evolving societal and scientific needs; Removing technical, scientific, cultural and geopolitical barriers for data use; and Coordinating the flow, integrity and preservation of information. A survey of data availability was conducted among the COOPEUS research infrastructures for the purpose of discovering impediments for open international and cross-disciplinary sharing of environmental data. The survey showed that the majority of data offered by the COOPEUS research infrastructures is available via the internet (>90%), but the accessibility to these data differ significantly among research infrastructures; only 45% offer open access on their data, whereas the remaining infrastructures offer restricted access e.g. do not release raw data or sensible data, demand user registration or require permission prior to release of data. These rules and regulations are often installed as a form of standard practice, whereas formal data policies are lacking in 40% of the infrastructures, primarily in the EU. In order to improve this situation COOPEUS has installed a common data-sharing policy, which is agreed upon by all the COOPEUS research infrastructures. To investigate the existing opportunities for improving interoperability among environmental research infrastructures, COOPEUS explored the opportunities with the GEOSS common infrastructure (GCI) by holding a hands-on workshop. Through exercises directly registering resources, the first steps were taken to implement the GCI as a platform for documenting the capabilities of the COOPEUS research infrastructures. COOPEUS recognizes the potential for the GCI to become an important platform promoting cross-disciplinary approaches in the studies of multifaceted environmental challenges. Recommendations from the workshop participants also revealed that in order to attract research infrastructures to use the GCI, the registration process must be simplified and accelerated. However, also the data policies of the individual research infrastructure, or lack thereof, can prevent the use of the GCI or other portals, due to unclarities regarding data management authority and data ownership. COOPEUS shall continue to promote cross-disciplinary data exchange in the environmental field and will in the future expand to also include other geographical areas.
47 CFR 59.2 - Terms and conditions of infrastructure sharing.
Code of Federal Regulations, 2010 CFR
2010-10-01
... infrastructure, technology, information, or telecommunications facilities, or functions made available to a... infrastructure, technology, information, and telecommunications facilities, or functions available to a... infrastructure, technology, information and telecommunications facilities and functions pursuant to this part. ...
Sharing and community curation of mass spectrometry data with GNPS
Nguyen, Don Duy; Watrous, Jeramie; Kapono, Clifford A; Luzzatto-Knaan, Tal; Porto, Carla; Bouslimani, Amina; Melnik, Alexey V; Meehan, Michael J; Liu, Wei-Ting; Crüsemann, Max; Boudreau, Paul D; Esquenazi, Eduardo; Sandoval-Calderón, Mario; Kersten, Roland D; Pace, Laura A; Quinn, Robert A; Duncan, Katherine R; Hsu, Cheng-Chih; Floros, Dimitrios J; Gavilan, Ronnie G; Kleigrewe, Karin; Northen, Trent; Dutton, Rachel J; Parrot, Delphine; Carlson, Erin E; Aigle, Bertrand; Michelsen, Charlotte F; Jelsbak, Lars; Sohlenkamp, Christian; Pevzner, Pavel; Edlund, Anna; McLean, Jeffrey; Piel, Jörn; Murphy, Brian T; Gerwick, Lena; Liaw, Chih-Chuang; Yang, Yu-Liang; Humpf, Hans-Ulrich; Maansson, Maria; Keyzers, Robert A; Sims, Amy C; Johnson, Andrew R.; Sidebottom, Ashley M; Sedio, Brian E; Klitgaard, Andreas; Larson, Charles B; P., Cristopher A Boya; Torres-Mendoza, Daniel; Gonzalez, David J; Silva, Denise B; Marques, Lucas M; Demarque, Daniel P; Pociute, Egle; O'Neill, Ellis C; Briand, Enora; Helfrich, Eric J. N.; Granatosky, Eve A; Glukhov, Evgenia; Ryffel, Florian; Houson, Hailey; Mohimani, Hosein; Kharbush, Jenan J; Zeng, Yi; Vorholt, Julia A; Kurita, Kenji L; Charusanti, Pep; McPhail, Kerry L; Nielsen, Kristian Fog; Vuong, Lisa; Elfeki, Maryam; Traxler, Matthew F; Engene, Niclas; Koyama, Nobuhiro; Vining, Oliver B; Baric, Ralph; Silva, Ricardo R; Mascuch, Samantha J; Tomasi, Sophie; Jenkins, Stefan; Macherla, Venkat; Hoffman, Thomas; Agarwal, Vinayak; Williams, Philip G; Dai, Jingqui; Neupane, Ram; Gurr, Joshua; Rodríguez, Andrés M. C.; Lamsa, Anne; Zhang, Chen; Dorrestein, Kathleen; Duggan, Brendan M; Almaliti, Jehad; Allard, Pierre-Marie; Phapale, Prasad; Nothias, Louis-Felix; Alexandrov, Theodore; Litaudon, Marc; Wolfender, Jean-Luc; Kyle, Jennifer E; Metz, Thomas O; Peryea, Tyler; Nguyen, Dac-Trung; VanLeer, Danielle; Shinn, Paul; Jadhav, Ajit; Müller, Rolf; Waters, Katrina M; Shi, Wenyuan; Liu, Xueting; Zhang, Lixin; Knight, Rob; Jensen, Paul R; Palsson, Bernhard O; Pogliano, Kit; Linington, Roger G; Gutiérrez, Marcelino; Lopes, Norberto P; Gerwick, William H; Moore, Bradley S; Dorrestein, Pieter C; Bandeira, Nuno
2017-01-01
The potential of the diverse chemistries present in natural products (NP) for biotechnology and medicine remains untapped because NP databases are not searchable with raw data and the NP community has no way to share data other than in published papers. Although mass spectrometry techniques are well-suited to high-throughput characterization of natural products, there is a pressing need for an infrastructure to enable sharing and curation of data. We present Global Natural Products Social molecular networking (GNPS, http://gnps.ucsd.edu), an open-access knowledge base for community wide organization and sharing of raw, processed or identified tandem mass (MS/MS) spectrometry data. In GNPS crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations. Data-driven social-networking should facilitate identification of spectra and foster collaborations. We also introduce the concept of ‘living data’ through continuous reanalysis of deposited data. PMID:27504778
Wang, Mingxun; Carver, Jeremy J; Phelan, Vanessa V; Sanchez, Laura M; Garg, Neha; Peng, Yao; Nguyen, Don Duy; Watrous, Jeramie; Kapono, Clifford A; Luzzatto-Knaan, Tal; Porto, Carla; Bouslimani, Amina; Melnik, Alexey V; Meehan, Michael J; Liu, Wei-Ting; Crüsemann, Max; Boudreau, Paul D; Esquenazi, Eduardo; Sandoval-Calderón, Mario; Kersten, Roland D; Pace, Laura A; Quinn, Robert A; Duncan, Katherine R; Hsu, Cheng-Chih; Floros, Dimitrios J; Gavilan, Ronnie G; Kleigrewe, Karin; Northen, Trent; Dutton, Rachel J; Parrot, Delphine; Carlson, Erin E; Aigle, Bertrand; Michelsen, Charlotte F; Jelsbak, Lars; Sohlenkamp, Christian; Pevzner, Pavel; Edlund, Anna; McLean, Jeffrey; Piel, Jörn; Murphy, Brian T; Gerwick, Lena; Liaw, Chih-Chuang; Yang, Yu-Liang; Humpf, Hans-Ulrich; Maansson, Maria; Keyzers, Robert A; Sims, Amy C; Johnson, Andrew R; Sidebottom, Ashley M; Sedio, Brian E; Klitgaard, Andreas; Larson, Charles B; P, Cristopher A Boya; Torres-Mendoza, Daniel; Gonzalez, David J; Silva, Denise B; Marques, Lucas M; Demarque, Daniel P; Pociute, Egle; O'Neill, Ellis C; Briand, Enora; Helfrich, Eric J N; Granatosky, Eve A; Glukhov, Evgenia; Ryffel, Florian; Houson, Hailey; Mohimani, Hosein; Kharbush, Jenan J; Zeng, Yi; Vorholt, Julia A; Kurita, Kenji L; Charusanti, Pep; McPhail, Kerry L; Nielsen, Kristian Fog; Vuong, Lisa; Elfeki, Maryam; Traxler, Matthew F; Engene, Niclas; Koyama, Nobuhiro; Vining, Oliver B; Baric, Ralph; Silva, Ricardo R; Mascuch, Samantha J; Tomasi, Sophie; Jenkins, Stefan; Macherla, Venkat; Hoffman, Thomas; Agarwal, Vinayak; Williams, Philip G; Dai, Jingqui; Neupane, Ram; Gurr, Joshua; Rodríguez, Andrés M C; Lamsa, Anne; Zhang, Chen; Dorrestein, Kathleen; Duggan, Brendan M; Almaliti, Jehad; Allard, Pierre-Marie; Phapale, Prasad; Nothias, Louis-Felix; Alexandrov, Theodore; Litaudon, Marc; Wolfender, Jean-Luc; Kyle, Jennifer E; Metz, Thomas O; Peryea, Tyler; Nguyen, Dac-Trung; VanLeer, Danielle; Shinn, Paul; Jadhav, Ajit; Müller, Rolf; Waters, Katrina M; Shi, Wenyuan; Liu, Xueting; Zhang, Lixin; Knight, Rob; Jensen, Paul R; Palsson, Bernhard O; Pogliano, Kit; Linington, Roger G; Gutiérrez, Marcelino; Lopes, Norberto P; Gerwick, William H; Moore, Bradley S; Dorrestein, Pieter C; Bandeira, Nuno
2016-08-09
The potential of the diverse chemistries present in natural products (NP) for biotechnology and medicine remains untapped because NP databases are not searchable with raw data and the NP community has no way to share data other than in published papers. Although mass spectrometry (MS) techniques are well-suited to high-throughput characterization of NP, there is a pressing need for an infrastructure to enable sharing and curation of data. We present Global Natural Products Social Molecular Networking (GNPS; http://gnps.ucsd.edu), an open-access knowledge base for community-wide organization and sharing of raw, processed or identified tandem mass (MS/MS) spectrometry data. In GNPS, crowdsourced curation of freely available community-wide reference MS libraries will underpin improved annotations. Data-driven social-networking should facilitate identification of spectra and foster collaborations. We also introduce the concept of 'living data' through continuous reanalysis of deposited data.
Open Data in Global Environmental Research: The Belmont Forum's Open Data Survey.
Schmidt, Birgit; Gemeinholzer, Birgit; Treloar, Andrew
2016-01-01
This paper presents the findings of the Belmont Forum's survey on Open Data which targeted the global environmental research and data infrastructure community. It highlights users' perceptions of the term "open data", expectations of infrastructure functionalities, and barriers and enablers for the sharing of data. A wide range of good practice examples was pointed out by the respondents which demonstrates a substantial uptake of data sharing through e-infrastructures and a further need for enhancement and consolidation. Among all policy responses, funder policies seem to be the most important motivator. This supports the conclusion that stronger mandates will strengthen the case for data sharing.
Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J.; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius
2016-01-01
The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data. PMID:28785418
Connor, Thomas R; Loman, Nicholas J; Thompson, Simon; Smith, Andy; Southgate, Joel; Poplawski, Radoslaw; Bull, Matthew J; Richardson, Emily; Ismail, Matthew; Thompson, Simon Elwood-; Kitchen, Christine; Guest, Martyn; Bakke, Marius; Sheppard, Samuel K; Pallen, Mark J
2016-09-01
The increasing availability and decreasing cost of high-throughput sequencing has transformed academic medical microbiology, delivering an explosion in available genomes while also driving advances in bioinformatics. However, many microbiologists are unable to exploit the resulting large genomics datasets because they do not have access to relevant computational resources and to an appropriate bioinformatics infrastructure. Here, we present the Cloud Infrastructure for Microbial Bioinformatics (CLIMB) facility, a shared computing infrastructure that has been designed from the ground up to provide an environment where microbiologists can share and reuse methods and data.
DOT National Transportation Integrated Search
2017-10-25
Sharing Data between Mobile Devices, Connected Vehicles and Infrastructure was a U.S. DOT-sponsored research project to study the integration of mobile devices (such as smartphones) into the Connected Vehicle (CV) environment. Objectives includ...
Roadmap for Developing of Brokering as a Component of EarthCube
NASA Astrophysics Data System (ADS)
Pearlman, J.; Khalsa, S. S.; Browdy, S.; Duerr, R. E.; Nativi, S.; Parsons, M. A.; Pearlman, F.; Robinson, E. M.
2012-12-01
The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Key to achieving the EarthCube vision is establishing a process that will guide the evolution of the infrastructure through community engagement and appropriate investment so that the infrastructure is embraced and utilized by the entire geosciences community. In this presentation we describe a roadmap, developed through the EarthCube Brokering Concept Award, for an evolutionary process of infrastructure and interoperability development. All geoscience communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for consolidating these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. This process of consolidation will be achieved by creating "interfaces," what we call "brokers," between systems. Brokers connect disparate systems without imposing new burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. These pilots can then grow into larger prototypes addressing intercommunity problems working towards a full-scale socio-technical infrastructure vision. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. Several NSF infrastructure projects are underway and beginning to shape the next generation of information sharing. There is a near term, and possibly unique, opportunity to increase the impact and interconnectivity of these projects, and further improve science research collaboration through brokering. Brokering has been demonstrated to be an essential part of a robust, adaptive infrastructure, but critical questions of governance and detailed implementation remain. Our roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.
Effect of infrastructure design on commons dilemmas in social-ecological system dynamics.
Yu, David J; Qubbaj, Murad R; Muneepeerakul, Rachata; Anderies, John M; Aggarwal, Rimjhim M
2015-10-27
The use of shared infrastructure to direct natural processes for the benefit of humans has been a central feature of human social organization for millennia. Today, more than ever, people interact with one another and the environment through shared human-made infrastructure (the Internet, transportation, the energy grid, etc.). However, there has been relatively little work on how the design characteristics of shared infrastructure affect the dynamics of social-ecological systems (SESs) and the capacity of groups to solve social dilemmas associated with its provision. Developing such understanding is especially important in the context of global change where design criteria must consider how specific aspects of infrastructure affect the capacity of SESs to maintain vital functions in the face of shocks. Using small-scale irrigated agriculture (the most ancient and ubiquitous example of public infrastructure systems) as a model system, we show that two design features related to scale and the structure of benefit flows can induce fundamental changes in qualitative behavior, i.e., regime shifts. By relating the required maintenance threshold (a design feature related to infrastructure scale) to the incentives facing users under different regimes, our work also provides some general guidance on determinants of robustness of SESs under globalization-related stresses.
Effect of infrastructure design on commons dilemmas in social−ecological system dynamics
Yu, David J.; Qubbaj, Murad R.; Muneepeerakul, Rachata; Anderies, John M.; Aggarwal, Rimjhim M.
2015-01-01
The use of shared infrastructure to direct natural processes for the benefit of humans has been a central feature of human social organization for millennia. Today, more than ever, people interact with one another and the environment through shared human-made infrastructure (the Internet, transportation, the energy grid, etc.). However, there has been relatively little work on how the design characteristics of shared infrastructure affect the dynamics of social−ecological systems (SESs) and the capacity of groups to solve social dilemmas associated with its provision. Developing such understanding is especially important in the context of global change where design criteria must consider how specific aspects of infrastructure affect the capacity of SESs to maintain vital functions in the face of shocks. Using small-scale irrigated agriculture (the most ancient and ubiquitous example of public infrastructure systems) as a model system, we show that two design features related to scale and the structure of benefit flows can induce fundamental changes in qualitative behavior, i.e., regime shifts. By relating the required maintenance threshold (a design feature related to infrastructure scale) to the incentives facing users under different regimes, our work also provides some general guidance on determinants of robustness of SESs under globalization-related stresses. PMID:26460043
Measuring Systemic Impacts of Bike Infrastructure Projects
DOT National Transportation Integrated Search
2018-05-01
This paper qualitatively identifies the impacts of bicycle infrastructure on all roadway users, including safety, operations, and travel route choice. Bicycle infrastructure includes shared lanes, conventional bike lanes, and separated bike lanes. Th...
Open Data in Global Environmental Research: The Belmont Forum’s Open Data Survey
Schmidt, Birgit; Gemeinholzer, Birgit; Treloar, Andrew
2016-01-01
This paper presents the findings of the Belmont Forum’s survey on Open Data which targeted the global environmental research and data infrastructure community. It highlights users’ perceptions of the term “open data”, expectations of infrastructure functionalities, and barriers and enablers for the sharing of data. A wide range of good practice examples was pointed out by the respondents which demonstrates a substantial uptake of data sharing through e-infrastructures and a further need for enhancement and consolidation. Among all policy responses, funder policies seem to be the most important motivator. This supports the conclusion that stronger mandates will strengthen the case for data sharing. PMID:26771577
ERIC Educational Resources Information Center
Lee, Ashley; Hobson, Joe; Bienkowski, Marie; Midgley, Steve; Currier, Sarah; Campbell, Lorna M.; Novoselova, Tatiana
2012-01-01
In this article, the authors describe an open-source, open-data digital infrastructure for sharing information about open educational resources (OERs) across disparate systems and platforms. The Learning Registry, which began as a project funded by the U.S. Departments of Education and Defense, currently has an active international community…
NASA Technical Reports Server (NTRS)
Sundaram, Meenakshi
2005-01-01
NASA and the aerospace industry are extremely serious about reducing the cost and improving the performance of launch vehicles both manned or unmanned. In the aerospace industry, sharing infrastructure for manufacturing more than one type spacecraft is becoming a trend to achieve economy of scale. An example is the Boeing Decatur facility where both Delta II and Delta IV launch vehicles are made. The author is not sure how Boeing estimates the costs of each spacecraft made in the same facility. Regardless of how a contractor estimates the cost, NASA in its popular cost estimating tool, NASA Air force Cost Modeling (NAFCOM) has to have a method built in to account for the effect of infrastructure sharing. Since there is no provision in the most recent version of NAFCOM2002 to take care of this, it has been found by the Engineering Cost Community at MSFC that the tool overestimates the manufacturing cost by as much as 30%. Therefore, the objective of this study is to develop a methodology to assess the impact of infrastructure sharing so that better operations cost estimates may be made.
Alternative Fuels Data Center: Natural Gas Fueling Infrastructure
Development Infrastructure Development to someone by E-mail Share Alternative Fuels Data Center : Natural Gas Fueling Infrastructure Development on Facebook Tweet about Alternative Fuels Data Center : Natural Gas Fueling Infrastructure Development on Twitter Bookmark Alternative Fuels Data Center: Natural
Schilling, Lisa M.; Kwan, Bethany M.; Drolshagen, Charles T.; Hosokawa, Patrick W.; Brandt, Elias; Pace, Wilson D.; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R.O.; Stephens, William E.; George, Joseph M.; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K.; Kahn, Michael G.
2013-01-01
Introduction: Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. Methods: The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. Discussion: SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions. PMID:25848567
Schilling, Lisa M; Kwan, Bethany M; Drolshagen, Charles T; Hosokawa, Patrick W; Brandt, Elias; Pace, Wilson D; Uhrich, Christopher; Kamerick, Michael; Bunting, Aidan; Payne, Philip R O; Stephens, William E; George, Joseph M; Vance, Mark; Giacomini, Kelli; Braddy, Jason; Green, Mika K; Kahn, Michael G
2013-01-01
Distributed Data Networks (DDNs) offer infrastructure solutions for sharing electronic health data from across disparate data sources to support comparative effectiveness research. Data sharing mechanisms must address technical and governance concerns stemming from network security and data disclosure laws and best practices, such as HIPAA. The Scalable Architecture for Federated Translational Inquiries Network (SAFTINet) deploys TRIAD grid technology, a common data model, detailed technical documentation, and custom software for data harmonization to facilitate data sharing in collaboration with stakeholders in the care of safety net populations. Data sharing partners host TRIAD grid nodes containing harmonized clinical data within their internal or hosted network environments. Authorized users can use a central web-based query system to request analytic data sets. SAFTINet DDN infrastructure achieved a number of data sharing objectives, including scalable and sustainable systems for ensuring harmonized data structures and terminologies and secure distributed queries. Initial implementation challenges were resolved through iterative discussions, development and implementation of technical documentation, governance, and technology solutions.
Distributed generation of shared RSA keys in mobile ad hoc networks
NASA Astrophysics Data System (ADS)
Liu, Yi-Liang; Huang, Qin; Shen, Ying
2005-12-01
Mobile Ad Hoc Networks is a totally new concept in which mobile nodes are able to communicate together over wireless links in an independent manner, independent of fixed physical infrastructure and centralized administrative infrastructure. However, the nature of Ad Hoc Networks makes them very vulnerable to security threats. Generation and distribution of shared keys for CA (Certification Authority) is challenging for security solution based on distributed PKI(Public-Key Infrastructure)/CA. The solutions that have been proposed in the literature and some related issues are discussed in this paper. The solution of a distributed generation of shared threshold RSA keys for CA is proposed in the present paper. During the process of creating an RSA private key share, every CA node only has its own private security. Distributed arithmetic is used to create the CA's private share locally, and that the requirement of centralized management institution is eliminated. Based on fully considering the Mobile Ad Hoc network's characteristic of self-organization, it avoids the security hidden trouble that comes by holding an all private security share of CA, with which the security and robustness of system is enhanced.
Alternative Fuels Data Center: Electric Vehicle Infrastructure Projection
Tool (EVI-Pro) Lite Electric Vehicle Infrastructure Projection Tool (EVI-Pro) Lite to someone by E-mail Share Alternative Fuels Data Center: Electric Vehicle Infrastructure Projection Tool (EVI -Pro) Lite on Facebook Tweet about Alternative Fuels Data Center: Electric Vehicle Infrastructure
Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In
Electric Vehicles Developing Infrastructure to Charge Plug-In Electric Vehicles to someone by E -mail Share Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In Electric Vehicles on Facebook Tweet about Alternative Fuels Data Center: Developing Infrastructure to Charge Plug-In
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Hanlon, M.; Nanda, G.; Agarwal, P.
2015-12-01
Recent advances in imaging have provided a wealth of 3D datasets that reveal pore space microstructure (nm to cm length scale) and allow investigation of nonlinear flow and mechanical phenomena from first principles using numerical approaches. This framework has popularly been called "digital rock physics". Researchers, however, have trouble storing and sharing the datasets both due to their size and the lack of standardized image types and associated metadata for volumetric datasets. This impedes scientific cross-validation of the numerical approaches that characterize large scale porous media properties, as well as development of multiscale approaches required for correct upscaling. A single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal, that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Once widely accepter, the repository will jumpstart productivity and enable scientific inquiry and engineering decisions founded on a data-driven basis. This is the first repository of its kind. We show initial results on incorporating essential software tools and pipelines that make it easier for researchers to store and reuse data, and for educators to quickly visualize and illustrate concepts to a wide audience. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.
COINSTAC: Decentralizing the future of brain imaging analysis
Ming, Jing; Verner, Eric; Sarwate, Anand; Kelly, Ross; Reed, Cory; Kahleck, Torran; Silva, Rogers; Panta, Sandeep; Turner, Jessica; Plis, Sergey; Calhoun, Vince
2017-01-01
In the era of Big Data, sharing neuroimaging data across multiple sites has become increasingly important. However, researchers who want to engage in centralized, large-scale data sharing and analysis must often contend with problems such as high database cost, long data transfer time, extensive manual effort, and privacy issues for sensitive data. To remove these barriers to enable easier data sharing and analysis, we introduced a new, decentralized, privacy-enabled infrastructure model for brain imaging data called COINSTAC in 2016. We have continued development of COINSTAC since this model was first introduced. One of the challenges with such a model is adapting the required algorithms to function within a decentralized framework. In this paper, we report on how we are solving this problem, along with our progress on several fronts, including additional decentralized algorithms implementation, user interface enhancement, decentralized regression statistic calculation, and complete pipeline specifications. PMID:29123643
Shared Knowledge for Decision-making on Environment and Health Issues in the Arctic
NASA Technical Reports Server (NTRS)
Maynard, Nancy G.
2003-01-01
This paper will describe a remote sensing and GIs-based system to bring indigenous traditional knowledge together with contemporary scientific knowledge to address impacts resulting from changes in climate, environment, weather and pollution in the Arctic. As scientists and policy-makers from both indigenous and non-indigenous communities continue to build closer partnerships to address common sustainability issues such as the health impacts of climate change and anthropogenic activities, it becomes increasingly important to create shared information management systems which integrate all relevant factors for optimal information sharing and decision-making. This system is being designed to bring together remotely sensed, indigenous and other data and observations for analysis, measuring, and monitoring parameters of interest (e.g., snow cover, rainfall, temperature, ice conditions, vegetation, infrastructure, fires). A description of the system and its components as well as a preliminary application of the system in the Arctic will be presented.
77 FR 72673 - Critical Infrastructure Protection and Resilience Month, 2012
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-05
.... Cyber incidents can have devastating consequences on both physical and virtual infrastructure, which is... work within existing authorities to fortify our country against cyber risks, comprehensive legislation remains essential to improving infrastructure security, enhancing cyber information sharing between...
Rahman, Mahabubur; Watabe, Hiroshi
2018-05-01
Molecular imaging serves as an important tool for researchers and clinicians to visualize and investigate complex biochemical phenomena using specialized instruments; these instruments are either used individually or in combination with targeted imaging agents to obtain images related to specific diseases with high sensitivity, specificity, and signal-to-noise ratios. However, molecular imaging, which is a multidisciplinary research field, faces several challenges, including the integration of imaging informatics with bioinformatics and medical informatics, requirement of reliable and robust image analysis algorithms, effective quality control of imaging facilities, and those related to individualized disease mapping, data sharing, software architecture, and knowledge management. As a cost-effective and open-source approach to address these challenges related to molecular imaging, we develop a flexible, transparent, and secure infrastructure, named MIRA, which stands for Molecular Imaging Repository and Analysis, primarily using the Python programming language, and a MySQL relational database system deployed on a Linux server. MIRA is designed with a centralized image archiving infrastructure and information database so that a multicenter collaborative informatics platform can be built. The capability of dealing with metadata, image file format normalization, and storing and viewing different types of documents and multimedia files make MIRA considerably flexible. With features like logging, auditing, commenting, sharing, and searching, MIRA is useful as an Electronic Laboratory Notebook for effective knowledge management. In addition, the centralized approach for MIRA facilitates on-the-fly access to all its features remotely through any web browser. Furthermore, the open-source approach provides the opportunity for sustainable continued development. MIRA offers an infrastructure that can be used as cross-boundary collaborative MI research platform for the rapid achievement in cancer diagnosis and therapeutics. Copyright © 2018 Elsevier Ltd. All rights reserved.
Challenges for Data Archival Centers in Evolving Environmental Sciences
NASA Astrophysics Data System (ADS)
Wei, Y.; Cook, R. B.; Gu, L.; Santhana Vannan, S. K.; Beaty, T.
2015-12-01
Environmental science has entered into a big data era as enormous data about the Earth environment are continuously collected through field and airborne missions, remote sensing observations, model simulations, sensor networks, etc. An open-access and open-management data infrastructure for data-intensive science is a major grand challenge in global environmental research (BERAC, 2010). Such an infrastructure, as exemplified in EOSDIS, GEOSS, and NSF EarthCube, will provide a complete lifecycle of environmental data and ensures that data will smoothly flow among different phases of collection, preservation, integration, and analysis. Data archival centers, as the data integration units closest to data providers, serve as the source power to compile and integrate heterogeneous environmental data into this global infrastructure. This presentation discusses the interoperability challenges and practices of geosciences from the aspect of data archival centers, based on the operational experiences of the NASA-sponsored Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) and related environmental data management activities. Specifically, we will discuss the challenges to 1) encourage and help scientists to more actively share data with the broader scientific community, so that valuable environmental data, especially those dark data collected by individual scientists in small independent projects, can be shared and integrated into the infrastructure to tackle big science questions; 2) curate heterogeneous multi-disciplinary data, focusing on the key aspects of identification, format, metadata, data quality, and semantics to make them ready to be plugged into a global data infrastructure. We will highlight data curation practices at the ORNL DAAC for global campaigns such as BOREAS, LBA, SAFARI 2000; and 3) enhance the capabilities to more effectively and efficiently expose and deliver "big" environmental data to broad range of users and systems. Experiences and challenges with integrating large data sets via the ORNL DAAC's data discovery and delivery Web services will be discussed.
NASA Astrophysics Data System (ADS)
Dabolt, T. O.
2016-12-01
The proliferation of open data and data services continues to thrive and is creating new challenges on how researchers, policy analysts and other decision makes can quickly discover and use relevant data. While traditional metadata catalog approaches used by applications such as data.gov prove to be useful starting points for data search they can quickly frustrate end users who are seeking ways to quickly find and then use data in machine to machine environs. The Geospatial Platform is overcoming these obstacles and providing end users and applications developers a richer more productive user experience. The Geospatial Platform leverages a collection of open source and commercial technology hosted on Amazon Web Services providing an ecosystem of services delivering trusted, consistent data in open formats to all users as well as a shared infrastructure for federal partners to serve their spatial data assets. It supports a diverse array of communities of practice ranging on topics from the 16 National Geospatial Data Assets Themes, to homeland security and climate adaptation. Come learn how you can contribute your data and leverage others or check it out on your own at https://www.geoplatform.gov/
Tufts Health Sciences Database: lessons, issues, and opportunities.
Lee, Mary Y; Albright, Susan A; Alkasab, Tarik; Damassa, David A; Wang, Paul J; Eaton, Elizabeth K
2003-03-01
The authors present their seven-year experience with developing the Tufts Health Sciences Database (Tufts HSDB), a database-driven information management system that combines the strengths of a digital library, content delivery tools, and curriculum management. They describe a future where online tools will provide a health sciences learning infrastructure that fosters the work of an increasingly interdisciplinary community of learners and allows content to be shared across institutions as well as with academic and commercial information repositories. The authors note the key partners in Tufts HSDB's success--the close collaboration of the health sciences library, educational affairs, and information technology staff. Tufts HSDB moved quickly from serving the medical curriculum to supporting Tufts' veterinary, dental, biomedical sciences, and nutrition schools, thus leveraging Tufts HSDB research and development with university-wide efforts including Internet2 middleware, wireless access, information security, and digital libraries. The authors identify major effects on teaching and learning, e.g., what is better taught with multimedia, how faculty preparation and student learning time can be more efficient and effective, how content integration for interdisciplinary teaching and learning is promoted, and how continuous improvement methods can be integrated. Also addressed are issues of faculty development, copyright and intellectual property, budgetary concerns, and coordinating IT across schools and hospitals. The authors describe Tufts' recent experience with sharing its infrastructure with other schools, and welcome inquiries from those wishing to explore national and international partnerships to create a truly open and integrated infrastructure for education across the health sciences.
TCIA Secure Cyber Critical Infrastructure Modernization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keliiaa, Curtis M.
The Sandia National Laboratories (Sandia Labs) tribal cyber infrastructure assurance initiative was developed in response to growing national cybersecurity concerns in the the sixteen Department of Homeland Security (DHS) defined critical infrastructure sectors1. Technical assistance is provided for the secure modernization of critical infrastructure and key resources from a cyber-ecosystem perspective with an emphasis on enhanced security, resilience, and protection. Our purpose is to address national critical infrastructure challenges as a shared responsibility.
Freshwater Choices in China: Options That Will Impact South and Southeast Asia
2014-12-04
engineering infrastructure upstream on shared international river basins within its borders, and will be able to effectively use the threat of...constructing hydro-engineering infrastructure upstream on shared international river basins within its borders, and will be able to effectively use the...international river basins within its borders, China will be able to effectively use the threat of restricting freshwater flows as a political weapon to
NASA Astrophysics Data System (ADS)
Zeff, H. B.; Characklis, G. W.; Reed, P. M.; Herman, J. D.
2015-12-01
Water supply policies that integrate portfolios of short-term management decisions with long-term infrastructure development enable utilities to adapt to a range of future scenarios. An effective mix of short-term management actions can augment existing infrastructure, potentially forestalling new development. Likewise, coordinated expansion of infrastructure such as regional interconnections and shared treatment capacity can increase the effectiveness of some management actions like water transfers. Highly adaptable decision pathways that mix long-term infrastructure options and short-term management actions require decision triggers capable of incorporating the impact of these time-evolving decisions on growing water supply needs. Here, we adapt risk-based triggers to sequence a set of potential infrastructure options in combination with utility-specific conservation actions and inter-utility water transfers. Individual infrastructure pathways can be augmented with conservation or water transfers to reduce the cost of meeting utility objectives, but they can also include cooperatively developed, shared infrastructure that expands regional capacity to transfer water. This analysis explores the role of cooperation among four water utilities in the 'Research Triangle' region of North Carolina by formulating three distinct categories of adaptive policy pathways: independent action (utility-specific conservation and supply infrastructure only), weak cooperation (utility-specific conservation and infrastructure development with regional transfers), and strong cooperation (utility specific conservation and jointly developed of regional infrastructure that supports transfers). Results suggest that strong cooperation aids the utilities in meeting their individual objections at substantially lower costs and with fewer irreversible infrastructure options.
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ananthakrishnan, Rachana; Bell, Gavin; Cinquini, Luca
2013-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation: An Open Infrastructure for Access to Distributed Geo-Spatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cinquini, Luca; Crichton, Daniel; Miller, Neill
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF s architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL,more » GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).« less
The Earth System Grid Federation : an Open Infrastructure for Access to Distributed Geospatial Data
NASA Technical Reports Server (NTRS)
Cinquini, Luca; Crichton, Daniel; Mattmann, Chris; Harney, John; Shipman, Galen; Wang, Feiyi; Ananthakrishnan, Rachana; Miller, Neill; Denvil, Sebastian; Morgan, Mark;
2012-01-01
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration that aims at developing the software infrastructure needed to facilitate and empower the study of climate change on a global scale. The ESGF's architecture employs a system of geographically distributed peer nodes, which are independently administered yet united by the adoption of common federation protocols and application programming interfaces (APIs). The cornerstones of its interoperability are the peer-to-peer messaging that is continuously exchanged among all nodes in the federation; a shared architecture and API for search and discovery; and a security infrastructure based on industry standards (OpenID, SSL, GSI and SAML). The ESGF software is developed collaboratively across institutional boundaries and made available to the community as open source. It has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the entire model output used for the next international assessment report on climate change (IPCC-AR5) and a suite of satellite observations (obs4MIPs) and reanalysis data sets (ANA4MIPs).
Franco, Natália M; Medeiros, Gabriel F; Silva, Edson A; Murta, Angela S; Machado, Aydano P; Fidalgo, Robson N
2015-01-01
This work presents a Modeling Language and its technological infrastructure to customize the vocabulary of Communication Boards (CB), which are important tools to provide more humanization of health care. Using a technological infrastructure based on Model-Driven Development (MDD) approach, our Modelin Language (ML) creates an abstraction layer between users (e.g., health professionals such as an audiologist or speech therapist) and application code. Moreover, the use of a metamodel enables a syntactic corrector for preventing creation of wrong models. Our ML and metamodel enable more autonomy for health professionals in creating customized CB because it abstracts complexities and permits them to deal only with the domain concepts (e.g., vocabulary and patient needs). Additionally, our infrastructure provides a configuration file that can be used to share and reuse models. This way, the vocabulary modelling effort will decrease our time since people share vocabulary models. Our study provides an infrastructure that aims to abstract the complexity of CB vocabulary customization, giving more autonomy to health professionals when they need customizing, sharing and reusing vocabularies for CB.
University of Maryland MRSEC - Facilities: Instrumentation Infrastructure
Instrumentation Infrastructure In order to establish a broader campus context, each MRSEC Shared Experimental without alteration is granted to educational institutions for non-profit administrative or educational
More Bang for the Buck: Integrating Green Infrastructure into Existing Public Works Projects
shares lessons learned from municipal and county officials experienced in coordinating green infrastructure applications with scheduled street maintenance, park improvements, and projects on public sites.
Alternative Fuels Data Center: Smith Dairy Deploys Natural Gas Vehicles and
Fueling Infrastructure in the Midwest Smith Dairy Deploys Natural Gas Vehicles and Fueling Infrastructure in the Midwest to someone by E-mail Share Alternative Fuels Data Center: Smith Dairy Deploys Data Center: Smith Dairy Deploys Natural Gas Vehicles and Fueling Infrastructure in the Midwest on
NASA Astrophysics Data System (ADS)
Delle Fratte, C.; Kennedy, J. A.; Kluth, S.; Mazzaferro, L.
2015-12-01
In a grid computing infrastructure tasks such as continuous upgrades, services installations and software deployments are part of an admins daily work. In such an environment tools to help with the management, provisioning and monitoring of the deployed systems and services have become crucial. As experiments such as the LHC increase in scale, the computing infrastructure also becomes larger and more complex. Moreover, today's admins increasingly work within teams that share responsibilities and tasks. Such a scaled up situation requires tools that not only simplify the workload on administrators but also enable them to work seamlessly in teams. In this paper will be presented our experience from managing the Max Planck Institute Tier2 using Puppet and Gitolite in a cooperative way to help the system administrator in their daily work. In addition to describing the Puppet-Gitolite system, best practices and customizations will also be shown.
A Cloud-based Infrastructure and Architecture for Environmental System Research
NASA Astrophysics Data System (ADS)
Wang, D.; Wei, Y.; Shankar, M.; Quigley, J.; Wilson, B. E.
2016-12-01
The present availability of high-capacity networks, low-cost computers and storage devices, and the widespread adoption of hardware virtualization and service-oriented architecture provide a great opportunity to enable data and computing infrastructure sharing between closely related research activities. By taking advantage of these approaches, along with the world-class high computing and data infrastructure located at Oak Ridge National Laboratory, a cloud-based infrastructure and architecture has been developed to efficiently deliver essential data and informatics service and utilities to the environmental system research community, and will provide unique capabilities that allows terrestrial ecosystem research projects to share their software utilities (tools), data and even data submission workflow in a straightforward fashion. The infrastructure will minimize large disruptions from current project-based data submission workflows for better acceptances from existing projects, since many ecosystem research projects already have their own requirements or preferences for data submission and collection. The infrastructure will eliminate scalability problems with current project silos by provide unified data services and infrastructure. The Infrastructure consists of two key components (1) a collection of configurable virtual computing environments and user management systems that expedite data submission and collection from environmental system research community, and (2) scalable data management services and system, originated and development by ORNL data centers.
Developing Governance for Federated Community-based EHR Data Sharing
Lin, Ching-Ping; Stephens, Kari A.; Baldwin, Laura-Mae; Keppel, Gina A.; Whitener, Ron J.; Echo-Hawk, Abigail; Korngiebel, Diane
2014-01-01
Bi-directional translational pathways between scientific discoveries and primary care are crucial for improving individual patient care and population health. The Data QUEST pilot project is a program supporting data sharing amongst community based primary care practices and is built on a technical infrastructure to share electronic health record data. We developed a set of governance requirements from interviewing and collaborating with partner organizations. Recommendations from our partner organizations included: 1) partner organizations can physically terminate the link to the data sharing network and only approved data exits the local site; 2) partner organizations must approve or reject each query; 3) partner organizations and researchers must respect local processes, resource restrictions, and infrastructures; and 4) partner organizations can be seamlessly added and removed from any individual data sharing query or the entire network. PMID:25717404
A national-scale authentication infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, R.; Engert, D.; Foster, I.
2000-12-01
Today, individuals and institutions in science and industry are increasingly forming virtual organizations to pool resources and tackle a common goal. Participants in virtual organizations commonly need to share resources such as data archives, computer cycles, and networks - resources usually available only with restrictions based on the requested resource's nature and the user's identity. Thus, any sharing mechanism must have the ability to authenticate the user's identity and determine if the user is authorized to request the resource. Virtual organizations tend to be fluid, however, so authentication mechanisms must be flexible and lightweight, allowing administrators to quickly establish andmore » change resource-sharing arrangements. However, because virtual organizations complement rather than replace existing institutions, sharing mechanisms cannot change local policies and must allow individual institutions to maintain control over their own resources. Our group has created and deployed an authentication and authorization infrastructure that meets these requirements: the Grid Security Infrastructure. GSI offers secure single sign-ons and preserves site control over access policies and local security. It provides its own versions of common applications, such as FTP and remote login, and a programming interface for creating secure applications.« less
DOT National Transportation Integrated Search
1997-06-06
Shared resource projects offer an opportunity for public transportation agencies to leverage property assets in exchange for support for transportation programs. Intelligent transportation systems (ITS) require wireline infrastructure in roadway ROW ...
Changes in mode of travel to work: a natural experimental study of new transport infrastructure.
Heinen, Eva; Panter, Jenna; Mackett, Roger; Ogilvie, David
2015-06-20
New transport infrastructure may promote a shift towards active travel, thereby improving population health. The purpose of this study was to determine the effect of a major transport infrastructure project on commuters' mode of travel, trip frequency and distance travelled to work. Quasi-experimental analysis nested within a cohort study of 470 adults working in Cambridge, UK. The intervention consisted of the opening of a guided busway with a path for walking and cycling in 2011. Exposure to the intervention was defined as the negative of the square root of the shortest distance from home to busway. The outcome measures were changes in commute mode share and number of commute trips - both based on a seven-day travel-to-work record collected before (2009) and after (2012) the intervention - and change in objective commute distance. The mode share outcomes were changes in the proportions of trips (i) involving any active travel, (ii) involving any public transport, and (iii) made entirely by car. Separate multinomial regression models were estimated adjusting for commute and sociodemographic characteristics, residential settlement size and life events. Proximity to the busway predicted an increased likelihood of a large (>30 %) increase in the share of commute trips involving any active travel (relative risk ratio [RRR] 1.80, 95 % CI 1.27, 2.55) and a large decrease in the share of trips made entirely by car (RRR 2.09, 95 % CI 1.35, 3.21), as well as a lower likelihood of a small (<30 %) reduction in the share of trips involving any active travel (RRR 0.47, 95 % CI 0.28, 0.81). It was not associated with changes in the share of commute trips involving any public transport, the number of commute trips, or commute distance. The new infrastructure promoted an increase in the share of commuting trips involving active travel and a decrease in the share made entirely by car. Further analysis will show the extent to which the changes in commute mode share were translated into an increase in time spent in active commuting and consequent health gain.
Detmer, Don E
2003-01-01
Background Improving health in our nation requires strengthening four major domains of the health care system: personal health management, health care delivery, public health, and health-related research. Many avoidable shortcomings in the health sector that result in poor quality are due to inaccessible data, information, and knowledge. A national health information infrastructure (NHII) offers the connectivity and knowledge management essential to correct these shortcomings. Better health and a better health system are within our reach. Discussion A national health information infrastructure for the United States should address the needs of personal health management, health care delivery, public health, and research. It should also address relevant global dimensions (e.g., standards for sharing data and knowledge across national boundaries). The public and private sectors will need to collaborate to build a robust national health information infrastructure, essentially a 'paperless' health care system, for the United States. The federal government should assume leadership for assuring a national health information infrastructure as recommended by the National Committee on Vital and Health Statistics and the President's Information Technology Advisory Committee. Progress is needed in the areas of funding, incentives, standards, and continued refinement of a privacy (i.e., confidentiality and security) framework to facilitate personal identification for health purposes. Particular attention should be paid to NHII leadership and change management challenges. Summary A national health information infrastructure is a necessary step for improved health in the U.S. It will require a concerted, collaborative effort by both public and private sectors. If you cannot measure it, you cannot improve it. Lord Kelvin PMID:12525262
Metadata squared: enhancing its usability for volunteered geographic information and the GeoWeb
Poore, Barbara S.; Wolf, Eric B.; Sui, Daniel Z.; Elwood, Sarah; Goodchild, Michael F.
2013-01-01
The Internet has brought many changes to the way geographic information is created and shared. One aspect that has not changed is metadata. Static spatial data quality descriptions were standardized in the mid-1990s and cannot accommodate the current climate of data creation where nonexperts are using mobile phones and other location-based devices on a continuous basis to contribute data to Internet mapping platforms. The usability of standard geospatial metadata is being questioned by academics and neogeographers alike. This chapter analyzes current discussions of metadata to demonstrate how the media shift that is occurring has affected requirements for metadata. Two case studies of metadata use are presented—online sharing of environmental information through a regional spatial data infrastructure in the early 2000s, and new types of metadata that are being used today in OpenStreetMap, a map of the world created entirely by volunteers. Changes in metadata requirements are examined for usability, the ease with which metadata supports coproduction of data by communities of users, how metadata enhances findability, and how the relationship between metadata and data has changed. We argue that traditional metadata associated with spatial data infrastructures is inadequate and suggest several research avenues to make this type of metadata more interactive and effective in the GeoWeb.
QSIA--A Web-Based Environment for Learning, Assessing and Knowledge Sharing in Communities
ERIC Educational Resources Information Center
Rafaeli, Sheizaf; Barak, Miri; Dan-Gur, Yuval; Toch, Eran
2004-01-01
This paper describes a Web-based and distributed system named QSIA that serves as an environment for learning, assessing and knowledge sharing. QSIA--Questions Sharing and Interactive Assignments--offers a unified infrastructure for developing, collecting, managing and sharing of knowledge items. QSIA enhances collaboration in authoring via online…
Integrated Air Surveillance Concept of Operations
2011-11-01
information, intelligence, weather data, and other situational awareness-related information. 4.2.4 Shared Services Automated processing of sensor and...other surveillance information will occur through shared services , accessible through an enterprise network infrastructure, that provide for collecting...also be provided, such as information discovery and translation. The IS architecture effort will identify specific shared services . Shared
Eco-logical successes : second edition, January 2012
DOT National Transportation Integrated Search
2012-01-01
In 2006, leaders from eight Federal agencies signed the interagency document EcoLogical: An Ecosystem Approach to Developing Infrastructure Projects. Eco-Logical is a document that outlines a shared vision of how to develop infrastructure projects in...
Implementation of a health data-sharing infrastructure across diverse primary care organizations.
Cole, Allison M; Stephens, Kari A; Keppel, Gina A; Lin, Ching-Ping; Baldwin, Laura-Mae
2014-01-01
Practice-based research networks bring together academic researchers and primary care clinicians to conduct research that improves health outcomes in real-world settings. The Washington, Wyoming, Alaska, Montana, and Idaho region Practice and Research Network implemented a health data-sharing infrastructure across 9 clinics in 3 primary care organizations. Following implementation, we identified challenges and solutions. Challenges included working with diverse primary care organizations, adoption of health information data-sharing technology in a rapidly changing local and national landscape, and limited resources for implementation. Overarching solutions included working with a multidisciplinary academic implementation team, maintaining flexibility, and starting with an established network for primary care organizations. Approaches outlined may generalize to similar initiatives and facilitate adoption of health data sharing in other practice-based research networks.
Implementation of a Health Data-Sharing Infrastructure Across Diverse Primary Care Organizations
Cole, Allison M.; Stephens, Kari A.; Keppel, Gina A.; Lin, Ching-Ping; Baldwin, Laura-Mae
2014-01-01
Practice-based research networks bring together academic researchers and primary care clinicians to conduct research that improves health outcomes in real-world settings. The Washington, Wyoming, Alaska, Montana, and Idaho region Practice and Research Network implemented a health data-sharing infrastructure across 9 clinics in 3 primary care organizations. Following implementation, we identified challenges and solutions. Challenges included working with diverse primary care organizations, adoption of health information data-sharing technology in a rapidly changing local and national landscape, and limited resources for implementation. Overarching solutions included working with a multidisciplinary academic implementation team, maintaining flexibility, and starting with an established network for primary care organizations. Approaches outlined may generalize to similar initiatives and facilitate adoption of health data sharing in other practice-based research networks. PMID:24594564
Challenges in sharing of geospatial data by data custodians in South Africa
NASA Astrophysics Data System (ADS)
Kay, Sissiel E.
2018-05-01
As most development planning and rendering of public services happens at a place or in a space, geospatial data is required. This geospatial data is best managed through a spatial data infrastructure, which has as a key objective to share geospatial data. The collection and maintenance of geospatial data is expensive and time consuming and so the principle of "collect once - use many times" should apply. It is best to obtain the geospatial data from the authoritative source - the appointed data custodian. In South Africa the South African Spatial Data Infrastructure (SASDI) is the means to achieve the requirement for geospatial data sharing. This requires geospatial data sharing to take place between the data custodian and the user. All data custodians are expected to comply with the Spatial Data Infrastructure Act (SDI Act) in terms of geo-spatial data sharing. Currently data custodians are experiencing challenges with regard to the sharing of geospatial data. This research is based on the current ten data themes selected by the Committee for Spatial Information and the organisations identified as the data custodians for these ten data themes. The objectives are to determine whether the identified data custodians comply with the SDI Act with respect to geospatial data sharing, and if not what are the reasons for this. Through an international comparative assessment it then determines if the compliance with the SDI Act is not too onerous on the data custodians. The research concludes that there are challenges with geospatial data sharing in South Africa and that the data custodians only partially comply with the SDI Act in terms of geospatial data sharing. However, it is shown that the South African legislation is not too onerous on the data custodians.
Building and strengthening infrastructure for data exchange: lessons from the beacon communities.
Torres, Gretchen W; Swietek, Karen; Ubri, Petry S; Singer, Rachel F; Lowell, Kristina H; Miller, Wilhelmine
2014-01-01
The Beacon Community Cooperative Agreement Program supports interventions, including care-delivery innovations, provider performance measurement and feedback initiatives, and tools for providers and consumers to enhance care. Using a learning health system framework, we examine the Beacon Communities' processes in building and strengthening health IT (HIT) infrastructures, specifically successes and challenges in sharing patient information to improve clinical care. In 2010, the Office of the National Coordinator for Health Information Technology (ONC) launched the three-year program, which provided $250 million to 17 Beacon Communities to invest in HIT and health information exchange (HIE) infrastructure. Beacon Communities used this funding to develop and disseminate HIT-enabled quality improvement practices found effective in particular community and practice environments. NORC conducted 7 site visits, November 2012-March 2013, selecting Communities to represent diverse program features. From August-October 2013, NORC held discussions with the remaining 10 Communities. Following each visit or discussion, NORC summarized the information gathered, including transcripts, team observations, and other documents the Community provided, to facilitate a within-Community analysis of context and stakeholders, intervention strategies, enabling factors, and challenges. Although each Community designed and implemented data-sharing strategies in a unique environment, similar challenges and enabling factors emerged across the Beacons. From a learning health system perspective, their strategies to build and strengthen data-sharing infrastructures address the following crosscutting priorities: promoting technical advances and innovations by helping providers adapt EHRs for data exchange and performance measurement with customizable IT and offering technical support to smaller, independent providers; engaging key stakeholders; and fostering transparent governance and stewardship of the infrastructure with neutral conveners. While all the Communities developed or strengthened data-exchange infrastructure, each did this in a unique environment of existing health care market and legal factors. The Communities, however, encountered similar challenges and enabling factors. Organizations undertaking collaborative data sharing, performance measurement and clinical transformation can learn from the Beacon Communities' experience.
Building and Strengthening Infrastructure for Data Exchange: Lessons from the Beacon Communities
Torres, Gretchen W.; Swietek, Karen; Ubri, Petry S.; Singer, Rachel F.; Lowell, Kristina H.; Miller, Wilhelmine
2014-01-01
Introduction: The Beacon Community Cooperative Agreement Program supports interventions, including care-delivery innovations, provider performance measurement and feedback initiatives, and tools for providers and consumers to enhance care. Using a learning health system framework, we examine the Beacon Communities’ processes in building and strengthening health IT (HIT) infrastructures, specifically successes and challenges in sharing patient information to improve clinical care. Background: In 2010, the Office of the National Coordinator for Health Information Technology (ONC) launched the three-year program, which provided $250 million to 17 Beacon Communities to invest in HIT and health information exchange (HIE) infrastructure. Beacon Communities used this funding to develop and disseminate HIT-enabled quality improvement practices found effective in particular community and practice environments. Methods: NORC conducted 7 site visits, November 2012–March 2013, selecting Communities to represent diverse program features. From August–October 2013, NORC held discussions with the remaining 10 Communities. Following each visit or discussion, NORC summarized the information gathered, including transcripts, team observations, and other documents the Community provided, to facilitate a within-Community analysis of context and stakeholders, intervention strategies, enabling factors, and challenges. Results: Although each Community designed and implemented data-sharing strategies in a unique environment, similar challenges and enabling factors emerged across the Beacons. From a learning health system perspective, their strategies to build and strengthen data-sharing infrastructures address the following crosscutting priorities: promoting technical advances and innovations by helping providers adapt EHRs for data exchange and performance measurement with customizable IT and offering technical support to smaller, independent providers; engaging key stakeholders; and fostering transparent governance and stewardship of the infrastructure with neutral conveners. Conclusion: While all the Communities developed or strengthened data-exchange infrastructure, each did this in a unique environment of existing health care market and legal factors. The Communities, however, encountered similar challenges and enabling factors. Organizations undertaking collaborative data sharing, performance measurement and clinical transformation can learn from the Beacon Communities’ experience. PMID:25848619
Securing services in the cloud: an investigation of the threats and the mitigations
NASA Astrophysics Data System (ADS)
Farroha, Bassam S.; Farroha, Deborah L.
2012-05-01
The stakeholder's security concerns over data in the clouds (Voice, Video and Text) are a real concern to DoD, the IC and private sector. This is primarily due to the lack of physical isolation of data when migrating to shared infrastructure platforms. The security concerns are related to privacy and regulatory compliance required in many industries (healthcare, financial, law enforcement, DoD, etc) and the corporate knowledge databases. The new paradigm depends on the service provider to ensure that the customer's information is continuously monitored and is kept available, secure, access controlled and isolated from potential adversaries.
Codifying Collegiality: Recent Developments in Data Sharing Policy in the Life Sciences
Pham-Kanter, Genevieve; Zinner, Darren E.; Campbell, Eric G.
2014-01-01
Over the last decade, there have been significant changes in data sharing policies and in the data sharing environment faced by life science researchers. Using data from a 2013 survey of over 1600 life science researchers, we analyze the effects of sharing policies of funding agencies and journals. We also examine the effects of new sharing infrastructure and tools (i.e., third party repositories and online supplements). We find that recently enacted data sharing policies and new sharing infrastructure and tools have had a sizable effect on encouraging data sharing. In particular, third party repositories and online supplements as well as data sharing requirements of funding agencies, particularly the NIH and the National Human Genome Research Institute, were perceived by scientists to have had a large effect on facilitating data sharing. In addition, we found a high degree of compliance with these new policies, although noncompliance resulted in few formal or informal sanctions. Despite the overall effectiveness of data sharing policies, some significant gaps remain: about one third of grant reviewers placed no weight on data sharing plans in their reviews, and a similar percentage ignored the requirements of material transfer agreements. These patterns suggest that although most of these new policies have been effective, there is still room for policy improvement. PMID:25259842
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Ketcham, R. A.; Hanlon, M.; Pettengill, M.; Ranganath, A.; Venkatesh, A.
2016-12-01
Due to advances in imaging modalities such as X-ray microtomography and scattered electron microscopy, 2D and 3D imaged datasets of rock microstructure on nanometer to centimeter length scale allow investigation of nonlinear flow and mechanical phenomena using numerical approaches. This in turn produces various upscaled parameters required by subsurface flow and deformation simulators. However, a single research group typically specializes in an imaging modality and/or related modeling on a single length scale, and lack of data-sharing infrastructure makes it difficult to integrate different length scales. We developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (http://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of geosciences or engineering researchers not necessarily trained in computer science or data analysis. Our objective is to enable scientific inquiry and engineering decisions founded on a data-driven basis. We show how the data loaded in the portal can be documented, referenced in publications via digital object identifiers, visualize and linked to other repositories. We then show preliminary results on integrating remote parallel visualization and flow simulation workflow with the pore structures currently stored in the repository. We finally discuss the issues of collecting correct metadata, data discoverability and repository sustainability. This is the first repository for this particular data, but is part of the wider ecosystem of geoscience data and model cyber-infrastructure called "Earthcube" (http://earthcube.org/) sponsored by National Science Foundation. For data sustainability and continuous access, the portal is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative.
Providing the Tools for Information Sharing: Net-Centric Enterprise Services
2007-07-01
The Department of Defense (DoD) is establishing a net-centric environment that increasingly leverages shared services and Service-Oriented...transformational program that delivers a set of shared services as part of the DoD’s common infrastructure to enable networked joint force capabilities, improved interoperability, and increased information sharing across mission area services.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, D. N.
2015-06-22
The Earth System Grid Federation (ESGF) is a multi-agency, international collaboration whose purpose is to develop the software infrastructure needed to facilitate and empower the study of climate change on a global scale. ESGF’s architecture employs a system of geographically distributed peer nodes that are independently administered yet united by common federation protocols and application programming interfaces. The cornerstones of its interoperability are the peer-to-peer messaging, which is continuously exchanged among all nodes in the federation; a shared architecture for search and discovery; and a security infrastructure based on industry standards. ESGF integrates popular application engines available from the open-sourcemore » community with custom components (for data publishing, searching, user interface, security, and messaging) that were developed collaboratively by the team. The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP)—output used by the Intergovernmental Panel on Climate Change assessment reports. ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs of the global climate science community.« less
e!DAL - a framework to store, share and publish research data
2014-01-01
Background The life-science community faces a major challenge in handling “big data”, highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the “big data life cycle”. The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. Results e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed “out-of-the-box” as an on-site repository. Conclusions e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK’s role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de. PMID:24958009
e!DAL--a framework to store, share and publish research data.
Arend, Daniel; Lange, Matthias; Chen, Jinbo; Colmsee, Christian; Flemming, Steffen; Hecht, Denny; Scholz, Uwe
2014-06-24
The life-science community faces a major challenge in handling "big data", highlighting the need for high quality infrastructures capable of sharing and publishing research data. Data preservation, analysis, and publication are the three pillars in the "big data life cycle". The infrastructures currently available for managing and publishing data are often designed to meet domain-specific or project-specific requirements, resulting in the repeated development of proprietary solutions and lower quality data publication and preservation overall. e!DAL is a lightweight software framework for publishing and sharing research data. Its main features are version tracking, metadata management, information retrieval, registration of persistent identifiers (DOI), an embedded HTTP(S) server for public data access, access as a network file system, and a scalable storage backend. e!DAL is available as an API for local non-shared storage and as a remote API featuring distributed applications. It can be deployed "out-of-the-box" as an on-site repository. e!DAL was developed based on experiences coming from decades of research data management at the Leibniz Institute of Plant Genetics and Crop Plant Research (IPK). Initially developed as a data publication and documentation infrastructure for the IPK's role as a data center in the DataCite consortium, e!DAL has grown towards being a general data archiving and publication infrastructure. The e!DAL software has been deployed into the Maven Central Repository. Documentation and Software are also available at: http://edal.ipk-gatersleben.de.
Support for Taverna workflows in the VPH-Share cloud platform.
Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F
2017-07-01
To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.
78 FR 16699 - National Maritime Security Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-18
... Executive Order \\1\\ to strengthen the cybersecurity of critical infrastructure by increasing information sharing and by jointly developing and implementing a framework of cybersecurity practices with our...-press-office/2013/02/12/executive-order-improving-critical-infrastructure-cybersecurity . (2...
23 CFR 505.13 - Federal Government's share of project cost.
Code of Federal Regulations, 2010 CFR
2010-04-01
... INFRASTRUCTURE MANAGEMENT PROJECTS OF NATIONAL AND REGIONAL SIGNIFICANCE EVALUATION AND RATING § 505.13 Federal Government's share of project cost. (a) Based on engineering studies, studies of economic feasibility, and... 23 Highways 1 2010-04-01 2010-04-01 false Federal Government's share of project cost. 505.13...
NIST Document Sharing Test Facility
NIST Document Sharing Test Facility This site supports the IHE effort in Document Sharing as part . This test facility is based on the IHE IT Infrastructure Technical Framework. All testing done against that Patient IDs be pre-registered before submitting metadata about them. To allocate new patient IDs
Cybersecurity Information Sharing Between Public Private Sector Agencies
2015-03-01
Recognizing the lack of scholarly literature on PPPs and protecting CI from all hazards , including cyber-related threats, Nathan Busch and Austen...referred to as SLTT), and the owners and operators in charge of critical infrastructure, to manage risks and increase resiliency against all hazards .74 PPD...and hazards to critical infrastructure security and resilience, and called for an updated National Infrastructure Protection Plan (NIPP).76 Despite
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-30
... allows a member firm to assign up to 30 of its OUCH ports to a dedicated server infrastructure for its exclusive use. A Dedicated OUCH Port Infrastructure subscription is available to a member firm for a fee of... certainty to a subscribing member firm that its OUCH ports are not on shared infrastructure. \\8\\ 15 U.S.C...
Informatics Infrastructure for the Materials Genome Initiative
NASA Astrophysics Data System (ADS)
Dima, Alden; Bhaskarla, Sunil; Becker, Chandler; Brady, Mary; Campbell, Carelyn; Dessauw, Philippe; Hanisch, Robert; Kattner, Ursula; Kroenlein, Kenneth; Newrock, Marcus; Peskin, Adele; Plante, Raymond; Li, Sheng-Yen; Rigodiat, Pierre-François; Amaral, Guillaume Sousa; Trautt, Zachary; Schmitt, Xavier; Warren, James; Youssef, Sharief
2016-08-01
A materials data infrastructure that enables the sharing and transformation of a wide range of materials data is an essential part of achieving the goals of the Materials Genome Initiative. We describe two high-level requirements of such an infrastructure as well as an emerging open-source implementation consisting of the Materials Data Curation System and the National Institute of Standards and Technology Materials Resource Registry.
NASA Astrophysics Data System (ADS)
Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.
2017-12-01
The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently developed, improved, scaled, and seamlessly reproduced among multidisciplinary users, thereby expanding the active learning curriculum and research opportunities for students in earth surface modeling and informatics.
Leong, T-Y
2012-01-01
This paper summarizes the recent trends and highlights the challenges and opportunities in decision support and knowledge management for patient-centered, personalized, and personal health care. The discussions are based on a broad survey of related references, focusing on the most recent publications. Major advances are examined in the areas of i) shared decision making paradigms, ii) continuity of care infrastructures and architectures, iii) human factors and system design approaches, iv) knowledge management innovations, and v) practical deployment and change considerations. Many important initiatives, projects, and plans with promising results have been identified. The common themes focus on supporting the individual patients who are playing an increasing central role in their own care decision processes. New collaborative decision making paradigms and information infrastructures are required to ensure effective continuity of care. Human factors and usability are crucial for the successful development and deployment of the relevant systems, tools, and aids. Advances in personalized medicine can be achieved through integrating genomic, phenotypic and other biological, individual, and population level information, and gaining useful insights from building and analyzing biological and other models at multiple levels of abstraction. Therefore, new Information and Communication Technologies and evaluation approaches are needed to effectively manage the scale and complexity of biomedical and health information, and adapt to the changing nature of clinical decision support. Recent research in decision support and knowledge management combines heterogeneous information and personal data to provide cost-effective, calibrated, personalized support in shared decision making at the point of care. Current and emerging efforts concentrate on developing or extending conventional paradigms, techniques, systems, and architectures for the new predictive, preemptive, and participatory health care model for patient-centered, personalized medicine. There is also an increasing emphasis on managing complexity with changing care models, processes, and settings.
Sharing simulation-based training courses between institutions: opportunities and challenges.
Laack, Torrey A; Lones, Ellen A; Schumacher, Donna R; Todd, Frances M; Cook, David A
2017-01-01
Sharing simulation-based training (SBT) courses between institutions could reduce time to develop new content but also presents challenges. We evaluate the process of sharing SBT courses across institutions in a mixed method study estimating the time required and identifying barriers and potential solutions. Two US academic medical institutions explored instructor experiences with the process of sharing four courses (two at each site) using personal interviews and a written survey and estimated the time needed to develop new content vs implement existing SBT courses. The project team spent approximately 618 h creating a collaboration infrastructure to support course sharing. Sharing two SBT courses was estimated to save 391 h compared with developing two new courses. In the qualitative analysis, participants noted the primary benefit of course sharing was time savings. Barriers included difficulty finding information and understanding overall course flow. Suggestions for improvement included establishing a standardized template, clearly identifying the target audience, providing a course overview, communicating with someone familiar with the original SBT course, employing an intuitive file-sharing platform, and considering local culture, context, and needs. Sharing SBT courses between institutions is feasible but not without challenges. An initial investment in a sharing infrastructure may facilitate downstream time savings compared with developing content de novo.
78 FR 19277 - National Maritime Security Advisory Committee; Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-29
... Obama signed an Executive Order to strengthen the cybersecurity of critical infrastructure by increasing information sharing and by jointly developing and implementing a framework of cybersecurity practices with our...-press-office/2013/02/12/executive-order-improving-critical-infrastructure-cybersecurity . (2...
From Start-up to Sustainability: A Decade of Collaboration to Shape the Future of Nursing.
Gubrud, Paula; Spencer, Angela G; Wagner, Linda
This article describes progress the Oregon Consortium for Nursing Education has made toward addressing the academic progression goals provided by the 2011 Institute of Medicine's Future of Nursing: Leading Change, Advancing Health report. The history of the consortium's development is described, emphasizing the creation of an efficient and sustainable organization infrastructure that supports a shared curriculum provided through a community college/university partnership. Data and analysis describing progress and challenges related to supporting a shared curriculum and increasing access and affordability for nursing education across the state are presented. We identified four crucial attributes of maintaining collaborative community that have been cultivated to assure the consortium continues to make progress toward reaching the Institute of Medicine's Future of Nursing goals. Oregon Consortium for Nursing Education provides important lessons learned for other statewide consortiums to consider when developing plans for sustainability.
NASA Astrophysics Data System (ADS)
Puchala, Brian; Tarcea, Glenn; Marquis, Emmanuelle. A.; Hedstrom, Margaret; Jagadish, H. V.; Allison, John E.
2016-08-01
Accelerating the pace of materials discovery and development requires new approaches and means of collaborating and sharing information. To address this need, we are developing the Materials Commons, a collaboration platform and information repository for use by the structural materials community. The Materials Commons has been designed to be a continuous, seamless part of the scientific workflow process. Researchers upload the results of experiments and computations as they are performed, automatically where possible, along with the provenance information describing the experimental and computational processes. The Materials Commons website provides an easy-to-use interface for uploading and downloading data and data provenance, as well as for searching and sharing data. This paper provides an overview of the Materials Commons. Concepts are also outlined for integrating the Materials Commons with the broader Materials Information Infrastructure that is evolving to support the Materials Genome Initiative.
Optimizing CMS build infrastructure via Apache Mesos
NASA Astrophysics Data System (ADS)
Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad
2015-12-01
The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.
Building Stronger State Partnerships with the US Department of Energy (Energy Assurance)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mike Keogh
2011-09-30
From 2007 until 2011, the National Association of Regulatory Utility Commissioners (NARUC) engaged in a partnership with the National Energy Technology Lab (NETL) to improve State-Federal coordination on electricity policy and energy assurance issues. This project allowed State Public Utility Commissioners and their staffs to engage on the most cutting-edge level in the arenas of energy assurance and electricity policy. Four tasks were outlined in the Statement of Performance Objectives: Task 1 - Training for Commissions on Critical Infrastructure Topics; Task 2 - Analyze and Implement Recommendations on Energy Assurance Issues; Task 3 - Ongoing liaison activities & outreach tomore » build stronger networks between federal agencies and state regulators; and Task 4 - Additional Activities. Although four tasks were prescribed, in practice these tasks were carried out under two major activity areas: the critical infrastructure and energy assurance partnership with the US Department of Energy's Infrastructure Security and Emergency Response office, and the National Council on Electricity Policy, a collaborative which since 1994 has brought together State and Federal policymakers to address the most pressing issues facing the grid from restructuring to smart grid implementation. On Critical Infrastructure protection, this cooperative agreement helped State officials yield several important advances. The lead role on NARUC's side was played by our Committee on Critical Infrastructure Protection. Key lessons learned in this arena include the following: (1) Tabletops and exercises work - They improve the capacity of policymakers and their industry counterparts to face the most challenging energy emergencies, and thereby equip these actors with the capacity to face everything up to that point as well. (2) Information sharing is critical - Connecting people who need information with people who have information is a key success factor. However, exposure of critical infrastructure information to bad actors also creates new vulnerabilities. (3) Tensions exist between the transparency-driven basis of regulatory activity and the information-protection requirements of asset protection. (4) Coordination between states is a key success factor - Because comparatively little federal authority exists over electricity and other energy infrastructure, the interstate nature of these energy grids defy centralized command and control governance. Patchwork responses are a risk when addressed at a state-by-state level. Coordination is the key to ensuring consistent response to shared threats. In Electricity Policy, the National Council on Electricity Policy continued to make important strides forward. Coordinated electricity policy among States remains the best surrogate for an absent national electricity policy. In every area from energy efficiency to clean coal, State policies are driving the country's electricity policy, and regional responses to climate change, infrastructure planning, market operation, and new technology deployment depend on a forum for bringing the States together.« less
Fuller, Daniel; Gauvin, Lise; Kestens, Yan
2013-02-01
Few studies have examined potential disparities in access to transportation infrastructures, an important determinant of population health. To examine individual- and area-level disparities in access to the road network, public transportation system, and a public bicycle share program in Montreal, Canada. Examining associations between sociodemographic variables and access to the road network, public transportation system, and a public bicycle share program, 6,495 adult respondents (mean age, 48.7 years; 59.0 % female) nested in 33 areas were included in a multilevel analysis. Individuals with lower incomes lived significantly closer to public transportation and the bicycle share program. At the area level, the interaction between low-education and low-income neighborhoods showed that these areas were significantly closer to public transportation and the bicycle share program controlling for individual and urbanicity variables. More deprived areas of the Island of Montreal have better access to transportation infrastructure than less-deprived areas.
Health Information Infrastructure: Flows and Frictions
ERIC Educational Resources Information Center
Chung, Dahee
2017-01-01
The healthcare environment is becoming increasingly dependent on health information technology, with providers, patients, payers, and other players producing and sharing information to improve healthcare delivery. This, in turn, has brought the issue of Health Information Infrastructure (HII) to the forefront of policy, design, and law. While…
Genomes in the cloud: balancing privacy rights and the public good.
Ohno-Machado, Lucila; Farcas, Claudiu; Kim, Jihoon; Wang, Shuang; Jiang, Xiaoqian
2013-01-01
The NIH-funded iDASH1 National Center for Biomedical Computing was created in 2010 with the goal of developing infrastructure, algorithms, and tools to integrate Data for Analysis, 'anonymization,' and SHaring. iDASH is based on the premise that, while a strong case for not sharing information to preserve individual privacy can be made, an equally compelling case for sharing genome information for the public good (i.e., to support new discoveries that promote health or alleviate the burden of disease) should also be made. In fact, these cases do not need to be mutually exclusive: genome data sharing on a cloud does not necessarily have to compromise individual privacy, although current practices need significant improvement. So far, protection of subject data from re-identification and misuse has been relying primarily on regulations such as HIPAA, the Common Rule, and GINA. However, protection of biometrics such as a genome requires specialized infrastructure and tools.
Challenges and Driving Forces for Business Plans in Biobanking.
Macheiner, Tanja; Huppertz, Berthold; Bayer, Michaela; Sargsyan, Karine
2017-04-01
Due to increased utilization of biospecimens for research and emergence of new technologies, the availability and quality of biospecimens and their collection are coming more and more into focus. However, the long-term economic situation of biobanks is still mostly unclear. Also, the common sustainable utilization of various international biobanks is challenging due to local differences in sample processing, law and ethics. This article discusses possible strategies to achieve a sustainable utilization of biospecimens as part of the business plan of biobanks. The following questions were addressed as part of a business plan: (1) How can a biobank build up and maintain an up-to-date infrastructure? (2) What kind of funding can support the sustainability of a biobank? (3) Is there an international solution for informed consents to enable sample and data sharing? (4) How can a biobank react during economically unstable periods? (5) Which kind of biobanking research is innovative? (6) What kind of education could be most needful for knowledge transfer in biobanking? (7) Does an expiration date for a biobank make sense according to the period of funding? A strategy for optimal utilization begins with sharing of resources, infrastructure, and investments at the planning stage of a biobank, and continues to the transfer of knowledge and know-how by education. For clinical biobanks in particular, a long-term funding and cost recovery strategy is necessary for sustainable utilization.
A One Health Evaluation of the Southern African Centre for Infectious Disease Surveillance
Hanin, Marie C. E.; Queenan, Kevin; Savic, Sara; Karimuribo, Esron; Rüegg, Simon R.; Häsler, Barbara
2018-01-01
Rooted in the recognition that emerging infectious diseases occur at the interface of human, animal, and ecosystem health, the Southern African Centre for Infectious Disease Surveillance (SACIDS) initiative aims to promote a trans-sectoral approach to address better infectious disease risk management in five countries of the Southern African Development Community. Nine years after SACIDS’ inception, this study aimed to evaluate the program by applying a One Health (OH) evaluation framework developed by the Network for Evaluation of One Health (NEOH). The evaluation included a description of the context and the initiative, illustration of the theory of change, identification of outputs and outcomes, and assessment of the One Healthness. The latter is the sum of characteristics that defines an integrated approach and includes OH thinking, OH planning, OH working, sharing infrastructure, learning infrastructure, and systemic organization. The protocols made available by NEOH were used to develop data collection protocols and identify the study design. The framework relies on a mixed methods approach by combining a descriptive and qualitative assessment with a semi-quantitative evaluation (scoring). Data for the analysis were gathered during a document review, in group and individual interviews and in an online survey. Operational aspects (i.e., OH thinking, planning, and working) were found to be balanced overall with the highest score in the planning dimension, whereas the infrastructure (learning infrastructure, systemic organization, and sharing infrastructure) was high for the first two dimensions, but low for sharing. The OH index calculated was 0.359, and the OH ratio calculated was 1.495. The program was praised for its great innovative energy in a difficult landscape dominated by poor infrastructure and its ability to create awareness for OH and enthuse people for the concept; training of people and networking. Shortcomings were identified regarding the balance of contributions, funds and activities across member countries in the South, lack of data sharing, unequal allocation of resources, top-down management structures, and limited horizontal collaboration. Despite these challenges, SACIDS is perceived to be an effective agent in tackling infectious diseases in an integrated manner. PMID:29616227
A One Health Evaluation of the Southern African Centre for Infectious Disease Surveillance.
Hanin, Marie C E; Queenan, Kevin; Savic, Sara; Karimuribo, Esron; Rüegg, Simon R; Häsler, Barbara
2018-01-01
Rooted in the recognition that emerging infectious diseases occur at the interface of human, animal, and ecosystem health, the Southern African Centre for Infectious Disease Surveillance (SACIDS) initiative aims to promote a trans-sectoral approach to address better infectious disease risk management in five countries of the Southern African Development Community. Nine years after SACIDS' inception, this study aimed to evaluate the program by applying a One Health (OH) evaluation framework developed by the Network for Evaluation of One Health (NEOH). The evaluation included a description of the context and the initiative, illustration of the theory of change, identification of outputs and outcomes, and assessment of the One Healthness. The latter is the sum of characteristics that defines an integrated approach and includes OH thinking, OH planning, OH working, sharing infrastructure, learning infrastructure, and systemic organization. The protocols made available by NEOH were used to develop data collection protocols and identify the study design. The framework relies on a mixed methods approach by combining a descriptive and qualitative assessment with a semi-quantitative evaluation (scoring). Data for the analysis were gathered during a document review, in group and individual interviews and in an online survey. Operational aspects (i.e., OH thinking, planning, and working) were found to be balanced overall with the highest score in the planning dimension, whereas the infrastructure (learning infrastructure, systemic organization, and sharing infrastructure) was high for the first two dimensions, but low for sharing. The OH index calculated was 0.359, and the OH ratio calculated was 1.495. The program was praised for its great innovative energy in a difficult landscape dominated by poor infrastructure and its ability to create awareness for OH and enthuse people for the concept; training of people and networking. Shortcomings were identified regarding the balance of contributions, funds and activities across member countries in the South, lack of data sharing, unequal allocation of resources, top-down management structures, and limited horizontal collaboration. Despite these challenges, SACIDS is perceived to be an effective agent in tackling infectious diseases in an integrated manner.
Code of Federal Regulations, 2013 CFR
2013-01-01
... also its vulnerabilities to emerging threats. Cyber incidents can have devastating consequences on both... against cyber risks, comprehensive legislation remains essential to improving infrastructure security, enhancing cyber information sharing between government and the private sector, and protecting the privacy...
Inaugural Genomics Automation Congress and the coming deluge of sequencing data.
Creighton, Chad J
2010-10-01
Presentations at Select Biosciences's first 'Genomics Automation Congress' (Boston, MA, USA) in 2010 focused on next-generation sequencing and the platforms and methodology around them. The meeting provided an overview of sequencing technologies, both new and emerging. Speakers shared their recent work on applying sequencing to profile cells for various levels of biomolecular complexity, including DNA sequences, DNA copy, DNA methylation, mRNA and microRNA. With sequencing time and costs continuing to drop dramatically, a virtual explosion of very large sequencing datasets is at hand, which will probably present challenges and opportunities for high-level data analysis and interpretation, as well as for information technology infrastructure.
Privacy-preserving photo sharing based on a public key infrastructure
NASA Astrophysics Data System (ADS)
Yuan, Lin; McNally, David; Küpçü, Alptekin; Ebrahimi, Touradj
2015-09-01
A significant number of pictures are posted to social media sites or exchanged through instant messaging and cloud-based sharing services. Most social media services offer a range of access control mechanisms to protect users privacy. As it is not in the best interest of many such services if their users restrict access to their shared pictures, most services keep users' photos unprotected which makes them available to all insiders. This paper presents an architecture for a privacy-preserving photo sharing based on an image scrambling scheme and a public key infrastructure. A secure JPEG scrambling is applied to protect regional visual information in photos. Protected images are still compatible with JPEG coding and therefore can be viewed by any one on any device. However, only those who are granted secret keys will be able to descramble the photos and view their original versions. The proposed architecture applies an attribute-based encryption along with conventional public key cryptography, to achieve secure transmission of secret keys and a fine-grained control over who may view shared photos. In addition, we demonstrate the practical feasibility of the proposed photo sharing architecture with a prototype mobile application, ProShare, which is built based on iOS platform.
DOT National Transportation Integrated Search
2011-09-01
As a result of a federal requirement, all non-federal entities that own or operate critical : infrastructure are required to develop Continuity of Operations/Continuity of Government : (COOP/COG) Plans. Transportation is a critical infrastructure com...
Enhancing infrastructure resilience through business continuity planning.
Fisher, Ronald; Norman, Michael; Klett, Mary
2017-01-01
Critical infrastructure is crucial to the functionality and wellbeing of the world around us. It is a complex network that works together to create an efficient society. The core components of critical infrastructure are dependent on one another to function at their full potential. Organisations face unprecedented environmental risks such as increased reliance on information technology and telecommunications, increased infrastructure interdependencies and globalisation. Successful organisations should integrate the components of cyber-physical and infrastructure interdependencies into a holistic risk framework. Physical security plans, cyber security plans and business continuity plans can help mitigate environmental risks. Cyber security plans are becoming the most crucial to have, yet are the least commonly found in organisations. As the reliance on cyber continues to grow, it is imperative that organisations update their business continuity and emergency preparedness activities to include this.
eComLab: remote laboratory platform
NASA Astrophysics Data System (ADS)
Pontual, Murillo; Melkonyan, Arsen; Gampe, Andreas; Huang, Grant; Akopian, David
2011-06-01
Hands-on experiments with electronic devices have been recognized as an important element in the field of engineering to help students get familiar with theoretical concepts and practical tasks. The continuing increase the student number, costly laboratory equipment, and laboratory maintenance slow down the physical lab efficiency. As information technology continues to evolve, the Internet has become a common media in modern education. Internetbased remote laboratory can solve a lot of restrictions, providing hands-on training as they can be flexible in time and the same equipment can be shared between different students. This article describes an on-going remote hands-on experimental radio modulation, network and mobile applications lab project "eComLab". Its main component is a remote laboratory infrastructure and server management system featuring various online media familiar with modern students, such as chat rooms and video streaming.
A genome-wide association study platform built on iPlant cyber-infrastructure
USDA-ARS?s Scientific Manuscript database
We demonstrated a flexible Genome-Wide Association (GWA) Study (GWAS) platform built upon the iPlant Collaborative Cyber-infrastructure. The platform supports big data management, sharing, and large scale study of both genotype and phenotype data on clusters. End users can add their own analysis too...
DOE Office of Scientific and Technical Information (OSTI.GOV)
2013-04-04
Spindle is software infrastructure that solves file system scalabiltiy problems associated with starting dynamically linked applications in HPC environments. When an HPC applications starts up thousands of pricesses at once, and those processes simultaneously access a shared file system to look for shared libraries, it can cause significant performance problems for both the application and other users. Spindle scalably coordinates the distribution of shared libraries to an application to avoid hammering the shared file system.
NASA Astrophysics Data System (ADS)
Allison, M. L.; Gurney, R. J.
2015-12-01
An e-infrastructure that supports data-intensive, multidisciplinary research is needed to accelerate the pace of science to address 21st century global change challenges. Data discovery, access, sharing and interoperability collectively form core elements of an emerging shared vision of e-infrastructure for scientific discovery. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. An 18-month long process involving ~120 experts in domain, computer, and social sciences from more than a dozen countries resulted in a formal set of recommendations to the Belmont Forum collaboration of national science funding agencies and others on what they are best suited to implement for development of an e-infrastructure in support of global change research, including: adoption of data principles that promote a global, interoperable e-infrastructure establishment of information and data officers for coordination of global data management and e-infrastructure efforts promotion of effective data planning determination of best practices development of a cross-disciplinary training curriculum on data management and curation The Belmont Forum is ideally poised to play a vital and transformative leadership role in establishing a sustained human and technical international data e-infrastructure to support global change research. The international collaborative process that went into forming these recommendations is contributing to national governments and funding agencies and international bodies working together to execute them.
Advanced e-Infrastructures for Civil Protection applications: the CYCLOPS Project
NASA Astrophysics Data System (ADS)
Mazzetti, P.; Nativi, S.; Verlato, M.; Ayral, P. A.; Fiorucci, P.; Pina, A.; Oliveira, J.; Sorani, R.
2009-04-01
During the full cycle of the emergency management, Civil Protection operative procedures involve many actors belonging to several institutions (civil protection agencies, public administrations, research centers, etc.) playing different roles (decision-makers, data and service providers, emergency squads, etc.). In this context the sharing of information is a vital requirement to make correct and effective decisions. Therefore a European-wide technological infrastructure providing a distributed and coordinated access to different kinds of resources (data, information, services, expertise, etc.) could enhance existing Civil Protection applications and even enable new ones. Such European Civil Protection e-Infrastructure should be designed taking into account the specific requirements of Civil Protection applications and the state-of-the-art in the scientific and technological disciplines which could make the emergency management more effective. In the recent years Grid technologies have reached a mature state providing a platform for secure and coordinated resource sharing between the participants collected in the so-called Virtual Organizations. Moreover the Earth and Space Sciences Informatics provide the conceptual tools for modeling the geospatial information shared in Civil Protection applications during its entire lifecycle. Therefore a European Civil Protection e-infrastructure might be based on a Grid platform enhanced with Earth Sciences services. In the context of the 6th Framework Programme the EU co-funded Project CYCLOPS (CYber-infrastructure for CiviL protection Operative ProcedureS), ended in December 2008, has addressed the problem of defining the requirements and identifying the research strategies and innovation guidelines towards an advanced e-Infrastructure for Civil Protection. Starting from the requirement analysis CYCLOPS has proposed an architectural framework for a European Civil Protection e-Infrastructure. This architectural framework has been evaluated through the development of prototypes of two operative applications used by the Italian Civil Protection for Wild Fires Risk Assessment (RISICO) and by the French Civil Protection for Flash Flood Risk Management (SPC-GD). The results of these studies and proof-of-concepts have been used as the basis for the definition of research and innovation strategies aiming to the detailed design and implementation of the infrastructure. In particular the main research themes and topics to be addressed have been identified and detailed. Finally the obstacles to the innovation required for the adoption of this infrastructure and possible strategies to overcome them have been discussed.
Gilbert, Jack A; Dick, Gregory J; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R M; DeLong, Edward F
2014-06-15
The National Science Foundation's EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on 'omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, "big-data capable" analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean 'omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the 'omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography.
Gilbert, Jack A; Dick, Gregory J.; Jenkins, Bethany; Heidelberg, John; Allen, Eric; Mackey, Katherine R. M.
2014-01-01
The National Science Foundation’s EarthCube End User Workshop was held at USC Wrigley Marine Science Center on Catalina Island, California in August 2013. The workshop was designed to explore and characterize the needs and tools available to the community that is focusing on microbial and physical oceanography research with a particular emphasis on ‘omic research. The assembled researchers outlined the existing concerns regarding the vast data resources that are being generated, and how we will deal with these resources as their volume and diversity increases. Particular attention was focused on the tools for handling and analyzing the existing data, on the need for the construction and curation of diverse federated databases, as well as development of shared, interoperable, “big-data capable” analytical tools. The key outputs from this workshop include (i) critical scientific challenges and cyber infrastructure constraints, (ii) the current and future ocean ‘omics science grand challenges and questions, and (iii) data management, analytical and associated and cyber-infrastructure capabilities required to meet critical current and future scientific challenges. The main thrust of the meeting and the outcome of this report is a definition of the ‘omics tools, technologies and infrastructures that facilitate continued advance in ocean science biology, marine biogeochemistry, and biological oceanography. PMID:25197495
NASA Astrophysics Data System (ADS)
Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.
2015-12-01
Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.
assistance to qualified E85 or dual E15 and biodiesel retailers. Cost-share grants are available to upgrade or install new E85 or dual E15 and biodiesel infrastructure. Three-year cost-share grants are available for up to 50% of the total cost of the total project, up to $30,000, and five-year cost-share
Educating English Language Learners: Opportunities for Improved Infrastructure. PERC Research Brief
ERIC Educational Resources Information Center
Rowland, Jeannette; Reumann-Moore, Rebecca; Hughes, Rosemary; Lin, Joshua
2016-01-01
Academic success for ELLs depends on high quality instruction and the infrastructure needed to support it (e.g., staff, curricular materials, collaboration, professional development). This brief examines the challenges schools face in these areas and the strategies they use to mediate them. The purpose of this brief is to share these strategies…
Data issues in the life sciences.
Thessen, Anne E; Patterson, David J
2011-01-01
We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the "Big New Biology". Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available.
Data issues in the life sciences
Thessen, Anne E.; Patterson, David J.
2011-01-01
Abstract We review technical and sociological issues facing the Life Sciences as they transform into more data-centric disciplines - the “Big New Biology”. Three major challenges are: 1) lack of comprehensive standards; 2) lack of incentives for individual scientists to share data; 3) lack of appropriate infrastructure and support. Technological advances with standards, bandwidth, distributed computing, exemplar successes, and a strong presence in the emerging world of Linked Open Data are sufficient to conclude that technical issues will be overcome in the foreseeable future. While motivated to have a shared open infrastructure and data pool, and pressured by funding agencies in move in this direction, the sociological issues determine progress. Major sociological issues include our lack of understanding of the heterogeneous data cultures within Life Sciences, and the impediments to progress include a lack of incentives to build appropriate infrastructures into projects and institutions or to encourage scientists to make data openly available. PMID:22207805
Condition Assessment Modeling for Distribution Systems Using Shared Frailty Analysis
Condition Assessment (CA) modeling is drawing increasing interest as a methodology for managing drinking water infrastructure. This paper develops a Cox Proportional Hazard (PH)/shared frailty model and applies it to the problem of investment in the repair and replacement of dri...
The Importance of Biodiversity E-infrastructures for Megadiverse Countries
Canhos, Dora A. L.; Sousa-Baena, Mariane S.; de Souza, Sidnei; Maia, Leonor C.; Stehmann, João R.; Canhos, Vanderlei P.; De Giovanni, Renato; Bonacelli, Maria B. M.; Los, Wouter; Peterson, A. Townsend
2015-01-01
Addressing the challenges of biodiversity conservation and sustainable development requires global cooperation, support structures, and new governance models to integrate diverse initiatives and achieve massive, open exchange of data, tools, and technology. The traditional paradigm of sharing scientific knowledge through publications is not sufficient to meet contemporary demands that require not only the results but also data, knowledge, and skills to analyze the data. E-infrastructures are key in facilitating access to data and providing the framework for collaboration. Here we discuss the importance of e-infrastructures of public interest and the lack of long-term funding policies. We present the example of Brazil’s speciesLink network, an e-infrastructure that provides free and open access to biodiversity primary data and associated tools. SpeciesLink currently integrates 382 datasets from 135 national institutions and 13 institutions from abroad, openly sharing ~7.4 million records, 94% of which are associated to voucher specimens. Just as important as the data is the network of data providers and users. In 2014, more than 95% of its users were from Brazil, demonstrating the importance of local e-infrastructures in enabling and promoting local use of biodiversity data and knowledge. From the outset, speciesLink has been sustained through project-based funding, normally public grants for 2–4-year periods. In between projects, there are short-term crises in trying to keep the system operational, a fact that has also been observed in global biodiversity portals, as well as in social and physical sciences platforms and even in computing services portals. In the last decade, the open access movement propelled the development of many web platforms for sharing data. Adequate policies unfortunately did not follow the same tempo, and now many initiatives may perish. PMID:26204382
The Importance of Biodiversity E-infrastructures for Megadiverse Countries.
Canhos, Dora A L; Sousa-Baena, Mariane S; de Souza, Sidnei; Maia, Leonor C; Stehmann, João R; Canhos, Vanderlei P; De Giovanni, Renato; Bonacelli, Maria B M; Los, Wouter; Peterson, A Townsend
2015-07-01
Addressing the challenges of biodiversity conservation and sustainable development requires global cooperation, support structures, and new governance models to integrate diverse initiatives and achieve massive, open exchange of data, tools, and technology. The traditional paradigm of sharing scientific knowledge through publications is not sufficient to meet contemporary demands that require not only the results but also data, knowledge, and skills to analyze the data. E-infrastructures are key in facilitating access to data and providing the framework for collaboration. Here we discuss the importance of e-infrastructures of public interest and the lack of long-term funding policies. We present the example of Brazil's speciesLink network, an e-infrastructure that provides free and open access to biodiversity primary data and associated tools. SpeciesLink currently integrates 382 datasets from 135 national institutions and 13 institutions from abroad, openly sharing ~7.4 million records, 94% of which are associated to voucher specimens. Just as important as the data is the network of data providers and users. In 2014, more than 95% of its users were from Brazil, demonstrating the importance of local e-infrastructures in enabling and promoting local use of biodiversity data and knowledge. From the outset, speciesLink has been sustained through project-based funding, normally public grants for 2-4-year periods. In between projects, there are short-term crises in trying to keep the system operational, a fact that has also been observed in global biodiversity portals, as well as in social and physical sciences platforms and even in computing services portals. In the last decade, the open access movement propelled the development of many web platforms for sharing data. Adequate policies unfortunately did not follow the same tempo, and now many initiatives may perish.
ERIC Educational Resources Information Center
Pereira, Francis; And Others
This survey was designed to elicit the perceptions of the members of the educational community on four issues concerning the NII (National Information Infrastructure), and to test whether these visions of the NII were shared by educators. The issues were: (1) the benefits of the NII to the education sector and specifically whether the NII will be…
Software and hardware infrastructure for research in electrophysiology
Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Řondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Štěbeták, Jan
2014-01-01
As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software. PMID:24639646
Software and hardware infrastructure for research in electrophysiology.
Mouček, Roman; Ježek, Petr; Vařeka, Lukáš; Rondík, Tomáš; Brůha, Petr; Papež, Václav; Mautner, Pavel; Novotný, Jiří; Prokop, Tomáš; Stěbeták, Jan
2014-01-01
As in other areas of experimental science, operation of electrophysiological laboratory, design and performance of electrophysiological experiments, collection, storage and sharing of experimental data and metadata, analysis and interpretation of these data, and publication of results are time consuming activities. If these activities are well organized and supported by a suitable infrastructure, work efficiency of researchers increases significantly. This article deals with the main concepts, design, and development of software and hardware infrastructure for research in electrophysiology. The described infrastructure has been primarily developed for the needs of neuroinformatics laboratory at the University of West Bohemia, the Czech Republic. However, from the beginning it has been also designed and developed to be open and applicable in laboratories that do similar research. After introducing the laboratory and the whole architectural concept the individual parts of the infrastructure are described. The central element of the software infrastructure is a web-based portal that enables community researchers to store, share, download and search data and metadata from electrophysiological experiments. The data model, domain ontology and usage of semantic web languages and technologies are described. Current data publication policy used in the portal is briefly introduced. The registration of the portal within Neuroscience Information Framework is described. Then the methods used for processing of electrophysiological signals are presented. The specific modifications of these methods introduced by laboratory researches are summarized; the methods are organized into a laboratory workflow. Other parts of the software infrastructure include mobile and offline solutions for data/metadata storing and a hardware stimulator communicating with an EEG amplifier and recording software.
Dot-gov: market failure and the creation of a national health information technology system.
Kleinke, J D
2005-01-01
The U.S. health care marketplace's continuing failure to adopt information technology (IT) is the result of economic problems unique to health care, business strategy problems typical of fragmented industries, and technology standardization problems common to infrastructure development in free-market economies. Given the information intensity of medicine, the quality problems associated with inadequate IT, the magnitude of U.S. health spending, and the large federal share of that spending, this market failure requires aggressive governmental intervention. Federal policies to compel the creation of a national health IT system would reduce aggregate health care costs and improve quality, goals that cannot be attained in the health care marketplace.
UNH Data Cooperative: A Cyber Infrastructure for Earth System Studies
NASA Astrophysics Data System (ADS)
Braswell, B. H.; Fekete, B. M.; Prusevich, A.; Gliden, S.; Magill, A.; Vorosmarty, C. J.
2007-12-01
Earth system scientists and managers have a continuously growing demand for a wide array of earth observations derived from various data sources including (a) modern satellite retrievals, (b) "in-situ" records, (c) various simulation outputs, and (d) assimilated data products combining model results with observational records. The sheer quantity of data, and formatting inconsistencies make it difficult for users to take full advantage of this important information resource. Thus the system could benefit from a thorough retooling of our current data processing procedures and infrastructure. Emerging technologies, like OPeNDAP and OGC map services, open standard data formats (NetCDF, HDF) data cataloging systems (NASA-Echo, Global Change Master Directory, etc.) are providing the basis for a new approach in data management and processing, where web- services are increasingly designed to serve computer-to-computer communications without human interactions and complex analysis can be carried out over distributed computer resources interconnected via cyber infrastructure. The UNH Earth System Data Collaborative is designed to utilize the aforementioned emerging web technologies to offer new means of access to earth system data. While the UNH Data Collaborative serves a wide array of data ranging from weather station data (Climate Portal) to ocean buoy records and ship tracks (Portsmouth Harbor Initiative) to land cover characteristics, etc. the underlaying data architecture shares common components for data mining and data dissemination via web-services. Perhaps the most unique element of the UNH Data Cooperative's IT infrastructure is its prototype modeling environment for regional ecosystem surveillance over the Northeast corridor, which allows the integration of complex earth system model components with the Cooperative's data services. While the complexity of the IT infrastructure to perform complex computations is continuously increasing, scientists are often forced to spend considerable amount of time to solve basic data management and preprocessing tasks and deal with low level computational design problems like parallelization of model codes. Our modeling infrastructure is designed to take care the bulk of the common tasks found in complex earth system models like I/O handling, computational domain and time management, parallel execution of the modeling tasks, etc. The modeling infrastructure allows scientists to focus on the numerical implementation of the physical processes on a single computational objects(typically grid cells) while the framework takes care of the preprocessing of input data, establishing of the data exchange between computation objects and the execution of the science code. In our presentation, we will discuss the key concepts of our modeling infrastructure. We will demonstrate integration of our modeling framework with data services offered by the UNH Earth System Data Collaborative via web interfaces. We will layout the road map to turn our prototype modeling environment into a truly community framework for wide range of earth system scientists and environmental managers.
State of emergency preparedness for US health insurance plans.
Merchant, Raina M; Finne, Kristen; Lardy, Barbara; Veselovskiy, German; Korba, Caey; Margolis, Gregg S; Lurie, Nicole
2015-01-01
Health insurance plans serve a critical role in public health emergencies, yet little has been published about their collective emergency preparedness practices and policies. We evaluated, on a national scale, the state of health insurance plans' emergency preparedness and policies. A survey of health insurance plans. We queried members of America's Health Insurance Plans, the national trade association representing the health insurance industry, about issues related to emergency preparedness issues: infrastructure, adaptability, connectedness, and best practices. Of 137 health insurance plans queried, 63% responded, representing 190.6 million members and 81% of US plan enrollment. All respondents had emergency plans for business continuity, and most (85%) had infrastructure for emergency teams. Some health plans also have established benchmarks for preparedness (eg, response time). Regarding adaptability, 85% had protocols to extend claim filing time and 71% could temporarily suspend prior medical authorization rules. Regarding connectedness, many plans shared their contingency plans with health officials, but often cited challenges in identifying regulatory agency contacts. Some health insurance plans had specific policies for assisting individuals dependent on durable medical equipment or home healthcare. Many plans (60%) expressed interest in sharing best practices. Health insurance plans are prioritizing emergency preparedness. We identified 6 policy modifications that health insurance plans could undertake to potentially improve healthcare system preparedness: establishing metrics and benchmarks for emergency preparedness; identifying disaster-specific policy modifications, enhancing stakeholder connectedness, considering digital strategies to enhance communication, improving support and access for special-needs individuals, and developing regular forums for knowledge exchange about emergency preparedness.
Optimizing CMS build infrastructure via Apache Mesos
Abdurachmanov, David; Degano, Alessandro; Elmer, Peter; ...
2015-12-23
The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux.Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora,more » and other applications on a dynamically shared pool of nodes. Lastly, we present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.« less
Optimizing CMS build infrastructure via Apache Mesos
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abdurachmanov, David; Degano, Alessandro; Elmer, Peter
The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux.Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora,more » and other applications on a dynamically shared pool of nodes. Lastly, we present how we migrated our continuous integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.« less
Secure Infrastructure-Less Network (SINET)
2017-06-01
Protocol CNSA Commercial National Security Algorithm COMSEC Communications Security COTS Commercial off the Shelf CSfC Commercial Solutions for...ABSTRACT (maximum 200 words) Military leaders and first responders desire the familiarity of commercial -off-the-shelf lightweight mobile devices while...since they lack reliable or secure communication infrastructure. Routine and simple mobile information-sharing tasks become a challenge over the
Water scarcity and urban forest management: introduction
E. Gregory McPherson; Robert Prince
2013-01-01
Between 1997 and 2009 a serious drought affected much of Australia. Whether reasoned or unintentional, water policy decisions closed the tap, turning much of the urban forestâs lifeline into a trickle. Green infrastructure became brown infrastructure, exposing its standing as a low priority relative to other consumptive sources. To share new solutions to water scarcity...
Ohmann, Christian; Banzi, Rita; Canham, Steve; Battaglia, Serena; Matei, Mihaela; Ariyo, Christopher; Becnel, Lauren; Bierer, Barbara; Bowers, Sarion; Clivio, Luca; Dias, Monica; Druml, Christiane; Faure, Hélène; Fenner, Martin; Galvez, Jose; Ghersi, Davina; Gluud, Christian; Houston, Paul; Karam, Ghassan; Kalra, Dipak; Krleža-Jerić, Karmela; Kubiak, Christine; Kuchinke, Wolfgang; Kush, Rebecca; Lukkarinen, Ari; Marques, Pedro Silverio; Newbigging, Andrew; O’Callaghan, Jennifer; Ravaud, Philippe; Schlünder, Irene; Shanahan, Daniel; Sitter, Helmut; Spalding, Dylan; Tudur-Smith, Catrin; van Reusel, Peter; van Veen, Evert-Ben; Visser, Gerben Rienk; Wilson, Julia; Demotes-Mainard, Jacques
2017-01-01
Objectives We examined major issues associated with sharing of individual clinical trial data and developed a consensus document on providing access to individual participant data from clinical trials, using a broad interdisciplinary approach. Design and methods This was a consensus-building process among the members of a multistakeholder task force, involving a wide range of experts (researchers, patient representatives, methodologists, information technology experts, and representatives from funders, infrastructures and standards development organisations). An independent facilitator supported the process using the nominal group technique. The consensus was reached in a series of three workshops held over 1 year, supported by exchange of documents and teleconferences within focused subgroups when needed. This work was set within the Horizon 2020-funded project CORBEL (Coordinated Research Infrastructures Building Enduring Life-science Services) and coordinated by the European Clinical Research Infrastructure Network. Thus, the focus was on non-commercial trials and the perspective mainly European. Outcome We developed principles and practical recommendations on how to share data from clinical trials. Results The task force reached consensus on 10 principles and 50 recommendations, representing the fundamental requirements of any framework used for the sharing of clinical trials data. The document covers the following main areas: making data sharing a reality (eg, cultural change, academic incentives, funding), consent for data sharing, protection of trial participants (eg, de-identification), data standards, rights, types and management of access (eg, data request and access models), data management and repositories, discoverability, and metadata. Conclusions The adoption of the recommendations in this document would help to promote and support data sharing and reuse among researchers, adequately inform trial participants and protect their rights, and provide effective and efficient systems for preparing, storing and accessing data. The recommendations now need to be implemented and tested in practice. Further work needs to be done to integrate these proposals with those from other geographical areas and other academic domains. PMID:29247106
40 CFR 52.1991 - Section 110(a)(2) infrastructure requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1991 Section 110(a)(2) infrastructure requirements. On September 25, 2008, Oregon Department of Environmental...
Shared Web Information Systems for Heritage in Scotland and Wales - Flexibility in Partnership
NASA Astrophysics Data System (ADS)
Thomas, D.; McKeague, P.
2013-07-01
The Royal Commissions on the Ancient and Historical Monuments of Scotland and Wales were established in 1908 to investigate and record the archaeological and built heritage of their respective countries. The organisations have grown organically over the succeeding century, steadily developing their inventories and collections as card and paper indexes. Computerisation followed in the late 1980s and early 1990s, with RCAHMS releasing Canmore, an online searchable database, in 1998. Following a review of service provision in Wales, RCAHMW entered into partnership with RCAHMS in 2003 to deliver a database for their national inventories and collections. The resultant partnership enables both organisations to develop at their own pace whilst delivering efficiencies through a common experience and a shared IT infrastructure. Through innovative solutions the partnership has also delivered benefits to the wider historic environment community, providing online portals to a range of datasets, ultimately raising public awareness and appreciation of the heritage around them. Now celebrating its 10th year, Shared Web Information Systems for Heritage, or more simply SWISH, continues to underpin the work of both organisations in presenting information about the historic environment to the public.
New Generation Sensor Web Enablement
Bröring, Arne; Echterhoff, Johannes; Jirka, Simon; Simonis, Ingo; Everding, Thomas; Stasch, Christoph; Liang, Steve; Lemmens, Rob
2011-01-01
Many sensor networks have been deployed to monitor Earth’s environment, and more will follow in the future. Environmental sensors have improved continuously by becoming smaller, cheaper, and more intelligent. Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for sharing, finding, and accessing sensors and their data across different applications. It hides the heterogeneous sensor hardware and communication protocols from the applications built on top of it. The Sensor Web Enablement initiative of the Open Geospatial Consortium standardizes web service interfaces and data encodings which can be used as building blocks for a Sensor Web. This article illustrates and analyzes the recent developments of the new generation of the Sensor Web Enablement specification framework. Further, we relate the Sensor Web to other emerging concepts such as the Web of Things and point out challenges and resulting future work topics for research on Sensor Web Enablement. PMID:22163760
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong
Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less
Liu, Changzheng; Oak Ridge National Lab.; Lin, Zhenhong; ...
2016-12-08
Plug-in electric vehicles (PEVs) are widely regarded as an important component of the technology portfolio designed to accomplish policy goals in sustainability and energy security. However, the market acceptance of PEVs in the future remains largely uncertain from today's perspective. By integrating a consumer choice model based on nested multinomial logit and Monte Carlo simulation, this study analyzes the uncertainty of PEV market penetration using Monte Carlo simulation. Results suggest that the future market for PEVs is highly uncertain and there is a substantial risk of low penetration in the early and midterm market. Top factors contributing to market sharemore » variability are price sensitivities, energy cost, range limitation, and charging availability. The results also illustrate the potential effect of public policies in promoting PEVs through investment in battery technology and infrastructure deployment. Here, continued improvement of battery technologies and deployment of charging infrastructure alone do not necessarily reduce the spread of market share distributions, but may shift distributions toward right, i.e., increase the probability of having great market success.« less
Lairmore, Michael D; Oglesbee, Michael; Weisbrode, Steve E; Wellman, Maxey; Rosol, Thomas; Stromberg, Paul
2007-01-01
Recent reports project a deficiency of veterinary pathologists, indicating a need to train highly qualified veterinary pathologists, particularly in academic veterinary medicine. The need to provide high-quality research training for veterinary pathologists has been recognized by the veterinary pathology training program of the Ohio State University (OSU) since its inception. The OSU program incorporates elements of both residency training and graduate education into a unified program. This review illustrates the components and structure of the training program and reflects on future challenges in training veterinary pathologists. Key elements of the OSU program include an experienced faculty, dedicated staff, and high-quality students who have a sense of common mission. The program is supported through cultural and infrastructure support. Financial compensation, limited research funding, and attractive work environments, including work-life balance, will undoubtedly continue to be forces in the marketplace for veterinary pathologists. To remain competitive and to expand the ability to train veterinary pathologists with research skills, programs must support strong faculty members, provide appropriate infrastructure support, and seek active partnerships with private industry to expand program opportunities. Shortages of trained faculty may be partially resolved by regional cooperation to share faculty expertise or through the use of communications technology to bridge distances between programs. To foster continued interest in academic careers, training programs will need to continue to evolve and respond to trainees' needs while maintaining strong allegiances to high-quality pathology training. Work-life balance, collegial environments that foster a culture of respect for veterinary pathology, and continued efforts to reach out to veterinary students to provide opportunities to learn about the diverse careers offered in veterinary pathology will pay long-term dividends for the future of the profession.
Lairmore, Michael D.; Oglesbee, Michael; Weisbrode, Steve E.; Wellman, Maxey; Rosol, Thomas; Stromberg, Paul
2011-01-01
Recent reports project a deficiency of veterinary pathologists, indicating a need to train highly qualified veterinary pathologists, particularly in academic veterinary medicine. The need to provide high-quality research training for veterinary pathologists has been recognized by the veterinary pathology training program of the Ohio State University (OSU) since its inception. The OSU program incorporates elements of both residency training and graduate education into a unified program. This review illustrates the components and structure of the training program and reflects on future challenges in training veterinary pathologists. Key elements of the OSU program include an experienced faculty, dedicated staff, and high-quality students who have a sense of common mission. The program is supported through cultural and infrastructure support. Financial compensation, limited research funding, and attractive work environments, including work–life balance, will undoubtedly continue to be forces in the marketplace for veterinary pathologists. To remain competitive and to expand the ability to train veterinary pathologists with research skills, programs must support strong faculty members, provide appropriate infrastructure support, and seek active partnerships with private industry to expand program opportunities. Shortages of trained faculty may be partially resolved by regional cooperation to share faculty expertise or through the use of communications technology to bridge distances between programs. To foster continued interest in academic careers, training programs will need to continue to evolve and respond to trainees' needs while maintaining strong allegiances to high-quality pathology training. Work–life balance, collegial environments that foster a culture of respect for veterinary pathology, and continued efforts to reach out to veterinary students to provide opportunities to learn about the diverse careers offered in veterinary pathology will pay long-term dividends for the future of the profession. PMID:18287474
Perspectives on Sharing Models and Related Resources in Computational Biomechanics Research.
Erdemir, Ahmet; Hunter, Peter J; Holzapfel, Gerhard A; Loew, Leslie M; Middleton, John; Jacobs, Christopher R; Nithiarasu, Perumal; Löhner, Rainlad; Wei, Guowei; Winkelstein, Beth A; Barocas, Victor H; Guilak, Farshid; Ku, Joy P; Hicks, Jennifer L; Delp, Scott L; Sacks, Michael; Weiss, Jeffrey A; Ateshian, Gerard A; Maas, Steve A; McCulloch, Andrew D; Peng, Grace C Y
2018-02-01
The role of computational modeling for biomechanics research and related clinical care will be increasingly prominent. The biomechanics community has been developing computational models routinely for exploration of the mechanics and mechanobiology of diverse biological structures. As a result, a large array of models, data, and discipline-specific simulation software has emerged to support endeavors in computational biomechanics. Sharing computational models and related data and simulation software has first become a utilitarian interest, and now, it is a necessity. Exchange of models, in support of knowledge exchange provided by scholarly publishing, has important implications. Specifically, model sharing can facilitate assessment of reproducibility in computational biomechanics and can provide an opportunity for repurposing and reuse, and a venue for medical training. The community's desire to investigate biological and biomechanical phenomena crossing multiple systems, scales, and physical domains, also motivates sharing of modeling resources as blending of models developed by domain experts will be a required step for comprehensive simulation studies as well as the enhancement of their rigor and reproducibility. The goal of this paper is to understand current perspectives in the biomechanics community for the sharing of computational models and related resources. Opinions on opportunities, challenges, and pathways to model sharing, particularly as part of the scholarly publishing workflow, were sought. A group of journal editors and a handful of investigators active in computational biomechanics were approached to collect short opinion pieces as a part of a larger effort of the IEEE EMBS Computational Biology and the Physiome Technical Committee to address model reproducibility through publications. A synthesis of these opinion pieces indicates that the community recognizes the necessity and usefulness of model sharing. There is a strong will to facilitate model sharing, and there are corresponding initiatives by the scientific journals. Outside the publishing enterprise, infrastructure to facilitate model sharing in biomechanics exists, and simulation software developers are interested in accommodating the community's needs for sharing of modeling resources. Encouragement for the use of standardized markups, concerns related to quality assurance, acknowledgement of increased burden, and importance of stewardship of resources are noted. In the short-term, it is advisable that the community builds upon recent strategies and experiments with new pathways for continued demonstration of model sharing, its promotion, and its utility. Nonetheless, the need for a long-term strategy to unify approaches in sharing computational models and related resources is acknowledged. Development of a sustainable platform supported by a culture of open model sharing will likely evolve through continued and inclusive discussions bringing all stakeholders at the table, e.g., by possibly establishing a consortium.
Integrating Computing Resources: A Shared Distributed Architecture for Academics and Administrators.
ERIC Educational Resources Information Center
Beltrametti, Monica; English, Will
1994-01-01
Development and implementation of a shared distributed computing architecture at the University of Alberta (Canada) are described. Aspects discussed include design of the architecture, users' views of the electronic environment, technical and managerial challenges, and the campuswide human infrastructures needed to manage such an integrated…
Port Infrastructure: Financing of Navigation Projects at Small and Medium-Sized Ports
DOT National Transportation Integrated Search
2000-03-01
Under the Water Resources Development Act of 1986, all public ports have had to share in the cost of navigation projets with the Corps of Engineers by paying the nonfederal share of the project's cost, which ranges from 20-60 percent depending on the...
Development and Classroom Implementation of an Environmental Data Creation and Sharing Tool
ERIC Educational Resources Information Center
Brogan, Daniel S.; McDonald, Walter M.; Lohani, Vinod K.; Dymond, Randel L.; Bradner, Aaron J.
2016-01-01
Education is essential for solving the complex water-related challenges facing society. The Learning Enhanced Watershed Assessment System (LEWAS) and the Online Watershed Learning System (OWLS) provide data creation and data sharing infrastructures, respectively, that combine to form an environmental learning tool. This system collects, integrates…
The International Symposium on Grids and Clouds
NASA Astrophysics Data System (ADS)
The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.
[Continuity of nutritional care at discharge in the era of ICT].
Martínez Olmos, Miguel Ángel
2015-05-07
Telemedicine represents the union of information technology and telecommunication services in health. This allows the improvement of health care, especially in underserved areas, bringing professionals working in continuing education and improving patient care at home. The application of telemedicine in various hospital complexes, clinics and health centers, has helped to provide a better service, within the parameters of efficiency, effectiveness, cost-benefit, with increasing satisfaction of medical staff and patients. The development and application of various types of telemedicine, the technological development of audio, text, video and data, and constant improvement of infrastructure in telecommunications, have favored the expansion and development of telemedicine in various medical specialties. The use of electronic health records by different health professionals can have a positive impact on the care provided to patients. This should also be supported by the development of better health policies, legal security and greater awareness in health professionals and patients regarding the potential benefits. Regarding the clinical activity in Nutrition, new technologies also provide an opportunity to improve in various educational, preventive, diagnostic and treatment aspects, including shared track between Nutrition Units and Primary Care Teams, for patients who need home nutritional care at, with shared protocols, providing teleconsultation in required cases and avoiding unnecessary travel to hospital.
Brokering Capabilities for EarthCube - supporting Multi-disciplinary Earth Science Research
NASA Astrophysics Data System (ADS)
Jodha Khalsa, Siri; Pearlman, Jay; Nativi, Stefano; Browdy, Steve; Parsons, Mark; Duerr, Ruth; Pearlman, Francoise
2013-04-01
The goal of NSF's EarthCube is to create a sustainable infrastructure that enables the sharing of all geosciences data, information, and knowledge in an open, transparent and inclusive manner. Brokering of data and improvements in discovery and access are a key to data exchange and promotion of collaboration across the geosciences. In this presentation we describe an evolutionary process of infrastructure and interoperability development focused on participation of existing science research infrastructures and augmenting them for improved access. All geosciences communities already have, to a greater or lesser degree, elements of an information infrastructure in place. These elements include resources such as data archives, catalogs, and portals as well as vocabularies, data models, protocols, best practices and other community conventions. What is necessary now is a process for levering these diverse infrastructure elements into an overall infrastructure that provides easy discovery, access and utilization of resources across disciplinary boundaries. Brokers connect disparate systems with only minimal burdens upon those systems, and enable the infrastructure to adjust to new technical developments and scientific requirements as they emerge. Robust cyberinfrastructure will arise only when social, organizational, and cultural issues are resolved in tandem with the creation of technology-based services. This is a governance issue, but is facilitated by infrastructure capabilities that can impact the uptake of new interdisciplinary collaborations and exchange. Thus brokering must address both the cyberinfrastructure and computer technology requirements and also the social issues to allow improved cross-domain collaborations. This is best done through use-case-driven requirements and agile, iterative development methods. It is important to start by solving real (not hypothetical) information access and use problems via small pilot projects that develop capabilities targeted to specific communities. Brokering, as a critical capability for connecting systems, evolves over time through more connections and increased functionality. This adaptive process allows for continual evaluation as to how well science-driven use cases are being met. There is a near term, and possibly unique, opportunity through EarthCube and European e-Infrastructure projects to increase the impact and interconnectivity of projects. In the developments described in this presentation, brokering has been demonstrated to be an essential part of a robust, adaptive technical infrastructure and demonstration and user scenarios can address of both the governance and detailed implementation paths forward. The EarthCube Brokering roadmap proposes the expansion of brokering pilots into fully operational prototypes that work with the broader science and informatics communities to answer these questions, connect existing and emerging systems, and evolve the EarthCube infrastructure.
Payne, Philip R.O.
2014-01-01
Ongoing transformation relative to the funding climate for healthcare research programs housed in academic and non-profit research organizations has led to a new (or renewed) emphasis on the pursuit of non-traditional sustainability models. This need is often particularly acute in the context of data management and sharing infrastructure that is developed under the auspices of such research initiatives. One option for achieving sustainability of such data management and sharing infrastructure is the pursuit of technology licensing and commercialization, in an effort to establish public-private or equivalent partnerships that sustain and even expand upon the development and dissemination of research-oriented data management and sharing technologies. However, the critical success factors for technology licensing and commercialization efforts are often unknown to individuals outside of the private sector, thus making this type of endeavor challenging to investigators in academic and non-profit settings. In response to such a gap in knowledge, this article will review a number of generalizable lessons learned from an effort undertaken at The Ohio State University to commercialize a prototypical research-oriented data management and sharing infrastructure, known as the Translational Research Informatics and Data Management (TRIAD) Grid. It is important to note that the specific emphasis of these lessons learned is on the early stages of moving a technology from the research setting into a private-sector entity and as such are particularly relevant to academic investigators interested in pursuing such activities. PMID:25848609
Distributed Data Networks That Support Public Health Information Needs.
Tabano, David C; Cole, Elizabeth; Holve, Erin; Davidson, Arthur J
Data networks, consisting of pooled electronic health data assets from health care providers serving different patient populations, promote data sharing, population and disease monitoring, and methods to assess interventions. Better understanding of data networks, and their capacity to support public health objectives, will help foster partnerships, expand resources, and grow learning health systems. We conducted semistructured interviews with 16 key informants across the United States, identified as network stakeholders based on their respective experience in advancing health information technology and network functionality. Key informants were asked about their experience with and infrastructure used to develop data networks, including each network's utility to identify and characterize populations, usage, and sustainability. Among 11 identified data networks representing hundreds of thousands of patients, key informants described aggregated health care clinical data contributing to population health measures. Key informant interview responses were thematically grouped to illustrate how networks support public health, including (1) infrastructure and information sharing; (2) population health measures; and (3) network sustainability. Collaboration between clinical data networks and public health entities presents an opportunity to leverage infrastructure investments to support public health. Data networks can provide resources to enhance population health information and infrastructure.
EUFAR the key portal and network for airborne research in Europe
NASA Astrophysics Data System (ADS)
Gérard, Elisabeth; Brown, Philip
2017-04-01
Created in 2000 and supported by the EU Framework Programmes since then as an Integrating Activities' project, EUFAR (European Facility of Airborne Research in environmental and Geo-sciences) was born out of the necessity to create a central network and access point for the airborne research community in Europe. With the aim to support researchers by granting them access to aircraft and instrumentation most suited to the needs of researchers across Europe, not accessible in their home countries, EUFAR also provides technical support and training in the field of airborne research for the environmental and geosciences, and enables the sharing of expertise and harmonisation of research practices. Today, EUFAR2 (2014-2018) coordinates and facilitates transnational access to 19 instrumented aircraft and 5 remote-sensing instruments through the 14 operators who are part of EUFAR's current 24-partner European consortium. In addition, the current project supports networking and joint research activities focused on providing an enabling environment for and to promote airborne research. Examples of some of these recent activities will be shown EUFAR is currently seeking to establish itself as an AISBL (international non-profit association) to ensure its existence and operations beyond January 2018 when our present EC funding comes to an end. The objectives of the EUFAR AISBL will include continuing to develop the integration of the research aircraft community in Europe and also its links with other environmental research infrastructures, such as the community of research infrastructures under the umbrella of ENVRIplus. Another objective will be to continue to broaden access to research facilities beyond that supported solely by national funding streams so that EUFAR better approaches the status of a European open research infrastructure. Together with the implementation of an Open Access scheme by means of resource-sharing envisaged in late 2017, such a sustainable structure will contribute substantially toward broadening the user base of existing airborne research facilities in Europe and mobilising additional resources to this end. EUFAR AISBL will be the most appropriate organisation for the (i) coordination of joint activities among the European institutions involved in airborne research, and also (ii) coordination of projects funded by the European Commission or other bodies for supporting activities beyond the self-financing perimeter of the AISBL (transnational access projects, education and training events, joint research activities, etc.). This will confirm EUFAR's position as the key portal for airborne research in Europe. This central position opens the way for further collaboration with other communities (UAS, etc.) and environmental research infrastructures (IAGOS, ACTRIS, ENVRIplus, EUROFLEETS, etc.) to ensure the mutual benefit of joint efforts in addressing future science challenges in a multi-disciplinary approach to the study of the Earth system.
Are We Ready for Mass Fatality Incidents? Preparedness of the US Mass Fatality Infrastructure.
Merrill, Jacqueline A; Orr, Mark; Chen, Daniel Y; Zhi, Qi; Gershon, Robyn R
2016-02-01
To assess the preparedness of the US mass fatality infrastructure, we developed and tested metrics for 3 components of preparedness: organizational, operational, and resource sharing networks. In 2014, data were collected from 5 response sectors: medical examiners and coroners, the death care industry, health departments, faith-based organizations, and offices of emergency management. Scores were calculated within and across sectors and a weighted score was developed for the infrastructure. A total of 879 respondents reported highly variable organizational capabilities: 15% had responded to a mass fatality incident (MFI); 42% reported staff trained for an MFI, but only 27% for an MFI involving hazardous contaminants. Respondents estimated that 75% of their staff would be willing and able to respond, but only 53% if contaminants were involved. Most perceived their organization as somewhat prepared, but 13% indicated "not at all." Operational capability scores ranged from 33% (death care industry) to 77% (offices of emergency management). Network capability analysis found that only 42% of possible reciprocal relationships between resource-sharing partners were present. The cross-sector composite score was 51%; that is, half the key capabilities for preparedness were in place. The sectors in the US mass fatality infrastructure report suboptimal capability to respond. National leadership is needed to ensure sector-specific and infrastructure-wide preparedness for a large-scale MFI.
ERIC Educational Resources Information Center
Inverness Research, 2016
2016-01-01
In facilities throughout the United States and abroad, communities of scientists share infrastructure, instrumentation, and equipment to conduct scientific research. In these large facilities--laboratories, accelerators, telescope arrays, and research vessels--scientists are researching key questions that have the potential to make a significant…
Integrating TRENCADIS components in gLite to share DICOM medical images and structured reports.
Blanquer, Ignacio; Hernández, Vicente; Salavert, José; Segrelles, Damià
2010-01-01
The problem of sharing medical information among different centres has been tackled by many projects. Several of them target the specific problem of sharing DICOM images and structured reports (DICOM-SR), such as the TRENCADIS project. In this paper we propose sharing and organizing DICOM data and DICOM-SR metadata benefiting from the existent deployed Grid infrastructures compliant with gLite such as EGEE or the Spanish NGI. These infrastructures contribute with a large amount of storage resources for creating knowledge databases and also provide metadata storage resources (such as AMGA) to semantically organize reports in a tree-structure. First, in this paper, we present the extension of TRENCADIS architecture to use gLite components (LFC, AMGA, SE) on the shake of increasing interoperability. Using the metadata from DICOM-SR, and maintaining its tree structure, enables federating different but compatible diagnostic structures and simplifies the definition of complex queries. This article describes how to do this in AMGA and it shows an approach to efficiently code radiology reports to enable the multi-centre federation of data resources.
NASA Technical Reports Server (NTRS)
Zuniga, Allison; Turner, Mark; Rasky, Dan
2017-01-01
A new concept study was initiated to examine the framework needed to gradually develop an economical and sustainable lunar infrastructure using a public private partnerships approach. This approach would establish partnership agreements between NASA and industry teams to develop cis-lunar and surface capabilities for mutual benefit while sharing cost and risk in the development phase and then allowing for transfer of operation of these infrastructure services back to its industry owners in the execution phase. These infrastructure services may include but are not limited to the following: lunar cargo transportation, power stations, energy storage devices, communication relay satellites, local communication towers, and surface mobility operations.
Infrastructure-Less Communication Platform for Off-The-Shelf Android Smartphones
2018-01-01
As smartphones and other small portable devices become more sophisticated and popular, opportunities for communication and information sharing among such device users have increased. In particular, since it is known that infrastructure-less device-to-device (D2D) communication platforms consisting only of such devices are excellent in terms of, for example, bandwidth efficiency, efforts are being made to merge their information sharing capabilities with conventional infrastructure. However, efficient multi-hop communication is difficult with the D2D communication protocol, and many conventional D2D communication platforms require modifications of the protocol and terminal operating systems (OSs). In response to these issues, this paper reports on a proposed tree-structured D2D communication platform for Android devices that combines Wi-Fi Direct and Wi-Fi functions. The proposed platform, which is expected to be used with general Android 4.0 (or higher) OS equipped terminals, makes it possible to construct an ad hoc network instantaneously without sharing prior knowledge among participating devices. We will show the feasibility of our proposed platform through its design and demonstrate the implementation of a prototype using real devices. In addition, we will report on our investigation into communication delays and stability based on the number of hops and on terminal performance through experimental confirmation experiments. PMID:29510536
Infrastructure-Less Communication Platform for Off-The-Shelf Android Smartphones.
Oide, Takuma; Abe, Toru; Suganuma, Takuo
2018-03-04
As smartphones and other small portable devices become more sophisticated and popular, opportunities for communication and information sharing among such device users have increased. In particular, since it is known that infrastructure-less device-to-device (D2D) communication platforms consisting only of such devices are excellent in terms of, for example, bandwidth efficiency, efforts are being made to merge their information sharing capabilities with conventional infrastructure. However, efficient multi-hop communication is difficult with the D2D communication protocol, and many conventional D2D communication platforms require modifications of the protocol and terminal operating systems (OSs). In response to these issues, this paper reports on a proposed tree-structured D2D communication platform for Android devices that combines Wi-Fi Direct and Wi-Fi functions. The proposed platform, which is expected to be used with general Android 4.0 (or higher) OS equipped terminals, makes it possible to construct an ad hoc network instantaneously without sharing prior knowledge among participating devices. We will show the feasibility of our proposed platform through its design and demonstrate the implementation of a prototype using real devices. In addition, we will report on our investigation into communication delays and stability based on the number of hops and on terminal performance through experimental confirmation experiments.
NASA Technical Reports Server (NTRS)
Falke, Stefan; Husar, Rudolf
2011-01-01
The goal of this REASoN applications and technology project is to deliver and use Earth Science Enterprise (ESE) data and tools in support of air quality management. Its scope falls within the domain of air quality management and aims to develop a federated air quality information sharing network that includes data from NASA, EPA, US States and others. Project goals were achieved through a access of satellite and ground observation data, web services information technology, interoperability standards, and air quality community collaboration. In contributing to a network of NASA ESE data in support of particulate air quality management, the project will develop access to distributed data, build Web infrastructure, and create tools for data processing and analysis. The key technologies used in the project include emerging web services for developing self describing and modular data access and processing tools, and service oriented architecture for chaining web services together to assemble customized air quality management applications. The technology and tools required for this project were developed within DataFed.net, a shared infrastructure that supports collaborative atmospheric data sharing and processing web services. Much of the collaboration was facilitated through community interactions through the Federation of Earth Science Information Partners (ESIP) Air Quality Workgroup. The main activities during the project that successfully advanced DataFed, enabled air quality applications and established community-oriented infrastructures were: develop access to distributed data (surface and satellite), build Web infrastructure to support data access, processing and analysis create tools for data processing and analysis foster air quality community collaboration and interoperability.
A Drupal-Based Collaborative Framework for Science Workflows
NASA Astrophysics Data System (ADS)
Pinheiro da Silva, P.; Gandara, A.
2010-12-01
Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between scientists about artifacts used or created through scientific processes; and to leverage the knowledge collected within the artifacts and scientific collaborations to support scientific discoveries.
ERIC Educational Resources Information Center
Clifton, Jennifer; Loveridge, Jordan; Long, Elenore
2016-01-01
It is not typically the bent of infrastructure to be continually responsive in a way that is expansive and inclusive; instead, for newcomers or those with alternative histories, aims, vision, values, and perspectives, the inertia of infrastructure is more likely to be experienced as infrastructural breakdowns. We ask: "What might wisdom look…
NASA Astrophysics Data System (ADS)
Tarboton, D. G.; Idaszak, R.; Horsburgh, J. S.; Ames, D. P.; Goodall, J. L.; Band, L. E.; Merwade, V.; Couch, A.; Hooper, R. P.; Maidment, D. R.; Dash, P. K.; Stealey, M.; Yi, H.; Gan, T.; Castronova, A. M.; Miles, B.; Li, Z.; Morsy, M. M.; Crawley, S.; Ramirez, M.; Sadler, J.; Xue, Z.; Bandaragoda, C.
2016-12-01
How do you share and publish hydrologic data and models for a large collaborative project? HydroShare is a new, web-based system for sharing hydrologic data and models with specific functionality aimed at making collaboration easier. HydroShare has been developed with U.S. National Science Foundation support under the auspices of the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) to support the collaboration and community cyberinfrastructure needs of the hydrology research community. Within HydroShare, we have developed new functionality for creating datasets, describing them with metadata, and sharing them with collaborators. We cast hydrologic datasets and models as "social objects" that can be shared, collaborated around, annotated, published and discovered. In addition to data and model sharing, HydroShare supports web application programs (apps) that can act on data stored in HydroShare, just as software programs on your PC act on your data locally. This can free you from some of the limitations of local computing capacity and challenges in installing and maintaining software on your own PC. HydroShare's web-based cyberinfrastructure can take work off your desk or laptop computer and onto infrastructure or "cloud" based data and processing servers. This presentation will describe HydroShare's collaboration functionality that enables both public and private sharing with individual users and collaborative user groups, and makes it easier for collaborators to iterate on shared datasets and models, creating multiple versions along the way, and publishing them with a permanent landing page, metadata description, and citable Digital Object Identifier (DOI) when the work is complete. This presentation will also describe the web app architecture that supports interoperability with third party servers functioning as application engines for analysis and processing of big hydrologic datasets. While developed to support the cyberinfrastructure needs of the hydrology community, the informatics infrastructure for programmatic interoperability of web resources has a generality beyond the solution of hydrology problems that will be discussed.
Optimizing health information technology's role in enabling comparative effectiveness research.
Navathe, Amol S; Conway, Patrick H
2010-12-01
Health information technology (IT) is a key enabler of comparative effectiveness research (CER). Health IT standards for data sharing are essential to advancing the research data infrastructure, and health IT is critical to the next step of incorporating clinical data into data sources. Four key principles for advancement of CER are (1) utilization of data as a strategic asset, (2) leveraging public-private partnerships, (3) building robust, scalable technology platforms, and (4) coordination of activities across government agencies. To maximize the value of the resources, payers and providers must contribute data to initiatives, engage with government agencies on lessons learned, continue to develop new technologies that address key challenges, and utilize the data to improve patient outcomes and conduct research.
NASA Astrophysics Data System (ADS)
Wang, Jingbo; Bastrakova, Irina; Evans, Ben; Gohar, Kashif; Santana, Fabiana; Wyborn, Lesley
2015-04-01
National Computational Infrastructure (NCI) manages national environmental research data collections (10+ PB) as part of its specialized high performance data node of the Research Data Storage Infrastructure (RDSI) program. We manage 40+ data collections using NCI's Data Management Plan (DMP), which is compatible with the ISO 19100 metadata standards. We utilize ISO standards to make sure our metadata is transferable and interoperable for sharing and harvesting. The DMP is used along with metadata from the data itself, to create a hierarchy of data collection, dataset and time series catalogues that is then exposed through GeoNetwork for standard discoverability. This hierarchy catalogues are linked using a parent-child relationship. The hierarchical infrastructure of our GeoNetwork catalogues system aims to address both discoverability and in-house administrative use-cases. At NCI, we are currently improving the metadata interoperability in our catalogue by linking with standardized community vocabulary services. These emerging vocabulary services are being established to help harmonise data from different national and international scientific communities. One such vocabulary service is currently being established by the Australian National Data Services (ANDS). Data citation is another important aspect of the NCI data infrastructure, which allows tracking of data usage and infrastructure investment, encourage data sharing, and increasing trust in research that is reliant on these data collections. We incorporate the standard vocabularies into the data citation metadata so that the data citation become machine readable and semantically friendly for web-search purpose as well. By standardizing our metadata structure across our entire data corpus, we are laying the foundation to enable the application of appropriate semantic mechanisms to enhance discovery and analysis of NCI's national environmental research data information. We expect that this will further increase the data discoverability and encourage the data sharing and reuse within the community, increasing the value of the data much further than its current use.
NASA Astrophysics Data System (ADS)
van Hemert, Jano; Vilotte, Jean-Pierre
2010-05-01
Research in earthquake and seismology addresses fundamental problems in understanding Earth's internal wave sources and structures, and augment applications to societal concerns about natural hazards, energy resources and environmental change. This community is central to the European Plate Observing System (EPOS)—the ESFRI initiative in solid Earth Sciences. Global and regional seismology monitoring systems are continuously operated and are transmitting a growing wealth of data from Europe and from around the world. These tremendous volumes of seismograms, i.e., records of ground motions as a function of time, have a definite multi-use attribute, which puts a great premium on open-access data infrastructures that are integrated globally. In Europe, the earthquake and seismology community is part of the European Integrated Data Archives (EIDA) infrastructure and is structured as "horizontal" data services. On top of this distributed data archive system, the community has developed recently within the EC project NERIES advanced SOA-based web services and a unified portal system. Enabling advanced analysis of these data by utilising a data-aware distributed computing environment is instrumental to fully exploit the cornucopia of data and to guarantee optimal operation of the high-cost monitoring facilities. The strategy of VERCE is driven by the needs of data-intensive applications in data mining and modelling and will be illustrated through a set of applications. It aims to provide a comprehensive architecture and framework adapted to the scale and the diversity of these applications, and to integrate the community data infrastructure with Grid and HPC infrastructures. A first novel aspect is a service-oriented architecture that provides well-equipped integrated workbenches, with an efficient communication layer between data and Grid infrastructures, augmented with bridges to the HPC facilities. A second novel aspect is the coupling between Grid data analysis and HPC data modelling applications through workflow and data sharing mechanisms. VERCE will develop important interactions with the European infrastructure initiatives in Grid and HPC computing. The VERCE team: CNRS-France (IPG Paris, LGIT Grenoble), UEDIN (UK), KNMI-ORFEUS (Holland), EMSC, INGV (Italy), LMU (Germany), ULIV (UK), BADW-LRZ (Germany), SCAI (Germany), CINECA (Italy)
An open, component-based information infrastructure for integrated health information networks.
Tsiknakis, Manolis; Katehakis, Dimitrios G; Orphanoudakis, Stelios C
2002-12-18
A fundamental requirement for achieving continuity of care is the seamless sharing of multimedia clinical information. Different technological approaches can be adopted for enabling the communication and sharing of health record segments. In the context of the emerging global information society, the creation of and access to the integrated electronic health record (I-EHR) of a citizen has been assigned high priority in many countries. This requirement is complementary to an overall requirement for the creation of a health information infrastructure (HII) to support the provision of a variety of health telematics and e-health services. In developing a regional or national HII, the components or building blocks that make up the overall information system ought to be defined and an appropriate component architecture specified. This paper discusses current international priorities and trends in developing the HII. It presents technological challenges and alternative approaches towards the creation of an I-EHR, being the aggregation of health data created during all interactions of an individual with the healthcare system. It also presents results from an ongoing Research and Development (R&D) effort towards the implementation of the HII in HYGEIAnet, the regional health information network of Crete, Greece, using a component-based software engineering approach. Critical design decisions and related trade-offs, involved in the process of component specification and development, are also discussed and the current state of development of an I-EHR service is presented. Finally, Human Computer Interaction (HCI) and security issues, which are important for the deployment and use of any I-EHR service, are considered.
Code of Federal Regulations, 2014 CFR
2014-01-01
... hereby ordered as follows: Section 1. Policy. Repeated cyber intrusions into critical infrastructure demonstrate the need for improved cybersecurity. The cyber threat to critical infrastructure continues to grow... resilience of the Nation's critical infrastructure and to maintain a cyber environment that encourages...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-22
... relation to market price and net asset value (``NAV'') per common share) and the relationship between the... relation to NAV per share). Applicants state that the Independent Directors also considered what conflicts... appropriate in the public interest and consistent with the protection of investors and the purposes fairly...
ERIC Educational Resources Information Center
Dede, Chris
2009-01-01
Greenhow, Robelia, and Hughes (2009) argue that Web 2.0 media are well suited to enhancing the education research community's purpose of generating and sharing knowledge. The author of this comment article first articulates how a research infrastructure with capabilities for communal bookmarking, photo and video sharing, social networking, wikis,…
NASA Astrophysics Data System (ADS)
Cornaglia, Bruno; Young, Gavin; Marchetta, Antonio
2015-12-01
Fixed broadband network deployments are moving inexorably to the use of Next Generation Access (NGA) technologies and architectures. These NGA deployments involve building fiber infrastructure increasingly closer to the customer in order to increase the proportion of fiber on the customer's access connection (Fibre-To-The-Home/Building/Door/Cabinet… i.e. FTTx). This increases the speed of services that can be sold and will be increasingly required to meet the demands of new generations of video services as we evolve from HDTV to "Ultra-HD TV" with 4k and 8k lines of video resolution. However, building fiber access networks is a costly endeavor. It requires significant capital in order to cover any significant geographic coverage. Hence many companies are forming partnerships and joint-ventures in order to share the NGA network construction costs. One form of such a partnership involves two companies agreeing to each build to cover a certain geographic area and then "cross-selling" NGA products to each other in order to access customers within their partner's footprint (NGA coverage area). This is tantamount to a bi-lateral wholesale partnership. The concept of Fixed Access Network Sharing (FANS) is to address the possibility of sharing infrastructure with a high degree of flexibility for all network operators involved. By providing greater configuration control over the NGA network infrastructure, the service provider has a greater ability to define the network and hence to define their product capabilities at the active layer. This gives the service provider partners greater product development autonomy plus the ability to differentiate from each other at the active network layer.
Ohmann, Christian; Banzi, Rita; Canham, Steve; Battaglia, Serena; Matei, Mihaela; Ariyo, Christopher; Becnel, Lauren; Bierer, Barbara; Bowers, Sarion; Clivio, Luca; Dias, Monica; Druml, Christiane; Faure, Hélène; Fenner, Martin; Galvez, Jose; Ghersi, Davina; Gluud, Christian; Groves, Trish; Houston, Paul; Karam, Ghassan; Kalra, Dipak; Knowles, Rachel L; Krleža-Jerić, Karmela; Kubiak, Christine; Kuchinke, Wolfgang; Kush, Rebecca; Lukkarinen, Ari; Marques, Pedro Silverio; Newbigging, Andrew; O'Callaghan, Jennifer; Ravaud, Philippe; Schlünder, Irene; Shanahan, Daniel; Sitter, Helmut; Spalding, Dylan; Tudur-Smith, Catrin; van Reusel, Peter; van Veen, Evert-Ben; Visser, Gerben Rienk; Wilson, Julia; Demotes-Mainard, Jacques
2017-12-14
We examined major issues associated with sharing of individual clinical trial data and developed a consensus document on providing access to individual participant data from clinical trials, using a broad interdisciplinary approach. This was a consensus-building process among the members of a multistakeholder task force, involving a wide range of experts (researchers, patient representatives, methodologists, information technology experts, and representatives from funders, infrastructures and standards development organisations). An independent facilitator supported the process using the nominal group technique. The consensus was reached in a series of three workshops held over 1 year, supported by exchange of documents and teleconferences within focused subgroups when needed. This work was set within the Horizon 2020-funded project CORBEL (Coordinated Research Infrastructures Building Enduring Life-science Services) and coordinated by the European Clinical Research Infrastructure Network. Thus, the focus was on non-commercial trials and the perspective mainly European. We developed principles and practical recommendations on how to share data from clinical trials. The task force reached consensus on 10 principles and 50 recommendations, representing the fundamental requirements of any framework used for the sharing of clinical trials data. The document covers the following main areas: making data sharing a reality (eg, cultural change, academic incentives, funding), consent for data sharing, protection of trial participants (eg, de-identification), data standards, rights, types and management of access (eg, data request and access models), data management and repositories, discoverability, and metadata. The adoption of the recommendations in this document would help to promote and support data sharing and reuse among researchers, adequately inform trial participants and protect their rights, and provide effective and efficient systems for preparing, storing and accessing data. The recommendations now need to be implemented and tested in practice. Further work needs to be done to integrate these proposals with those from other geographical areas and other academic domains. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Prepaid group practices offer lessons in staffing ratios.
2004-05-01
Capitated physician organizations and prepaid group practices share many similarities in staffing, care processes and infrastructure. Use these benchmarks to help conduct physician workforce planning.
Approach to sustainable e-Infrastructures - The case of the Latin American Grid
NASA Astrophysics Data System (ADS)
Barbera, Roberto; Diacovo, Ramon; Brasileiro, Francisco; Carvalho, Diego; Dutra, Inês; Faerman, Marcio; Gavillet, Philippe; Hoeger, Herbert; Lopez Pourailly, Maria Jose; Marechal, Bernard; Garcia, Rafael Mayo; Neumann Ciuffo, Leandro; Ramos Pollan, Paul; Scardaci, Diego; Stanton, Michael
2010-05-01
The EELA (E-Infrastructure shared between Europe and Latin America) and EELA-2 (E-science grid facility for Europe and Latin America) projects, co-funded by the European Commission under FP6 and FP7, respectively, have been successful in building a high capacity, production-quality, scalable Grid Facility for a wide spectrum of applications (e.g. Earth & Life Sciences, High energy physics, etc.) from several European and Latin American User Communities. This paper presents the 4-year experience of EELA and EELA-2 in: • Providing each Member Institution the unique opportunity to benefit of a huge distributed computing platform for its research activities, in particular through initiatives such as OurGrid which proposes a so-called Opportunistic Grid Computing well adapted to small and medium Research Laboratories such as most of those of Latin America and Africa; • Developing a realistic strategy to ensure the long-term continuity of the e-Infrastructure in the Latin American continent, beyond the term of the EELA-2 project, in association with CLARA and collaborating with EGI. Previous interactions between EELA and African Grid members at events such as the IST Africa'07, 08 and 09, the International Conference on Open Access'08 and EuroAfriCa-ICT'08, to which EELA and EELA-2 contributed, have shown that the e-Infrastructure situation in Africa compares well with the Latin American one. This means that African Grids are likely to face the same problems that EELA and EELA-2 experienced, especially in getting the necessary User and Decision Makers support to create NGIs and, later, a possible continent-wide African Grid Initiative (AGI). The hope is that the EELA-2 endeavour towards sustainability as described in this presentation could help the progress of African Grids.
Standing Naval Forces and Global Security
1993-06-04
standards an- good engineering practices. The team submits a r:-,cr: to !PPC recommending that the prcject be accepted b NATO. 8. Audit . The...established. A system of common funds and trailing audits must be in effect to pay for the infrastructure. NATO infrastructure appears to be a good example to...Search And Rescue and maritime safety monitor marine polution 6. sharing maritime inteiiigence1 5 Commodore Bateman foresees coupling these activities or
A Combination Therapy of JO-I and Chemotherapy in Ovarian Cancer Models
2013-10-01
which consists of a 3PAR storage backend and is sharing data via a highly available NetApp storage gateway and 2 high throughput commodity storage...Environment is configured as self- service Enterprise cloud and currently hosts more than 700 virtual machines. The network infrastructure consists of...technology infrastructure and information system applications designed to integrate, automate, and standardize operations. These systems fuse state of
About opportunities of the sharing of city infrastructure centralized warmly - and water supply
NASA Astrophysics Data System (ADS)
Zamaleev, M. M.; Gubin, I. V.; Sharapov, V. I.
2017-11-01
It is shown that joint use of engineering infrastructure of centralized heat and water supply of consumers will be the cost-efficient decision for municipal services of the city. The new technology for regulated heating of drinking water in the condenser of steam turbines of combined heat and power plant is offered. Calculation of energy efficiency from application of new technology is executed.
A Spatial Data Infrastructure for Environmental Noise Data in Europe.
Abramic, Andrej; Kotsev, Alexander; Cetl, Vlado; Kephalopoulos, Stylianos; Paviotti, Marco
2017-07-06
Access to high quality data is essential in order to better understand the environmental and health impact of noise in an increasingly urbanised world. This paper analyses how recent developments of spatial data infrastructures in Europe can significantly improve the utilization of data and streamline reporting on a pan-European scale. The Infrastructure for Spatial Information in the European Community (INSPIRE), and Environmental Noise Directive (END) described in this manuscript provide principles for data management that, once applied, would lead to a better understanding of the state of environmental noise. Furthermore, shared, harmonised and easily discoverable environmental spatial data, required by the INSPIRE, would also support the data collection needed for the assessment and development of strategic noise maps. Action plans designed by the EU Member States to reduce noise and mitigate related effects can be shared to the public through already established nodes of the European spatial data infrastructure. Finally, data flows regarding reporting on the state of environment and END implementation to the European level can benefit by applying a decentralised e-reporting service oriented infrastructure. This would allow reported data to be maintained, frequently updated and enable pooling of information from/to other relevant and interrelated domains such as air quality, transportation, human health, population, marine environment or biodiversity. We describe those processes and provide a use case in which noise data from two neighbouring European countries are mapped to common data specifications, defined by INSPIRE, thus ensuring interoperability and harmonisation.
Hasson, Uri; Skipper, Jeremy I; Wilde, Michael J; Nusbaum, Howard C; Small, Steven L
2008-01-15
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data.
Standard development at the Human Variome Project.
Smith, Timothy D; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. © The Author(s) 2015. Published by Oxford University Press.
Standard development at the Human Variome Project
Smith, Timothy D.; Vihinen, Mauno
2015-01-01
The Human Variome Project (HVP) is a world organization working towards facilitating the collection, curation, interpretation and free and open sharing of genetic variation information. A key component of HVP activities is the development of standards and guidelines. HVP Standards are systems, procedures and technologies that the HVP Consortium has determined must be used by HVP-affiliated data sharing infrastructure and should be used by the broader community. HVP guidelines are considered to be beneficial for HVP affiliated data sharing infrastructure and the broader community to adopt. The HVP also maintains a process for assessing systems, processes and tools that implement HVP Standards and Guidelines. Recommended System Status is an accreditation process designed to encourage the adoption of HVP Standards and Guidelines. Here, we describe the HVP standards development process and discuss the accepted standards, guidelines and recommended systems as well as those under acceptance. Certain HVP Standards and Guidelines are already widely adopted by the community and there are committed users for the others. PMID:25818894
Chervenak, Ann L; van Erp, Theo G M; Kesselman, Carl; D'Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G
2012-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment.
Hasson, Uri; Skipper, Jeremy I.; Wilde, Michael J.; Nusbaum, Howard C.; Small, Steven L.
2007-01-01
The increasingly complex research questions addressed by neuroimaging research impose substantial demands on computational infrastructures. These infrastructures need to support management of massive amounts of data in a way that affords rapid and precise data analysis, to allow collaborative research, and to achieve these aims securely and with minimum management overhead. Here we present an approach that overcomes many current limitations in data analysis and data sharing. This approach is based on open source database management systems that support complex data queries as an integral part of data analysis, flexible data sharing, and parallel and distributed data processing using cluster computing and Grid computing resources. We assess the strengths of these approaches as compared to current frameworks based on storage of binary or text files. We then describe in detail the implementation of such a system and provide a concrete description of how it was used to enable a complex analysis of fMRI time series data. PMID:17964812
Chervenak, Ann L.; van Erp, Theo G.M.; Kesselman, Carl; D’Arcy, Mike; Sobell, Janet; Keator, David; Dahm, Lisa; Murry, Jim; Law, Meng; Hasso, Anton; Ames, Joseph; Macciardi, Fabio; Potkin, Steven G.
2015-01-01
Progress in our understanding of brain disorders increasingly relies on the costly collection of large standardized brain magnetic resonance imaging (MRI) data sets. Moreover, the clinical interpretation of brain scans benefits from compare and contrast analyses of scans from patients with similar, and sometimes rare, demographic, diagnostic, and treatment status. A solution to both needs is to acquire standardized, research-ready clinical brain scans and to build the information technology infrastructure to share such scans, along with other pertinent information, across hospitals. This paper describes the design, deployment, and operation of a federated imaging system that captures and shares standardized, de-identified clinical brain images in a federation across multiple institutions. In addition to describing innovative aspects of the system architecture and our initial testing of the deployed infrastructure, we also describe the Standardized Imaging Protocol (SIP) developed for the project and our interactions with the Institutional Review Board (IRB) regarding handling patient data in the federated environment. PMID:22941984
NASA Astrophysics Data System (ADS)
Allison, M. Lee; Davis, Rowena
2016-04-01
An e-infrastructure that supports data-intensive, multidisciplinary research is needed to accelerate the pace of science to address 21st century global change challenges. Data discovery, access, sharing and interoperability collectively form core elements of an emerging shared vision of e-infrastructure for scientific discovery. The pace and breadth of change in information management across the data lifecycle means that no one country or institution can unilaterally provide the leadership and resources required to use data and information effectively, or needed to support a coordinated, global e-infrastructure. An 18-month long process involving ~120 experts in domain, computer, and social sciences from more than a dozen countries resulted in a formal set of recommendations that were adopted in fall, 2015 by the Belmont Forum collaboration of national science funding agencies and international bodies on what they are best suited to implement for development of an e-infrastructure in support of global change research, including: • adoption of data principles that promote a global, interoperable e-infrastructure, that can be enforced • establishment of information and data officers for coordination of global data management and e-infrastructure efforts • promotion of effective data planning and stewardship • determination of international and community best practices for adoption • development of a cross-disciplinary training curriculum on data management and curation The implementation plan is being executed under four internationally-coordinated Action Themes towards a globally organized, internationally relevant e-infrastructure and data management capability drawn from existing components, protocols, and standards. The Belmont Forum anticipates opportunities to fund additional projects to fill key gaps and to integrate best practices into an e-infrastructure to support their programs but that can also be scaled up and deployed more widely. Background The Belmont Forum is a global consortium established in 2009 to build on the work of the International Group of Funding Agencies for Global Change Research toward furthering collaborative efforts to deliver knowledge needed for action to avoid and adapt to detrimental environmental change, including extreme hazardous events.
Patient-controlled sharing of medical imaging data across unaffiliated healthcare organizations
Ahn, David K; Unde, Bhagyashree; Gage, H Donald; Carr, J Jeffrey
2013-01-01
Background Current image sharing is carried out by manual transportation of CDs by patients or organization-coordinated sharing networks. The former places a significant burden on patients and providers. The latter faces challenges to patient privacy. Objective To allow healthcare providers efficient access to medical imaging data acquired at other unaffiliated healthcare facilities while ensuring strong protection of patient privacy and minimizing burden on patients, providers, and the information technology infrastructure. Methods An image sharing framework is described that involves patients as an integral part of, and with full control of, the image sharing process. Central to this framework is the Patient Controlled Access-key REgistry (PCARE) which manages the access keys issued by image source facilities. When digitally signed by patients, the access keys are used by any requesting facility to retrieve the associated imaging data from the source facility. A centralized patient portal, called a PCARE patient control portal, allows patients to manage all the access keys in PCARE. Results A prototype of the PCARE framework has been developed by extending open-source technology. The results for feasibility, performance, and user assessments are encouraging and demonstrate the benefits of patient-controlled image sharing. Discussion The PCARE framework is effective in many important clinical cases of image sharing and can be used to integrate organization-coordinated sharing networks. The same framework can also be used to realize a longitudinal virtual electronic health record. Conclusion The PCARE framework allows prior imaging data to be shared among unaffiliated healthcare facilities while protecting patient privacy with minimal burden on patients, providers, and infrastructure. A prototype has been implemented to demonstrate the feasibility and benefits of this approach. PMID:22886546
Community for Data Integration 2014 annual report
Langseth, Madison L.; Chang, Michelle Y.; Carlino, Jennifer; Birch, Daniella D.; Bradley, Joshua; Bristol, R. Sky; Conzelmann, Craig; Diehl, Robert H.; Earle, Paul S.; Ellison, Laura E.; Everette, Anthony L.; Fuller, Pamela L.; Gordon, Janice M.; Govoni, David L.; Guy, Michelle R.; Henkel, Heather S.; Hutchison, Vivian B.; Kern, Tim; Lightsom, Frances L.; Long, Joseph W.; Longhenry, Ryan; Preston, Todd M.; Smith, Stan W.; Viger, Roland J.; Wesenberg, Katherine; Wood, Eric C.
2015-10-02
To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical challenges including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. As of 2014, the CDI continues to hold an annual RFP that creates data management tools and practices, collaboration tools, and training in support of data integration and delivery.
Accelerating the pace of progress.
Chinery-hesse, M
1995-01-01
During the past 20 years, the proportion of women in paid employment has increased significantly. Yet women still do not have equal opportunity or receive equal treatment in the workplace. Under these conditions, society loses because it does not completely utilize all its human resources toward sustainable development. All levels of preparation for the Fourth World Conference on Women, scheduled for September 1995 in Beijing, have allowed participants to review the current social status of women. Globalization of the economy, technological transformation, and restructuring of the labor market pose difficult problems and emerging challenges for women: feminization of poverty, increasing instability of women's work, continued inequality in the sharing of family responsibilities, and inadequacy of social infrastructure to allow women both to work and to attend to their family's needs. Working women's rights continue to be neglected. The International Labor Organization's (ILO) 75 years of experience suggest that a comprehensive integrated approach and participation of all relevant actors are needed to address gender inequality in the workplace. ILO has proposed such a strategy.
A technological infrastructure to sustain Internetworked Enterprises
NASA Astrophysics Data System (ADS)
La Mattina, Ernesto; Savarino, Vincenzo; Vicari, Claudia; Storelli, Davide; Bianchini, Devis
In the Web 3.0 scenario, where information and services are connected by means of their semantics, organizations can improve their competitive advantage by publishing their business and service descriptions. In this scenario, Semantic Peer to Peer (P2P) can play a key role in defining dynamic and highly reconfigurable infrastructures. Organizations can share knowledge and services, using this infrastructure to move towards value networks, an emerging organizational model characterized by fluid boundaries and complex relationships. This chapter collects and defines the technological requirements and architecture of a modular and multi-Layer Peer to Peer infrastructure for SOA-based applications. This technological infrastructure, based on the combination of Semantic Web and P2P technologies, is intended to sustain Internetworked Enterprise configurations, defining a distributed registry and enabling more expressive queries and efficient routing mechanisms. The following sections focus on the overall architecture, while describing the layers that form it.
40 CFR 52.1991 - Section 110(a)(2) infrastructure requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PROGRAMS (CONTINUED) APPROVAL AND PROMULGATION OF IMPLEMENTATION PLANS (CONTINUED) Oregon § 52.1991 Section 110(a)(2) infrastructure requirements. (a) On September 25, 2008, Oregon Department of Environmental...). (b) On September 25, 2008, December 23, 2010, August 17, 2011, and December 19, 2011, the Oregon...
77 FR 14462 - Space Transportation Infrastructure Matching Grants Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-09
... DEPARTMENT OF TRANSPORTATION Federal Aviation Administration Space Transportation Infrastructure... grant proposals for the Space Transportation Infrastructure Matching Grants Program. SUMMARY: This notice solicits Fiscal Year (FY) 2012 grant proposals to continue the development of a Commercial Space...
Image Sharing in Radiology-A Primer.
Chatterjee, Arindam R; Stalcup, Seth; Sharma, Arjun; Sato, T Shawn; Gupta, Pushpender; Lee, Yueh Z; Malone, Christopher; McBee, Morgan; Hotaling, Elise L; Kansagra, Akash P
2017-03-01
By virtue of its information technology-oriented infrastructure, the specialty of radiology is uniquely positioned to be at the forefront of efforts to promote data sharing across the healthcare enterprise, including particularly image sharing. The potential benefits of image sharing for clinical, research, and educational applications in radiology are immense. In this work, our group-the Association of University Radiologists (AUR) Radiology Research Alliance Task Force on Image Sharing-reviews the benefits of implementing image sharing capability, introduces current image sharing platforms and details their unique requirements, and presents emerging platforms that may see greater adoption in the future. By understanding this complex ecosystem of image sharing solutions, radiologists can become important advocates for the successful implementation of these powerful image sharing resources. Copyright © 2017 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Day, K J; Norris, A C
2006-03-01
Shared services organizations are ascribed with adding value to business in several ways but especially by sharing resources and leading to economies of scale. However, these gains are not automatic and in some instances, particularly healthcare, they are difficult to achieve. This article describes a project to develop a shared services information technology infrastructure across two district health boards in New Zealand. The study reveals valuable insight into the crisis issues that accompany change management and identifies emergent themes that can be used to reduce negative impact.
Strategic behaviors and governance challenges in social-ecological systems
NASA Astrophysics Data System (ADS)
Muneepeerakul, Rachata; Anderies, John M.
2017-08-01
The resource management and environmental policy literature focuses on devising regulations and incentive structures to achieve desirable goals. It often presumes the existence of public infrastructure that actualizes these incentives and regulations through a process loosely referred to as `governance.' In many cases, it is not clear if and how such governance infrastructure can be created and supported. Here, we take a complex systems view in which `governance' is an emergent phenomenon generated by interactions between social, economic, and environmental (both built and natural) factors. We present a framework and formal stylized model to explore under what circumstances stable governance structures may emerge endogenously in coupled infrastructure systems comprising shared natural, social, and built infrastructures of which social-ecological systems are specific examples. The model allows us to derive general conditions for a sustainable coupled infrastructure system in which critical infrastructure (e.g., canals) is provided by a governing entity that enables resource users (e.g., farmers) to produce outputs from natural infrastructure (e.g., water) to meet their needs while supporting the governing entity.
NASA Technical Reports Server (NTRS)
Scott, John Carver
1991-01-01
During the course of recent years the frequency and magnitude of major disasters - of natural, technological, or ecological origin - have made the world community dramatically aware of the immense losses of human life and economic resources that are caused regularly by such calamities. Particularly hard hit are developing countries, for whom the magnitude of disasters frequently outstrips the ability of the society to cope with them. In many cases this situation can be prevented, and the recent trend in disaster management has been to emphasize the importance of preparedness and mitigation as a means of prevention. In cases of disaster, a system is needed to respond to relief requirements, particularly the delivery of medical care. There is no generic telecommunications infrastructure appropriate for the variety of applications in medical care and disaster management. The need to integrate telemedicine/telehealth into shared regional disaster management telecommunications networks is discussed. Focus is on the development of infrastructure designed to serve the needs of disaster prone regions of the developing world.
Scalability Issues for Remote Sensing Infrastructure: A Case Study.
Liu, Yang; Picard, Sean; Williamson, Carey
2017-04-29
For the past decade, a team of University of Calgary researchers has operated a large "sensor Web" to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system's memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure.
Survey of Collaboration Technologies in Multi-level Security Environments
2014-04-28
infrastructure or resources. In this research program, the security implications of the US Air Force GeoBase (the US The problem is that in many cases...design structure. ORA uses a Java interface for ease of use, and a C++ computational backend . The current version ORA1.2 software is available on the...information: culture, policy, governance, economics and resources, and technology and infrastructure . This plan, the DoD Information Sharing
NASA Technical Reports Server (NTRS)
2005-01-01
KENNEDY SPACE CENTER, FLA. At the 1st Space Exploration Conference: Continuing the Voyage of Discovery, held at Disneys Contemporary Resort in Orlando, film director James Cameron (right) talks to Daniel Stearns, a 13-year-old student from Longmeadow, Mass., who won the Space Exploration Video Festival award sponsored by Lockheed Martin. Stearns shared first place with a team from McNair High School in Dekalb County, Ga. The Georgia school participates in NASAs Explorer School program. Cameron is one of the keynote speakers at the conference. Topics being presented focus on new missions, technologies and infrastructure needed to turn the vision for space exploration into reality. Other keynote speakers at the three-day conference are Congressman Dave Weldon, film director James Cameron and NASAs senior Mars scientist James Garvin. The conference has drawn attendees from around the world.
The Economic and Social Value of an Image Exchange Network: A Case for the Cloud.
Mayo, Ray Cody; Pearson, Kathryn L; Avrin, David E; Leung, Jessica W T
2017-01-01
As the health care environment continually changes, radiologists look to the ACR's Imaging 3.0 ® initiative to guide the search for value. By leveraging new technology, a cloud-based image exchange network could provide secure universal access to prior images, which were previously siloed, to facilitate accurate interpretation, improved outcomes, and reduced costs. The breast imaging department represents a viable starting point given the robust data supporting the benefit of access to prior imaging studies, existing infrastructure for image sharing, and the current workflow reliance on prior images. This concept is scalable not only to the remainder of the radiology department but also to the broader medical record. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
XDS in healthcare: Could it lead to a duplication problem? Field study from GVR Sweden
NASA Astrophysics Data System (ADS)
Wintell, M.; Lundberg, N.; Lindsköld, L.
2011-03-01
Managing different registries and repositories within healthcare regions grows the risk of having almost the same information but with different status and with different content. This is due to the fact that when medical information is created it's done in a dynamical process that will lead to that information will change its contents during lifetime within the "active" healthcare phase. The information needs to be easy accessible, being the platform for making the medical decisions transparent. In the Region Västra Götaland (VGR), Sweden, data is shared from 29 X-ray departments with different Picture Archive and Communication Systems (PACS) and Radiology Information Systems (RIS) systems through the Infobroker solution, that's acts as a broker between the actors involved. Request/reports from RIS are stored as DIgital COmmunication in Medicine (DICOM)-Structured Reports (SR) objects, together with the images. Every status change within this activities are updated within the Information Infrastructure based on Integrating the Healthcare Enterprise (IHE) mission. Cross-enterprise Document Sharing for Imaging (XDS-I) were the registry and the central repository are the components used for sharing medical documentation. The VGR strategy was not to apply one regional XDS-I registry and repository, instead VGR applied an Enterprise Architecture (EA) intertwined with the Information Infrastructure for the dynamic delivery to consumers. The upcoming usage of different Regional XDS registries and repositories could lead to new ways of carrying out shared work but it can also lead into "problems". XDS and XDS-I implemented without a strategy could lead to increased numbers of status/versions but also duplication of information in the Information Infrastructure.
Regional Charging Infrastructure for Plug-In Electric Vehicles: A Case Study of Massachusetts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric; Raghavan, Sesha; Rames, Clement
Given the complex issues associated with plug-in electric vehicle (PEV) charging and options in deploying charging infrastructure, there is interest in exploring scenarios of future charging infrastructure deployment to provide insight and guidance to national and regional stakeholders. The complexity and cost of PEV charging infrastructure pose challenges to decision makers, including individuals, communities, and companies considering infrastructure installations. The value of PEVs to consumers and fleet operators can be increased with well-planned and cost-effective deployment of charging infrastructure. This will increase the number of miles driven electrically and accelerate PEV market penetration, increasing the shared value of charging networksmore » to an expanding consumer base. Given these complexities and challenges, the objective of the present study is to provide additional insight into the role of charging infrastructure in accelerating PEV market growth. To that end, existing studies on PEV infrastructure are summarized in a literature review. Next, an analysis of current markets is conducted with a focus on correlations between PEV adoption and public charging availability. A forward-looking case study is then conducted focused on supporting 300,000 PEVs by 2025 in Massachusetts. The report concludes with a discussion of potential methodology for estimating economic impacts of PEV infrastructure growth.« less
NASA Astrophysics Data System (ADS)
Mortazavi-Naeini, Mohammad; Kuczera, George; Cui, Lijie
2014-06-01
Significant population increase in urban areas is likely to result in a deterioration of drought security and level of service provided by urban water resource systems. One way to cope with this is to optimally schedule the expansion of system resources. However, the high capital costs and environmental impacts associated with expanding or building major water infrastructure warrant the investigation of scheduling system operational options such as reservoir operating rules, demand reduction policies, and drought contingency plans, as a way of delaying or avoiding the expansion of water supply infrastructure. Traditionally, minimizing cost has been considered the primary objective in scheduling capacity expansion problems. In this paper, we consider some of the drawbacks of this approach. It is shown that there is no guarantee that the social burden of coping with drought emergencies is shared equitably across planning stages. In addition, it is shown that previous approaches do not adequately exploit the benefits of joint optimization of operational and infrastructure options and do not adequately address the need for the high level of drought security expected for urban systems. To address these shortcomings, a new multiobjective optimization approach to scheduling capacity expansion in an urban water resource system is presented and illustrated in a case study involving the bulk water supply system for Canberra. The results show that the multiobjective approach can address the temporal equity issue of sharing the burden of drought emergencies and that joint optimization of operational and infrastructure options can provide solutions superior to those just involving infrastructure options.
Healthcare information technology infrastructures in Turkey.
Dogac, A; Yuksel, M; Ertürkmen, G L; Kabak, Y; Namli, T; Yıldız, M H; Ay, Y; Ceyhan, B; Hülür, U; Oztürk, H; Atbakan, E
2014-05-22
The objective of this paper is to describe some of the major healthcare information technology (IT) infrastructures in Turkey, namely, Sağlık-Net (Turkish for "Health-Net"), the Centralized Hospital Appointment System, the Basic Health Statistics Module, the Core Resources Management System, and the e-prescription system of the Social Security Institution. International collaboration projects that are integrated with Sağlık-Net are also briefly summarized. The authors provide a survey of the some of the major healthcare IT infrastructures in Turkey. Sağlık-Net has two main components: the National Health Information System (NHIS) and the Family Medicine Information System (FMIS). The NHIS is a nation-wide infrastructure for sharing patients' Electronic Health Records (EHRs). So far, EHRs of 78.9 million people have been created in the NHIS. Similarly, family medicine is operational in the whole country via FMIS. Centralized Hospital Appointment System enables the citizens to easily make appointments in healthcare providers. Basic Health Statistics Module is used for collecting information about the health status, risks and indicators across the country. Core Resources Management System speeds up the flow of information between the headquarters and Provincial Health Directorates. The e-prescription system is linked with Sağlık-Net and seamlessly integrated with the healthcare provider information systems. Finally, Turkey is involved in several international projects for experience sharing and disseminating national developments. With the introduction of the "Health Transformation Program" in 2003, a number of successful healthcare IT infrastructures have been developed in Turkey. Currently, work is going on to enhance and further improve their functionality.
Building an Economical and Sustainable Lunar Infrastructure to Enable Lunar Industrialization
NASA Technical Reports Server (NTRS)
Zuniga, Allison F.; Turner, Mark; Rasky, Daniel; Loucks, Mike; Carrico, John; Policastri, Daniel
2017-01-01
A new concept study was initiated to examine the architecture needed to gradually develop an economical, evolvable and sustainable lunar infrastructure using a public/private partnerships approach. This approach would establish partnership agreements between NASA and industry teams to develop a lunar infrastructure system that would be mutually beneficial. This approach would also require NASA and its industry partners to share costs in the development phase and then transfer operation of these infrastructure services back to its industry owners in the execution phase. These infrastructure services may include but are not limited to the following: lunar cargo transportation, power stations, communication towers and satellites, autonomous rover operations, landing pads and resource extraction operations. The public/private partnerships approach used in this study leveraged best practices from NASA's Commercial Orbital Transportation Services (COTS) program which introduced an innovative and economical approach for partnering with industry to develop commercial cargo services to the International Space Station. This program was planned together with the ISS Commercial Resupply Services (CRS) contracts which was responsible for initiating commercial cargo delivery services to the ISS for the first time. The public/private partnerships approach undertaken in the COTS program proved to be very successful in dramatically reducing development costs for these ISS cargo delivery services as well as substantially reducing operational costs. To continue on this successful path towards installing economical infrastructure services for LEO and beyond, this new study, named Lunar COTS (Commercial Operations and Transport Services), was conducted to examine extending the NASA COTS model to cis-lunar space and the lunar surface. The goals of the Lunar COTS concept are to: 1) develop and demonstrate affordable and commercial cis-lunar and surface capabilities, such as lunar cargo delivery and surface power generation, in partnership with industry; 2) incentivize industry to establish economical and sustainable lunar infrastructure services to support NASA missions and initiate lunar commerce; and 3) encourage creation of new space markets for economic growth and benefit. A phased-development approach was also studied to allow for incremental development and demonstration of capabilities needed to build a lunar infrastructure. This paper will describe the Lunar COTS concept goals, objectives and approach for building an economical and sustainable lunar infrastructure. It will also describe the technical challenges and advantages of developing and operating each infrastructure element. It will also describe the potential benefits and progress that can be accomplished in the initial phase of this Lunar COTS approach. Finally, the paper will also look forward to the potential of a robust lunar industrialization environment and its potential effect on the next 50 years of space exploration.
Structural health monitoring of civil infrastructure.
Brownjohn, J M W
2007-02-15
Structural health monitoring (SHM) is a term increasingly used in the last decade to describe a range of systems implemented on full-scale civil infrastructures and whose purposes are to assist and inform operators about continued 'fitness for purpose' of structures under gradual or sudden changes to their state, to learn about either or both of the load and response mechanisms. Arguably, various forms of SHM have been employed in civil infrastructure for at least half a century, but it is only in the last decade or two that computer-based systems are being designed for the purpose of assisting owners/operators of ageing infrastructure with timely information for their continued safe and economic operation. This paper describes the motivations for and recent history of SHM applications to various forms of civil infrastructure and provides case studies on specific types of structure. It ends with a discussion of the present state-of-the-art and future developments in terms of instrumentation, data acquisition, communication systems and data mining and presentation procedures for diagnosis of infrastructural 'health'.
GLIDE: a grid-based light-weight infrastructure for data-intensive environments
NASA Technical Reports Server (NTRS)
Mattmann, Chris A.; Malek, Sam; Beckman, Nels; Mikic-Rakic, Marija; Medvidovic, Nenad; Chrichton, Daniel J.
2005-01-01
The promise of the grid is that it will enable public access and sharing of immense amounts of computational and data resources among dynamic coalitions of individuals and institutions. However, the current grid solutions make several limiting assumptions that curtail their widespread adoption. To address these limitations, we present GLIDE, a prototype light-weight, data-intensive middleware infrastructure that enables access to the robust data and computational power of the grid on DREAM platforms.
Information Collection using Handheld Devices in Unreliable Networking Environments
2014-06-01
different types of mobile devices that connect wirelessly to a database 8 server. The actual backend database is not important to the mobile clients...Google’s infrastructure and local servers with MySQL and PostgreSQL on the backend (ODK 2014b). (2) Google Fusion Tables are used to do basic link...how we conduct business. Our requirements to share information do not change simply because there is little or no existing infrastructure in our
HCP: A Flexible CNN Framework for Multi-label Image Classification.
Wei, Yunchao; Xia, Wei; Lin, Min; Huang, Junshi; Ni, Bingbing; Dong, Jian; Zhao, Yao; Yan, Shuicheng
2015-10-26
Convolutional Neural Network (CNN) has demonstrated promising performance in single-label image classification tasks. However, how CNN best copes with multi-label images still remains an open problem, mainly due to the complex underlying object layouts and insufficient multi-label training images. In this work, we propose a flexible deep CNN infrastructure, called Hypotheses-CNN-Pooling (HCP), where an arbitrary number of object segment hypotheses are taken as the inputs, then a shared CNN is connected with each hypothesis, and finally the CNN output results from different hypotheses are aggregated with max pooling to produce the ultimate multi-label predictions. Some unique characteristics of this flexible deep CNN infrastructure include: 1) no ground-truth bounding box information is required for training; 2) the whole HCP infrastructure is robust to possibly noisy and/or redundant hypotheses; 3) the shared CNN is flexible and can be well pre-trained with a large-scale single-label image dataset, e.g., ImageNet; and 4) it may naturally output multi-label prediction results. Experimental results on Pascal VOC 2007 and VOC 2012 multi-label image datasets well demonstrate the superiority of the proposed HCP infrastructure over other state-of-the-arts. In particular, the mAP reaches 90.5% by HCP only and 93.2% after the fusion with our complementary result in [44] based on hand-crafted features on the VOC 2012 dataset.
NASA Astrophysics Data System (ADS)
Glaves, Helen; Schaap, Dick
2017-04-01
In recent years there has been a paradigm shift in marine research moving from the traditional discipline based methodology employed at the national level by one or more organizations, to a multidisciplinary, ecosystem level approach conducted on an international scale. This increasingly holistic approach to marine research is in part being driven by policy and legislation. For example, the European Commission's Blue Growth strategy promotes sustainable growth in the marine environment including the development of sea-basin strategies (European Commission 2014). As well as this policy driven shift to ecosystem level marine research there are also scientific and economic drivers for a basin level approach. Marine monitoring is essential for assessing the health of an ecosystem and determining the impacts of specific factors and activities on it. The availability of large volumes of good quality data is fundamental to this increasingly holistic approach to ocean research but there are significant barriers to its re-use. These are due to the heterogeneity of the data resulting from having been collected by many organizations around the globe using a variety of sensors mounted on a range of different platforms. The data is then delivered and archived in a range of formats, using various spatial coordinate systems and aligned with different standards. This heterogeneity coupled with organizational and national policies on data sharing make access and re-use of marine data problematic. In response to the need for greater sharing of marine data a number of e-infrastructures have been developed but these have different levels of granularity with the majority having been developed at the regional level to address specific requirements for data e.g. SeaDataNet in Europe, the Australian Ocean Data Network (AODN). These data infrastructures are also frequently aligned with the priorities of the local funding agencies and have been created in isolation from those developed elsewhere. To add a further layer of complexity there are also global initiatives providing marine data infrastructures e.g. IOC-IODE, POGO as well as those with a wider remit which includes environmental data e.g. GEOSS, COPERNICUS etc. Ecosystem level marine research requires a common framework for marine data management that supports the sharing of data across these regional and global data systems, and provides the user with access to the data available from these services via a single point of access. This framework must be based on existing data systems and established by developing interoperability between them. The Ocean Data and Interoperability Platform (ODIP/ODIP II) project brings together those organisations responsible for maintaining selected regional data infrastructures along with other relevant experts in order to identify the common standards and best practice necessary to underpin this framework, and to evaluate the differences and commonalties between the regional data infrastructures in order to establish interoperability between them for the purposes of data sharing. This coordinated approach is being demonstrated and validated through the development of a series of prototype interoperability solutions that demonstrate the mechanisms and standards necessary to facilitate the sharing of marine data across these existing data infrastructures.
Krishnan, Jerry A; Martin, Molly A; Lohff, Cortland; Mosnaim, Giselle S; Margellos-Anast, Helen; DeLisa, Julie A; McMahon, Kate; Erwin, Kim; Zun, Leslie S; Berbaum, Michael L; McDermott, Michael; Bracken, Nina E; Kumar, Rajesh; Margaret Paik, S; Nyenhuis, Sharmilee M; Ignoffo, Stacy; Press, Valerie G; Pittsenbarger, Zachary E; Thompson, Trevonne M
2017-06-01
Among children with asthma, black children are two to four times as likely to have an emergency department (ED) visit and die from asthma, respectively, compared to white children in the United States. Despite the availability of evidence-based asthma management guidelines, minority children are less likely than white children to receive or use effective options for asthma care. The CHICAGO Plan is a three-arm multi-center randomized pragmatic trial of children 5 to 11years old presenting to the ED with uncontrolled asthma that compares: [1] an ED-focused intervention to improve the quality of care on discharge to home, [2] the same ED-focused intervention together with a home-based community health worker (CHW)-led intervention, and [3] enhanced usual care. All children receive spacers for the metered dose inhaler and teaching about its use. The Patient-Reported Outcomes Measurement Information System (PROMIS) Asthma Impact Scale and Satisfaction with Participation in Social Roles at 6months are the primary outcomes in children and in caregivers, respectively. Other patient-reported outcomes and indicators of healthcare utilization are assessed as secondary outcomes. Innovative features of the CHICAGO Plan include early and continuous engagement of children, caregivers, the Chicago Department of Public Health, and other stakeholders to inform the design and implementation of the study and a shared research infrastructure to coordinate study activities. The objective of this report is to describe the development of the CHICAGO Plan, including the methods and rationale for engaging stakeholders, the shared research infrastructure, and other features of the pragmatic clinical trial design. Published by Elsevier Inc.
Krishnan, Jerry A.; Martin, Molly A.; Lohff, Cortland; Mosnaim, Giselle S.; Margellos-Anast, Helen; DeLisa, Julie A.; McMahon, Kate; Erwin, Kim; Zun, Leslie S.; Berbaum, Michael L.; McDermott, Michael; Bracken, Nina E.; Kumar, Rajesh; Paik, S. Margaret; Nyenhuis, Sharmilee M.; Ignoffo, Stacy; Press, Valerie G.; Pittsenbarger, Zachary E.; Thompson, Trevonne M.
2017-01-01
Among children with asthma, black children are two to four times as likely to have an emergency department (ED) visit and die from asthma, respectively, compared to white children in the United States. Despite the availability of evidence-based asthma management guidelines, minority children are less likely than white children to receive or use effective options for asthma care. The CHICAGO Plan is a three-arm multi-center randomized pragmatic trial of children 5 to 11 years old presenting to the ED with uncontrolled asthma that compares: (1) an ED-focused intervention to improve the quality of care on discharge to home, (2) the same ED-focused intervention together with a home-based community health worker (CHW)-led intervention, and (3) enhanced usual care. All children receive spacers for the metered dose inhaler and teaching about its use. The Patient-Reported Outcomes Measurement Information System (PROMIS) Asthma Impact Scale and Satisfaction with Participation in Social Roles at 6 months are the primary outcomes in children and in caregivers, respectively. Other patient-reported outcomes and indicators of healthcare utilization are assessed as secondary outcomes. Innovative features of the CHICAGO Plan include early and continuous engagement of children, caregivers, the Chicago Department of Public Health, and other stakeholders to inform the design and implementation of the study and a shared research infrastructure to coordinate study activities. The objective of this report is to describe the development of the CHICAGO Plan, including the methods and rationale for engaging stakeholders, the shared research infrastructure, and other features of the pragmatic clinical trial design. PMID:28366780
NASA Astrophysics Data System (ADS)
Lindsköld, L.; Alvfeldt, G.; Wintell, M.
2015-03-01
One of the challenges of today's healthcare is that data from radiology is heterogeneous, stored and managed in silos created by PACS vendors. Also seen is a lack of coordinated use of harmonized reference information models and established healthcare standards. Radiology in Region Västra Götaland has been entering the world of "Big Data" since 2006, 34 departments split into 4 private image center, 2 small-size hospital, 4 middle-sized hospital groups and one University hospital. Using the same information infrastructure as a means of collaborating and sharing information between. As an organization building for the future we must meet the values and requirements of the stakeholders and count the patient as the major actor. Can "Big Data" analytics be a valuable asset from a regional management perspective? Our initial findings indicates that this is the case, based on three different perspectives - work practice changes, understanding data quality when sharing information and introducing new services in work practice. Going from local to enterprise workflow utilizing the power of "Big Data", not only by volume but also by combining diverse sources and aggregate the information domains, visualize new trends as well as dependencies more effectively. Building trust by the use of Big Data in healthcare involves a long and winding journey, but the persevering infrastructure-oriented organization will give new ways of collaboration for the enterprise it serves. It also involves continuous negotiation with people concerning how and why they should collaborate with new actors within the region to achieve patient centric care. This will nurture a more open-minded, hopeful and life-affirming holistic approach involving all stakeholders, newcomers' specialists and patients.
Integrating Emerging Data Sources Into Operational Practice
DOT National Transportation Integrated Search
2018-05-15
Agencies have the potential to collect, use, and share data from connected and automated vehicles (CAV), connected travelers, and connected infrastructure elements to improve the performance of their traffic management systems and traffic management ...
42 CFR § 512.350 - Data sharing.
Code of Federal Regulations, 2010 CFR
2017-10-01
...) HEALTH CARE INFRASTRUCTURE AND MODEL PROGRAMS EPISODE PAYMENT MODEL Pricing and Payment § 512.350 Data... delivery of care. (4) Otherwise achieve the goals of the models described in this section. (b) Beneficiary...
7 CFR 1486.209 - How are program applications evaluated and approved?
Code of Federal Regulations, 2010 CFR
2010-01-01
... maintain U.S. market share; (ii) Marketing and distribution of value-added products, including new products... channels in emerging markets, including infrastructural impediments to U.S. exports; such studies may...
Connected vehicles and cybersecurity.
DOT National Transportation Integrated Search
2016-01-01
Connected vehicles are a next-generation technology in vehicles and in infrastructure that will make travel safer, cleaner, and more efficient. The advanced wireless technology enables vehicles to share and communicate information with each other and...
NASA Astrophysics Data System (ADS)
Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob
2007-07-01
The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.
Purkayastha, S.; Biswas, R.; Jai Ganesh, A.U.; Otero, P.
2015-01-01
Summary Objective To share how an effectual merging of local and online networks in low resource regions can supplement and strengthen the local practice of patient centered care through the use of an online digital infrastructure powered by all stakeholders in healthcare. User Driven Health Care offers the dynamic integration of patient values and evidence based solutions for improved medical communication in medical care. Introduction This paper conceptualizes patient care-coordination through the lens of engaged stakeholders using digital infrastructures tools to integrate information technology. We distinguish this lens from the prevalent conceptualization of dyadic ties between clinician-patient, patient-nurse, clinician-nurse, and offer the holistic integration of all stakeholder inputs, in the clinic and augmented by online communication in a multi-national setting. Methods We analyze an instance of the user-driven health care (UDHC), a network of providers, patients, students and researchers working together to help manage patient care. The network currently focuses on patients from LMICs, but the provider network is global in reach. We describe UDHC and its opportunities and challenges in care-coordination to reduce costs, bring equity, and improve care quality and share evidence. Conclusion UDHC has resulted in coordinated global based local care, affecting multiple facets of medical practice. Shared information resources between providers with disparate knowledge, results in better understanding by patients, unique and challenging cases for students, innovative community based research and discovery learning for all. PMID:26123908
Purkayastha, S; Price, A; Biswas, R; Jai Ganesh, A U; Otero, P
2015-08-13
To share how an effectual merging of local and online networks in low resource regions can supplement and strengthen the local practice of patient centered care through the use of an online digital infrastructure powered by all stakeholders in healthcare. User Driven Health Care offers the dynamic integration of patient values and evidence based solutions for improved medical communication in medical care. This paper conceptualizes patient care-coordination through the lens of engaged stakeholders using digital infrastructures tools to integrate information technology. We distinguish this lens from the prevalent conceptualization of dyadic ties between clinician-patient, patient-nurse, clinician-nurse, and offer the holistic integration of all stakeholder inputs, in the clinic and augmented by online communication in a multi-national setting. We analyze an instance of the user-driven health care (UDHC), a network of providers, patients, students and researchers working together to help manage patient care. The network currently focuses on patients from LMICs, but the provider network is global in reach. We describe UDHC and its opportunities and challenges in care-coordination to reduce costs, bring equity, and improve care quality and share evidence. UDHC has resulted in coordinated global based local care, affecting multiple facets of medical practice. Shared information resources between providers with disparate knowledge, results in better understanding by patients, unique and challenging cases for students, innovative community based research and discovery learning for all.
The Electronic Data Methods (EDM) forum for comparative effectiveness research (CER).
Holve, Erin; Segal, Courtney; Lopez, Marianne Hamilton; Rein, Alison; Johnson, Beth H
2012-07-01
AcademyHealth convened the Electronic Data Methods (EDM) Forum to collect, synthesize, and share lessons from eleven projects that are building infrastructure and using electronic clinical data for comparative effectiveness research (CER) and patient-centered outcomes research (PCOR). This paper provides a brief review of participating projects and provides a framework of common challenges. EDM Forum staff conducted a text review of relevant grant programs' funding opportunity announcements; projects' research plans; and available information on projects' websites. Additional information was obtained from presentations provided by each project; phone calls with project principal investigators, affiliated partners, and staff from the Agency for Healthcare Research and Quality (AHRQ); and six site visits. Projects participating in the EDM Forum are building infrastructure and developing innovative strategies to address a set of methodological, and data and informatics challenges, here identified in a common framework. The eleven networks represent more than 20 states and include a range of partnership models. Projects vary substantially in size, from 11,000 to more than 7.5 million individuals. Nearly all of the AHRQ priority populations and conditions are addressed. In partnership with the projects, the EDM Forum is focused on identifying and sharing lessons learned to advance the national dialogue on the use of electronic clinical data to conduct CER and PCOR. These efforts have the shared goal of addressing challenges in traditional research studies and data sources, and aim to build infrastructure and generate evidence to support a learning health care system that can improve patient outcomes.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Schmidt, Markus H
2014-11-01
The energy allocation (EA) model defines behavioral strategies that optimize the temporal utilization of energy to maximize reproductive success. This model proposes that all species of the animal kingdom share a universal sleep function that shunts waking energy utilization toward sleep-dependent biological investment. For endotherms, REM sleep evolved to enhance energy appropriation for somatic and CNS-related processes by eliminating thermoregulatory defenses and skeletal muscle tone. Alternating REM with NREM sleep conserves energy by decreasing the need for core body temperature defense. Three EA phenotypes are proposed: sleep-wake cycling, torpor, and continuous (or predominant) wakefulness. Each phenotype carries inherent costs and benefits. Sleep-wake cycling downregulates specific biological processes in waking and upregulates them in sleep, thereby decreasing energy demands imposed by wakefulness, reducing cellular infrastructure requirements, and resulting in overall energy conservation. Torpor achieves the greatest energy savings, but critical biological operations are compromised. Continuous wakefulness maximizes niche exploitation, but endures the greatest energy demands. The EA model advances a new construct for understanding sleep-wake organization in ontogenetic and phylogenetic domains. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Rosinski, A.; Morentz, J.; Beilin, P.
2017-12-01
The principal function of the California Earthquake Clearinghouse is to provide State and Federal disaster response managers, and the scientific and engineering communities, with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes and tsunamis. The overarching problem highlighted in discussions with Clearinghouse partners is the confusion and frustration of many of the Operational Area representatives, and some regional utilities throughout the state on what software applications they should be using and maintaining to meet State, Federal, and Local, requirements, and for what purposes, and how to deal with the limitations of these applications. This problem is getting in the way of making meaningful progress on developing multi-application interoperability and the necessary supporting cross-sector information-sharing procedures and dialogue on essential common operational information that entities need to share for different all hazards missions and related operational activities associated with continuity, security, and resilience. The XchangeCore based system the Clearinghouse is evolving helps deal with this problem, and does not compound it by introducing yet another end-user application; there is no end-user interface with which one views XchangeCore, all viewing of data provided through XchangeCore occurs in and on existing, third-party operational applications. The Clearinghouse efforts with XchangeCore are compatible with FEMA, which is currently using XchangeCore-provided data for regional and National Business Emergency Operations Center (source of business information sharing during emergencies) response. Also important, and should be emphasized, is that information-sharing is not just for response, but for preparedness, risk assessment/mitigation decision-making, and everyday operational needs for situational awareness. In other words, the benefits of the Clearinghouse information sharing efforts transcend emergency response. The Clearinghouse is in the process of developing an Information-Sharing System Guide and CONOPS/ templates, that should be aimed a multi-stakeholder, non-technical audience.
Distributed data networks: a blueprint for Big Data sharing and healthcare analytics.
Popovic, Jennifer R
2017-01-01
This paper defines the attributes of distributed data networks and outlines the data and analytic infrastructure needed to build and maintain a successful network. We use examples from one successful implementation of a large-scale, multisite, healthcare-related distributed data network, the U.S. Food and Drug Administration-sponsored Sentinel Initiative. Analytic infrastructure-development concepts are discussed from the perspective of promoting six pillars of analytic infrastructure: consistency, reusability, flexibility, scalability, transparency, and reproducibility. This paper also introduces one use case for machine learning algorithm development to fully utilize and advance the portfolio of population health analytics, particularly those using multisite administrative data sources. © 2016 New York Academy of Sciences.
Agile Infrastructure Monitoring
NASA Astrophysics Data System (ADS)
Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.
2014-06-01
At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.
Control Infrastructure for a Pulsed Ion Accelerator
NASA Astrophysics Data System (ADS)
Persaud, A.; Regis, M. J.; Stettler, M. W.; Vytla, V. K.
2016-10-01
We report on updates to the accelerator controls for the Neutralized Drift Compression Experiment II, a pulsed induction-type accelerator for heavy ions. The control infrastructure is built around a LabVIEW interface combined with an Apache Cassandra backend for data archiving. Recent upgrades added the storing and retrieving of device settings into the database, as well as ZeroMQ as a message broker that replaces LabVIEW's shared variables. Converting to ZeroMQ also allows easy access via other programming languages, such as Python.
Strategies for 96-hour critical infrastructure compliance.
Storbakken, Steven H; Kendall, Shannon; Lackey, Connie
2009-01-01
Organizations that stand the best chance at survival following a disaster do so because they can depend on the sharing of resources and mutual ideologies, the authors claim, pointing out that when it comes to strategizing for 96-hour critical infrastructure compliance, it is important to keep at the forefront not only the idea of collaborative planning from within the organization--involving security and safety, clinical, facilities and administrative staffs--but also includes collaborative planning with the local and regional businesses surrounding the organization.
Control Infrastructure for a Pulsed Ion Accelerator
Persaud, A.; Regis, M. J.; Stettler, M. W.; ...
2016-07-27
We report on updates to the accelerator controls for the Neutralized Drift Compression Experiment II, a pulsed induction-type accelerator for heavy ions. The control infrastructure is built around a LabVIEW interface combined with an Apache Cassandra backend for data archiving. Recent upgrades added the storing and retrieving of device settings into the database, as well as ZeroMQ as a message broker that replaces LabVIEW's shared variables. Converting to ZeroMQ also allows easy access via other programming languages, such as Python.
The ORAC-DR data reduction pipeline
NASA Astrophysics Data System (ADS)
Cavanagh, B.; Jenness, T.; Economou, F.; Currie, M. J.
2008-03-01
The ORAC-DR data reduction pipeline has been used by the Joint Astronomy Centre since 1998. Originally developed for an infrared spectrometer and a submillimetre bolometer array, it has since expanded to support twenty instruments from nine different telescopes. By using shared code and a common infrastructure, rapid development of an automated data reduction pipeline for nearly any astronomical data is possible. This paper discusses the infrastructure available to developers and estimates the development timescales expected to reduce data for new instruments using ORAC-DR.
IHE cross-enterprise document sharing for imaging: interoperability testing software
2010-01-01
Background With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. Results In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. Conclusions EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties. PMID:20858241
IHE cross-enterprise document sharing for imaging: interoperability testing software.
Noumeir, Rita; Renaud, Bérubé
2010-09-21
With the deployments of Electronic Health Records (EHR), interoperability testing in healthcare is becoming crucial. EHR enables access to prior diagnostic information in order to assist in health decisions. It is a virtual system that results from the cooperation of several heterogeneous distributed systems. Interoperability between peers is therefore essential. Achieving interoperability requires various types of testing. Implementations need to be tested using software that simulates communication partners, and that provides test data and test plans. In this paper we describe a software that is used to test systems that are involved in sharing medical images within the EHR. Our software is used as part of the Integrating the Healthcare Enterprise (IHE) testing process to test the Cross Enterprise Document Sharing for imaging (XDS-I) integration profile. We describe its architecture and functionalities; we also expose the challenges encountered and discuss the elected design solutions. EHR is being deployed in several countries. The EHR infrastructure will be continuously evolving to embrace advances in the information technology domain. Our software is built on a web framework to allow for an easy evolution with web technology. The testing software is publicly available; it can be used by system implementers to test their implementations. It can also be used by site integrators to verify and test the interoperability of systems, or by developers to understand specifications ambiguities, or to resolve implementations difficulties.
23 CFR 505.17 - Applicability of Title 23, U.S. Code.
Code of Federal Regulations, 2010 CFR
2010-04-01
... INFRASTRUCTURE MANAGEMENT PROJECTS OF NATIONAL AND REGIONAL SIGNIFICANCE EVALUATION AND RATING § 505.17... until expended and the Federal share of the cost of a Project of National and Regional Significance...
Improving Access to Transit Through Crowdsourced Information
DOT National Transportation Integrated Search
2017-11-01
The purpose of this research was to facilitate the ongoing collection of information from the public about potential areas of multimodal service and infrastructure improvements and easily share these problems with transit agencies, departments of tra...
Positioning infrastructure and technologies for low-carbon urbanization
NASA Astrophysics Data System (ADS)
Chester, Mikhail V.; Sperling, Josh; Stokes, Eleanor; Allenby, Braden; Kockelman, Kara; Kennedy, Christopher; Baker, Lawrence A.; Keirstead, James; Hendrickson, Chris T.
2014-10-01
The expected urbanization of the planet in the coming century coupled with aging infrastructure in developed regions, increasing complexity of man-made systems, and pressing climate change impacts have created opportunities for reassessing the role of infrastructure and technologies in cities and how they contribute to greenhouse gas (GHG) emissions. Modern urbanization is predicated on complex, increasingly coupled infrastructure systems, and energy use continues to be largely met from fossil fuels. Until energy infrastructures evolve away from carbon-based fuels, GHG emissions are critically tied to the urbanization process. Further complicating the challenge of decoupling urban growth from GHG emissions are lock-in effects and interdependencies. This paper synthesizes state-of-the-art thinking for transportation, fuels, buildings, water, electricity, and waste systems and finds that GHG emissions assessments tend to view these systems as static and isolated from social and institutional systems. Despite significant understanding of methods and technologies for reducing infrastructure-related GHG emissions, physical, institutional, and cultural constraints continue to work against us, pointing to knowledge gaps that must be addressed. This paper identifies three challenge themes to improve our understanding of the role of infrastructure and technologies in urbanization processes and position these increasingly complex systems for low-carbon growth. The challenges emphasize how we can reimagine the role of infrastructure in the future and how people, institutions, and ecological systems interface with infrastructure.
Scientific names of organisms: attribution, rights, and licensing
2014-01-01
Background As biological disciplines extend into the ‘big data’ world, they will need a names-based infrastructure to index and interconnect distributed data. The infrastructure must have access to all names of all organisms if it is to manage all information. Those who compile lists of species hold different views as to the intellectual property rights that apply to the lists. This creates uncertainty that impedes the development of a much-needed infrastructure for sharing biological data in the digital world. Findings The laws in the United States of America and European Union are consistent with the position that scientific names of organisms and their compilation in checklists, classifications or taxonomic revisions are not subject to copyright. Compilations of names, such as classifications or checklists, are not creative in the sense of copyright law. Many content providers desire credit for their efforts. Conclusions A ‘blue list’ identifies elements of checklists, classifications and monographs to which intellectual property rights do not apply. To promote sharing, authors of taxonomic content, compilers, intermediaries, and aggregators should receive citable recognition for their contributions, with the greatest recognition being given to the originating authors. Mechanisms for achieving this are discussed. PMID:24495358
Scalability Issues for Remote Sensing Infrastructure: A Case Study
Liu, Yang; Picard, Sean; Williamson, Carey
2017-01-01
For the past decade, a team of University of Calgary researchers has operated a large “sensor Web” to collect, analyze, and share scientific data from remote measurement instruments across northern Canada. This sensor Web receives real-time data streams from over a thousand Internet-connected sensors, with a particular emphasis on environmental data (e.g., space weather, auroral phenomena, atmospheric imaging). Through research collaborations, we had the opportunity to evaluate the performance and scalability of their remote sensing infrastructure. This article reports the lessons learned from our study, which considered both data collection and data dissemination aspects of their system. On the data collection front, we used benchmarking techniques to identify and fix a performance bottleneck in the system’s memory management for TCP data streams, while also improving system efficiency on multi-core architectures. On the data dissemination front, we used passive and active network traffic measurements to identify and reduce excessive network traffic from the Web robots and JavaScript techniques used for data sharing. While our results are from one specific sensor Web system, the lessons learned may apply to other scientific Web sites with remote sensing infrastructure. PMID:28468262
Quality assessment concept of the World Data Center for Climate and its application to CMIP5 data
NASA Astrophysics Data System (ADS)
Stockhause, M.; Höck, H.; Toussaint, F.; Lautenschlager, M.
2012-08-01
The preservation of data in a high state of quality which is suitable for interdisciplinary use is one of the most pressing and challenging current issues in long-term archiving. For high volume data such as climate model data, the data and data replica are no longer stored centrally but distributed over several local data repositories, e.g. the data of the Climate Model Intercomparison Project Phase 5 (CMIP5). The most important part of the data is to be archived, assigned a DOI, and published according to the World Data Center for Climate's (WDCC) application of the DataCite regulations. The integrated part of WDCC's data publication process, the data quality assessment, was adapted to the requirements of a federated data infrastructure. A concept of a distributed and federated quality assessment procedure was developed, in which the workload and responsibility for quality control is shared between the three primary CMIP5 data centers: Program for Climate Model Diagnosis and Intercomparison (PCMDI), British Atmospheric Data Centre (BADC), and WDCC. This distributed quality control concept, its pilot implementation for CMIP5, and first experiences are presented. The distributed quality control approach is capable of identifying data inconsistencies and to make quality results immediately available for data creators, data users and data infrastructure managers. Continuous publication of new data versions and slow data replication prevents the quality control from check completion. This together with ongoing developments of the data and metadata infrastructure requires adaptations in code and concept of the distributed quality control approach.
Healthcare Information Technology Infrastructures in Turkey
Yuksel, M.; Ertürkmen, G. L.; Kabak, Y.; Namli, T.; Yıldız, M. H.; Ay, Y.; Ceyhan, B.; Hülür, Ü.; Öztürk, H.; Atbakan, E.
2014-01-01
Summary Objectives The objective of this paper is to describe some of the major healthcare information technology (IT) infrastructures in Turkey, namely, Sağlık-Net (Turkish for “Health-Net”), the Centralized Hospital Appointment System, the Basic Health Statistics Module, the Core Resources Management System, and the e-prescription system of the Social Security Institution. International collaboration projects that are integrated with Sağlık-Net are also briefly summarized. Methods The authors provide a survey of the some of the major healthcare IT infrastructures in Turkey. Results Sağlık-Net has two main components: the National Health Information System (NHIS) and the Family Medicine Information System (FMIS). The NHIS is a nation-wide infrastructure for sharing patients’ Electronic Health Records (EHRs). So far, EHRs of 78.9 million people have been created in the NHIS. Similarly, family medicine is operational in the whole country via FMIS. Centralized Hospital Appointment System enables the citizens to easily make appointments in healthcare providers. Basic Health Statistics Module is used for collecting information about the health status, risks and indicators across the country. Core Resources Management System speeds up the flow of information between the headquarters and Provincial Health Directorates. The e-prescription system is linked with Sağlık-Net and seamlessly integrated with the healthcare provider information systems. Finally, Turkey is involved in several international projects for experience sharing and disseminating national developments. Conclusion With the introduction of the “Health Transformation Program” in 2003, a number of successful healthcare IT infrastructures have been developed in Turkey. Currently, work is going on to enhance and further improve their functionality. PMID:24853036
Next generation terminology infrastructure to support interprofessional care planning.
Collins, Sarah; Klinkenberg-Ramirez, Stephanie; Tsivkin, Kira; Mar, Perry L; Iskhakova, Dina; Nandigam, Hari; Samal, Lipika; Rocha, Roberto A
2017-11-01
Develop a prototype of an interprofessional terminology and information model infrastructure that can enable care planning applications to facilitate patient-centered care, learn care plan linkages and associations, provide decision support, and enable automated, prospective analytics. The study steps included a 3 step approach: (1) Process model and clinical scenario development, and (2) Requirements analysis, and (3) Development and validation of information and terminology models. Components of the terminology model include: Health Concerns, Goals, Decisions, Interventions, Assessments, and Evaluations. A terminology infrastructure should: (A) Include discrete care plan concepts; (B) Include sets of profession-specific concerns, decisions, and interventions; (C) Communicate rationales, anticipatory guidance, and guidelines that inform decisions among the care team; (D) Define semantic linkages across clinical events and professions; (E) Define sets of shared patient goals and sub-goals, including patient stated goals; (F) Capture evaluation toward achievement of goals. These requirements were mapped to AHRQ Care Coordination Measures Framework. This study used a constrained set of clinician-validated clinical scenarios. Terminology models for goals and decisions are unavailable in SNOMED CT, limiting the ability to evaluate these aspects of the proposed infrastructure. Defining and linking subsets of care planning concepts appears to be feasible, but also essential to model interprofessional care planning for common co-occurring conditions and chronic diseases. We recommend the creation of goal dynamics and decision concepts in SNOMED CT to further enable the necessary models. Systems with flexible terminology management infrastructure may enable intelligent decision support to identify conflicting and aligned concerns, goals, decisions, and interventions in shared care plans, ultimately decreasing documentation effort and cognitive burden for clinicians and patients. Copyright © 2017 Elsevier Inc. All rights reserved.
EVER-EST: a virtual research environment for Earth Sciences
NASA Astrophysics Data System (ADS)
Marelli, Fulvio; Albani, Mirko; Glaves, Helen
2016-04-01
There is an increasing requirement for researchers to work collaboratively using common resources whilst being geographically dispersed. By creating a virtual research environment (VRE) using a service oriented architecture (SOA) tailored to the needs of Earth Science (ES) communities, the EVEREST project will provide a range of both generic and domain specific data management services to support a dynamic approach to collaborative research. EVER-EST will provide the means to overcome existing barriers to sharing of Earth Science data and information allowing research teams to discover, access, share and process heterogeneous data, algorithms, results and experiences within and across their communities, including those domains beyond Earth Science. Researchers will be able to seamlessly manage both the data involved in their computationally intensive disciplines and the scientific methods applied in their observations and modelling, which lead to the specific results that need to be attributable, validated and shared both within the community and more widely e.g. in the form of scholarly communications. Central to the EVEREST approach is the concept of the Research Object (RO) , which provides a semantically rich mechanism to aggregate related resources about a scientific investigation so that they can be shared together using a single unique identifier. Although several e-laboratories are incorporating the research object concept in their infrastructure, the EVER-EST VRE will be the first infrastructure to leverage the concept of Research Objects and their application in observational rather than experimental disciplines. Development of the EVEREST VRE will leverage the results of several previous projects which have produced state-of-the-art technologies for scientific data management and curation as well those which have developed models, techniques and tools for the preservation of scientific methods and their implementation in computational forms such as scientific workflows. The EVER-EST data processing infrastructure will be based on a Cloud Computing approach, in which new applications can be integrated using "virtual machines" that have their own specifications (disk size, processor speed, operating system etc.) and run on shared private (physical deployment over local hardware) or commercial Cloud infrastructures. The EVER-EST e-infrastructure will be validated by four virtual research communities (VRC) covering different multidisciplinary Earth Science domains including: ocean monitoring, natural hazards, land monitoring and risk management (volcanoes and seismicity). Each VRC will use the virtual research environment according to its own specific requirements for data, software, best practice and community engagement. This user-centric approach will allow an assessment to be made of the capability for the proposed solution to satisfy the heterogeneous needs of a variety of Earth Science communities for more effective collaboration, and higher efficiency and creativity in research. EVER-EST is funded by the European Commission's H2020 for three years starting in October 2015. The project is led by the European Space Agency (ESA), involves some of the major European Earth Science data providers/users including NERC, DLR, INGV, CNR and SatCEN.
50 CFR 86.13 - What is boating infrastructure?
Code of Federal Regulations, 2011 CFR
2011-10-01
... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM BOATING INFRASTRUCTURE GRANT (BIG...) Floating docks and fixed piers; (g) Floating and fixed breakwaters; (h) Dinghy docks (floating or fixed...
EarthCube - A Community-led, Interdisciplinary Collaboration for Geoscience Cyberinfrastructure
NASA Astrophysics Data System (ADS)
Allison, M. L.; Keane, C. M.; Robinson, E.
2015-12-01
The EarthCube Test Enterprise Governance Project completed its initial two-year long process to engage the community and test a demonstration governing organization with the goal of facilitating a community-led process on designing and developing a geoscience cyberinfrastructure. Conclusions are that EarthCube is viable, has engaged a broad spectrum of end-users and contributors, and has begun to foster a sense of urgency around the importance of open and shared data. Levels of trust among participants are growing. At the same time, the active participants in EarthCube represent a very small sub-set of the larger population of geoscientists. Results from Stage I of this project have impacted NSF decisions on the direction of the EarthCube program. The overall tone of EarthCube events has had a constructive, problem-solving orientation. The technical and organizational elements of EarthCube are poised to support a functional infrastructure for the geosciences community. The process for establishing shared technological standards has notable progress but there is a continuing need to expand technological and cultural alignment. Increasing emphasis is being given to the interdependencies among EarthCube funded projects. The newly developed EarthCube Technology Plan highlights important progress in this area by five working groups focusing on: 1. Use cases; 2. Funded project gap analysis; 3. Testbed development; 4. Standards; and 5. Architecture. There is ample justification to continue running a community-led governance framework that facilitates agreement on a system architecture, guides EarthCube activities, and plays an increasing role in making the EarthCube vision of cyberinfrastructure for the geosciences operational. There is widespread community expectation for support of a multiyear EarthCube governing effort to put into practice the science, technical, and organizational plans that have and are continuing to emerge.
A novel technique for stiffening steel structures.
DOT National Transportation Integrated Search
2009-03-01
The use of composite materials for strengthening the ailing infrastructure has been steadily gaining acceptance and market share. One can state that this strengthening technique has become main stream in some applications such as in strengthening of ...
The OSG open facility: A sharing ecosystem
Jayatilaka, B.; Levshina, T.; Rynge, M.; ...
2015-12-23
The Open Science Grid (OSG) ties together individual experiments’ computing power, connecting their resources to create a large, robust computing grid, this computing infrastructure started primarily as a collection of sites associated with large HEP experiments such as ATLAS, CDF, CMS, and DZero. In the years since, the OSG has broadened its focus to also address the needs of other US researchers and increased delivery of Distributed High Through-put Computing (DHTC) to users from a wide variety of disciplines via the OSG Open Facility. Presently, the Open Facility delivers about 100 million computing wall hours per year to researchers whomore » are not already associated with the owners of the computing sites, this is primarily accomplished by harvesting and organizing the temporarily unused capacity (i.e. opportunistic cycles) from the sites in the OSG. Using these methods, OSG resource providers and scientists share computing hours with researchers in many other fields to enable their science, striving to make sure that these computing power used with maximal efficiency. Furthermore, we believe that expanded access to DHTC is an essential tool for scientific innovation and work continues in expanding this service.« less
Swimming Upstream: Creating a Culture of High-Value Care.
Gupta, Reshma; Moriates, Christopher
2017-05-01
As health system leaders strategize the best ways to encourage the transition toward value-based health care, the underlying culture-defined as a system of shared assumptions, values, beliefs, and norms existing within an environment-continues to shape clinician practice patterns. The current prevailing medical culture contributes to overtesting, overtreatment, and health care waste. Choosing Wisely lists, appropriateness criteria, and guidelines codify best practices, but academic medicine as a whole must recognize that faculty and trainees are all largely still operating within the same cultural climate. Addressing this culture, on both local and national levels, is imperative for engaging clinicians in reforms and creating sustained changes that will deliver on the promise of better health care value. This Perspective outlines four steps for health system leaders to understand, cultivate, and maintain cultural changes toward value-based care: (1) Build the will for change through engaging frontline providers and communicating patient-centered motivations for health care value; (2) create necessary infrastructure to support value improvement efforts; (3) expose physicians to value-based payment structures; and (4) demonstrate leadership commitment and visibility to shared goals. The authors support their recommendations with concrete examples from emerging models and leaders across the country.
Community for Data Integration 2013 Annual Report
Chang, Michelle Y.; Carlino, Jennifer; Barnes, Christopher; Blodgett, David L.; Bock, Andrew R.; Everette, Anthony L.; Fernette, Gregory L.; Flint, Lorraine E.; Gordon, Janice M.; Govoni, David L.; Hay, Lauren E.; Henkel, Heather S.; Hines, Megan K.; Holl, Sally L.; Homer, Collin G.; Hutchison, Vivian B.; Ignizio, Drew A.; Kern, Tim J.; Lightsom, Frances L.; Markstrom, Steven L.; O'Donnell, Michael S.; Schei, Jacquelyn L.; Schmid, Lorna A.; Schoephoester, Kathryn M.; Schweitzer, Peter N.; Skagen, Susan K.; Sullivan, Daniel J.; Talbert, Colin; Warren, Meredith Pavlick
2015-01-01
grow overall USGS capabilities with data and information by increasing visibility of the work of many people throughout the USGS and the CDI community. To achieve these goals, the CDI operates within four applied areas: monthly forums, annual workshop/webinar series, working groups, and projects. The monthly forums, also known as the Opportunity/Challenge of the Month, provide an open dialogue to share and learn about data integration efforts or to present problems that invite the Community to offer solutions, advice, and support. Since 2010, the CDI has also sponsored annual workshops/webinar series to encourage the exchange of ideas, sharing of activities, presentations of current projects, and networking among members. Stemming from common interests, the working groups are focused on efforts to address data management and technical 2 challenges, including the development of standards and tools, improving interoperability and information infrastructure, and data preservation within USGS and its partners. The growing support for the activities of the working groups led to the CDI’s first formal request for proposals (RFP) process in 2013 to fund projects that produced tangible products. Today the CDI continues to hold an annual RFP that create data management tools and practices, collaboration tools, and training in support of data integration and delivery.
Developing standards for a national spatial data infrastructure
Wortman, Kathryn C.
1994-01-01
The concept of a framework for data and information linkages among producers and users, known as a National Spatial Data Infrastructure (NSDI), is built upon four corners: data, technology, institutions, and standards. Standards are paramount to increase the efficiency and effectiveness of the NSDI. Historically, data standards and specifications have been developed with a very limited scope - they were parochial, and even competitive in nature, and promoted the sharing of data and information within only a small community at the expense of more open sharing across many communities. Today, an approach is needed to grow and evolve standards to support open systems and provide consistency and uniformity among data producers. There are several significant ongoing activities in geospatial data standards: transfer or exchange, metadata, and data content. In addition, standards in other areas are under discussion, including data quality, data models, and data collection.
The European Network of Analytical and Experimental Laboratories for Geosciences
NASA Astrophysics Data System (ADS)
Freda, Carmela; Funiciello, Francesca; Meredith, Phil; Sagnotti, Leonardo; Scarlato, Piergiorgio; Troll, Valentin R.; Willingshofer, Ernst
2013-04-01
Integrating Earth Sciences infrastructures in Europe is the mission of the European Plate Observing System (EPOS).The integration of European analytical, experimental, and analogue laboratories plays a key role in this context and is the task of the EPOS Working Group 6 (WG6). Despite the presence in Europe of high performance infrastructures dedicated to geosciences, there is still limited collaboration in sharing facilities and best practices. The EPOS WG6 aims to overcome this limitation by pushing towards national and trans-national coordination, efficient use of current laboratory infrastructures, and future aggregation of facilities not yet included. This will be attained through the creation of common access and interoperability policies to foster and simplify personnel mobility. The EPOS ambition is to orchestrate European laboratory infrastructures with diverse, complementary tasks and competences into a single, but geographically distributed, infrastructure for rock physics, palaeomagnetism, analytical and experimental petrology and volcanology, and tectonic modeling. The WG6 is presently organizing its thematic core services within the EPOS distributed research infrastructure with the goal of joining the other EPOS communities (geologists, seismologists, volcanologists, etc...) and stakeholders (engineers, risk managers and other geosciences investigators) to: 1) develop tools and services to enhance visitor programs that will mutually benefit visitors and hosts (transnational access); 2) improve support and training activities to make facilities equally accessible to students, young researchers, and experienced users (training and dissemination); 3) collaborate in sharing technological and scientific know-how (transfer of knowledge); 4) optimize interoperability of distributed instrumentation by standardizing data collection, archive, and quality control standards (data preservation and interoperability); 5) implement a unified e-Infrastructure for data analysis, numerical modelling, and joint development and standardization of numerical tools (e-science implementation); 6) collect and store data in a flexible inventory database accessible within and beyond the Earth Sciences community(open access and outreach); 7) connect to environmental and hazard protection agencies, stakeholders, and public to raise consciousness of geo-hazards and geo-resources (innovation for society). We will inform scientists and industrial stakeholders on the most recent WG6 achievements in EPOS and we will show how our community is proceeding to design the thematic core services.
SFB754 - data management in large interdisciplinary collaborative research projects: what matters?
NASA Astrophysics Data System (ADS)
Mehrtens, Hela; Springer, Pina; Schirnick, Carsten; Schelten, Christiane K.
2016-04-01
Data management for SFB 754 is an integral part of the joint data management team at GEOMAR Helmholtz Centre for Ocean Research Kiel, a cooperation of the Cluster of Excellence "Future Ocean", the SFB 754 and other current and former nationally and EU-funded projects. The coalition successfully established one common data management infrastructure for marine sciences in Kiel. It aims to help researchers to better document the data lifecycle from acquisition to publication and share their results already during the project phase. The infrastructure is continuously improved by integration of standard tools and developing extensions in close cooperation with scientists, data centres and other research institutions. Open and frequent discussion of data management topics during SFB 754 meetings and seminars and efficient cooperation with its coordination office allowed gradual establishment of better data management practices. Furthermore a data policy was agreed on to ensure proper usage of data sets, even unpublished ones, schedules data upload and dissemination and enforces long-term public availability of the research outcome. Acceptance of the infrastructure is also backed by easy usage of the web-based platform for data set documentation and exchange among all research disciplines of the SFB 754. Members of the data management team act as data curators and assist in data publication in World Data Centres (e.g. PANGAEA). Cooperation with world data centres makes the research data then globally searchable and accessible while links to the data producers ensure citability and provide points of contact for the scientific community. A complete record of SFB 754 publications is maintained within the institutional repository for full text print publications by the GEOMAR library. This repository is strongly linked with the data management information system providing dynamic and up-to-date overviews on the various ties between publications and available data sets, expeditions and projects. Such views are also frequently used for the website and reports by the SFB 754 scientific community. The concept of a joint approach initiated by large-scale projects and participating institutions in order to establish a single data management infrastructure has proven to be very successful. We have experienced a snowball-like propagation among marine researchers at GEOMAR and Kiel University, they continue to engage data management services well known from collaboration with SFB 754. But we also observe an ongoing demand for training of new junior (and senior) scientists and continuous need for adaption to new methods and techniques. Only a standardized and consistent data management warrants completeness and integrity of published research data related to their peer-reviewed journal publications in the long run. Based on our daily experience this is best achieved, if not only, by skilled and experienced staff in a dedicated data management team which persists beyond the funding period of research projects. It can effectively carry on and impact by continuous personal contact, consultation and training of researchers on-site. (This poster is linked to the presentation by Dr. Christiane K. Schelten)
Reviewing innovative Earth observation solutions for filling science-policy gaps in hydrology
NASA Astrophysics Data System (ADS)
Lehmann, Anthony; Giuliani, Gregory; Ray, Nicolas; Rahman, Kazi; Abbaspour, Karim C.; Nativi, Stefano; Craglia, Massimo; Cripe, Douglas; Quevauviller, Philippe; Beniston, Martin
2014-10-01
Improved data sharing is needed for hydrological modeling and water management that require better integration of data, information and models. Technological advances in Earth observation and Web technologies have allowed the development of Spatial Data Infrastructures (SDIs) for improved data sharing at various scales. International initiatives catalyze data sharing by promoting interoperability standards to maximize the use of data and by supporting easy access to and utilization of geospatial data. A series of recent European projects are contributing to the promotion of innovative Earth observation solutions and the uptake of scientific outcomes in policy. Several success stories involving different hydrologists' communities can be reported around the World. Gaps still exist in hydrological, agricultural, meteorological and climatological data access because of various issues. While many sources of data exists at all scales it remains difficult and time-consuming to assemble hydrological information for most projects. Furthermore, data and sharing formats remain very heterogeneous. Improvements require implementing/endorsing some commonly agreed standards and documenting data with adequate metadata. The brokering approach allows binding heterogeneous resources published by different data providers and adapting them to tools and interfaces commonly used by consumers of these resources. The challenge is to provide decision-makers with reliable information, based on integrated data and tools derived from both Earth observations and scientific models. Successful SDIs rely therefore on various aspects: a shared vision between all participants, necessity to solve a common problem, adequate data policies, incentives, and sufficient resources. New data streams from remote sensing or crowd sourcing are also producing valuable information to improve our understanding of the water cycle, while field sensors are developing rapidly and becoming less costly. More recent data standards are enhancing interoperability between hydrology and other scientific disciplines, while solutions exist to communicate uncertainty of data and models, which is an essential pre-requisite for decision-making. Distributed computing infrastructures can handle complex and large hydrological data and models, while Web Processing Services bring the flexibility to develop and execute simple to complex workflows over the Internet. The need for capacity building at human, infrastructure and institutional levels is also a major driver for reinforcing the commitment to SDI concepts.
Raising Virtual Laboratories in Australia onto global platforms
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Barker, M.; Fraser, R.; Evans, B. J. K.; Moloney, G.; Proctor, R.; Moise, A. F.; Hamish, H.
2016-12-01
Across the globe, Virtual Laboratories (VLs), Science Gateways (SGs), and Virtual Research Environments (VREs) are being developed that enable users who are not co-located to actively work together at various scales to share data, models, tools, software, workflows, best practices, etc. Outcomes range from enabling `long tail' researchers to more easily access specific data collections, to facilitating complex workflows on powerful supercomputers. In Australia, government funding has facilitated the development of a range of VLs through the National eResearch Collaborative Tools and Resources (NeCTAR) program. The VLs provide highly collaborative, research-domain oriented, integrated software infrastructures that meet user community needs. Twelve VLs have been funded since 2012, including the Virtual Geophysics Laboratory (VGL); Virtual Hazards, Impact and Risk Laboratory (VHIRL); Climate and Weather Science Laboratory (CWSLab); Marine Virtual Laboratory (MarVL); and Biodiversity and Climate Change Virtual Laboratory (BCCVL). These VLs share similar technical challenges, with common issues emerging on integration of tools, applications and access data collections via both cloud-based environments and other distributed resources. While each VL began with a focus on a specific research domain, communities of practice have now formed across the VLs around common issues, and facilitate identification of best practice case studies, and new standards. As a result, tools are now being shared where the VLs access data via data services using international standards such as ISO, OGC, W3C. The sharing of these approaches is starting to facilitate re-usability of infrastructure and is a step towards supporting interdisciplinary research. Whilst the focus of the VLs are Australia-centric, by using standards, these environments are able to be extended to analysis on other international datasets. Many VL datasets are subsets of global datasets and so extension to global is a small (and often requested) step. Similarly, most of the tools, software, and other technologies could be shared across infrastructures globally. Therefore, it is now time to better connect the Australian VLs with similar initiatives elsewhere to create international platforms that can contribute to global research challenges.
A synthesis study on collecting, managing, and sharing road construction asset data.
DOT National Transportation Integrated Search
2015-09-01
Accurate and complete construction records and asbuilt data are the key prerequisites to the effective management of transportation : infrastructure assets throughout their life cycle. The construction phase is the best time to collect such data. ...
Development and evaluation of infrastructure strategies for safer cycling.
DOT National Transportation Integrated Search
2017-01-01
In recent years there has been an increasing number of recreational and bicycle : commuters in the United States. Although bicycle users still represent a very small mode : share, municipalities have been attempting to further encourage the health, e...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Eric W; Rames, Clement L; Bedir, Abdulkadir
This report analyzes plug-in electric vehicle (PEV) infrastructure needs in California from 2017 to 2025 in a scenario where the State's zero-emission vehicle (ZEV) deployment goals are achieved by household vehicles. The statewide infrastructure needs are evaluated by using the Electric Vehicle Infrastructure Projection tool, which incorporates representative statewide travel data from the 2012 California Household Travel Survey. The infrastructure solution presented in this assessment addresses two primary objectives: (1) enabling travel for battery electric vehicles and (2) maximizing the electric vehicle-miles traveled for plug-in hybrid electric vehicles. The analysis is performed at the county-level for each year between 2017more » and 2025 while considering potential technology improvements. The results from this study present an infrastructure solution that can facilitate market growth for PEVs to reach the State's ZEV goals by 2025. The overall results show a need for 99k-130k destination chargers, including workplaces and public locations, and 9k-25k fast chargers. The results also show a need for dedicated or shared residential charging solutions at multi-family dwellings, which are expected to host about 120k PEVs by 2025. An improvement to the scientific literature, this analysis presents the significance of infrastructure reliability and accessibility on the quantification of charger demand.« less
Factors affecting willingness to share electronic health data among California consumers.
Kim, Katherine K; Sankar, Pamela; Wilson, Machelle D; Haynes, Sarah C
2017-04-04
Robust technology infrastructure is needed to enable learning health care systems to improve quality, access, and cost. Such infrastructure relies on the trust and confidence of individuals to share their health data for healthcare and research. Few studies have addressed consumers' views on electronic data sharing and fewer still have explored the dual purposes of healthcare and research together. The objective of the study is to explore factors that affect consumers' willingness to share electronic health information for healthcare and research. This study involved a random-digit dial telephone survey of 800 adult Californians conducted in English and Spanish. Logistic regression was performed using backward selection to test for significant (p-value ≤ 0.05) associations of each explanatory variable with the outcome variable. The odds of consent for electronic data sharing for healthcare decreased as Likert scale ratings for EHR impact on privacy worsened, odds ratio (OR) = 0.74, 95% CI [0.60, 0.90]; security, OR = 0.80, 95% CI [0.66, 0.98]; and quality, OR = 0.59, 95% CI [0.46-0.75]. The odds of consent for sharing for research was greater for those who think EHR will improve research quality, OR = 11.26, 95% CI [4.13, 30.73]; those who value research benefit over privacy OR = 2.72, 95% CI [1.55, 4.78]; and those who value control over research benefit OR = 0.49, 95% CI [0.26, 0.94]. Consumers' choices about electronically sharing health information are affected by their attitudes toward EHRs as well as beliefs about research benefit and individual control. Design of person-centered interventions utilizing electronically collected health information, and policies regarding data sharing should address these values of importance to people. Understanding of these perspectives is critical for leveraging health data to support learning health care systems.
NASA Astrophysics Data System (ADS)
Albeke, S. E.; Perkins, D. G.; Ewers, S. L.; Ewers, B. E.; Holbrook, W. S.; Miller, S. N.
2015-12-01
The sharing of data and results is paramount for advancing scientific research. The Wyoming Center for Environmental Hydrology and Geophysics (WyCEHG) is a multidisciplinary group that is driving scientific breakthroughs to help manage water resources in the Western United States. WyCEHG is mandated by the National Science Foundation (NSF) to share their data. However, the infrastructure from which to share such diverse, complex and massive amounts of data did not exist within the University of Wyoming. We developed an innovative framework to meet the data organization, sharing, and discovery requirements of WyCEHG by integrating both open and closed source software, embedded metadata tags, semantic web technologies, and a web-mapping application. The infrastructure uses a Relational Database Management System as the foundation, providing a versatile platform to store, organize, and query myriad datasets, taking advantage of both structured and unstructured formats. Detailed metadata are fundamental to the utility of datasets. We tag data with Uniform Resource Identifiers (URI's) to specify concepts with formal descriptions (i.e. semantic ontologies), thus allowing users the ability to search metadata based on the intended context rather than conventional keyword searches. Additionally, WyCEHG data are geographically referenced. Using the ArcGIS API for Javascript, we developed a web mapping application leveraging database-linked spatial data services, providing a means to visualize and spatially query available data in an intuitive map environment. Using server-side scripting (PHP), the mapping application, in conjunction with semantic search modules, dynamically communicates with the database and file system, providing access to available datasets. Our approach provides a flexible, comprehensive infrastructure from which to store and serve WyCEHG's highly diverse research-based data. This framework has not only allowed WyCEHG to meet its data stewardship requirements, but can provide a template for others to follow.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
NASA Astrophysics Data System (ADS)
Othman, Raha binti; Bakar, Muhamad Shahbani Abu; Mahamud, Ku Ruhana Ku
2017-10-01
While Spatial Data Infrastructure (SDI) has been established in Malaysia, the full potential can be further realized. To a large degree, geospatial industry users are hopeful that they can easily get access to the system and start utilizing the data. Some users expect SDI to provide them with readily available data without the necessary steps of requesting the data from the data providers as well as the steps for them to process and to prepare the data for their use. Some further argued that the usability of the system can be improved if appropriate combination between data sharing and focused application is found within the services. In order to address the current challenges and to enhance the effectiveness of the SDI in Malaysia, there is possibility of establishing a collaborative business venture between public and private entities; thus can help addressing the issues and expectations. In this paper, we discussed the possibility of collaboration between these two entities. Interviews with seven entities are held to collect information on the exposure, acceptance and sharing of platform. The outcomes indicate that though the growth of GIS technology and the high level of technology acceptance provides a solid based for utilizing the geospatial data, the absence of concrete policy on data sharing, a quality geospatial data, an authority for coordinator agency, leaves a vacuum for the successful implementation of the SDI initiative.
IsoMAP (Isoscape Modeling, Analysis, and Prediction)
NASA Astrophysics Data System (ADS)
Miller, C. C.; Bowen, G. J.; Zhang, T.; Zhao, L.; West, J. B.; Liu, Z.; Rapolu, N.
2009-12-01
IsoMAP is a TeraGrid-based web portal aimed at building the infrastructure that brings together distributed multi-scale and multi-format geospatial datasets to enable statistical analysis and modeling of environmental isotopes. A typical workflow enabled by the portal includes (1) data source exploration and selection, (2) statistical analysis and model development; (3) predictive simulation of isotope distributions using models developed in (1) and (2); (4) analysis and interpretation of simulated spatial isotope distributions (e.g., comparison with independent observations, pattern analysis). The gridded models and data products created by one user can be shared and reused among users within the portal, enabling collaboration and knowledge transfer. This infrastructure and the research it fosters can lead to fundamental changes in our knowledge of the water cycle and ecological and biogeochemical processes through analysis of network-based isotope data, but it will be important A) that those with whom the data and models are shared can be sure of the origin, quality, inputs, and processing history of these products, and B) the system is agile and intuitive enough to facilitate this sharing (rather than just ‘allow’ it). IsoMAP researchers are therefore building into the portal’s architecture several components meant to increase the amount of metadata about users’ products and to repurpose those metadata to make sharing and discovery more intuitive and robust to both expected, professional users as well as unforeseeable populations from other sectors.
Infrastructure Analysis Tools: A Focus on Cash Flow Analysis (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melaina, M.; Penev, M.
2012-09-01
NREL has developed and maintains a variety of infrastructure analysis models for the U.S. Department of Energy. Business case analysis has recently been added to this tool set. This presentation focuses on cash flow analysis. Cash flows depend upon infrastructure costs, optimized spatially and temporally, and assumptions about financing and revenue. NREL has incorporated detailed metrics on financing and incentives into the models. Next steps in modeling include continuing to collect feedback on regional/local infrastructure development activities and 'roadmap' dynamics, and incorporating consumer preference assumptions on infrastructure to provide direct feedback between vehicles and station rollout.
Enabling cross-disciplinary research by linking data to Open Access publications
NASA Astrophysics Data System (ADS)
Rettberg, N.
2012-04-01
OpenAIREplus focuses on the linking of research data to associated publications. The interlinking of research objects has implications for optimising the research process, allowing the sharing, enrichment and reuse of data, and ultimately serving to make open data an essential part of first class research. The growing call for more concrete data management and sharing plans, apparent at funder and national level, is complemented by the increasing support for a scientific infrastructure that supports the seamless access to a range of research materials. This paper will describe the recently launched OpenAIREplus and will detail how it plans to achieve its goals of developing an Open Access participatory infrastructure for scientific information. OpenAIREplus extends the current collaborative OpenAIRE project, which provides European researchers with a service network for the deposit of peer-reviewed FP7 grant-funded Open Access publications. This new project will focus on opening up the infrastructure to data sources from subject-specific communities to provide metadata about research data and publications, facilitating the linking between these objects. The ability to link within a publication out to a citable database, or other research data material, is fairly innovative and this project will enable users to search, browse, view, and create relationships between different information objects. In this regard, OpenAIREplus will build on prototypes of so-called "Enhanced Publications", originally conceived in the DRIVER-II project. OpenAIREplus recognizes the importance of representing the context of publications and datasets, thus linking to resources about the authors, their affiliation, location, project data and funding. The project will explore how links between text-based publications and research data are managed in different scientific fields. This complements a previous study in OpenAIRE on current disciplinary practices and future needs for infrastructural Open Access services, taking into account the variety within research approaches. Adopting Linked Data mechanisms on top of citation and content mining, it will approach the interchange of data between generic infrastructures such as OpenAIREplus and subject specific service providers. The paper will also touch on the other challenges envisaged in the project with regard to the culture of sharing data, as well as IPR, licensing and organisational issues.
Cyber Compendium, Professional Continuing Education Course Papers. Volume 2, Issue 1, Spring 2015
2015-01-01
250,000 mobile devices.23 Cutting just 20% of this infrastructure would translate to a potential savings of over $7 Billion per year. Air...BYOD device extends beyond the device itself. There is the supporting infrastructure such as the Mobile Application Store (MAS), which provides access...96 Prioritizing Cyber Capabilities: to Protect U.S. Critical Infrastructure
30 CFR 219.410 - What does this subpart contain?
Code of Federal Regulations, 2010 CFR
2010-07-01
... coastal protection, including conservation, coastal restoration, hurricane protection, and infrastructure directly affected by coastal wetland losses. (2) Mitigation of damage to fish, wildlife, or natural... coastal political subdivisions within those States; and to the Land and Water Conservation Fund. Shared...
DOT National Transportation Integrated Search
1997-01-01
Intelligent transportation systems (ITS) use advances in communications, computer and information systems to create technologies that can improve traffic, transit and commercial vehicle operations. Essentially, ITS provides the right people in the tr...
Instinctive analytics for coalition operations (Conference Presentation)
NASA Astrophysics Data System (ADS)
de Mel, Geeth R.; La Porta, Thomas; Pham, Tien; Pearson, Gavin
2017-05-01
The success of future military coalition operations—be they combat or humanitarian—will increasingly depend on a system's ability to share data and processing services (e.g. aggregation, summarization, fusion), and automatically compose services in support of complex tasks at the network edge. We call such an infrastructure instinctive—i.e., an infrastructure that reacts instinctively to address the analytics task at hand. However, developing such an infrastructure is made complex for the coalition environment due to its dynamism both in terms of user requirements and service availability. In order to address the above challenge, in this paper, we highlight our research vision and sketch some initial solutions into the problem domain. Specifically, we propose means to (1) automatically infer formal task requirements from mission specifications; (2) discover data, services, and their features automatically to satisfy the identified requirements; (3) create and augment shared domain models automatically; (4) efficiently offload services to the network edge and across coalition boundaries adhering to their computational properties and costs; and (5) optimally allocate and adjust services while respecting the constraints of operating environment and service fit. We envision that the research will result in a framework which enables self-description, discover, and assemble capabilities to both data and services in support of coalition mission goals.
NASA Astrophysics Data System (ADS)
Garschagen, Matthias; Sandholz, Simone
2018-04-01
Increased attention has lately been given to the resilience of critical infrastructure in the context of natural hazards and disasters. The major focus therein is on the sensitivity of critical infrastructure technologies and their management contingencies. However, strikingly little attention has been given to assessing and mitigating social vulnerabilities towards the failure of critical infrastructure and to the development, design and implementation of minimum supply standards in situations of major infrastructure failure. Addressing this gap and contributing to a more integrative perspective on critical infrastructure resilience is the objective of this paper. It asks which role social vulnerability assessments and minimum supply considerations can, should and do - or do not - play for the management and governance of critical infrastructure failure. In its first part, the paper provides a structured review on achievements and remaining gaps in the management of critical infrastructure and the understanding of social vulnerabilities towards disaster-related infrastructure failures. Special attention is given to the current state of minimum supply concepts with a regional focus on policies in Germany and the EU. In its second part, the paper then responds to the identified gaps by developing a heuristic model on the linkages of critical infrastructure management, social vulnerability and minimum supply. This framework helps to inform a vision of a future research agenda, which is presented in the paper's third part. Overall, the analysis suggests that the assessment of socially differentiated vulnerabilities towards critical infrastructure failure needs to be undertaken more stringently to inform the scientifically and politically difficult debate about minimum supply standards and the shared responsibilities for securing them.
Building the Synergy between Public Sector and Research Data Infrastructures
NASA Astrophysics Data System (ADS)
Craglia, Massimo; Friis-Christensen, Anders; Ostländer, Nicole; Perego, Andrea
2014-05-01
INSPIRE is a European Directive aiming to establish a EU-wide spatial data infrastructure to give cross-border access to information that can be used to support EU environmental policies, as well as other policies and activities having an impact on the environment. In order to ensure cross-border interoperability of data infrastructures operated by EU Member States, INSPIRE sets out a framework based on common specifications for metadata, data, network services, data and service sharing, monitoring and reporting. The implementation of INSPIRE has reached important milestones: the INSPIRE Geoportal was launched in 2011 providing a single access point for the discovery of INSPIRE data and services across EU Member States (currently, about 300K), while all the technical specifications for the interoperability of data across the 34 INSPIRE themes were adopted at the end of 2013. During this period a number of EU and international initiatives has been launched, concerning cross-domain interoperability and (Linked) Open Data. In particular, the EU Open Data Portal, launched in December 2012, made provisions to access government and scientific data from EU institutions and bodies, and the EU ISA Programme (Interoperability Solutions for European Public Administrations) promotes cross-sector interoperability by sharing and re-using EU-wide and national standards and components. Moreover, the Research Data Alliance (RDA), an initiative jointly funded by the European Commission, the US National Science Foundation and the Australian Research Council, was launched in March 2013 to promote scientific data sharing and interoperability. The Joint Research Centre of the European Commission (JRC), besides being the technical coordinator of the implementation of INSPIRE, is also actively involved in the initiatives promoting cross-sector re-use in INSPIRE, and sustainable approaches to address the evolution of technologies - in particular, how to support Linked Data in INSPIRE and the use of global persistent identifiers. It is evident that government and scientific data infrastructures are currently facing a number of issues that have already been addressed in INSPIRE. Sharing experiences and competencies will avoid re-inventing the wheel, and help promoting the cross-domain adoption of consistent solutions. Actually, one of the lessons learnt from INSPIRE and the initiatives in which JRC is involved, is that government and research data are not two separate worlds. Government data are commonly used as a basis to create scientific data, and vice-versa. Consequently, it is fundamental to adopt a consistent approach to address interoperability and data management issues shared by both government and scientific data. The presentation illustrates some of the lessons learnt during the implementation of INSPIRE and in work on data and service interoperability coordinated with European and international initiatives. We describe a number of critical interoperability issues and barriers affecting both scientific and government data, concerning, e.g., data terminologies, quality and licensing, and propose how these problems could be effectively addressed by a closer collaboration of the government and scientific communities, and the sharing of experiences and practices.
DataSync - sharing data via filesystem
NASA Astrophysics Data System (ADS)
Ulbricht, Damian; Klump, Jens
2014-05-01
Usually research work is a cycle of to hypothesize, to collect data, to corroborate the hypothesis, and finally to publish the results. In this sequence there are possibilities to base the own work on the work of others. Maybe there are candidates of physical samples listed in the IGSN-Registry and there is no need to go on excursion to acquire physical samples. Hopefully the DataCite catalogue lists already metadata of datasets that meet the constraints of the hypothesis and that are now open for reappraisal. After all, working with the measured data to corroborate the hypothesis involves new methods, and proven methods as well as different software tools. A cohort of intermediate data is created that can be shared with colleagues to discuss the research progress and receive a first evaluation. In consequence, the intermediate data should be versioned to easily get back to valid intermediate data, when you notice you get on the wrong track. Things are different for project managers. They want to know what is currently done, what has been done, and what is the last valid data, if somebody has to continue the work. To make life of members of small science projects easier we developed Datasync [1] as a software for sharing and versioning data. Datasync is designed to synchronize directory trees between different computers of a research team over the internet. The software is developed as JAVA application and watches a local directory tree for changes that are replicated as eSciDoc-objects into an eSciDoc-infrastructure [2] using the eSciDoc REST API. Modifications to the local filesystem automatically create a new version of an eSciDoc-object inside the eSciDoc-infrastructure. This way individual folders can be shared between team members while project managers can get a general idea of current status by synchronizing whole project inventories. Additionally XML metadata from separate files can be managed together with data files inside the eSciDoc-objects. While Datasync's major task is to distribute directory trees, we complement its functionality with the PHP-based application panMetaDocs [3]. panMetaDocs is the successor to panMetaWorks [4] and inherits most of its functionality. Through an internet browser PanMetaDocs provides a web-based overview of the datasets inside the eSciDoc-infrastructure. The software allows to upload further data, to add and edit metadata using the metadata editor, and it disseminates metadata through various channels. In addition, previous versions of a file can be downloaded and access rights can be defined on files and folders to control visibility of files for users of both panMetaDocs and Datasync. panMetaDocs serves as a publication agent for datasets and it serves as a registration agent for dataset DOIs. The application stack presented here allows sharing, versioning, and central storage of data from the very beginning of project activities by using the file synchronization service Datasync. The web-application panMetaDocs complements the functionality of DataSync by providing a dataset publication agent and other tools to handle administrative tasks on the data. [1] http://github.com/ulbricht/datasync [2] http://github.com/escidoc [3] http://panmetadocs.sf.net [4] http://metaworks.pangaea.de
The involvement of parents in the healthcare provided to hospitalzed children.
Melo, Elsa Maria de Oliveira Pinheiro de; Ferreira, Pedro Lopes; Lima, Regina Aparecida Garcia de; Mello, Débora Falleiros de
2014-01-01
to analyze the answers of parents and health care professionals concerning the involvement of parents in the care provided to hospitalized children. exploratory study based on the conceptual framework of pediatric healthcare with qualitative data analysis. three dimensions of involvement were highlighted: daily care provided to children, opinions concerning the involvement of parents, and continuity of care with aspects related to the presence and participation of parents, benefits to the child and family, information needs, responsibility, right to healthcare, hospital infrastructure, care delivery, communication between the parents and health services, shared learning, and follow-up after discharge. the involvement of parents in the care provided to their children has many meanings for parents, nurses and doctors. Specific strategies need to be developed with and for parents in order to mobilize parental competencies and contribute to increasing their autonomy and decision-making concerning the care provided to children.
Genomics-Based Security Protocols: From Plaintext to Cipherprotein
NASA Technical Reports Server (NTRS)
Shaw, Harry; Hussein, Sayed; Helgert, Hermann
2011-01-01
The evolving nature of the internet will require continual advances in authentication and confidentiality protocols. Nature provides some clues as to how this can be accomplished in a distributed manner through molecular biology. Cryptography and molecular biology share certain aspects and operations that allow for a set of unified principles to be applied to problems in either venue. A concept for developing security protocols that can be instantiated at the genomics level is presented. A DNA (Deoxyribonucleic acid) inspired hash code system is presented that utilizes concepts from molecular biology. It is a keyed-Hash Message Authentication Code (HMAC) capable of being used in secure mobile Ad hoc networks. It is targeted for applications without an available public key infrastructure. Mechanics of creating the HMAC are presented as well as a prototype HMAC protocol architecture. Security concepts related to the implementation differences between electronic domain security and genomics domain security are discussed.
Biotechnology software in the digital age: are you winning?
Scheitz, Cornelia Johanna Franziska; Peck, Lawrence J; Groban, Eli S
2018-01-16
There is a digital revolution taking place and biotechnology companies are slow to adapt. Many pharmaceutical, biotechnology, and industrial bio-production companies believe that software must be developed and maintained in-house and that data are more secure on internal servers than on the cloud. In fact, most companies in this space continue to employ large IT and software teams and acquire computational infrastructure in the form of in-house servers. This is due to a fear of the cloud not sufficiently protecting in-house resources and the belief that their software is valuable IP. Over the next decade, the ability to quickly adapt to changing market conditions, with agile software teams, will quickly become a compelling competitive advantage. Biotechnology companies that do not adopt the new regime may lose on key business metrics such as return on invested capital, revenue, profitability, and eventually market share.
SMART Cougars: Development and Feasibility of a Campus-based HIV Prevention Intervention.
Ali, Samira; Rawwad, Tamara Al; Leal, Roberta M; Wilson, Maria I; Mancillas, Alberto; Keo-Meier, Becca; Torres, Luis R
2017-01-01
University campuses are promising sites for service implementation because they have the infrastructure to support services, offer access to an otherwise difficult to reach population, and prioritize knowledge sharing among all entities. As HIV rates continue to rise among minority young adults, the need to implement innovative programs at the university level also increases. The University of Houston's (UH) Substance Use, Mental Health, and HIV/AIDS Risk Assessment and Testing (SMART Cougars) program provides HIV testing and education, mental health, and substance abuse services and referrals to students on campus and in surrounding communities. The aim of this paper is to describe development and examine feasibility of SMART Cougars (SC). Using Bowen's feasibility framework, we found that SC produced a demand, was acceptable and appropriate, implemented without many challenges, and integrated among university and community settings. Combined, these factors and processes changed social norms around sexual health messages on campus.
EMSO ERIC - Ocean Consortium Facility for Europe and the World
NASA Astrophysics Data System (ADS)
Best, Mairi
2017-04-01
EMSO is forging ahead through the next challenge in Earth-Ocean Science: How to co-ordinate ocean data acquisition, analysis and response across provincial, national, regional, and global scales. EMSO provides power, communications, sensors, and data infrastructure for continuous, high resolution, real-time, interactive ocean observations across a truly multi- and interdisciplinary range of research areas including biology, geology, chemistry, physics, engineering, and computer science; from polar to tropical environments, through the water column down to the abyss. 11 deep sea and 4 shallow nodes span from Arctic through the Atlantic and Mediterranean, to the Black Sea. The EMSO Preparatory Phase (FP7) project led to the Interim phase (involving 13 countries) of forming the legal entity: the EMSO European Research Infrastructure Consortium (EMSO-ERIC)-officially created by the EC in 2016. The open user community, originally developed through ESONET (European Seafloor Observatory NETwork), follows on scientific community planning contributions of the ESONET-NoE (FP6) project. Further progress made through the FixO3 project (FP7) also contributes to this shared infrastructure. Coordination among nodes is being strengthened through the EMSOdev project (H2020) which is producing the EMSO Generic Instrument Module (EGIM) - standardised observations of temperature, pressure, salinity, dissolved oxygen, turbidity, chlorophyll fluorescence, currents, passive acoustics, pH, pCO2, and nutrients. Early installations are now being upgraded; in October 2015 EMSO-France deployed a second cable and junction box serving the Ligurian Sea Node in order to monitor slope stability offshore Nice; in 2016 the EMSO Azores Node receives a major upgrade that will double its observing capacity; for EMSO-Italia the Capo Passero site is being installed and the Catania site is being upgraded. EMSOLINK will continue the expansion work. EMSO is a key player in international coordination projects such as CoopEUS/Coop+, ENVRI/ENVRIplus, GOOS/EOOS - as such EMSO not only brings together countries and disciplines, but allows the pooling of resources and coordination to assemble harmonised data into a comprehensive regional ocean picture which it will then make available to researchers and stakeholders worldwide on an open and interoperable access basis.
Update on CERN Search based on SharePoint 2013
NASA Astrophysics Data System (ADS)
Alvarez, E.; Fernandez, S.; Lossent, A.; Posada, I.; Silva, B.; Wagner, A.
2017-10-01
CERN’s enterprise Search solution “CERN Search” provides a central search solution for users and CERN service providers. A total of about 20 million public and protected documents from a wide range of document collections is indexed, including Indico, TWiki, Drupal, SharePoint, JACOW, E-group archives, EDMS, and CERN Web pages. In spring 2015, CERN Search was migrated to a new infrastructure based on SharePoint 2013. In the context of this upgrade, the document pre-processing and indexing process was redesigned and generalised. The new data feeding framework allows to profit from new functionality and it facilitates the long term maintenance of the system.
CloudMan as a platform for tool, data, and analysis distribution.
Afgan, Enis; Chapman, Brad; Taylor, James
2012-11-27
Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions.
A data management and publication workflow for a large-scale, heterogeneous sensor network.
Jones, Amber Spackman; Horsburgh, Jeffery S; Reeder, Stephanie L; Ramírez, Maurier; Caraballo, Juan
2015-06-01
It is common for hydrology researchers to collect data using in situ sensors at high frequencies, for extended durations, and with spatial distributions that produce data volumes requiring infrastructure for data storage, management, and sharing. The availability and utility of these data in addressing scientific questions related to water availability, water quality, and natural disasters relies on effective cyberinfrastructure that facilitates transformation of raw sensor data into usable data products. It also depends on the ability of researchers to share and access the data in useable formats. In this paper, we describe a data management and publication workflow and software tools for research groups and sites conducting long-term monitoring using in situ sensors. Functionality includes the ability to track monitoring equipment inventory and events related to field maintenance. Linking this information to the observational data is imperative in ensuring the quality of sensor-based data products. We present these tools in the context of a case study for the innovative Urban Transitions and Aridregion Hydrosustainability (iUTAH) sensor network. The iUTAH monitoring network includes sensors at aquatic and terrestrial sites for continuous monitoring of common meteorological variables, snow accumulation and melt, soil moisture, surface water flow, and surface water quality. We present the overall workflow we have developed for effectively transferring data from field monitoring sites to ultimate end-users and describe the software tools we have deployed for storing, managing, and sharing the sensor data. These tools are all open source and available for others to use.
NASA Astrophysics Data System (ADS)
Sindrilaru, Elvin A.; Peters, Andreas J.; Adde, Geoffray M.; Duellmann, Dirk
2017-10-01
CERN has been developing and operating EOS as a disk storage solution successfully for over 6 years. The CERN deployment provides 135 PB and stores 1.2 billion replicas distributed over two computer centres. Deployment includes four LHC instances, a shared instance for smaller experiments and since last year an instance for individual user data as well. The user instance represents the backbone of the CERNBOX service for file sharing. New use cases like synchronisation and sharing, the planned migration to reduce AFS usage at CERN and the continuous growth has brought EOS to new challenges. Recent developments include the integration and evaluation of various technologies to do the transition from a single active in-memory namespace to a scale-out implementation distributed over many meta-data servers. The new architecture aims to separate the data from the application logic and user interface code, thus providing flexibility and scalability to the namespace component. Another important goal is to provide EOS as a CERN-wide mounted filesystem with strong authentication making it a single storage repository accessible via various services and front- ends (/eos initiative). This required new developments in the security infrastructure of the EOS FUSE implementation. Furthermore, there were a series of improvements targeting the end-user experience like tighter consistency and latency optimisations. In collaboration with Seagate as Openlab partner, EOS has a complete integration of OpenKinetic object drive cluster as a high-throughput, high-availability, low-cost storage solution. This contribution will discuss these three main development projects and present new performance metrics.
NASA Technical Reports Server (NTRS)
2005-01-01
The Transformational Concept of Operations (CONOPS) provides a long-term, sustainable vision for future U.S. space transportation infrastructure and operations. This vision presents an interagency concept, developed cooperatively by the Department of Defense (DoD), the Federal Aviation Administration (FAA), and the National Aeronautics and Space Administration (NASA) for the upgrade, integration, and improved operation of major infrastructure elements of the nation s space access systems. The interagency vision described in the Transformational CONOPS would transform today s space launch infrastructure into a shared system that supports worldwide operations for a variety of users. The system concept is sufficiently flexible and adaptable to support new types of missions for exploration, commercial enterprise, and national security, as well as to endure further into the future when space transportation technology may be sufficiently advanced to enable routine public space travel as part of the global transportation system. The vision for future space transportation operations is based on a system-of-systems architecture that integrates the major elements of the future space transportation system - transportation nodes (spaceports), flight vehicles and payloads, tracking and communications assets, and flight traffic coordination centers - into a transportation network that concurrently accommodates multiple types of mission operators, payloads, and vehicle fleets. This system concept also establishes a common framework for defining a detailed CONOPS for the major elements of the future space transportation system. The resulting set of four CONOPS (see Figure 1 below) describes the common vision for a shared future space transportation system (FSTS) infrastructure from a variety of perspectives.
Sharing Data and Analytical Resources Securely in a Biomedical Research Grid Environment
Langella, Stephen; Hastings, Shannon; Oster, Scott; Pan, Tony; Sharma, Ashish; Permar, Justin; Ervin, David; Cambazoglu, B. Barla; Kurc, Tahsin; Saltz, Joel
2008-01-01
Objectives To develop a security infrastructure to support controlled and secure access to data and analytical resources in a biomedical research Grid environment, while facilitating resource sharing among collaborators. Design A Grid security infrastructure, called Grid Authentication and Authorization with Reliably Distributed Services (GAARDS), is developed as a key architecture component of the NCI-funded cancer Biomedical Informatics Grid (caBIG™). The GAARDS is designed to support in a distributed environment 1) efficient provisioning and federation of user identities and credentials; 2) group-based access control support with which resource providers can enforce policies based on community accepted groups and local groups; and 3) management of a trust fabric so that policies can be enforced based on required levels of assurance. Measurements GAARDS is implemented as a suite of Grid services and administrative tools. It provides three core services: Dorian for management and federation of user identities, Grid Trust Service for maintaining and provisioning a federated trust fabric within the Grid environment, and Grid Grouper for enforcing authorization policies based on both local and Grid-level groups. Results The GAARDS infrastructure is available as a stand-alone system and as a component of the caGrid infrastructure. More information about GAARDS can be accessed at http://www.cagrid.org. Conclusions GAARDS provides a comprehensive system to address the security challenges associated with environments in which resources may be located at different sites, requests to access the resources may cross institutional boundaries, and user credentials are created, managed, revoked dynamically in a de-centralized manner. PMID:18308979
NASA Astrophysics Data System (ADS)
Ross, A.; Little, M. M.
2013-12-01
NASA's Atmospheric Science Data Center (ASDC) is piloting the use of Geographic Information System (GIS) technology that can be leveraged for crisis planning, emergency response, and disaster management/awareness. Many different organizations currently use GIS tools and geospatial data during a disaster event. ASDC datasets have not been fully utilized by this community in the past due to incompatible data formats that ASDC holdings are archived in. Through the successful implementation of this pilot effort and continued collaboration with the larger Homeland Defense and Department of Defense emergency management community through the Homeland Infrastructure Foundation-Level Data Working Group (HIFLD WG), our data will be easily accessible to those using GIS and increase the ability to plan, respond, manage, and provide awareness during disasters. The HIFLD WG Partnership has expanded to include more than 5,900 mission partners representing the 14 executive departments, 98 agencies, 50 states (and 3 territories), and more than 700 private sector organizations to directly enhance the federal, state, and local government's ability to support domestic infrastructure data gathering, sharing and protection, visualization, and spatial knowledge management.The HIFLD WG Executive Membership is lead by representatives from the Department of Defense (DoD) Office of the Assistant Secretary of Defense for Homeland Defense and Americas' Security Affairs - OASD (HD&ASA); the Department of Homeland Security (DHS), National Protection and Programs Directorate's Office of Infrastructure Protection (NPPD IP); the National Geospatial-Intelligence Agency (NGA) Integrated Working Group - Readiness, Response and Recovery (IWG-R3); the Department of Interior (DOI) United States Geological Survey (USGS) National Geospatial Program (NGP), and DHS Federal Emergency Management Agency (FEMA).
NASA Astrophysics Data System (ADS)
Bauer, J. R.; Rose, K.; Romeo, L.; Barkhurst, A.; Nelson, J.; Duran-Sesin, R.; Vielma, J.
2016-12-01
Efforts to prepare for and reduce the risk of hazards, from both natural and anthropogenic sources, which threaten our oceans and coasts requires an understanding of the dynamics and interactions between the physical, ecological, and socio-economic systems. Understanding these coupled dynamics are essential as offshore oil & gas exploration and production continues to push into harsher, more extreme environments where risks and uncertainty increase. However, working with these large, complex data from various sources and scales to assess risks and potential impacts associated with offshore energy exploration and production poses several challenges to research. In order to address these challenges, an integrated assessment model (IAM) was developed at the Department of Energy's (DOE) National Energy Technology Laboratory (NETL) that combines spatial data infrastructure and an online research platform to manage, process, analyze, and share these large, multidimensional datasets, research products, and the tools and models used to evaluate risk and reduce uncertainty for the entire offshore system, from the subsurface, through the water column, to coastal ecosystems and communities. Here, we will discuss the spatial data infrastructure and online research platform, NETL's Energy Data eXchange (EDX), that underpin the offshore IAM, providing information on how the framework combines multidimensional spatial data and spatio-temporal tools to evaluate risks to the complex matrix of potential environmental, social, and economic impacts stemming from modeled offshore hazard scenarios, such as oil spills or hurricanes. In addition, we will discuss the online analytics, tools, and visualization methods integrated into this framework that support availability and access to data, as well as allow for the rapid analysis and effective communication of analytical results to aid a range of decision-making needs.
Data hosting infrastructure for primary biodiversity data
2011-01-01
Background Today, an unprecedented volume of primary biodiversity data are being generated worldwide, yet significant amounts of these data have been and will continue to be lost after the conclusion of the projects tasked with collecting them. To get the most value out of these data it is imperative to seek a solution whereby these data are rescued, archived and made available to the biodiversity community. To this end, the biodiversity informatics community requires investment in processes and infrastructure to mitigate data loss and provide solutions for long-term hosting and sharing of biodiversity data. Discussion We review the current state of biodiversity data hosting and investigate the technological and sociological barriers to proper data management. We further explore the rescuing and re-hosting of legacy data, the state of existing toolsets and propose a future direction for the development of new discovery tools. We also explore the role of data standards and licensing in the context of data hosting and preservation. We provide five recommendations for the biodiversity community that will foster better data preservation and access: (1) encourage the community's use of data standards, (2) promote the public domain licensing of data, (3) establish a community of those involved in data hosting and archival, (4) establish hosting centers for biodiversity data, and (5) develop tools for data discovery. Conclusion The community's adoption of standards and development of tools to enable data discovery is essential to sustainable data preservation. Furthermore, the increased adoption of open content licensing, the establishment of data hosting infrastructure and the creation of a data hosting and archiving community are all necessary steps towards the community ensuring that data archival policies become standardized. PMID:22373257
An open and transparent process to select ELIXIR Node Services as implemented by ELIXIR-UK
Hancock, John M.; Game, Alf; Ponting, Chris P.; Goble, Carole A.
2017-01-01
ELIXIR is the European infrastructure established specifically for the sharing and sustainability of life science data. To provide up-to-date resources and services, ELIXIR needs to undergo a continuous process of refreshing the services provided by its national Nodes. Here we present the approach taken by ELIXIR-UK to address the advice by the ELIXIR Scientific Advisory Board that Nodes need to develop “ mechanisms to ensure that each Node continues to be representative of the Bioinformatics efforts within the country”. ELIXIR-UK put in place an open and transparent process to identify potential ELIXIR resources within the UK during late 2015 and early to mid-2016. Areas of strategic strength were identified and Expressions of Interest in these priority areas were requested from the UK community. Criteria were established, in discussion with the ELIXIR Hub, and prospective ELIXIR-UK resources were assessed by an independent committee set up by the Node for this purpose. Of 19 resources considered, 14 were judged to be immediately ready to be included in the UK ELIXIR Node’s portfolio. A further five were placed on the Node’s roadmap for future consideration for inclusion. ELIXIR-UK expects to repeat this process regularly to ensure its portfolio continues to reflect its community’s strengths. PMID:28149502
An open and transparent process to select ELIXIR Node Services as implemented by ELIXIR-UK.
Hancock, John M; Game, Alf; Ponting, Chris P; Goble, Carole A
2016-01-01
ELIXIR is the European infrastructure established specifically for the sharing and sustainability of life science data. To provide up-to-date resources and services, ELIXIR needs to undergo a continuous process of refreshing the services provided by its national Nodes. Here we present the approach taken by ELIXIR-UK to address the advice by the ELIXIR Scientific Advisory Board that Nodes need to develop " mechanisms to ensure that each Node continues to be representative of the Bioinformatics efforts within the country". ELIXIR-UK put in place an open and transparent process to identify potential ELIXIR resources within the UK during late 2015 and early to mid-2016. Areas of strategic strength were identified and Expressions of Interest in these priority areas were requested from the UK community. Criteria were established, in discussion with the ELIXIR Hub, and prospective ELIXIR-UK resources were assessed by an independent committee set up by the Node for this purpose. Of 19 resources considered, 14 were judged to be immediately ready to be included in the UK ELIXIR Node's portfolio. A further five were placed on the Node's roadmap for future consideration for inclusion. ELIXIR-UK expects to repeat this process regularly to ensure its portfolio continues to reflect its community's strengths.
ERIC Educational Resources Information Center
Training, 2012
2012-01-01
"Training" magazine taps 2012 Training Top 125 winners and Top 10 Hall of Famers to provide their learning and development best practices in each issue. In this article, Jamie Leitch (American Infrastructure) and Mary Beth Alexander (The Economical Insurance Group) share strategies for tuition reimbursement and professional designations.
ERIC Educational Resources Information Center
Olsen, Florence
2003-01-01
Colleges and universities are beginning to consider collaborating on open-source-code projects as a way to meet critical software and computing needs. Points out the attractive features of noncommercial open-source software and describes some examples in use now, especially for the creation of Web infrastructure. (SLD)
Rapid deployment of internet-connected environmental monitoring devices
USDA-ARS?s Scientific Manuscript database
Advances in electronic sensing and monitoring systems and the growth of the communications infrastructure have enabled users to gain immediate access to information and interaction with physical devices. To facilitate the uploading, viewing, and sharing of data via the internet, while avoiding the ...
DOT National Transportation Integrated Search
2016-07-14
This report describes the system requirements specifications (SyRS) for the use of mobile devices in a connected vehicle environment. Specifically, it defines the different types of requirements (functional, interface, performance, security, data, an...
Sepucha, Karen R; Simmons, Leigh H; Barry, Michael J; Edgman-Levitan, Susan; Licurse, Adam M; Chaguturu, Sreekanth K
2016-04-01
Shared decision making is a core component of population health strategies aimed at improving patient engagement. Massachusetts General Hospital's integration of shared decision making into practice has focused on the following three elements: developing a culture receptive to, and health care providers skilled in, shared decision making conversations; using patient decision aids to help inform and engage patients; and providing infrastructure and resources to support the implementation of shared decision making in practice. In the period 2005-15, more than 900 clinicians and other staff members were trained in shared decision making, and more than 28,000 orders for one of about forty patient decision aids were placed to support informed patient-centered decisions. We profile two different implementation initiatives that increased the use of patient decision aids at the hospital's eighteen adult primary care practices, and we summarize key elements of the shared decision making program. Project HOPE—The People-to-People Health Foundation, Inc.
DOT National Transportation Integrated Search
2012-10-01
Floridas transportation infrastructure must continually evolve to meet the demands of its growing population. Many jurisdictions are moving toward multimodal transportation systems that utilize existing infrastructure more efficiently, providing u...
Parallel digital forensics infrastructure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebrock, Lorie M.; Duggan, David Patrick
2009-10-01
This report documents the architecture and implementation of a Parallel Digital Forensics infrastructure. This infrastructure is necessary for supporting the design, implementation, and testing of new classes of parallel digital forensics tools. Digital Forensics has become extremely difficult with data sets of one terabyte and larger. The only way to overcome the processing time of these large sets is to identify and develop new parallel algorithms for performing the analysis. To support algorithm research, a flexible base infrastructure is required. A candidate architecture for this base infrastructure was designed, instantiated, and tested by this project, in collaboration with New Mexicomore » Tech. Previous infrastructures were not designed and built specifically for the development and testing of parallel algorithms. With the size of forensics data sets only expected to increase significantly, this type of infrastructure support is necessary for continued research in parallel digital forensics. This report documents the implementation of the parallel digital forensics (PDF) infrastructure architecture and implementation.« less
Operational models of infrastructure resilience.
Alderson, David L; Brown, Gerald G; Carlyle, W Matthew
2015-04-01
We propose a definition of infrastructure resilience that is tied to the operation (or function) of an infrastructure as a system of interacting components and that can be objectively evaluated using quantitative models. Specifically, for any particular system, we use quantitative models of system operation to represent the decisions of an infrastructure operator who guides the behavior of the system as a whole, even in the presence of disruptions. Modeling infrastructure operation in this way makes it possible to systematically evaluate the consequences associated with the loss of infrastructure components, and leads to a precise notion of "operational resilience" that facilitates model verification, validation, and reproducible results. Using a simple example of a notional infrastructure, we demonstrate how to use these models for (1) assessing the operational resilience of an infrastructure system, (2) identifying critical vulnerabilities that threaten its continued function, and (3) advising policymakers on investments to improve resilience. © 2014 Society for Risk Analysis.
O'Connor, Brian D.; Yuen, Denis; Chung, Vincent; Duncan, Andrew G.; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH). PMID:28344774
O'Connor, Brian D; Yuen, Denis; Chung, Vincent; Duncan, Andrew G; Liu, Xiang Kun; Patricia, Janice; Paten, Benedict; Stein, Lincoln; Ferretti, Vincent
2017-01-01
As genomic datasets continue to grow, the feasibility of downloading data to a local organization and running analysis on a traditional compute environment is becoming increasingly problematic. Current large-scale projects, such as the ICGC PanCancer Analysis of Whole Genomes (PCAWG), the Data Platform for the U.S. Precision Medicine Initiative, and the NIH Big Data to Knowledge Center for Translational Genomics, are using cloud-based infrastructure to both host and perform analysis across large data sets. In PCAWG, over 5,800 whole human genomes were aligned and variant called across 14 cloud and HPC environments; the processed data was then made available on the cloud for further analysis and sharing. If run locally, an operation at this scale would have monopolized a typical academic data centre for many months, and would have presented major challenges for data storage and distribution. However, this scale is increasingly typical for genomics projects and necessitates a rethink of how analytical tools are packaged and moved to the data. For PCAWG, we embraced the use of highly portable Docker images for encapsulating and sharing complex alignment and variant calling workflows across highly variable environments. While successful, this endeavor revealed a limitation in Docker containers, namely the lack of a standardized way to describe and execute the tools encapsulated inside the container. As a result, we created the Dockstore ( https://dockstore.org), a project that brings together Docker images with standardized, machine-readable ways of describing and running the tools contained within. This service greatly improves the sharing and reuse of genomics tools and promotes interoperability with similar projects through emerging web service standards developed by the Global Alliance for Genomics and Health (GA4GH).
Surjaningrum, Endang R; Minas, Harry; Jorm, Anthony F; Kakuma, Ritsuko
2018-01-01
Indonesian maternal health policies state that community health workers (CHWs) are responsible for detection and referral of pregnant women and postpartum mothers who might suffer from mental health problems (task-sharing). The documents have been published for a while, however reports on the implementation are hardly found which possibly resulted from feasibility issue within the health system. To examine the feasibility of task-sharing in integrated mental health care to identify perinatal depression in Surabaya, Indonesia. Semi-structured interviews were conducted with 62 participants representing four stakeholder groups in primary health care: program managers from the health office and the community, health workers and CHWs, mental health specialists, and service users. Questions on the feasibility were supported by vignettes about perinatal depression. WHO's health systems framework was applied to analyse the data using framework analysis. Findings indicated the policy initiative is feasible to the district health system. A strong basis within the health system for task-sharing in maternal mental health rests on health leadership and governance that open an opportunity for training and supervision, financing, and intersectoral collaboration. The infrastructure and resources in the city provide potential for a continuity of care. Nevertheless, feasibility is challenged by gaps between policy and practices, inadequate support system in technologies and information system, assigning the workforce and strategies to be applied, and the lack of practical guidelines to guide the implementation. The health system and resources in Surabaya provide opportunities for task-sharing to detect and refer cases of perinatal depression in an integrated mental health care system. Participation of informal workforce might facilitate in closing the gap in the provision of information on perinatal mental health.
Assessing the vulnerability of infrastructure to climate change on the Islands of Samoa
NASA Astrophysics Data System (ADS)
Fakhruddin, S. H. M.
2015-03-01
Pacific Islanders have been exposed to risks associated with climate change. Samoa as one of the Pacific Islands are prone to climatic hazards that will likely increase in coming decades, affecting coastal communities and infrastructure around the islands. Climate models do not predict a reduction of such disaster events in the future in Samoa; indeed, most predict an increase in such events. This paper identifies key infrastructure and their functions and status in order to provide an overall picture of relative vulnerability to climate-related stresses of such infrastructure on the island. By reviewing existing reports as well as holding a series of consultation meetings, a list of critical infrastructures were developed and shared with stakeholders for their consideration. An indicator-based vulnerability model (SIVM) was developed in collaboration with stakeholders to assess the vulnerability of selected infrastructure systems on the Samoan Islands. Damage costs were extracted from the Evan cyclone recovery needs document. On the other hand, criticality and capacity to repair data were collected from stakeholders. Having stakeholder perspectives on these two issues was important because (a) criticality of a given infrastructure could be viewed differently among different stakeholders, and (b) stakeholders were the best available source (in this study) to estimate the capacity to repair non-physical damage to such infrastructure. Analysis of the results suggested rankings from most vulnerable to least vulnerable sectors are the transportation sector, the power sector, the water supply sector and the sewerage system.
Assessing the vulnerability of infrastructure to climate change on the Islands of Samoa
NASA Astrophysics Data System (ADS)
Fakhruddin, S. H. M.; Babel, M. S.; Kawasaki, A.
2015-06-01
Pacific Islanders have been exposed to risks associated with climate change. Samoa, as one of the Pacific Islands, is prone to climatic hazards that will likely increase in the coming decades, affecting coastal communities and infrastructure around the islands. Climate models do not predict a reduction of such disaster events in the future in Samoa; indeed, most predict an increase. This paper identifies key infrastructure and their functions and status in order to provide an overall picture of relative vulnerability to climate-related stresses of such infrastructure on the island. By reviewing existing reports as well as holding a series of consultation meetings, a list of critical infrastructure was developed and shared with stakeholders for their consideration. An indicator-based vulnerability model (SIVM) was developed in collaboration with stakeholders to assess the vulnerability of selected infrastructure systems on the Samoan Islands. Damage costs were extracted from the Cyclone Evan recovery needs document. Additionally, data on criticality and capacity to repair damage were collected from stakeholders. Having stakeholder perspectives on these two issues was important because (a) criticality of a given infrastructure could be viewed differently among different stakeholders, and (b) stakeholders were the best available source (in this study) to estimate the capacity to repair non-physical damage to such infrastructure. Analysis of the results suggested a ranking of sectors from the most vulnerable to least vulnerable are: the transportation sector, the power sector, the water supply sector and the sewerage system.
Rapid Arctic Changes due to Infrastructure and Climate (RATIC) in the Russian North
NASA Astrophysics Data System (ADS)
Walker, D. A.; Kofinas, G.; Raynolds, M. K.; Kanevskiy, M. Z.; Shur, Y.; Ambrosius, K.; Matyshak, G. V.; Romanovsky, V. E.; Kumpula, T.; Forbes, B. C.; Khukmotov, A.; Leibman, M. O.; Khitun, O.; Lemay, M.; Allard, M.; Lamoureux, S. F.; Bell, T.; Forbes, D. L.; Vincent, W. F.; Kuznetsova, E.; Streletskiy, D. A.; Shiklomanov, N. I.; Fondahl, G.; Petrov, A.; Roy, L. P.; Schweitzer, P.; Buchhorn, M.
2015-12-01
The Rapid Arctic Transitions due to Infrastructure and Climate (RATIC) initiative is a forum developed by the International Arctic Science Committee (IASC) Terrestrial, Cryosphere, and Social & Human working groups for developing and sharing new ideas and methods to facilitate the best practices for assessing, responding to, and adaptively managing the cumulative effects of Arctic infrastructure and climate change. An IASC white paper summarizes the activities of two RATIC workshops at the Arctic Change 2014 Conference in Ottawa, Canada and the 2015 Third International Conference on Arctic Research Planning (ICARP III) meeting in Toyama, Japan (Walker & Pierce, ed. 2015). Here we present an overview of the recommendations from several key papers and posters presented at these conferences with a focus on oil and gas infrastructure in the Russian north and comparison with oil development infrastructure in Alaska. These analyses include: (1) the effects of gas- and oilfield activities on the landscapes and the Nenets indigenous reindeer herders of the Yamal Peninsula, Russia; (2) a study of urban infrastructure in the vicinity of Norilsk, Russia, (3) an analysis of the effects of pipeline-related soil warming on trace-gas fluxes in the vicinity of Nadym, Russia, (4) two Canadian initiatives that address multiple aspects of Arctic infrastructure called Arctic Development and Adaptation to Permafrost in Transition (ADAPT) and the ArcticNet Integrated Regional Impact Studies (IRIS), and (5) the effects of oilfield infrastructure on landscapes and permafrost in the Prudhoe Bay region, Alaska.
NASA Astrophysics Data System (ADS)
Asmi, A.; Sorvari, S.; Kutsch, W. L.; Laj, P.
2017-12-01
European long-term environmental research infrastructures (often referred as ESFRI RIs) are the core facilities for providing services for scientists in their quest for understanding and predicting the complex Earth system and its functioning that requires long-term efforts to identify environmental changes (trends, thresholds and resilience, interactions and feedbacks). Many of the research infrastructures originally have been developed to respond to the needs of their specific research communities, however, it is clear that strong collaboration among research infrastructures is needed to serve the trans-boundary research requires exploring scientific questions at the intersection of different scientific fields, conducting joint research projects and developing concepts, devices, and methods that can be used to integrate knowledge. European Environmental research infrastructures have already been successfully worked together for many years and have established a cluster - ENVRI cluster - for their collaborative work. ENVRI cluster act as a collaborative platform where the RIs can jointly agree on the common solutions for their operations, draft strategies and policies and share best practices and knowledge. Supporting project for the ENVRI cluster, ENVRIplus project, brings together 21 European research infrastructures and infrastructure networks to work on joint technical solutions, data interoperability, access management, training, strategies and dissemination efforts. ENVRI cluster act as one stop shop for multidisciplinary RI users, other collaborative initiatives, projects and programmes and coordinates and implement jointly agreed RI strategies.
A game theory analysis of green infrastructure stormwater management policies
NASA Astrophysics Data System (ADS)
William, Reshmina; Garg, Jugal; Stillwell, Ashlynn S.
2017-09-01
Green stormwater infrastructure has been demonstrated as an innovative water resources management approach that addresses multiple challenges facing urban environments. However, there is little consensus on what policy strategies can be used to best incentivize green infrastructure adoption by private landowners. Game theory, an analysis framework that has historically been under-utilized within the context of stormwater management, is uniquely suited to address this policy question. We used a cooperative game theory framework to investigate the potential impacts of different policy strategies used to incentivize green infrastructure installation. The results indicate that municipal regulation leads to the greatest reduction in pollutant loading. However, the choice of the "best" regulatory approach will depend on a variety of different factors including politics and financial considerations. Large, downstream agents have a disproportionate share of bargaining power. Results also reveal that policy impacts are highly dependent on agents' spatial position within the stormwater network, leading to important questions of social equity and environmental justice.
Lindberg, D A; Humphreys, B L
1995-01-01
The High-Performance Computing and Communications (HPCC) program is a multiagency federal effort to advance the state of computing and communications and to provide the technologic platform on which the National Information Infrastructure (NII) can be built. The HPCC program supports the development of high-speed computers, high-speed telecommunications, related software and algorithms, education and training, and information infrastructure technology and applications. The vision of the NII is to extend access to high-performance computing and communications to virtually every U.S. citizen so that the technology can be used to improve the civil infrastructure, lifelong learning, energy management, health care, etc. Development of the NII will require resolution of complex economic and social issues, including information privacy. Health-related applications supported under the HPCC program and NII initiatives include connection of health care institutions to the Internet; enhanced access to gene sequence data; the "Visible Human" Project; and test-bed projects in telemedicine, electronic patient records, shared informatics tool development, and image systems. PMID:7614116
NASA Astrophysics Data System (ADS)
Alpi, Danielle Marie
The 16 sectors of critical infrastructure in the US are susceptible to cyber-attacks. Potential attacks come from internal and external threats. These attacks target the industrial control systems (ICS) of companies within critical infrastructure. Weakness in the energy sector's ICS, specifically the oil and gas industry, can result in economic and ecological disaster. The purpose of this study was to establish means for oil companies to identify and stop cyber-attacks specifically APT threats. This research reviewed current cyber vulnerabilities and ways in which a cyber-attack may be deterred. This research found that there are insecure devices within ICS that are not regularly updated. Therefore, security issues have amassed. Safety procedures and training thereof are often neglected. Jurisdiction is unclear in regard to critical infrastructure. The recommendations this research offers are further examination of information sharing methods, development of analytic platforms, and better methods for the implementation of defense-in-depth security measures.
The GEOSS solution for enabling data interoperability and integrative research.
Nativi, Stefano; Mazzetti, Paolo; Craglia, Max; Pirrone, Nicola
2014-03-01
Global sustainability research requires an integrative research effort underpinned by digital infrastructures (systems) able to harness data and heterogeneous information across disciplines. Digital data and information sharing across systems and applications is achieved by implementing interoperability: a property of a product or system to work with other products or systems, present or future. There are at least three main interoperability challenges a digital infrastructure must address: technological, semantic, and organizational. In recent years, important international programs and initiatives are focusing on such an ambitious objective. This manuscript presents and combines the studies and the experiences carried out by three relevant projects, focusing on the heavy metal domain: Global Mercury Observation System, Global Earth Observation System of Systems (GEOSS), and INSPIRE. This research work recognized a valuable interoperability service bus (i.e., a set of standards models, interfaces, and good practices) proposed to characterize the integrative research cyber-infrastructure of the heavy metal research community. In the paper, the GEOSS common infrastructure is discussed implementing a multidisciplinary and participatory research infrastructure, introducing a possible roadmap for the heavy metal pollution research community to join GEOSS as a new Group on Earth Observation community of practice and develop a research infrastructure for carrying out integrative research in its specific domain.
Ports of Delaware Bay: Industry And Public Sector Cooperation For Information Sharing
2010-12-01
Introduction MIST Fall 2010 high-density tunnels, stations, and bridges. Another $20 million was provided for intercity ...passenger rail to protect infrastructure and the traveling public.12 This commitment of funds at a time of shrinking budgets demonstrates the essential
Integrating WEPP into the WEPS infrastructure
USDA-ARS?s Scientific Manuscript database
The Wind Erosion Prediction System (WEPS) and the Water Erosion Prediction Project (WEPP) share a common modeling philosophy, that of moving away from primarily empirically based models based on indices or "average conditions", and toward a more process based approach which can be evaluated using ac...
Building a School District's Wide Area Network.
ERIC Educational Resources Information Center
Mastel, Vern L.
1996-01-01
Describes the development of a wide area network (WAN) in the Bismarck Public School District (North Dakota). Topics include design goals, network infrastructure, implementing library access, sharing resources across platforms, electronic mail, dial-in access, Internet access, adhering to software licenses, shareware and freeware, and monitoring…
Optimized Infrastructure for the Earth System Prediction Capability
2013-09-30
for referencing memory between its native coupling datatype (MCT Attribute Vectors) and ESMF Arrays. This will reduce the copies required and will...introduced ability within CESM to share memory between ESMF and MCT datatypes makes using both tools together much easier. Using both is appealing
Networking. New Opportunities for Partnering, CAUSE94. Track IV.
ERIC Educational Resources Information Center
CAUSE, Boulder, CO.
Seven papers are presented from the 1994 CAUSE conference track on networking and information sharing among higher education institutions. The papers include: (1) "Integrated Statewide Infrastructure of Learning Technologies," focusing on the University of Wisconsin System (Lee Alley); (2) "Designing and Implementing a Network…
Community-driven computational biology with Debian Linux.
Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles
2010-12-21
The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.
Integration of a neuroimaging processing pipeline into a pan-canadian computing grid
NASA Astrophysics Data System (ADS)
Lavoie-Courchesne, S.; Rioux, P.; Chouinard-Decorte, F.; Sherif, T.; Rousseau, M.-E.; Das, S.; Adalat, R.; Doyon, J.; Craddock, C.; Margulies, D.; Chu, C.; Lyttelton, O.; Evans, A. C.; Bellec, P.
2012-02-01
The ethos of the neuroimaging field is quickly moving towards the open sharing of resources, including both imaging databases and processing tools. As a neuroimaging database represents a large volume of datasets and as neuroimaging processing pipelines are composed of heterogeneous, computationally intensive tools, such open sharing raises specific computational challenges. This motivates the design of novel dedicated computing infrastructures. This paper describes an interface between PSOM, a code-oriented pipeline development framework, and CBRAIN, a web-oriented platform for grid computing. This interface was used to integrate a PSOM-compliant pipeline for preprocessing of structural and functional magnetic resonance imaging into CBRAIN. We further tested the capacity of our infrastructure to handle a real large-scale project. A neuroimaging database including close to 1000 subjects was preprocessed using our interface and publicly released to help the participants of the ADHD-200 international competition. This successful experiment demonstrated that our integrated grid-computing platform is a powerful solution for high-throughput pipeline analysis in the field of neuroimaging.
Board on Research Data and Information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sztein, A. Ester; Boright, John
2015-08-14
The Board on Research Data and Information (BRDI) has planned and undertaken numerous activities regarding data citation, attribution, management, policy, publishing, centers, access, curation, sharing, and infrastructure; and international collaboration and cooperation. Some of these activities resulted in National Research Council reports (For Attribution: Developing Data Attribution and Citation Practices and Standards (2012), The Case for International Scientific Data Sharing: A Focus on Developing Countries (2012), and The Future of Scientific Knowledge Discovery in Open Networked Environments (2012); and a peer-reviewed paper (Out of Cite, Out of Mind: The Current State of Practice, Policy, and Technology for the Citation ofmore » Data, 2013). BRDI held symposia, workshops and sessions in the U.S. and abroad on diverse topics such as global scientific data infrastructures, discovery of data online, privacy in a big data world, and data citation principles, among other timely data-related subjects. In addition, BRDI effects the representation of the United States before the International Council for Science’s International Committee on Data for Science and Technology (CODATA).« less
Operating and Managing a Backup Control Center
NASA Technical Reports Server (NTRS)
Marsh, Angela L.; Pirani, Joseph L.; Bornas, Nicholas
2010-01-01
Due to the criticality of continuous mission operations, some control centers must plan for alternate locations in the event an emergency shuts down the primary control center. Johnson Space Center (JSC) in Houston, Texas is the Mission Control Center (MCC) for the International Space Station (ISS). Due to Houston s proximity to the Gulf of Mexico, JSC is prone to threats from hurricanes which could cause flooding, wind damage, and electrical outages to the buildings supporting the MCC. Marshall Space Flight Center (MSFC) has the capability to be the Backup Control Center for the ISS if the situation is needed. While the MSFC Huntsville Operations Support Center (HOSC) does house the BCC, the prime customer and operator of the ISS is still the JSC flight operations team. To satisfy the customer and maintain continuous mission operations, the BCC has critical infrastructure that hosts ISS ground systems and flight operations equipment that mirrors the prime mission control facility. However, a complete duplicate of Mission Control Center in another remote location is very expensive to recreate. The HOSC has infrastructure and services that MCC utilized for its backup control center to reduce the costs of a somewhat redundant service. While labor talents are equivalent, experiences are not. Certain operations are maintained in a redundant mode, while others are simply maintained as single string with adequate sparing levels of equipment. Personnel at the BCC facility must be trained and certified to an adequate level on primary MCC systems. Negotiations with the customer were done to match requirements with existing capabilities, and to prioritize resources for appropriate level of service. Because some of these systems are shared, an activation of the backup control center will cause a suspension of scheduled HOSC activities that may share resources needed by the BCC. For example, the MCC is monitoring a hurricane in the Gulf of Mexico. As the threat to MCC increases, HOSC must begin a phased activation of the BCC, while working resource conflicts with normal HOSC activities. In a long duration outage to the MCC, this could cause serious impacts to the BCC host facility s primary mission support activities. This management of a BCC is worked based on customer expectations and negotiations done before emergencies occur. I.
Risk-based asset management methodology for highway infrastructure systems.
DOT National Transportation Integrated Search
2004-01-01
Maintaining the infrastructure of roads, highways, and bridges is paramount to ensuring that these assets will remain safe and reliable in the future. If maintenance costs remain the same or continue to escalate, and additional funding is not made av...
Commitment to Cybersecurity and Information Technology Governance: A Case Study and Leadership Model
ERIC Educational Resources Information Center
Curtis, Scipiaruth Kendall
2012-01-01
The continual emergence of technologies has infiltrated government and industry business infrastructures, requiring reforming organizations and fragile network infrastructures. Emerging technologies necessitates countermeasures, commitment to cybersecurity and information technology governance for organization's survivability and sustainability.…
NASA Astrophysics Data System (ADS)
Kutsch, Werner Leo; Asmi, Ari; Laj, Paolo; Brus, Magdalena; Sorvari, Sanna
2016-04-01
ENVRIplus is a Horizon 2020 project bringing together Environmental and Earth System Research Infrastructures, projects and networks together with technical specialist partners to create a more coherent, interdisciplinary and interoperable cluster of Environmental Research Infrastructures (RIs) across Europe. The objective of ENVRIplus is to provide common solutions to shared challenges for these RIs in their efforts to deliver new services for science and society. To reach this overall goal, ENVRIplus brings together the current ESFRI roadmap environmental and associate fields RIs, leading I3 projects, key developing RI networks and specific technical specialist partners to build common synergic solutions for pressing issues in RI construction and implementation. ENVRIplus will be organized along 6 main objectives, further on called "Themes": 1) Improve the RI's abilities to observe the Earth System, particularly in developing and testing new sensor technologies, harmonizing observation methodologies and developing methods to overcome common problems associated with distributed remote observation networks; 2) Generate common solutions for shared information technology and data related challenges of the environmental RIs in data and service discovery and use, workflow documentation, data citations methodologies, service virtualization, and user characterization and interaction; 3) Develop harmonized policies for access (physical and virtual) for the environmental RIs, including access services for the multidisciplinary users; 4) Investigate the interactions between RIs and society: Find common approaches and methodologies how to assess the RIs' ability to answer the economical and societal challenges, develop ethics guidelines for RIs and investigate the possibility to enhance the use Citizen Science approaches in RI products and services; 5) Ensure the cross-fertilisation and knowledge transfer of new technologies, best practices, approaches and policies of the RIs by generating training material for RI personnel to use the new observational, technological and computational tools and facilitate inter-RI knowledge transfer via a staff exchange program; 6) Create RI communication and cooperation framework to coordinate activities of the environmental RIs towards common strategic development, improved user interaction and interdisciplinary cross-RI products and services. The produced solutions, services, systems and other project results are made available to all environmental research infrastructure initiatives.
NASA Astrophysics Data System (ADS)
Pinheiro da Silva, P.; CyberShARE Center of Excellence
2011-12-01
Scientists today face the challenge of rethinking the manner in which they document and make available their processes and data in an international cyber-infrastructure of shared resources. Some relevant examples of new scientific practices in the realm of computational and data extraction sciences include: large scale data discovery; data integration; data sharing across distinct scientific domains, systematic management of trust and uncertainty; and comprehensive support for explaining processes and results. This talk introduces CI-Miner - an innovative hands-on, open-source, community-driven methodology to integrate these new scientific practices. It has been developed in collaboration with scientists, with the purpose of capturing, storing and retrieving knowledge about scientific processes and their products, thereby further supporting a new generation of science techniques based on data exploration. CI-Miner uses semantic annotations in the form of W3C Ontology Web Language-based ontologies and Proof Markup Language (PML)-based provenance to represent knowledge. This methodology specializes in general-purpose ontologies, projected into workflow-driven ontologies(WDOs) and into semantic abstract workflows (SAWs). Provenance in PML is CI-Miner's integrative component, which allows scientists to retrieve and reason with the knowledge represented in these new semantic documents. It serves additionally as a platform to share such collected knowledge with the scientific community participating in the international cyber-infrastructure. The integrated semantic documents that are tailored for the use of human epistemic agents may also be utilized by machine epistemic agents, since the documents are based on W3C Resource Description Framework (RDF) notation. This talk is grounded upon interdisciplinary lessons learned through the use of CI-Miner in support of government-funded national and international cyber-infrastructure initiatives in the areas of geo-sciences (NSF-GEON and NSF-EarthScope), environmental sciences (CEON, NSF NEON, NSF-LTER and DOE-Ameri-Flux), and solar physics (VSTO and NSF-SPCDIS). The discussion on provenance is based on the use of PML in support of projects in collaboration with government organizations (DARPA, ARDA, NSF, DHS and DOE), research organizations (NCAR and PNNL), and industries (IBM and SRI International).
Adapting Digital Libraries to Continual Evolution
NASA Technical Reports Server (NTRS)
Barkstrom, Bruce R.; Finch, Melinda; Ferebee, Michelle; Mackey, Calvin
2002-01-01
In this paper, we describe five investment streams (data storage infrastructure, knowledge management, data production control, data transport and security, and personnel skill mix) that need to be balanced against short-term operating demands in order to maximize the probability of long-term viability of a digital library. Because of the rapid pace of information technology change, a digital library cannot be a static institution. Rather, it has to become a flexible organization adapted to continuous evolution of its infrastructure.
Framework for Shared Drinking Water Risk Assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowry, Thomas Stephen; Tidwell, Vincent C.; Peplinski, William John
Central to protecting our nation's critical infrastructure is the development of methodologies for prioritizing action and supporting resource allocation decisions associated with risk-reduction initiatives. Toward this need a web-based risk assessment framework that promotes the anonymous sharing of results among water utilities is demonstrated. Anonymous sharing of results offers a number of potential advantages such as assistance in recognizing and correcting bias, identification of 'unknown, unknowns', self-assessment and benchmarking for the local utility, treatment of shared assets and/or threats across multiple utilities, and prioritization of actions beyond the scale of a single utility. The constructed framework was demonstrated for threemore » water utilities. Demonstration results were then compared to risk assessment results developed using a different risk assessment application by a different set of analysts.« less
Data Storage and sharing for the long tail of science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, B.; Pouchard, L.; Smith, P. M.
Research data infrastructure such as storage must now accommodate new requirements resulting from trends in research data management that require researchers to store their data for the long term and make it available to other researchers. We propose Data Depot, a system and service that provides capabilities for shared space within a group, shared applications, flexible access patterns and ease of transfer at Purdue University. We evaluate Depot as a solution for storing and sharing multiterabytes of data produced in the long tail of science with a use case in soundscape ecology studies from the Human- Environment Modeling and Analysismore » Laboratory. We observe that with the capabilities enabled by Data Depot, researchers can easily deploy fine-grained data access control, manage data transfer and sharing, as well as integrate their workflows into a High Performance Computing environment.« less
Enhancing the many-to-many relations across IHE document sharing communities.
Ribeiro, Luís S; Costa, Carlos; Oliveira, José Luís
2012-01-01
The Integrating Healthcare Enterprise (IHE) initiative is an ongoing project aiming to enable true inter-site interoperability in the health IT field. IHE is a work in progress and many challenges need to be overcome before the healthcare Institutions may share patient clinical records transparently and effortless. Configuring, deploying and testing an IHE document sharing community requires a significant effort to plan and maintain the supporting IT infrastructure. With the new paradigm of cloud computing is now possible to launch software devices on demand and paying accordantly to the usage. This paper presents a framework designed with purpose of expediting the creation of IHE document sharing communities. It provides semi-ready templates of sharing communities that will be customized according the community needs. The framework is a meeting point of the healthcare institutions, creating a favourable environment that might converge in new inter-institutional professional relationships and eventually the creation of new Affinity Domains.
McLoughlin, Clodagh; Patel, Kunal D; O'Callaghan, Tom; Reeves, Scott
2018-03-01
The recent growth in online technology has led to a rapid increase in the sharing of health related information globally. Health and social care professionals are now using a wide range of virtual communities of practice (VCoPs) for learning, support, continuing professional education, knowledge management and information sharing. In this article, we report the findings from a review of the literature that explored the use of VCoPs by health and social care professionals to determine their potential for interprofessional education and collaboration. We employed integrated review methods to search and identify relevant VCoP articles. We undertook searches of PubMed and Google Scholar from 2000, which after screening, resulted in the inclusion of 19 articles. A thematic analysis generated the following key issues related to the use of VCoPs: 'definitions and approaches', 'technological infrastructure', 'reported benefits', 'participation issues', 'trust and privacy and 'technical ability'. Based on the findings from this review, there is some evidence that VCoPs can offer an informal method of professional and interprofessional development for clinicians, and can decrease social and professional isolation. However, for VCoPs to be successful, issues of privacy, trust, encouragement and technology need to be addressed.
Re-inventing Data Libraries: Ensuring Continuing Access To Curated (Value-added) Data
NASA Astrophysics Data System (ADS)
Burnhill, P.; Medyckyj-Scott, D.
2008-12-01
How many years of inexperience do we need in using, and in particular sharing, digital data generated by others? That history pre-dates, but must also gain leverage from, the emergence of the digital library. Much of this sharing was done within research groups but recent attention to spatial data infrastructure highlights the importance of achieving several 'right mixes': * between Internet-standards, geo-specific referencing, and domain-specific vocabulary (cf ontology); * between attention to user-focus'd services and machine-to-machine interoperability; * between the demands of current high-quality services, the practice of data curation, and the need for long term preservation. This presentation will draw upon ideas and experience data library services in research universities, a national (UK) academic data centre, and developments in digital curation. It will be argued that the 1980s term 'data library' has some polemic value in that we have yet to learn what it means to 'do library' for data: more than "a bit like inter-galactic library loan", perhaps. Illustration will be drawn from multi-faceted database of digitized boundaries (UKBORDERS), through the first Internet map delivery of national mapping agency data (Digimap), to strategic positioning to help geo-enable academic and scientific data and so enhance research (in the UK, in Europe, and beyond).
Des Jardins, Terrisca R.
2014-01-01
Community-based health information exchanges (HIEs) and efforts to consolidate and house data are growing, given the advent of Accountable Care Organizations (ACOs) under the Affordable Care Act and other similar population health focused initiatives. The Southeast Michigan Beacon Community (SEMBC) can be looked to as one case study that offers lessons learned, insights on challenges faced and accompanying workarounds related to governance and stakeholder engagement. The SEMBC case study employs an established Data Warehouse Governance Framework to identify and explain the necessary governance and stakeholder engagement components, particularly as they relate to community-wide data sharing and data warehouses or repositories. Perhaps the biggest lesson learned through the SEMBC experience is that community-based work is hard. It requires a great deal of community leadership, collaboration and resources. SEMBC found that organizational structure and guiding principles needed to be continually revisited and nurtured in order to build the relationships and trust needed among stakeholder organizations. SEMBC also found that risks and risk mitigation tactics presented challenges and opportunities at the outset and through the duration of the three year pilot period. Other communities across the country embarking on similar efforts need to consider realistic expectations about community data sharing infrastructures and the accompanying and necessary governance and stakeholder engagement fundamentals. PMID:25848612
Measuring household consumption and waste in unmetered, intermittent piped water systems
NASA Astrophysics Data System (ADS)
Kumpel, Emily; Woelfle-Erskine, Cleo; Ray, Isha; Nelson, Kara L.
2017-01-01
Measurements of household water consumption are extremely difficult in intermittent water supply (IWS) regimes in low- and middle-income countries, where water is delivered for short durations, taps are shared, metering is limited, and household storage infrastructure varies widely. Nonetheless, consumption estimates are necessary for utilities to improve water delivery. We estimated household water use in Hubli-Dharwad, India, with a mixed-methods approach combining (limited) metered data, storage container inventories, and structured observations. We developed a typology of household water access according to infrastructure conditions based on the presence of an overhead storage tank and a shared tap. For households with overhead tanks, container measurements and metered data produced statistically similar consumption volumes; for households without overhead tanks, stored volumes underestimated consumption because of significant water use directly from the tap during delivery periods. Households that shared taps consumed much less water than those that did not. We used our water use calculations to estimate waste at the household level and in the distribution system. Very few households used 135 L/person/d, the Government of India design standard for urban systems. Most wasted little water even when unmetered, however, unaccounted-for water in the neighborhood distribution systems was around 50%. Thus, conservation efforts should target loss reduction in the network rather than at households.
31 CFR 800.208 - Critical infrastructure.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 31 Money and Finance: Treasury 3 2010-07-01 2010-07-01 false Critical infrastructure. 800.208 Section 800.208 Money and Finance: Treasury Regulations Relating to Money and Finance (Continued) OFFICE OF INVESTMENT SECURITY, DEPARTMENT OF THE TREASURY REGULATIONS PERTAINING TO MERGERS, ACQUISITIONS...
Bicycling for transportation and health: the role of infrastructure.
Dill, Jennifer
2009-01-01
This paper aims to provide insight on whether bicycling for everyday travel can help US adults meet the recommended levels of physical activity and what role public infrastructure may play in encouraging this activity. The study collected data on bicycling behavior from 166 regular cyclists in the Portland, Oregon metropolitan area using global positioning system (GPS) devices. Sixty percent of the cyclists rode for more than 150 minutes per week during the study and nearly all of the bicycling was for utilitarian purposes, not exercise. A disproportionate share of the bicycling occurred on streets with bicycle lanes, separate paths, or bicycle boulevards. The data support the need for well-connected neighborhood streets and a network of bicycle-specific infrastructure to encourage more bicycling among adults. This can be accomplished through comprehensive planning, regulation, and funding.
FIRST Robotics, Gulfport High, StenniSphere, Bo Clarke, mentor
NASA Technical Reports Server (NTRS)
2006-01-01
Bo Clarke, mentor for Gulfport High School's Team Fusion, offers strategy tips to students and coaches during the FIRST Robotics Competition kickoff held at StenniSphere on Jan. 7. Clarke is the lead building and infrastructure specialist for NASA's Shared Services Center at Stennis Space Center.
FIRST Robotics, Gulfport High, StenniSphere, Bo Clarke, mentor
2006-01-07
Bo Clarke, mentor for Gulfport High School's Team Fusion, offers strategy tips to students and coaches during the FIRST Robotics Competition kickoff held at StenniSphere on Jan. 7. Clarke is the lead building and infrastructure specialist for NASA's Shared Services Center at Stennis Space Center.
Open Informational Ecosystems: The Missing Link for Sharing Educational Resources
ERIC Educational Resources Information Center
Kerres, Michael; Heinen, Richard
2015-01-01
Open educational resources are not available "as such". Their provision relies on a technological infrastructure of related services that can be described as an informational ecosystem. A closed informational ecosystem keeps educational resources within its boundary. An open informational ecosystem relies on the concurrence of…
DOT National Transportation Integrated Search
2017-10-30
The Task 6 Prototype Acceptance Test Summary Report summarizes the results of Acceptance Testing carried out at Battelle facilities in accordance with the Task 6 Acceptance Test Plan. The Acceptance Tests were designed to verify that the prototype sy...
2009-05-12
56 RBC Financial Group, Daily Forex Fundamentals, February 27, 2009. [ http...www.actionforex.com/fundamental- analysis/daily- forex -fundamentals/canada%27s-fourth%11quarter-current-account-moves-into-deficit-after-nine-years- of-surpluses...sharing, infrastructure improvements, improvement of compatible immigration databases , visa policy coordination, common biometric identifiers in
802.11 Wireless Infrastructure To Enhance Medical Response to Disasters
Arisoylu, Mustafa; Mishra, Rajesh; Rao, Ramesh; Lenert, Leslie A.
2005-01-01
802.11 (WiFi) is a well established network communications protocol that has wide applicability in civil infrastructure. This paper describes research that explores the design of 802.11 networks enhanced to support data communications in disaster environments. The focus of these efforts is to create network infrastructure to support operations by Metropolitan Medical Response System (MMRS) units and Federally-sponsored regional teams that respond to mass casualty events caused by a terrorist attack with chemical, biological, nuclear or radiological weapons or by a hazardous materials spill. In this paper, we describe an advanced WiFi-based network architecture designed to meet the needs of MMRS operations. This architecture combines a Wireless Distribution Systems for peer-to-peer multihop connectivity between access points with flexible and shared access to multiple cellular backhauls for robust connectivity to the Internet. The architecture offers a high bandwidth data communications infrastructure that can penetrate into buildings and structures while also supporting commercial off-the-shelf end-user equipment such as PDAs. It is self-configuring and is self-healing in the event of a loss of a portion of the infrastructure. Testing of prototype units is ongoing. PMID:16778990
NASA Astrophysics Data System (ADS)
Farooq, Umer; Schank, Patricia; Harris, Alexandra; Fusco, Judith; Schlager, Mark
Community computing has recently grown to become a major research area in human-computer interaction. One of the objectives of community computing is to support computer-supported cooperative work among distributed collaborators working toward shared professional goals in online communities of practice. A core issue in designing and developing community computing infrastructures — the underlying sociotechnical layer that supports communitarian activities — is sustainability. Many community computing initiatives fail because the underlying infrastructure does not meet end user requirements; the community is unable to maintain a critical mass of users consistently over time; it generates insufficient social capital to support significant contributions by members of the community; or, as typically happens with funded initiatives, financial and human capital resource become unavailable to further maintain the infrastructure. On the basis of more than 9 years of design experience with Tapped In-an online community of practice for education professionals — we present a case study that discusses four design interventions that have sustained the Tapped In infrastructure and its community to date. These interventions represent broader design strategies for developing online environments for professional communities of practice.
Kaggal, Vinod C.; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J.; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P.; Ross, Jason L.; Chaudhry, Rajeev; Buntrock, James D.; Liu, Hongfang
2016-01-01
The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future. PMID:27385912
Kaggal, Vinod C; Elayavilli, Ravikumar Komandur; Mehrabi, Saeed; Pankratz, Joshua J; Sohn, Sunghwan; Wang, Yanshan; Li, Dingcheng; Rastegar, Majid Mojarad; Murphy, Sean P; Ross, Jason L; Chaudhry, Rajeev; Buntrock, James D; Liu, Hongfang
2016-01-01
The concept of optimizing health care by understanding and generating knowledge from previous evidence, ie, the Learning Health-care System (LHS), has gained momentum and now has national prominence. Meanwhile, the rapid adoption of electronic health records (EHRs) enables the data collection required to form the basis for facilitating LHS. A prerequisite for using EHR data within the LHS is an infrastructure that enables access to EHR data longitudinally for health-care analytics and real time for knowledge delivery. Additionally, significant clinical information is embedded in the free text, making natural language processing (NLP) an essential component in implementing an LHS. Herein, we share our institutional implementation of a big data-empowered clinical NLP infrastructure, which not only enables health-care analytics but also has real-time NLP processing capability. The infrastructure has been utilized for multiple institutional projects including the MayoExpertAdvisor, an individualized care recommendation solution for clinical care. We compared the advantages of big data over two other environments. Big data infrastructure significantly outperformed other infrastructure in terms of computing speed, demonstrating its value in making the LHS a possibility in the near future.
A National Strategy to Develop Pragmatic Clinical Trials Infrastructure
Guise, Jeanne‐Marie; Dolor, Rowena J.; Meissner, Paul; Tunis, Sean; Krishnan, Jerry A.; Pace, Wilson D.; Saltz, Joel; Hersh, William R.; Michener, Lloyd; Carey, Timothy S.
2014-01-01
Abstract An important challenge in comparative effectiveness research is the lack of infrastructure to support pragmatic clinical trials, which compare interventions in usual practice settings and subjects. These trials present challenges that differ from those of classical efficacy trials, which are conducted under ideal circumstances, in patients selected for their suitability, and with highly controlled protocols. In 2012, we launched a 1‐year learning network to identify high‐priority pragmatic clinical trials and to deploy research infrastructure through the NIH Clinical and Translational Science Awards Consortium that could be used to launch and sustain them. The network and infrastructure were initiated as a learning ground and shared resource for investigators and communities interested in developing pragmatic clinical trials. We followed a three‐stage process of developing the network, prioritizing proposed trials, and implementing learning exercises that culminated in a 1‐day network meeting at the end of the year. The year‐long project resulted in five recommendations related to developing the network, enhancing community engagement, addressing regulatory challenges, advancing information technology, and developing research methods. The recommendations can be implemented within 24 months and are designed to lead toward a sustained national infrastructure for pragmatic trials. PMID:24472114
Institutional shared resources and translational cancer research.
De Paoli, Paolo
2009-06-29
The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology.In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization supporting clinical trial recruitment and management represent essential tools, providing solutions to overcome existing barriers in the development of translational research in biomedical research centers.
Institutional shared resources and translational cancer research
De Paoli, Paolo
2009-01-01
The development and maintenance of adequate shared infrastructures is considered a major goal for academic centers promoting translational research programs. Among infrastructures favoring translational research, centralized facilities characterized by shared, multidisciplinary use of expensive laboratory instrumentation, or by complex computer hardware and software and/or by high professional skills are necessary to maintain or improve institutional scientific competitiveness. The success or failure of a shared resource program also depends on the choice of appropriate institutional policies and requires an effective institutional governance regarding decisions on staffing, existence and composition of advisory committees, policies and of defined mechanisms of reporting, budgeting and financial support of each resource. Shared Resources represent a widely diffused model to sustain cancer research; in fact, web sites from an impressive number of research Institutes and Universities in the U.S. contain pages dedicated to the SR that have been established in each Center, making a complete view of the situation impossible. However, a nation-wide overview of how Cancer Centers develop SR programs is available on the web site for NCI-designated Cancer Centers in the U.S., while in Europe, information is available for individual Cancer centers. This article will briefly summarize the institutional policies, the organizational needs, the characteristics, scientific aims, and future developments of SRs necessary to develop effective translational research programs in oncology. In fact, the physical build-up of SRs per se is not sufficient for the successful translation of biomedical research. Appropriate policies to improve the academic culture in collaboration, the availability of educational programs for translational investigators, the existence of administrative facilitations for translational research and an efficient organization supporting clinical trial recruitment and management represent essential tools, providing solutions to overcome existing barriers in the development of translational research in biomedical research centers. PMID:19563639
Privacy protection for personal health information and shared care records.
Neame, Roderick L B
2014-01-01
The protection of personal information privacy has become one of the most pressing security concerns for record keepers: this will become more onerous with the introduction of the European General Data Protection Regulation (GDPR) in mid-2014. Many institutions, both large and small, have yet to implement the essential infrastructure for data privacy protection and patient consent and control when accessing and sharing data; even more have failed to instil a privacy and security awareness mindset and culture amongst their staff. Increased regulation, together with better compliance monitoring, has led to the imposition of increasingly significant monetary penalties for failure to protect privacy: these too are set to become more onerous under the GDPR, increasing to a maximum of 2% of annual turnover. There is growing pressure in clinical environments to deliver shared patient care and to support this with integrated information. This demands that more information passes between institutions and care providers without breaching patient privacy or autonomy. This can be achieved with relatively minor enhancements of existing infrastructures and does not require extensive investment in inter-operating electronic records: indeed such investments to date have been shown not to materially improve data sharing. REQUIREMENTS FOR PRIVACY: There is an ethical duty as well as a legal obligation on the part of care providers (and record keepers) to keep patient information confidential and to share it only with the authorisation of the patient. To achieve this information storage and retrieval, communication systems must be appropriately configured. There are many components of this, which are discussed in this paper. Patients may consult clinicians anywhere and at any time: therefore, their data must be available for recipient-driven retrieval (i.e. like the World Wide Web) under patient control and kept private: a method for delivering this is outlined.
An i2b2-based, generalizable, open source, self-scaling chronic disease registry
Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D
2013-01-01
Objective Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Materials and methods Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. Results The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. Discussion We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. Conclusions The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases. PMID:22733975
An i2b2-based, generalizable, open source, self-scaling chronic disease registry.
Natter, Marc D; Quan, Justin; Ortiz, David M; Bousvaros, Athos; Ilowite, Norman T; Inman, Christi J; Marsolo, Keith; McMurry, Andrew J; Sandborg, Christy I; Schanberg, Laura E; Wallace, Carol A; Warren, Robert W; Weber, Griffin M; Mandl, Kenneth D
2013-01-01
Registries are a well-established mechanism for obtaining high quality, disease-specific data, but are often highly project-specific in their design, implementation, and policies for data use. In contrast to the conventional model of centralized data contribution, warehousing, and control, we design a self-scaling registry technology for collaborative data sharing, based upon the widely adopted Integrating Biology & the Bedside (i2b2) data warehousing framework and the Shared Health Research Information Network (SHRINE) peer-to-peer networking software. Focusing our design around creation of a scalable solution for collaboration within multi-site disease registries, we leverage the i2b2 and SHRINE open source software to create a modular, ontology-based, federated infrastructure that provides research investigators full ownership and access to their contributed data while supporting permissioned yet robust data sharing. We accomplish these objectives via web services supporting peer-group overlays, group-aware data aggregation, and administrative functions. The 56-site Childhood Arthritis & Rheumatology Research Alliance (CARRA) Registry and 3-site Harvard Inflammatory Bowel Diseases Longitudinal Data Repository now utilize i2b2 self-scaling registry technology (i2b2-SSR). This platform, extensible to federation of multiple projects within and between research networks, encompasses >6000 subjects at sites throughout the USA. We utilize the i2b2-SSR platform to minimize technical barriers to collaboration while enabling fine-grained control over data sharing. The implementation of i2b2-SSR for the multi-site, multi-stakeholder CARRA Registry has established a digital infrastructure for community-driven research data sharing in pediatric rheumatology in the USA. We envision i2b2-SSR as a scalable, reusable solution facilitating interdisciplinary research across diseases.
On leadership organizational intelligence/organizational stupidity: the leader's challenge.
Kerfoot, Karlene
2003-01-01
Creating organizations with a high IQ or creating organizations without the necessary intelligence guarantees success or failure of the organization. Without structures such as shared leadership and other forms of participative management, the organization or unit cannot access and use the available information and wisdom in the organization. When nurses and other health care professionals do not feel like they have a shared stake and do not feel like citizens of the organization, they lack passion for the organization's work. When nurses feel a sense of share ownnership and autonomy for the clinical practice, terrific outcomes are achieved. Leaders must accept the challenge to build the infrastructure that leads to excellence in organizational IQ.
CloudMan as a platform for tool, data, and analysis distribution
2012-01-01
Background Cloud computing provides an infrastructure that facilitates large scale computational analysis in a scalable, democratized fashion, However, in this context it is difficult to ensure sharing of an analysis environment and associated data in a scalable and precisely reproducible way. Results CloudMan (usecloudman.org) enables individual researchers to easily deploy, customize, and share their entire cloud analysis environment, including data, tools, and configurations. Conclusions With the enabled customization and sharing of instances, CloudMan can be used as a platform for collaboration. The presented solution improves accessibility of cloud resources, tools, and data to the level of an individual researcher and contributes toward reproducibility and transparency of research solutions. PMID:23181507
2016-04-01
infrastructure . The work is motivated by the fact that today’s clouds are very static, uniform, and predictable, allowing attackers who identify a...vulnerability in one of the services or infrastructure components to spread their effect to other, mission-critical services. Our goal is to integrate into...clouds by elevating continuous change, evolution, and misinformation as first-rate design principles of the cloud’s infrastructure . Our work is
Infrastructure resources for clinical research in amyotrophic lateral sclerosis.
Sherman, Alexander V; Gubitz, Amelie K; Al-Chalabi, Ammar; Bedlack, Richard; Berry, James; Conwit, Robin; Harris, Brent T; Horton, D Kevin; Kaufmann, Petra; Leitner, Melanie L; Miller, Robert; Shefner, Jeremy; Vonsattel, Jean Paul; Mitsumoto, Hiroshi
2013-05-01
Clinical trial networks, shared clinical databases, and human biospecimen repositories are examples of infrastructure resources aimed at enhancing and expediting clinical and/or patient oriented research to uncover the etiology and pathogenesis of amyotrophic lateral sclerosis (ALS), a rapidly progressive neurodegenerative disease that leads to the paralysis of voluntary muscles. The current status of such infrastructure resources, as well as opportunities and impediments, were discussed at the second Tarrytown ALS meeting held in September 2011. The discussion focused on resources developed and maintained by ALS clinics and centers in North America and Europe, various clinical trial networks, U.S. government federal agencies including the National Institutes of Health (NIH), the Agency for Toxic Substances and Disease Registry (ATSDR) and the Centers for Disease Control and Prevention (CDC), and several voluntary disease organizations that support ALS research activities. Key recommendations included 1) the establishment of shared databases among individual ALS clinics to enhance the coordination of resources and data analyses; 2) the expansion of quality-controlled human biospecimen banks; and 3) the adoption of uniform data standards, such as the recently developed Common Data Elements (CDEs) for ALS clinical research. The value of clinical trial networks such as the Northeast ALS (NEALS) Consortium and the Western ALS (WALS) Consortium was recognized, and strategies to further enhance and complement these networks and their research resources were discussed.
Clinical Bioinformatics: challenges and opportunities
2012-01-01
Background Network Tools and Applications in Biology (NETTAB) Workshops are a series of meetings focused on the most promising and innovative ICT tools and to their usefulness in Bioinformatics. The NETTAB 2011 workshop, held in Pavia, Italy, in October 2011 was aimed at presenting some of the most relevant methods, tools and infrastructures that are nowadays available for Clinical Bioinformatics (CBI), the research field that deals with clinical applications of bioinformatics. Methods In this editorial, the viewpoints and opinions of three world CBI leaders, who have been invited to participate in a panel discussion of the NETTAB workshop on the next challenges and future opportunities of this field, are reported. These include the development of data warehouses and ICT infrastructures for data sharing, the definition of standards for sharing phenotypic data and the implementation of novel tools to implement efficient search computing solutions. Results Some of the most important design features of a CBI-ICT infrastructure are presented, including data warehousing, modularity and flexibility, open-source development, semantic interoperability, integrated search and retrieval of -omics information. Conclusions Clinical Bioinformatics goals are ambitious. Many factors, including the availability of high-throughput "-omics" technologies and equipment, the widespread availability of clinical data warehouses and the noteworthy increase in data storage and computational power of the most recent ICT systems, justify research and efforts in this domain, which promises to be a crucial leveraging factor for biomedical research. PMID:23095472
NASA Astrophysics Data System (ADS)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats
2014-06-01
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt
Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less
DOT National Transportation Integrated Search
2014-12-01
The intent of this report is to provide (1) an initial assessment of National Airspace System (NAS) infrastructure affected by continuing development and deployment of unmanned aircraft systems into the NAS, and (2) a description of process challenge...
Mazhitova, Galina; Karstkarel, Nanka; Oberman, Naum; Romanovsky, Vladimir; Kuhry, Peter
2004-08-01
The relationship between permafrost conditions and the distribution of infrastructure in the Usa Basin, Northeast European Russia, is analyzed. About 75% of the Basin is underlain by permafrost terrain with various degrees of continuity (isolated patches to continuous permafrost). The region has a high level of urban and industrial development (e.g., towns, coal mines, hydrocarbon extraction sites, railway, pipelines). GIS-analyses indicate that about 60% of all infrastructure is located in the 'high risk' permafrost area, here defined as the zones of isolated to discontinuous permafrost (3-90% coverage) with 'warm' ground temperatures (0 to -2 degrees C). Ground monitoring, aerial photo interpretation, and permafrost modeling suggest a differential response to future global warming. Most of the permafrost-affected terrain will likely start to thaw within a few decades to a century. This forecast poses serious challenges to permafrost engineering and calls for long-term investments in adequate infrastructure that will pay back overtime.
Damage assessment of bridge infrastructure subjected to flood-related hazards
NASA Astrophysics Data System (ADS)
Michalis, Panagiotis; Cahill, Paul; Bekić, Damir; Kerin, Igor; Pakrashi, Vikram; Lapthorne, John; Morais, João Gonçalo Martins Paulo; McKeogh, Eamon
2017-04-01
Transportation assets represent a critical component of society's infrastructure systems. Flood-related hazards are considered one of the main climate change impacts on highway and railway infrastructure, threatening the security and functionality of transportation systems. Of such hazards, flood-induced scour is a primarily cause of bridge collapses worldwide and one of the most complex and challenging water flow and erosion phenomena, leading to structural instability and ultimately catastrophic failures. Evaluation of scour risk under severe flood events is a particularly challenging issue considering that depth of foundations is very difficult to evaluate in water environment. The continual inspection, assessment and maintenance of bridges and other hydraulic structures under extreme flood events requires a multidisciplinary approach, including knowledge and expertise of hydraulics, hydrology, structural engineering, geotechnics and infrastructure management. The large number of bridges under a single management unit also highlights the need for efficient management, information sharing and self-informing systems to provide reliable, cost-effective flood and scour risk management. The "Intelligent Bridge Assessment Maintenance and Management System" (BRIDGE SMS) is an EU/FP7 funded project which aims to couple state-of-the art scientific expertise in multidisciplinary engineering sectors with industrial knowledge in infrastructure management. This involves the application of integrated low-cost structural health monitoring systems to provide real-time information towards the development of an intelligent decision support tool and a web-based platform to assess and efficiently manage bridge assets. This study documents the technological experience and presents results obtained from the application of sensing systems focusing on the damage assessment of water-hazards at bridges over watercourses in Ireland. The applied instrumentation is interfaced with an open-source platform that can offer a more economical remote monitoring solution. The results presented in this investigation provide an important guide for a multidisciplinary approach to bridge monitoring and can be used as a benchmark for the field application of cost-effective and robust sensing methods. This will deliver key information regarding the impact of water-related hazards at bridge structures through an integrated structural health monitoring and management system. Acknowledgement: The authors wish to acknowledge the financial support of the European Commission, through the Marie Curie action Industry-Academia Partnership and Pathways Network BRIDGE SMS (Intelligent Bridge Assessment Maintenance and Management System) - FP7-People-2013-IAPP- 612517.
Antenna Beam Pattern Characteristics of HAPS User Terminal
NASA Astrophysics Data System (ADS)
Ku, Bon-Jun; Oh, Dae Sub; Kim, Nam; Ahn, Do-Seob
High Altitude Platform Stations (HAPS) are recently considered as a green infrastructure to provide high speed multimedia services. The critical issue of HAPS is frequency sharing with satellite systems. Regulating antenna beam pattern using adaptive antenna schemes is one of means to facilitate the sharing with a space receiver for fixed satellite services on the uplink of a HAPS system operating in U bands. In this letter, we investigate antenna beam pattern characteristics of HAPS user terminals with various values of scan angles of main beam, null position angles, and null width.
EUDAT: A New Cross-Disciplinary Data Infrastructure For Science
NASA Astrophysics Data System (ADS)
Lecarpentier, Damien; Michelini, Alberto; Wittenburg, Peter
2013-04-01
In recent years significant investments have been made by the European Commission and European member states to create a pan-European e-Infrastructure supporting multiple research communities. As a result, a European e-Infrastructure ecosystem is currently taking shape, with communication networks, distributed grids and HPC facilities providing European researchers from all fields with state-of-the-art instruments and services that support the deployment of new research facilities on a pan-European level. However, the accelerated proliferation of data - newly available from powerful new scientific instruments, simulations and the digitization of existing resources - has created a new impetus for increasing efforts and investments in order to tackle the specific challenges of data management, and to ensure a coherent approach to research data access and preservation. EUDAT is a pan-European initiative that started in October 2011 and which aims to help overcome these challenges by laying out the foundations of a Collaborative Data Infrastructure (CDI) in which centres offering community-specific support services to their users could rely on a set of common data services shared between different research communities. Although research communities from different disciplines have different ambitions and approaches - particularly with respect to data organization and content - they also share many basic service requirements. This commonality makes it possible for EUDAT to establish common data services, designed to support multiple research communities, as part of this CDI. During the first year, EUDAT has been reviewing the approaches and requirements of a first subset of communities from linguistics (CLARIN), solid earth sciences (EPOS), climate sciences (ENES), environmental sciences (LIFEWATCH), and biological and medical sciences (VPH), and shortlisted four generic services to be deployed as shared services on the EUDAT infrastructure. These services are data replication from site to site, data staging to compute facilities, metadata, and easy storage. A number of enabling services such as distributed authentication and authorization, persistent identifiers, hosting of services, workspaces and centre registry were also discussed. The services being designed in EUDAT will thus be of interest to a broad range of communities that lack their own robust data infrastructures, or that are simply looking for additional storage and/or computing capacities to better access, use, re-use, and preserve their data. The first pilots were completed in 2012 and a pre-production ready operational infrastructure, comprised of five sites (RZG, CINECA, SARA, CSC, FZJ), offering 480TB of online storage and 4PB of near-line (tape) storage, initially serving four user communities (ENES, EPOS, CLARIN, VPH) was established. These services shall be available to all communities in a production environment by 2014. Although EUDAT has initially focused on a subset of research communities, it aims to engage with other communities interested in adapting their solutions or contributing to the design of the infrastructure. Discussions with other research communities - belonging to the fields of environmental sciences, biomedical science, physics, social sciences and humanities - have already begun and are following a pattern similar to the one we adopted with the initial communities. The next step will consist of integrating representatives from these communities into the existing pilots and task forces so as to include them in the process of designing the services and, ultimately, shaping the future CDI.
IDEAL-NM Annual Report: School Year 2013-2014
ERIC Educational Resources Information Center
New Mexico Public Education Department, 2015
2015-01-01
Innovative Digital Education and Learning-New Mexico (IDEAL-NM) was created in response to the 2005 Performance and Accountability Contract, "Making Schools Work" to leverage technology. On October 27, 2006, the statewide e-learning program that would implement a shared e-learning infrastructure using a single statewide learning…
Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications
ERIC Educational Resources Information Center
Jung, Gueyoung
2010-01-01
Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-10
... infrastructure, and disruption to social networks. Amendment 16 to the Multispecies (groundfish) Fishery... Collection; Comment Request; Social Impacts of the Implementation of a Catch Shares Program in the Mid... . SUPPLEMENTARY INFORMATION: I. Abstract This request is for new information collection. Social Impact Assessment...
DOT National Transportation Integrated Search
2017-10-27
This report describes the system architecture and design of the Experimental Prototype System (EPS) for the demonstration of the use of mobile devices in a connected vehicle environment. Specifically, it defines the system structure and behavior, the...
DOT National Transportation Integrated Search
2016-07-13
This report describes the concept of operation for the use of mobile devices in a connected vehicle environment. Specifically, it identifies the needs, conceptual system, and potential scenarios that serve as the basis for demonstrating both safety a...
Leaks in the National Information Infrastructure Dam: Who Should Protect It?
2004-04-01
have paid off cyber criminals who threatened to attack their computer systems and destroy their data unless a ‘ransom’ was paid. These cyber...sharing information with law enforcement and appropriate industry groups will we be able to identify and prosecute cyber criminals , identify new
Capturing Teachers' Experience of Learning Design through Case Studies
ERIC Educational Resources Information Center
Masterman, Elizabeth; Jameson, Jill; Walker, Simon
2009-01-01
This article distinguishes three dimensions to learning design: a technological infrastructure, a conceptual framework for practice that focuses on the creation of structured sequences of learning activities, and a way to represent and share practice through the use of mediating artefacts. Focusing initially on the second of these dimensions, the…
Grossman, Robert L.; Heath, Allison; Murphy, Mark; Patterson, Maria; Wells, Walt
2017-01-01
Data commons collocate data, storage, and computing infrastructure with core services and commonly used tools and applications for managing, analyzing, and sharing data to create an interoperable resource for the research community. An architecture for data commons is described, as well as some lessons learned from operating several large-scale data commons. PMID:29033693
Extending the ARIADNE Web-Based Learning Environment.
ERIC Educational Resources Information Center
Van Durm, Rafael; Duval, Erik; Verhoeven, Bart; Cardinaels, Kris; Olivie, Henk
One of the central notions of the ARIADNE learning platform is a share-and-reuse approach toward the development of digital course material. The ARIADNE infrastructure includes a distributed database called the Knowledge Pool System (KPS), which acts as a repository of pedagogical material, described with standardized IEEE LTSC Learning Object…
DOT National Transportation Integrated Search
2013-11-01
The United States (US) Department of Transportation (USDOT) and the Road Bureau of Ministry of Land, Infrastructure, Transport, and Tourism (MLIT) of Japan have a long history of sharing information on ITS (Intelligent Transportation Systems) activit...
USDA-ARS?s Scientific Manuscript database
The fungal genus Fusarium includes many plant and/or animal pathogenic species and produces diverse toxins. Although accurate identification is critical for managing such threats, it is difficult to identify Fusarium morphologically. Fortunately, extensive molecular phylogenetic studies, founded on ...
FOREWORD: Structural Health Monitoring and Intelligent Infrastructure
NASA Astrophysics Data System (ADS)
Wu, Zhishen; Fujino, Yozo
2005-06-01
This special issue collects together 19 papers that were originally presented at the First International Conference on Structural Health Monitoring and Intelligent Infrastructure (SHMII-1'2003), held in Tokyo, Japan, on 13-15 November 2003. This conference was organized by the Japan Society of Civil Engineers (JSCE) with partial financial support from the Japan Society for the Promotion of Science (JSPS) and the Ministry of Education, Culture, Sport, Science and Technology, Japan. Many related organizations supported the conference. A total of 16 keynote papers including six state-of-the-art reports from different counties, six invited papers and 154 contributed papers were presented at the conference. The conference was attended by a diverse group of about 300 people from a variety of disciplines in academia, industry and government from all over the world. Structural health monitoring (SHM) and intelligent materials, structures and systems have been the subject of intense research and development in the last two decades and, in recent years, an increasing range of applications in infrastructure have been discovered both for existing structures and for new constructions. SHMII-1'2003 addressed progress in the development of building, transportation, marine, underground and energy-generating structures, and other civilian infrastructures that are periodically, continuously and/or actively monitored where there is a need to optimize their performance. In order to focus the current needs on SHM and intelligent technologies, the conference theme was set as 'Structures/Infrastructures Sustainability'. We are pleased to have the privilege to edit this special issue on SHM and intelligent infrastructure based on SHMII-1'2003. We invited some of the presenters to submit a revised/extended version of their paper that was included in the SHMII-1'2003 proceedings for possible publication in the special issue. Each paper included in this special issue was edited with the same quality standards as for any paper in a regular issue. The papers cover a wide spectrum of topics including smart and effective sensing technologies, reliable approaches to signal processing, rational data gathering and interpretation methods, advanced damage characterization, modeling feature selection and diagnosis methods, and system integration technologies, etc. This special issue contains the most up-to-date achievements in SHM and intelligent technologies and provides information pertaining to their current and potential applications in infrastructure. It is our hope that this special issue makes a significant contribution in advancing awareness and acceptance of SHM and intelligent technologies for the maintenance and construction of different kinds of infrastructure. We would like to express our sincere thanks to Professor Varadan (Editor-in-Chief), Professor Matsuzaki (Regional Editor), the Editorial Assistants and the staff at Institute of Physics Publishing for their great support and advice in publishing this special issue. Special thanks are due to all the reviewers for their willingness to share their time and expertise. Final but important thanks go to Ms Suzhen Li (Doctorate Candidate at Ibaraki University) for her assistance in editing this special issue.
NASA Astrophysics Data System (ADS)
Schaap, Dick M. A.; Fichaut, Michele
2013-04-01
The second phase of the project SeaDataNet started on October 2011 for another 4 years with the aim to upgrade the SeaDataNet infrastructure built during previous years. The numbers of the project are quite impressive: 59 institutions from 35 different countries are involved. In particular, 45 data centers are sharing human and financial resources in a common efforts to sustain an operationally robust and state-of-the-art Pan-European infrastructure for providing up-to-date and high quality access to ocean and marine metadata, data and data products. The main objective of SeaDataNet II is to improve operations and to progress towards an efficient data management infrastructure able to handle the diversity and large volume of data collected via the Pan-European oceanographic fleet and the new observation systems, both in real-time and delayed mode. The infrastructure is based on a semi-distributed system that incorporates and enhance the existing NODCs network. SeaDataNet aims at serving users from science, environmental management, policy making, and economical sectors. Better integrated data systems are vital for these users to achieve improved scientific research and results, to support marine environmental and integrated coastal zone management, to establish indicators of Good Environmental Status for sea basins, and to support offshore industry developments, shipping, fisheries, and other economic activities. The recent EU communication "MARINE KNOWLEDGE 2020 - marine data and observation for smart and sustainable growth" states that the creation of marine knowledge begins with observation of the seas and oceans. In addition, directives, policies, science programmes require reporting of the state of the seas and oceans in an integrated pan-European manner: of particular note are INSPIRE, MSFD, WISE-Marine and GMES Marine Core Service. These underpin the importance of a well functioning marine and ocean data management infrastructure. SeaDataNet is now one of the major players in informatics in oceanography and collaborative relationships have been created with other EU and non EU projects. In particular SeaDataNet has recognised roles in the continuous serving of common vocabularies, the provision of tools for data management, as well as giving access to metadata, data sets and data products of importance for society. The SeaDataNet infrastructure comprises a network of interconnected data centres and a central SeaDataNet portal. The portal provides users not only background information about SeaDataNet and the various SeaDataNet standards and tools, but also a unified and transparent overview of the metadata and controlled access to the large collections of data sets, managed by the interconnected data centres. The presentation will give information on present services of the SeaDataNet infrastructure and services, and highlight a number of key achievements in SeaDataNet II so far.
Unidata: A geoscience e-infrastructure for International Data Sharing
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan
2017-04-01
The Internet and its myriad manifestations, including the World Wide Web, have amply demonstrated the compounding benefits of a global cyberinfrastructure and the power of networked communities as institutions and people exchange knowledge, ideas, and resources. The Unidata Program recognizes those benefits, and over the past several years it has developed a growing portfolio of international data distribution activities, conducted in close collaboration with academic, research and operational institutions on several continents, to advance earth system science education and research. The portfolio includes provision of data, tools, support and training as well as outreach activities that bring various stakeholders together to address important issues, all toward the goals of building a community with a shared vision. The overarching goals of Unidata's international data sharing activities include: • democratization of access-to and use-of data that describe the dynamic earth system by facilitating data access to a broad spectrum of observations and forecasts • building capacity and empowering geoscientists and educators worldwide by building encouraging local communities where data, tools, and best practices in education and research are shared • strengthening international science partnerships for exchanging knowledge and expertise • Supporting faculty and students at research and educational institutions in the use of Unidata systems building regional and global communities around specific geoscientific themes. In this presentation, I will present Unidata's ongoing data sharing activities in Latin America, Europe, Africa and Antarctica that are enabling linkages to existing and emergent e-infrastructures and operational networks, including recent advances to develop interoperable data systems, tools, and services that benefit the geosciences. Particular emphasis in the presentation will be made to describe the examples of the use of Unidata's International Data Distribution Network, Local Data Manager, and THREDDS in various settings, as well as experiences and lessons learned with the implementation and benefits of the myriad data sharing efforts.
Planning for Bike Share Connectivity to Rail Transit
Griffin, Greg Phillip; Sener, Ipek Nese
2016-01-01
Bike sharing can play a role in providing access to transit stations and then to final destinations, but early implementation of these systems in North America has been opportunistic rather than strategic. This study evaluates local intermodal plan goals using trip data and associated infrastructure such as transit stops and bike share station locations in Austin, Texas, and Chicago, Illinois. Bike sharing use data from both cities suggest a weak relationship with existing rail stations that could be strengthened through collaborative, intermodal planning. The study suggests a planning framework and example language that could be tailored to help address the linkage between bike sharing and transit. Rather than an exhaustive study of the practice, this study provides evidence from these two cities that identify opportunities to improve intermodal planning. Cities that are planning or expanding a bike sharing system should consider carefully how to leverage this mode with existing modes of transport. Regardless of a city’s status in implementing a bike sharing system, planners can leverage information on existing transport systems for planning at regional and local levels. PMID:27872554
Educational technology infrastructure and services in North American medical schools.
Kamin, Carol; Souza, Kevin H; Heestand, Diane; Moses, Anna; O'Sullivan, Patricia
2006-07-01
To describe the current educational technology infrastructure and services provided by North American allopathic medical schools that are members of the Association of American Medical Colleges (AAMC), to present information needed for institutional benchmarking. A Web-based survey instrument was developed and administered in the fall of 2004 by the authors, sent to representatives of 137 medical schools and completed by representatives of 88, a response rate of 64%. Schools were given scores for infrastructure and services provided. Data were analyzed with one-way analyses of variance, chi-square, and correlation coefficients. There was no difference in the number of infrastructure features or services offered based on region of the country, public versus private schools, or size of graduating class. Schools implemented 3.0 (SD = 1.5) of 6 infrastructure items and offered 11.6 (SD = 4.1) of 22 services. Over 90% of schools had wireless access (97%), used online course materials for undergraduate medical education (97%), course management system for graduate medical education (95%) and online teaching evaluations (90%). Use of services differed across the undergraduate, graduate, and continuing medical education continuum. Outside of e-portfolios for undergraduates, the least-offered services were for services to graduate and continuing medical education. The results of this survey provide a benchmark for the level of services and infrastructure currently supporting educational technology by AAMC-member allopathic medical schools.
Infrastructure Commons in Economic Perspective
NASA Astrophysics Data System (ADS)
Frischmann, Brett M.
This chapter briefly summarizes a theory (developed in substantial detail elsewhere)1 that explains why there are strong economic arguments for managing and sustaining infrastructure resources in an openly accessible manner. This theory facilitates a better understanding of two related issues: how society benefits from infrastructure resources and how decisions about how to manage or govern infrastructure resources affect a wide variety of public and private interests. The key insights from this analysis are that infrastructure resources generate value as inputs into a wide range of productive processes and that the outputs from these processes are often public goods and nonmarket goods that generate positive externalities that benefit society as a whole. Managing such resources in an openly accessible manner may be socially desirable from an economic perspective because doing so facilitates these downstream productive activities. For example, managing the Internet infrastructure in an openly accessible manner facilitates active citizen involvement in the production and sharing of many different public and nonmarket goods. Over the last decade, this has led to increased opportunities for a wide range of citizens to engage in entrepreneurship, political discourse, social network formation, and community building, among many other activities. The chapter applies these insights to the network neutrality debate and suggests how the debate might be reframed to better account for the wide range of private and public interests at stake.
Decision analysis and risk models for land development affecting infrastructure systems.
Thekdi, Shital A; Lambert, James H
2012-07-01
Coordination and layering of models to identify risks in complex systems such as large-scale infrastructure of energy, water, and transportation is of current interest across application domains. Such infrastructures are increasingly vulnerable to adjacent commercial and residential land development. Land development can compromise the performance of essential infrastructure systems and increase the costs of maintaining or increasing performance. A risk-informed approach to this topic would be useful to avoid surprise, regret, and the need for costly remedies. This article develops a layering and coordination of models for risk management of land development affecting infrastructure systems. The layers are: system identification, expert elicitation, predictive modeling, comparison of investment alternatives, and implications of current decisions for future options. The modeling layers share a focus on observable factors that most contribute to volatility of land development and land use. The relevant data and expert evidence include current and forecasted growth in population and employment, conservation and preservation rules, land topography and geometries, real estate assessments, market and economic conditions, and other factors. The approach integrates to a decision framework of strategic considerations based on assessing risk, cost, and opportunity in order to prioritize needs and potential remedies that mitigate impacts of land development to the infrastructure systems. The approach is demonstrated for a 5,700-mile multimodal transportation system adjacent to 60,000 tracts of potential land development. © 2011 Society for Risk Analysis.
26 CFR 1.401-1 - Qualified pension, profit-sharing, and stock bonus plans.
Code of Federal Regulations, 2012 CFR
2012-04-01
... employees or their beneficiaries to participate in the profits of the employer's trade or business, or in... 26 Internal Revenue 5 2012-04-01 2011-04-01 true Qualified pension, profit-sharing, and stock... TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Pension, Profit-Sharing, Stock Bonus...
26 CFR 1.401-1 - Qualified pension, profit-sharing, and stock bonus plans.
Code of Federal Regulations, 2011 CFR
2011-04-01
... employees or their beneficiaries to participate in the profits of the employer's trade or business, or in... 26 Internal Revenue 5 2011-04-01 2011-04-01 false Qualified pension, profit-sharing, and stock... TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Pension, Profit-Sharing, Stock Bonus...
26 CFR 1.401-1 - Qualified pension, profit-sharing, and stock bonus plans.
Code of Federal Regulations, 2014 CFR
2014-04-01
... employees or their beneficiaries to participate in the profits of the employer's trade or business, or in... 26 Internal Revenue 5 2014-04-01 2014-04-01 false Qualified pension, profit-sharing, and stock... TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Pension, Profit-Sharing, Stock Bonus...
26 CFR 1.401-1 - Qualified pension, profit-sharing, and stock bonus plans.
Code of Federal Regulations, 2013 CFR
2013-04-01
... employees or their beneficiaries to participate in the profits of the employer's trade or business, or in... 26 Internal Revenue 5 2013-04-01 2013-04-01 false Qualified pension, profit-sharing, and stock... TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Pension, Profit-Sharing, Stock Bonus...
26 CFR 1.704-1 - Partner's distributive share.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 26 Internal Revenue 8 2014-04-01 2014-04-01 false Partner's distributive share. 1.704-1 Section 1.704-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Partners and Partnerships § 1.704-1 Partner's distributive share. (a) Effect of partnership agreement. A partner'...
26 CFR 1.704-1 - Partner's distributive share.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 26 Internal Revenue 8 2013-04-01 2013-04-01 false Partner's distributive share. 1.704-1 Section 1.704-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Partners and Partnerships § 1.704-1 Partner's distributive share. (a) Effect of partnership agreement. A partner'...
26 CFR 1.704-1 - Partner's distributive share.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 26 Internal Revenue 8 2012-04-01 2012-04-01 false Partner's distributive share. 1.704-1 Section 1.704-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Partners and Partnerships § 1.704-1 Partner's distributive share. (a) Effect of partnership agreement. A partner'...
26 CFR 1.704-1 - Partner's distributive share.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 26 Internal Revenue 8 2011-04-01 2011-04-01 false Partner's distributive share. 1.704-1 Section 1.704-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Partners and Partnerships § 1.704-1 Partner's distributive share. (a) Effect of partnership agreement. A partner'...
The p-medicine portal—a collaboration platform for research in personalised medicine
Schera, Fatima; Weiler, Gabriele; Neri, Elias; Kiefer, Stephan; Graf, Norbert
2014-01-01
The European project p-medicine creates an information technology infrastructure that facilitates the development from current medical practice to personalised medicine. The main access point to this infrastructure is the p-medicine portal that provides clinicians, patients, and researchers a platform to collaborate, share data and expertise, and use tools and services to improve personalised treatments of patients. In this document, we describe the community-based structure of the p-medicine portal and provide information about the p-medicine security framework implemented in the portal. Finally, we show the user interface and describe the p-medicine tools and services integrated in the portal. PMID:24567755
NASA Astrophysics Data System (ADS)
Herman, J. D.; Zeff, H. B.; Reed, P. M.; Characklis, G. W.
2013-12-01
In the Eastern United States, water infrastructure and institutional frameworks have evolved in a historically water-rich environment. However, large regional droughts over the past decade combined with continuing population growth have marked a transition to a state of water scarcity, for which current planning paradigms are ill-suited. Significant opportunities exist to improve the efficiency of water infrastructure via regional coordination, namely, regional 'portfolios' of water-related assets such as reservoirs, conveyance, conservation measures, and transfer agreements. Regional coordination offers the potential to improve reliability, cost, and environmental impact in the expected future state of the world, and, with informed planning, to improve robustness to future uncertainty. In support of this challenge, this study advances a multi-agent many-objective robust decision making (multi-agent MORDM) framework that blends novel computational search and uncertainty analysis tools to discover flexible, robust regional portfolios. Our multi-agent MORDM framework is demonstrated for four water utilities in the Research Triangle region of North Carolina, USA. The utilities supply nearly two million customers and have the ability to interact with one another via transfer agreements and shared infrastructure. We show that strategies for this region which are Pareto-optimal in the expected future state of the world remain vulnerable to performance degradation under alternative scenarios of deeply uncertain hydrologic and economic factors. We then apply the Patient Rule Induction Method (PRIM) to identify which of these uncertain factors drives the individual and collective vulnerabilities for the four cooperating utilities. Our results indicate that clear multi-agent tradeoffs emerge for attaining robustness across the utilities. Furthermore, the key factor identified for improving the robustness of the region's water supply is cooperative demand reduction. This type of approach is critically important given the risks and challenges posed by rising supply development costs, limits on new infrastructure, growing water demands and the underlying uncertainties associated with climate change. The proposed framework serves as a planning template for other historically water-rich regions which must now confront the reality of impending water scarcity.
Geospatial data infrastructure: The development of metadata for geo-information in China
NASA Astrophysics Data System (ADS)
Xu, Baiquan; Yan, Shiqiang; Wang, Qianju; Lian, Jian; Wu, Xiaoping; Ding, Keyong
2014-03-01
Stores of geoscience records are in constant flux. These stores are continually added to by new information, ideas and data, which are frequently revised. The geoscience record is in restrained by human thought and technology for handling information. Conventional methods strive, with limited success, to maintain geoscience records which are readily susceptible and renewable. The information system must adapt to the diversity of ideas and data in geoscience and their changes through time. In China, more than 400,000 types of important geological data are collected and produced in geological work during the last two decades, including oil, natural gas and marine data, mine exploration, geophysical, geochemical, remote sensing and important local geological survey and research reports. Numerous geospatial databases are formed and stored in National Geological Archives (NGA) with available formats of MapGIS, ArcGIS, ArcINFO, Metalfile, Raster, SQL Server, Access and JPEG. But there is no effective way to warrant that the quality of information is adequate in theory and practice for decision making. The need for fast, reliable, accurate and up-to-date information by providing the Geographic Information System (GIS) communities are becoming insistent for all geoinformation producers and users in China. Since 2010, a series of geoinformation projects have been carried out under the leadership of the Ministry of Land and Resources (MLR), including (1) Integration, update and maintenance of geoinformation databases; (2) Standards research on clusterization and industrialization of information services; (3) Platform construction of geological data sharing; (4) Construction of key borehole databases; (5) Product development of information services. "Nine-System" of the basic framework has been proposed for the development and improvement of the geospatial data infrastructure, which are focused on the construction of the cluster organization, cluster service, convergence, database, product, policy, technology, standard and infrastructure systems. The development of geoinformation stores and services put forward a need for Geospatial Data Infrastructure (GDI) in China. In this paper, some of the ideas envisaged into the development of metadata in China are discussed.
Continuity of operations/continuity of government for state-level transportation organizations.
DOT National Transportation Integrated Search
2011-09-01
The Homeland Security Presidential Directive 20 (HSPD-20) requires all local, state, tribal and territorial government agencies, : and private sector owners of critical infrastructure and key resources (CI/KR) to create a Continuity of Operations/Con...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, Jing; Lin, Zhenhong
2012-01-01
This paper studies the role of public charging infrastructure in increasing PHEV s share of driving on electricity and the resulting petroleum use reduction. Using vehicle activity data obtained from the GPS-tracking household travel survey in Austin, Texas, gasoline and electricity consumptions of PHEVs in real world driving context are estimated. Driver s within-day recharging behavior, constrained by travel activities and public charger network, is modeled as a boundedly rational decision and incorporated in the energy use estimation. The key findings from the Austin dataset include: (1) public charging infrastructure makes PHEV a competitive vehicle choice for consumers without amore » home charger; (2) providing sufficient public charging service is expected to significantly reduce petroleum consumption of PHEVs; and (3) public charging opportunities offer greater benefits for PHEVs with a smaller battery pack, as within-day recharges compensate battery capacity.« less
Current and future flood risk to railway infrastructure in Europe
NASA Astrophysics Data System (ADS)
Bubeck, Philip; Kellermann, Patric; Alfieri, Lorenzo; Feyen, Luc; Dillenardt, Lisa; Thieken, Annegret H.
2017-04-01
Railway infrastructure plays an important role in the transportation of freight and passengers across the European Union. According to Eurostat, more than four billion passenger-kilometres were travelled on national and international railway lines of the EU28 in 2014. To further strengthen transport infrastructure in Europe, the European Commission will invest another € 24.05 billion in the transnational transport network until 2020 as part of its new transport infrastructure policy (TEN-T), including railway infrastructure. Floods pose a significant risk to infrastructure elements. Damage data of recent flood events in Europe show that infrastructure losses can make up a considerable share of overall losses. For example, damage to state and municipal infrastructure in the federal state of Saxony (Germany) accounted for nearly 60% of overall losses during the large-scale event in June 2013. Especially in mountainous areas with little usable space available, roads and railway lines often follow floodplains or are located along steep and unsteady slopes. In Austria, for instance, the flood of 2013 caused € 75 million of direct damage to railway infrastructure. Despite the importance of railway infrastructure and its exposure to flooding, assessments of potential damage and risk (i.e. probability * damage) are still in its infancy compared with other sectors, such as the residential or industrial sector. Infrastructure-specific assessments at the regional scale are largely lacking. Regional assessment of potential damage to railway infrastructure has been hampered by a lack of infrastructure-specific damage models and data availability. The few available regional approaches have used damage models that assess damage to various infrastructure elements (e.g. roads, railway, airports and harbours) using one aggregated damage function and cost estimate. Moreover, infrastructure elements are often considerably underrepresented in regional land cover data, such as CORINE, due to their line shapes. To assess current and future damage and risk to railway infrastructure in Europe, we apply the damage model RAIL -' RAilway Infrastructure Loss' that was specifically developed for railway infrastructure using empirical damage data. To adequately and comprehensively capture the line-shaped features of railway infrastructure, the assessment makes use of the open-access data set of openrailway.org. Current and future flood hazard in Europe is obtained with the LISFLOOD-based pan-European flood hazard mapping procedure combined with ensemble projections of extreme streamflow for the current century based on EURO-CORDEX RCP 8.5 climate scenarios. The presentation shows first results of the combination of the hazard data and the model RAIL for Europe.
Research Infrastructure and Scientific Collections: The Supply and Demand of Scientific Research
NASA Astrophysics Data System (ADS)
Graham, E.; Schindel, D. E.
2016-12-01
Research infrastructure is essential in both experimental and observational sciences and is commonly thought of as single-sited facilities. In contrast, object-based scientific collections are distributed in nearly every way, including by location, taxonomy, geologic epoch, discipline, collecting processes, benefits sharing rules, and many others. These diffused collections may have been amassed for a particular discipline, but their potential for use and impact in other fields needs to be explored. Through a series of cross-disciplinary activities, Scientific Collections International (SciColl) has explored and developed new ways in which the supply of scientific collections can meet the demand of researchers in unanticipated ways. From cross-cutting workshops on emerging infectious diseases and food security, to an online portal of collections, SciColl aims to illustrate the scope and value of object-based scientific research infrastructure. As distributed infrastructure, the full impact of scientific collections to the research community is a result of discovering, utilizing, and networking these resources. Examples and case studies from infectious disease research, food security topics, and digital connectivity will be explored.
Examining Willingness to Attack Critical Infrastructure Online and Offline
ERIC Educational Resources Information Center
Holt, Thomas J.; Kilger, Max
2012-01-01
The continuing adoption of technologies by the general public coupled with the expanding reliance of critical infrastructures connected through the Internet has created unique opportunities for attacks by civilians and nation-states alike. Although governments are increasingly focusing on policies to deter nation-state level attacks, it is unclear…
50 CFR 86.10 - What does this regulation do?
Code of Federal Regulations, 2010 CFR
2010-10-01
... 50 Wildlife and Fisheries 6 2010-10-01 2010-10-01 false What does this regulation do? 86.10... (CONTINUED) FINANCIAL ASSISTANCE-WILDLIFE SPORT FISH RESTORATION PROGRAM BOATING INFRASTRUCTURE GRANT (BIG... Boating Infrastructure Grant (BIG) Program. “We” and “us” refers to the Fish and Wildlife Service. This...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-27
... Loan Programs Office into the process. The FRN also identifies the principles Western will continue using to ensure (1) that the Program is separate and distinct from Western's power marketing functions... obtain project funding. Table of Contents I. Definitions II. Principles III. Project Evaluation Criteria...
Danville Community College Information Technology General Plan, 1998-99.
ERIC Educational Resources Information Center
Danville Community Coll., VA.
This document describes technology usage, infrastructure and planning for Danville Community College. The Plan is divided into four sections: Introduction, Vision and Mission, Applications, and Infrastructure. The four major goals identified in Vision and Mission are: (1) to ensure the successful use of all technologies through continued training…
Boué, Stéphanie; Exner, Thomas; Ghosh, Samik; Belcastro, Vincenzo; Dokler, Joh; Page, David; Boda, Akash; Bonjour, Filipe; Hardy, Barry; Vanscheeuwijck, Patrick; Hoeng, Julia; Peitsch, Manuel
2017-01-01
The US FDA defines modified risk tobacco products (MRTPs) as products that aim to reduce harm or the risk of tobacco-related disease associated with commercially marketed tobacco products. Establishing a product’s potential as an MRTP requires scientific substantiation including toxicity studies and measures of disease risk relative to those of cigarette smoking. Best practices encourage verification of the data from such studies through sharing and open standards. Building on the experience gained from the OpenTox project, a proof-of-concept database and website ( INTERVALS) has been developed to share results from both in vivo inhalation studies and in vitro studies conducted by Philip Morris International R&D to assess candidate MRTPs. As datasets are often generated by diverse methods and standards, they need to be traceable, curated, and the methods used well described so that knowledge can be gained using data science principles and tools. The data-management framework described here accounts for the latest standards of data sharing and research reproducibility. Curated data and methods descriptions have been prepared in ISA-Tab format and stored in a database accessible via a search portal on the INTERVALS website. The portal allows users to browse the data by study or mechanism (e.g., inflammation, oxidative stress) and obtain information relevant to study design, methods, and the most important results. Given the successful development of the initial infrastructure, the goal is to grow this initiative and establish a public repository for 21 st-century preclinical systems toxicology MRTP assessment data and results that supports open data principles. PMID:29123642
A moral compass for management decision making: a healthcare CEO's reflections.
Donnellan, John J
2013-01-01
Ethical behavior is good for business in any organization. In healthcare, it results in better patient care, a more committed and satisfied staff, more efficient care delivery, and increased market share. But it requires leaders who have a broad view of the role that ethics programs--and an effective, sustained ethical culture--play. Ethical organizations have integrated and shared ethical values and practices, an effective ethics infrastructure, ongoing ethics education for staff at every level, ethical and morally courageous leaders, and a culture that is consistent with the organization's values. The mission, vision, and values statements of these organizations have been successfully translated into a set of shared values--a moral compass that guides behavior and decision making.
Accelerating Exploration Through the Sharing of Best Practices in Research Partnerships
NASA Technical Reports Server (NTRS)
Nall, Mark; Casas, Joseph
2004-01-01
This paper proposes the formation of an international panel of space related public/private partnerships for the purposes of sharing best practices among members. The exploration and development of space is too costly to be conducted by governments alone. Private industry has a significant role in creating needed technologies, and developing commercial space infrastructure, thereby allowing sustainable exploration to take place. Public/private partnerships between government and industry are key to fostering industrial participation in space. The spacefaring nations have, or are developing these partnerships. Those organizations forming these partnerships can benefit from sharing among each other best practices and lessons learned. In this way the common goal of space exploration and development can be more effectively pursued.
Leveraging EHR Data for Outcomes and Comparative Effectiveness Research in Oncology
Harris, Marcelline R.; Buyuktur, Ayse G.; Clark, Patricia M.; An, Lawrence C.; Hanauer, David A.
2012-01-01
Along with the increasing adoption of electronic health records (EHRs) are expectations that data collected within EHRs will be readily available for outcomes and comparative effectiveness research. Yet the ability to effectively share and reuse data depends on implementing and configuring EHRs with these goals in mind from the beginning. Data sharing and integration must be planned both locally as well as nationally. The rich data transmission and semantic infrastructure developed by the National Cancer Institute (NCI) for research provides an excellent example of moving beyond paper-based paradigms and exploiting the power of semantically robust, network-based systems, and engaging both domain and informatics expertise. Similar efforts are required to address current challenges in sharing EHR data. PMID:22948276
A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system
NASA Astrophysics Data System (ADS)
Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.
2014-06-01
The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.
Evolving models for medical physics education and training: a global perspective.
Sprawls, P
2008-01-01
There is a significant need for high-quality medical physics education and training in all countries to support effective and safe use of modern medical technology for both diagnostic and treatment purposes. This is, and will continue to be, achieved using appropriate technology to increase both the effectiveness and efficiency of educational activities everywhere in the world. While the applications of technology to education and training are relatively new, the successful applications are based on theories and principles of the learning process developed by two pioneers in the field, Robert Gagne and Edgar Dale.The work of Gagne defines the different levels of learning that can occur and is used to show the types and levels of learning that are required for the application of physics and engineering principles to achieve appropriate diagnostic and therapeutic results from modern technology. The learning outcomes are determined by the effectiveness of the learning activity or experience. The extensive work of Dale as formulated in his Cone of Experience relates the effectiveness to the efficiency of educational activities. A major challenge in education is the development and conduction of learning activities (classroom discussions, laboratory and applied experiences, individual study, etc) that provide an optimum balance between effectiveness and efficiency. New and evolving models of the educational process use technology as the infrastructure to support education that is both more effective and efficient.The goal is to use technology to enhance human performance for both learners (students) and learning facilitators (teachers). A major contribution to global education is the trend in the development of shared educational resources. Two models of programs to support this effort with open and free shared resources are Physical Principles of Medical Imaging Online (http://www.sprawls.org/resources) and AAPM Continuing Education Courses (http://www.aapm.org/international).
Evolving models for medical physics education and training: a global perspective
Sprawls, P
2008-01-01
There is a significant need for high-quality medical physics education and training in all countries to support effective and safe use of modern medical technology for both diagnostic and treatment purposes. This is, and will continue to be, achieved using appropriate technology to increase both the effectiveness and efficiency of educational activities everywhere in the world. While the applications of technology to education and training are relatively new, the successful applications are based on theories and principles of the learning process developed by two pioneers in the field, Robert Gagne and Edgar Dale. The work of Gagne defines the different levels of learning that can occur and is used to show the types and levels of learning that are required for the application of physics and engineering principles to achieve appropriate diagnostic and therapeutic results from modern technology. The learning outcomes are determined by the effectiveness of the learning activity or experience. The extensive work of Dale as formulated in his Cone of Experience relates the effectiveness to the efficiency of educational activities. A major challenge in education is the development and conduction of learning activities (classroom discussions, laboratory and applied experiences, individual study, etc) that provide an optimum balance between effectiveness and efficiency. New and evolving models of the educational process use technology as the infrastructure to support education that is both more effective and efficient. The goal is to use technology to enhance human performance for both learners (students) and learning facilitators (teachers). A major contribution to global education is the trend in the development of shared educational resources. Two models of programs to support this effort with open and free shared resources are Physical Principles of Medical Imaging Online (http://www.sprawls.org/resources) and AAPM Continuing Education Courses (http://www.aapm.org/international). PMID:21614309
DOT National Transportation Integrated Search
2011-08-01
The Homeland Security Presidential Directive 20 (HSPD-20) requires all local, state, tribal and territorial government agencies, and private sector owners of critical infrastructure and key resources (CI/KR) to create a Continuity of Operationsl:onti...
Time to consider sharing data extracted from trials included in systematic reviews.
Wolfenden, Luke; Grimshaw, Jeremy; Williams, Christopher M; Yoong, Sze Lin
2016-11-03
While the debate regarding shared clinical trial data has shifted from whether such data should be shared to how this is best achieved, the sharing of data collected as part of systematic reviews has received little attention. In this commentary, we discuss the potential benefits of coordinated efforts to share data collected as part of systematic reviews. There are a number of potential benefits of systematic review data sharing. Shared information and data obtained as part of the systematic review process may reduce unnecessary duplication, reduce demand on trialist to service repeated requests from reviewers for data, and improve the quality and efficiency of future reviews. Sharing also facilitates research to improve clinical trial and systematic review methods and supports additional analyses to address secondary research questions. While concerns regarding appropriate use of data, costs, or the academic return for original review authors may impede more open access to information extracted as part of systematic reviews, many of these issues are being addressed, and infrastructure to enable greater access to such information is being developed. Embracing systems to enable more open access to systematic review data has considerable potential to maximise the benefits of research investment in undertaking systematic reviews.
Data Management in Astrobiology: Challenges and Opportunities for an Interdisciplinary Community
Suomela, Todd; Malone, Jim
2014-01-01
Abstract Data management and sharing are growing concerns for scientists and funding organizations throughout the world. Funding organizations are implementing requirements for data management plans, while scientists are establishing new infrastructures for data sharing. One of the difficulties is sharing data among a diverse set of research disciplines. Astrobiology is a unique community of researchers, containing over 110 different disciplines. The current study reports the results of a survey of data management practices among scientists involved in the astrobiology community and the NASA Astrobiology Institute (NAI) in particular. The survey was administered over a 2-month period in the first half of 2013. Fifteen percent of the NAI community responded (n=114), and additional (n=80) responses were collected from members of an astrobiology Listserv. The results of the survey show that the astrobiology community shares many of the same concerns for data sharing as other groups. The benefits of data sharing are acknowledged by many respondents, but barriers to data sharing remain, including lack of acknowledgement, citation, time, and institutional rewards. Overcoming technical, institutional, and social barriers to data sharing will be a challenge into the future. Key Words: Data management—Data sharing—Data preservation. Astrobiology 14, 451–461. PMID:24840364
Kuchinke, Wolfgang; Krauth, Christian; Bergmann, René; Karakoyun, Töresin; Woollard, Astrid; Schluender, Irene; Braasch, Benjamin; Eckert, Martin; Ohmann, Christian
2016-07-07
In an unprecedented rate data in the life sciences is generated and stored in many different databases. An ever increasing part of this data is human health data and therefore falls under data protected by legal regulations. As part of the BioMedBridges project, which created infrastructures that connect more than 10 ESFRI research infrastructures (RI), the legal and ethical prerequisites of data sharing were examined employing a novel and pragmatic approach. We employed concepts from computer science to create legal requirement clusters that enable legal interoperability between databases for the areas of data protection, data security, Intellectual Property (IP) and security of biosample data. We analysed and extracted access rules and constraints from all data providers (databases) involved in the building of data bridges covering many of Europe's most important databases. These requirement clusters were applied to five usage scenarios representing the data flow in different data bridges: Image bridge, Phenotype data bridge, Personalised medicine data bridge, Structural data bridge, and Biosample data bridge. A matrix was built to relate the important concepts from data protection regulations (e.g. pseudonymisation, identifyability, access control, consent management) with the results of the requirement clusters. An interactive user interface for querying the matrix for requirements necessary for compliant data sharing was created. To guide researchers without the need for legal expert knowledge through legal requirements, an interactive tool, the Legal Assessment Tool (LAT), was developed. LAT provides researchers interactively with a selection process to characterise the involved types of data and databases and provides suitable requirements and recommendations for concrete data access and sharing situations. The results provided by LAT are based on an analysis of the data access and sharing conditions for different kinds of data of major databases in Europe. Data sharing for research purposes must be opened for human health data and LAT is one of the means to achieve this aim. In summary, LAT provides requirements in an interactive way for compliant data access and sharing with appropriate safeguards, restrictions and responsibilities by introducing a culture of responsibility and data governance when dealing with human data.
ERIC Educational Resources Information Center
Lynch, Clifford A.
1997-01-01
Union catalogs and distributed search systems are two ways users can locate materials in print and electronic formats. This article examines the advantages and limitations of both approaches and argues that they should be considered complementary rather than competitive. Discusses technologies creating linkage between catalogs and databases and…
Crowdteaching: Supporting Teaching as Designing in Collective Intelligence Communities
ERIC Educational Resources Information Center
Recker, Mimi; Yuan, Min; Ye, Lei
2014-01-01
The widespread availability of high-quality Web-based content offers new potential for supporting teachers as designers of curricula and classroom activities. When coupled with a participatory Web culture and infrastructure, teachers can share their creations as well as leverage from the best that their peers have to offer to support a collective…
"CrowdTeaching": Supporting Teacher Collective Intelligence Communities
ERIC Educational Resources Information Center
Recker, Mimi M.; Yuan, Min; Ye, Lei
2013-01-01
The widespread availability of high-quality Web-based tools and content offers new promise and potential for supporting teachers as creators of instructional activities. When coupled with a participatory Web culture and infrastructure, teachers can share their creations as well as leverage from the best that their peers have to offer to support a…
ICT and Information Strategies for a Knowledge Economy: The Indian Experience
ERIC Educational Resources Information Center
Ghosh, Maitrayee; Ghosh, Ipsheet
2009-01-01
Purpose: The purpose of this paper is to describe the progress India has made in its move towards a knowledge-based economy with details of how the Indian Government has demonstrated its commitment to the development of fundamental pillars of knowledge sharing infrastructure, knowledge workers and a knowledge innovation system. Libraries are…
ERIC Educational Resources Information Center
Whyman, Wynne
2003-01-01
A camp maintenance survey was completed by maintenance personnel from 99 camps. Results highlighted several important considerations: ensuring sufficient maintenance funds for aging infrastructure, including camp/property personnel in decision making, publicizing completed maintenance projects, examining long-term needs of the land, and adopting…
Documentary with Ephemeral Media: Curation Practices in Online Social Spaces
ERIC Educational Resources Information Center
Erickson, Ingrid
2010-01-01
New hardware such as mobile handheld devices and digital cameras; new online social venues such as social networking, microblogging, and online photo sharing sites; and new infrastructures such as the global positioning system are beginning to establish new practices--what the author refers to as "sociolocative"--that combine data about a physical…
Moving pathogen genomics out of the lab and into the clinic: what will it take?
Luheshi, Leila M; Raza, Sobia; Peacock, Sharon J
2015-12-30
Pathogen genomic analysis is a potentially transformative new approach to the clinical and public-health management of infectious diseases. Health systems investing in this technology will need to build infrastructure and develop policies that ensure genomic information can be generated, shared and acted upon in a timely manner.
The multiple resource inventory decision-making process
Victor A. Rudis
1993-01-01
A model of the multiple resource inventory decision-making process is presented that identifies steps in conducting inventories, describes the infrastructure, and points out knowledge gaps that are common to many interdisciplinary studies.Successful efforts to date suggest the need to bridge the gaps by sharing elements, maintain dialogue among stakeholders in multiple...
DOT National Transportation Integrated Search
2016-11-23
A growing demand for passenger and freight transportation, combined with limited capital to expand : the United States (U.S.) rail infrastructure, are creating pressure for a more efficient use of the current : line capacity. This is further exacerba...
DOT National Transportation Integrated Search
2017-10-25
The Task 8 D2X Hub Proof-of-Concept Test Evaluation Report provides results of the experimental data analysis performed in accordance with the experimental plan for the proof-of-concept version of the prototype system. The data set analyzed includes ...
Financing Higher Education in Kenya: Student Perceptions and Experiences
ERIC Educational Resources Information Center
Ngolovoi, Mary S.
2008-01-01
In response to declining governmental funding, cost-sharing in higher education and dual-track tuition policies were introduced in the 1990s in Kenya. The decline of government funding in higher education was a result of slow economic growth, competing public needs (such as health, elementary education, and infrastructure), and pressure to reduce…
Community-driven computational biology with Debian Linux
2010-01-01
Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984
Implicitly Coordinated Detect and Avoid Capability for Safe Autonomous Operation of Small UAS
NASA Technical Reports Server (NTRS)
Balachandran, Swee; Munoz, Cesar A.; Consiglio, Maria C.
2017-01-01
As the airspace becomes increasingly shared by autonomous small Unmanned Aerial Systems (UAS), there would be a pressing need for coordination strategies so that aircraft can safely and independently maneuver around obstacles, geofences, and traffic aircraft. Explicitly coordinating resolution strategies for small UAS would require additional components such as a reliable vehicle-to-vehicle communication infrastructure and standardized protocols for information exchange that could significantly increase the cost of deploying small UAS in a shared airspace. This paper explores a novel approach that enables multiple aircraft to implicitly coordinate their resolution maneuvers. By requiring all aircraft to execute the proposed approach deterministically, it is possible for all of them to implicitly agree on the region of airspace each will be occupying in a given time interval. The proposed approach lends itself to the construction of a suitable feedback mechanism that enables the real-time execution of an implicitly conflict-free path in a closed-loop manner dealing with uncertainties in aircraft speed. If a network infrastructure is available, the proposed approach can also exploit the benefits of explicit information.
Mougin, Christian; Azam, Didier; Caquet, Thierry; Cheviron, Nathalie; Dequiedt, Samuel; Le Galliard, Jean-François; Guillaume, Olivier; Houot, Sabine; Lacroix, Gérard; Lafolie, François; Maron, Pierre-Alain; Michniewicz, Radika; Pichot, Christian; Ranjard, Lionel; Roy, Jacques; Zeller, Bernd; Clobert, Jean; Chanzy, André
2015-10-01
The infrastructure for Analysis and Experimentation on Ecosystems (AnaEE-France) is an integrated network of the major French experimental, analytical, and modeling platforms dedicated to the biological study of continental ecosystems (aquatic and terrestrial). This infrastructure aims at understanding and predicting ecosystem dynamics under global change. AnaEE-France comprises complementary nodes offering access to the best experimental facilities and associated biological resources and data: Ecotrons, seminatural experimental platforms to manipulate terrestrial and aquatic ecosystems, in natura sites equipped for large-scale and long-term experiments. AnaEE-France also provides shared instruments and analytical platforms dedicated to environmental (micro) biology. Finally, AnaEE-France provides users with data bases and modeling tools designed to represent ecosystem dynamics and to go further in coupling ecological, agronomical, and evolutionary approaches. In particular, AnaEE-France offers adequate services to tackle the new challenges of research in ecotoxicology, positioning its various types of platforms in an ecologically advanced ecotoxicology approach. AnaEE-France is a leading international infrastructure, and it is pioneering the construction of AnaEE (Europe) infrastructure in the field of ecosystem research. AnaEE-France infrastructure is already open to the international community of scientists in the field of continental ecotoxicology.
Building a multidisciplinary e-infrastructure for the NextData Community
NASA Astrophysics Data System (ADS)
Nativi, Stefano; Rorro, Marco; Mazzetti, Paolo; Fiameni, Giuseppe; Papeschi, Fabrizio; Carpenè, Michele
2014-05-01
In 2012, Italy decided to launch a national initiative called NextData (http://www.nextdataproject.it/): a national system for the retrieval, storage, access and diffusion of environmental and climate data from mountain and marine areas. NextData is funded by the Research and University Ministry, as a "Project of Interest". In 2013, NextData funded a "special project", the NextData System of Systems Infrastructure project (ND-SoS-Ina). The main objective is to design, build and operate in production the NextData multidisciplinary and multi-organizational e-infrastructure for the publication and sharing of its resources (e.g. data, services, vocabularies, models). SoS-Ina realizes the NextData general portal implementing the interoperability among the data archives carried out by NextData. The Florentine Division of the Institute of Atmospheric Pollution Research of CNR (CNR-IIA) and CINECA run the project. SoS-Ina (http://essi-lab.eu/nextdata/sosina/) decided to adopt a "System of Systems" (SoS) approach based on a brokering architecture. This has been pursued by applying the brokering technology first developed by the EC-FP7 EuroGEOSS project (http://www.eurogeoss.eu/broker/Pages/AbouttheEuroGEOSSBroker.aspx) and more recently consolidated by the international programme GEOSS (Global Earth Observation System of Systems) of GEO (Group oh Earth Observation) -see http://www.earthobservations.org/documents/geo_ix/20111122_geoss_implementation_highlights.pdf. The NextData general Portal architecture definition will proceed accordingly with the requirements elicited by user communities. The portal will rely on services and interfaces being offered by the brokering middleware and will be based on Liferay (http://www.liferay.com/). Liferay is free and open source, it provides many built-in applications for social collaboration, content and document management. Liferay is also configurable for high availability. The project considers three distinct phases and related milestones: (a) the first prototype of the NextData SoS infrastructure, implementing the core functionalities; (b) the consolidated version of the NextData SoS infrastructure, implementing advanced functionalities; (c) the final and operative NextData SoS infrastructure for data and information sharing and publication. An important outcome of the project will be the performances and scalability advancement of the current brokering and portal technologies, exploiting resources and middleware services provided by CINECA.
Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.
Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor
2016-01-01
In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.
Comparing New Zealand's 'Middle Out' health information technology strategy with other OECD nations.
Bowden, Tom; Coiera, Enrico
2013-05-01
Implementation of efficient, universally applied, computer to computer communications is a high priority for many national health systems. As a consequence, much effort has been channelled into finding ways in which a patient's previous medical history can be made accessible when needed. A number of countries have attempted to share patients' records, with varying degrees of success. While most efforts to create record-sharing architectures have relied upon government-provided strategy and funding, New Zealand has taken a different approach. Like most British Commonwealth nations, New Zealand has a 'hybrid' publicly/privately funded health system. However its information technology infrastructure and automation has largely been developed by the private sector, working closely with regional and central government agencies. Currently the sector is focused on finding ways in which patient records can be shared amongst providers across three different regions. New Zealand's healthcare IT model combines government contributed funding, core infrastructure, facilitation and leadership with private sector investment and skills and is being delivered via a set of controlled experiments. The net result is a 'Middle Out' approach to healthcare automation. 'Middle Out' relies upon having a clear, well-articulated health-reform strategy and a determination by both public and private sector organisations to implement useful healthcare IT solutions by working closely together. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Timpka, Toomas; Nordqvist, Cecilia; Lindqvist, Kent
2009-03-09
Safety promotion is planned and practised not only by public health organizations, but also by other welfare state agencies, private companies and non-governmental organizations. The term 'infrastructure' originally denoted the underlying resources needed for warfare, e.g. roads, industries, and an industrial workforce. Today, 'infrastructure' refers to the physical elements, organizations and people needed to run projects in different societal arenas. The aim of this study was to examine associations between infrastructure and local implementation of safety policies in injury prevention and safety promotion programs. Qualitative data on municipalities in Sweden designated as Safe Communities were collected from focus group interviews with municipal politicians and administrators, as well as from policy documents, and materials published on the Internet. Actor network theory was used to identify weaknesses in the present infrastructure and determine strategies that can be used to resolve these. The weakness identification analysis revealed that the factual infrastructure available for effectuating national strategies varied between safety areas and approaches, basically reflecting differences between bureaucratic and network-based organizational models. At the local level, a contradiction between safety promotion and the existence of quasi-markets for local public service providers was found to predispose for a poor local infrastructure diminishing the interest in integrated inter-agency activities. The weakness resolution analysis showed that development of an adequate infrastructure for safety promotion would require adjustment of the legal framework regulating injury data exchange, and would also require rational financial models for multi-party investments in local infrastructures. We found that the "silo" structure of government organization and assignment of resources was a barrier to collaborative action for safety at a community level. It may therefore be overly optimistic to take for granted that different approaches to injury control, such as injury prevention and safety promotion, can share infrastructure. Similarly, it may be unrealistic to presuppose that safety promotion can reach its potential in terms of injury rate reductions unless the critical infrastructure for this is in place. Such an alignment of the infrastructure to organizational processes requires more than financial investments.
Big data from small data: data-sharing in the ‘long tail’ of neuroscience
Ferguson, Adam R; Nielson, Jessica L; Cragin, Melissa H; Bandrowski, Anita E; Martone, Maryann E
2016-01-01
The launch of the US BRAIN and European Human Brain Projects coincides with growing international efforts toward transparency and increased access to publicly funded research in the neurosciences. The need for data-sharing standards and neuroinformatics infrastructure is more pressing than ever. However, ‘big science’ efforts are not the only drivers of data-sharing needs, as neuroscientists across the full spectrum of research grapple with the overwhelming volume of data being generated daily and a scientific environment that is increasingly focused on collaboration. In this commentary, we consider the issue of sharing of the richly diverse and heterogeneous small data sets produced by individual neuroscientists, so-called long-tail data. We consider the utility of these data, the diversity of repositories and options available for sharing such data, and emerging best practices. We provide use cases in which aggregating and mining diverse long-tail data convert numerous small data sources into big data for improved knowledge about neuroscience-related disorders. PMID:25349910
A web-portal for interactive data exploration, visualization, and hypothesis testing
Bartsch, Hauke; Thompson, Wesley K.; Jernigan, Terry L.; Dale, Anders M.
2014-01-01
Clinical research studies generate data that need to be shared and statistically analyzed by their participating institutions. The distributed nature of research and the different domains involved present major challenges to data sharing, exploration, and visualization. The Data Portal infrastructure was developed to support ongoing research in the areas of neurocognition, imaging, and genetics. Researchers benefit from the integration of data sources across domains, the explicit representation of knowledge from domain experts, and user interfaces providing convenient access to project specific data resources and algorithms. The system provides an interactive approach to statistical analysis, data mining, and hypothesis testing over the lifetime of a study and fulfills a mandate of public sharing by integrating data sharing into a system built for active data exploration. The web-based platform removes barriers for research and supports the ongoing exploration of data. PMID:24723882
A secure EHR system based on hybrid clouds.
Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke
2012-10-01
Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-19
... market participants that the program is experimental and that NASDAQ may choose not to continue the... not only the costs of the data distribution infrastructure, but also the costs of designing... infrastructure--that have risen. The same holds true for execution services; despite numerous enhancements to...
Quality Support Infrastructure in Early Childhood: Still (Mostly) Missing
ERIC Educational Resources Information Center
Azzi-Lessing, Lenette
2009-01-01
Support for early care and education among policy makers and the public is at an unprecedented high. As investments in early care and education programs in the United States continue to rise, the issue of quality becomes increasingly critical. This article addresses the need for adequate infrastructure to support high-quality early care and…
Capacity for research in minority health: the need for infrastructure plus will.
Pearson, T A
2001-11-01
Cardiovascular mortality has continued to decline, but racial disparities in cardiovascular diseases (CVD) continue to grow. To build the capacity to address these racial disparities, two things will be required. First, a research and policy infrastructure must be in place to provide guidance on what to do and how to do it. Second, the will to implement and activate this infrastructure must be present at the community and policy-making levels. The Jackson Heart Study is an example of a research infrastructure with the economic resources, scientific expertise, and technical manpower required to monitor, organize, assess, and follow a cohort of individuals over time to study the burden, natural history, predictive factors, and level of care for CVD in an African American community. The creation of will within the community for CVD research may require additional strategies than in the majority community, such as community organization and local policy development. These additional efforts at the community level should create a fertile environment to develop research and, ultimately, test strategies for reducing national disparities in cardiovascular health.
Interoperability Assets for Patient Summary Components: A Gap Analysis.
Heitmann, Kai U; Cangioli, Giorgio; Melgara, Marcello; Chronaki, Catherine
2018-01-01
The International Patient Summary (IPS) standards aim to define the specifications for a minimal and non-exhaustive Patient Summary, which is specialty-agnostic and condition-independent, but still clinically relevant. Meanwhile, health systems are developing and implementing their own variation of a patient summary while, the eHealth Digital Services Infrastructure (eHDSI) initiative is deploying patient summary services across countries in the Europe. In the spirit of co-creation, flexible governance, and continuous alignment advocated by eStandards, the Trillum-II initiative promotes adoption of the patient summary by engaging standards organizations, and interoperability practitioners in a community of practice for digital health to share best practices, tools, data, specifications, and experiences. This paper compares operational aspects of patient summaries in 14 case studies in Europe, the United States, and across the world, focusing on how patient summary components are used in practice, to promote alignment and joint understanding that will improve quality of standards and lower costs of interoperability.
Krümpel, Johannes Hagen; Illi, Lukas; Lemmer, Andreas
2018-03-01
As a consequence of a growing share of solar and wind power, recent research on biogas production highlighted a need for demand-orientated, flexible gas production to provide grid services and enable a decentralized stabilization of the electricity infrastructure. Two-staged anaerobic digestion is particularly suitable for shifting the methane production into times of higher demand due to the spatio-temporal separation of hydrolysis and methanogenesis. To provide a basis for predicting gas production in an anaerobic filter, kinetic parameters of gas production have been determined experimentally in this study. A new methodology is used, enabling their determination during continuous operation. An order in methane production rate could be established by comparing the half lives of methane production. The order was beginning with the fastest: acetic acid>ethanol>butyric acid>iso-butyric acid>valeric acid>propionic acid>1,2propanediol>lactic acid. However, the mixture of a natural hydrolysate from the acidification tank appeared to produce methane faster than all single components tested.
Wilson, Danyell S.; Fang, Bin; Dalton, William S.; Meade, Cathy; Koomen, John M.
2012-01-01
The National Cancer Institute’s Center to Reduce Cancer Health Disparities has created pilot training opportunities under the “Continuing Umbrella of Research Experiences” (CURE) program that focus on emerging technologies (ET). In this pilot project, an eighteen month cancer biology research internship was reinforced with: instruction in an emerging technology (proteomics), a transition from the undergraduate laboratory to a research setting, education in cancer health disparities, and community outreach activities. A major goal was to provide underrepresented undergraduates with hands-on research experiences that are rarely encountered at the undergraduate level, including mentoring, research presentations, and participation in local and national meetings. These opportunities provided education and career development for the undergraduates, and they have given each student the opportunity to transition from learning to sharing their knowledge and from being mentored to mentoring others. Here, we present the concepts, curriculum, infrastructure, and challenges for this training program along with evaluations by both the students and their mentors. PMID:22528637
Wilson, Danyell S; Fang, Bin; Dalton, William S; Meade, Cathy D; Koomen, John M
2012-06-01
The National Cancer Institute's Center to Reduce Cancer Health Disparities has created pilot training opportunities under the "Continuing Umbrella of Research Experiences" program that focus on emerging technologies. In this pilot project, an 18-month cancer biology research internship was reinforced with: instruction in an emerging technology (proteomics), a transition from the undergraduate laboratory to a research setting, education in cancer health disparities, and community outreach activities. A major goal was to provide underrepresented undergraduates with hands-on research experiences that are rarely encountered at the undergraduate level, including mentoring, research presentations, and participation in local and national meetings. These opportunities provided education and career development for the undergraduates, and they have given each student the opportunity to transition from learning to sharing their knowledge and from being mentored to mentoring others. Here, we present the concepts, curriculum, infrastructure, and challenges for this training program along with evaluations by both the students and their mentors.
Democratizing molecular diagnostics for the developing world.
Abou Tayoun, Ahmad N; Burchard, Paul R; Malik, Imran; Scherer, Axel; Tsongalis, Gregory J
2014-01-01
Infectious diseases that are largely treatable continue to pose a tremendous burden on the developing world despite the availability of highly potent drugs. The high mortality and morbidity rates of these diseases are largely due to a lack of affordable diagnostics that are accessible to resource-limited areas and that can deliver high-quality results. In fact, modified molecular diagnostics for infectious diseases were rated as the top biotechnology to improve health in developing countries. In this review, we describe the characteristics of accessible molecular diagnostic tools and discuss the challenges associated with implementing such tools at low infrastructure sites. We highlight our experience as part of the "Grand Challenge" project supported by the Gates Foundation for addressing global health inequities and describe issues and solutions associated with developing adequate technologies or molecular assays needed for broad access in the developing world. We believe that sharing this knowledge will facilitate the development of new molecular technologies that are extremely valuable for improving global health.
Mapping the organization: a bibliometric analysis of nurses' contributions to the literature.
Goode, Colleen J; McCarty, Lauren B; Fink, Regina M; Oman, Kathleen S; Makic, MaryBeth Flynn; Krugman, Mary E; Traditi, Lisa
2013-09-01
The aim of this study was to map an academic hospital's nursing contributions to the literature using bibliometric methods. Nurse executives continue to search for ways to share knowledge gained in the clinical setting. Manuscripts from clinical nurses must increase to advance the science of nursing practice and nursing administration. A search of electronic databases and curriculum vitae provided bibliographic data for University of Colorado Hospital (UCH) nurses from 1990 to 2012. Bibliometric techniques were used for publication counts and citation analysis. A review of the infrastructure supporting scholarly work was undertaken. A total of 191 journal articles, 9 books, 103 book chapters, 5 manuals, and 46 manual chapters were published by UCH nurses. Author productivity steadily increased. Citation analysis indicated that the works published were used by others. The h-index for UCH authors was 25. The hospital culture, interdisciplinary practice, and the role of the research nurse scientists had an impact on study results.
Incorporating Human Movement Behavior into the Analysis of Spatially Distributed Infrastructure.
Wu, Lihua; Leung, Henry; Jiang, Hao; Zheng, Hong; Ma, Li
2016-01-01
For the first time in human history, the majority of the world's population resides in urban areas. Therefore, city managers are faced with new challenges related to the efficiency, equity and quality of the supply of resources, such as water, food and energy. Infrastructure in a city can be viewed as service points providing resources. These service points function together as a spatially collaborative system to serve an increasing population. To study the spatial collaboration among service points, we propose a shared network according to human's collective movement and resource usage based on data usage detail records (UDRs) from the cellular network in a city in western China. This network is shown to be not scale-free, but exhibits an interesting triangular property governed by two types of nodes with very different link patterns. Surprisingly, this feature is consistent with the urban-rural dualistic context of the city. Another feature of the shared network is that it consists of several spatially separated communities that characterize local people's active zones but do not completely overlap with administrative areas. According to these features, we propose the incorporation of human movement into infrastructure classification. The presence of well-defined spatially separated clusters confirms the effectiveness of this approach. In this paper, our findings reveal the spatial structure inside a city, and the proposed approach provides a new perspective on integrating human movement into the study of a spatially distributed system.
Haimes, Yacov Y
2012-11-01
Natural and human-induced disasters affect organizations in myriad ways because of the inherent interconnectedness and interdependencies among human, cyber, and physical infrastructures, but more importantly, because organizations depend on the effectiveness of people and on the leadership they provide to the organizations they serve and represent. These human-organizational-cyber-physical infrastructure entities are termed systems of systems. Given the multiple perspectives that characterize them, they cannot be modeled effectively with a single model. The focus of this article is: (i) the centrality of the states of a system in modeling; (ii) the efficacious role of shared states in modeling systems of systems, in identification, and in the meta-modeling of systems of systems; and (iii) the contributions of the above to strategic preparedness, response to, and recovery from catastrophic risk to such systems. Strategic preparedness connotes a decision-making process and its associated actions. These must be: implemented in advance of a natural or human-induced disaster, aimed at reducing consequences (e.g., recovery time, community suffering, and cost), and/or controlling their likelihood to a level considered acceptable (through the decisionmakers' implicit and explicit acceptance of various risks and tradeoffs). The inoperability input-output model (IIM), which is grounded on Leontief's input/output model, has enabled the modeling of interdependent subsystems. Two separate modeling structures are introduced. These are: phantom system models (PSM), where shared states constitute the essence of modeling coupled systems; and the IIM, where interdependencies among sectors of the economy are manifested by the Leontief matrix of technological coefficients. This article demonstrates the potential contributions of these two models to each other, and thus to more informative modeling of systems of systems schema. The contributions of shared states to this modeling and to systems identification are presented with case studies. © 2012 Society for Risk Analysis.
The Internet of Samples in the Earth Sciences (iSamples)
NASA Astrophysics Data System (ADS)
Carter, M. R.; Lehnert, K. A.
2015-12-01
Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical samples. Creating awareness of the need to include physical samples in discussions of reproducible science is another priority of the iSamples RCN.
IEDA: Making Small Data BIG Through Interdisciplinary Partnerships Among Long-tail Domains
NASA Astrophysics Data System (ADS)
Lehnert, K. A.; Carbotte, S. M.; Arko, R. A.; Ferrini, V. L.; Hsu, L.; Song, L.; Ghiorso, M. S.; Walker, D. J.
2014-12-01
The Big Data world in the Earth Sciences so far exists primarily for disciplines that generate massive volumes of observational or computed data using large-scale, shared instrumentation such as global sensor networks, satellites, or high-performance computing facilities. These data are typically managed and curated by well-supported community data facilities that also provide the tools for exploring the data through visualization or statistical analysis. In many other domains, especially those where data are primarily acquired by individual investigators or small teams (known as 'Long-tail data'), data are poorly shared and integrated, lacking a community-based data infrastructure that ensures persistent access, quality control, standardization, and integration of data, as well as appropriate tools to fully explore and mine the data within the context of broader Earth Science datasets. IEDA (Integrated Earth Data Applications, www.iedadata.org) is a data facility funded by the US NSF to develop and operate data services that support data stewardship throughout the full life cycle of observational data in the solid earth sciences, with a focus on the data management needs of individual researchers. IEDA builds on a strong foundation of mature disciplinary data systems for marine geology and geophysics, geochemistry, and geochronology. These systems have dramatically advanced data resources in those long-tail Earth science domains. IEDA has strengthened these resources by establishing a consolidated, enterprise-grade infrastructure that is shared by the domain-specific data systems, and implementing joint data curation and data publication services that follow community standards. In recent years, other domain-specific data efforts have partnered with IEDA to take advantage of this infrastructure and improve data services to their respective communities with formal data publication, long-term preservation of data holdings, and better sustainability. IEDA hopes to foster such partnerships with streamlined data services, including user-friendly, single-point interfaces for data submission, discovery, and access across the partner systems to support interdisciplinary science.
Management of Knowledge Representation Standards Activities
NASA Technical Reports Server (NTRS)
Patil, Ramesh S. (Principal Investigator)
1993-01-01
This report describes the efforts undertaken over the last two years to identify the issues underlying the current difficulties in sharing and reuse, and a community wide initiative to overcome them. First, we discuss four bottlenecks to sharing and reuse, present a vision of a future in which these bottlenecks have been ameliorated, and describe the efforts of the initiative's four working groups to address these bottlenecks. We then address the supporting technology and infrastructure that is critical to enabling the vision of the future. Finally, we consider topics of longer-range interest by reviewing some of the research issues raised by our vision.
NASA Technical Reports Server (NTRS)
Buquo, Lynn E.; Johnson-Throop, Kathy A.
2011-01-01
An Information Architecture facilitates the understanding and, hence, harnessing of the human system risk-related data supply chain which enhances the ability to securely collect, integrate, and share data assets that improve human system research and operations. By mapping the risk-related data flow from raw data to useable information and knowledge (think of it as a data supply chain), the Human Research Program (HRP) and Space Life Science Directorate (SLSD) are building an information architecture plan to leverage their existing, and often shared, IT infrastructure.
Geographic Hotspots of Critical National Infrastructure.
Thacker, Scott; Barr, Stuart; Pant, Raghav; Hall, Jim W; Alderson, David
2017-12-01
Failure of critical national infrastructures can result in major disruptions to society and the economy. Understanding the criticality of individual assets and the geographic areas in which they are located is essential for targeting investments to reduce risks and enhance system resilience. Within this study we provide new insights into the criticality of real-life critical infrastructure networks by integrating high-resolution data on infrastructure location, connectivity, interdependence, and usage. We propose a metric of infrastructure criticality in terms of the number of users who may be directly or indirectly disrupted by the failure of physically interdependent infrastructures. Kernel density estimation is used to integrate spatially discrete criticality values associated with individual infrastructure assets, producing a continuous surface from which statistically significant infrastructure criticality hotspots are identified. We develop a comprehensive and unique national-scale demonstration for England and Wales that utilizes previously unavailable data from the energy, transport, water, waste, and digital communications sectors. The testing of 200,000 failure scenarios identifies that hotspots are typically located around the periphery of urban areas where there are large facilities upon which many users depend or where several critical infrastructures are concentrated in one location. © 2017 Society for Risk Analysis.
NASA Astrophysics Data System (ADS)
Foglini, Federica; Grande, Valentina; De Leo, Francesco; Mantovani, Simone; Ferraresi, Sergio
2017-04-01
EVER-EST offers a framework based on advanced services delivered both at the e-infrastructure and domain-specific level, with the objective of supporting each phase of the Earth Science Research and Information Lifecycle. It provides innovative e-research services to Earth Science user communities for communication, cross-validation and the sharing of knowledge and science outputs. The project follows a user-centric approach: real use cases taken from pre-selected Virtual Research Communities (VRC) covering different Earth Science research scenarios drive the implementation of the Virtual Research Environment (VRE) services and capabilities. The Sea Monitoring community is involved in the evaluation of the EVER-EST infrastructure. The community of potential users is wide and heterogeneous including both multi-disciplinary scientists and national/international agencies and authorities (e.g. MPAs directors, technicians from regional agencies like ARPA in Italy, the technicians working for the Ministry of the Environment) dealing with the adoption of a better way of measuring the quality of the environment. The scientific community has the main role of assessing the best criteria and indicators for defining the Good Environmental Status (GES) in their own sub regions, and implementing methods, protocols and tools for monitoring the GES descriptors. According to the Marine Strategy Framework Directive (MSFD), the environmental status of marine waters is defined by 11 descriptors, and forms a proposed set of 29 associated criteria and 56 different indicators. The objective of the Sea Monitoring VRC is to provide useful and applicable contributions to the evaluation of the descriptors: D1.Biodiversity, D2.Non-indigenous species and D6.Seafloor Integrity (http://ec.europa.eu/environment/marine/good-environmental-status/index_en.htm). The main challenges for the community members are: 1. discovery of existing data and products distributed among different infrastructures; 2. sharing methodologies about the GES evaluation and monitoring; 3. working on the same workflows and data; 4. adopting shared powerful tools for data processing (e.g. software and servers). The Sea Monitoring portal provides the VRC users with tools and services aimed at enhancing their ability to interoperate and share knowledge, experience and methods for GES assessment and monitoring, such as: •digital information services for data management, exploitation and preservation (accessibility of heterogeneous data sources including associated documentation); •e-collaboration services to communicate and share knowledge, ideas, protocols and workflows; •e-learning services to facilitate the use of common workflows for assessing GES indicators; •e-research services for workflow management, validation and verification, as well as visualization and interactive services. The current study is co-financed by the European Union's Horizon 2020 research and innovation programme under the EVER-EST project (Grant Agreement No. 674907).
Electronic manufacturing and packaging in Japan
NASA Technical Reports Server (NTRS)
Kelly, Michael J.; Boulton, William R. (Editor); Kukowski, John A.; Meieran, Eugene S.; Pecht, Michael; Peeples, John W.; Tummala, Rao R.
1995-01-01
This report summarizes the status of electronic manufacturing and packaging technology in Japan in comparison to that in the United States, and its impact on competition in electronic manufacturing in general. In addition to electronic manufacturing technologies, the report covers technology and manufacturing infrastructure, electronics manufacturing and assembly, quality assurance and reliability in the Japanese electronics industry, and successful product realization strategies. The panel found that Japan leads the United States in almost every electronics packaging technology. Japan clearly has achieved a strategic advantage in electronics production and process technologies. Panel members believe that Japanese competitors could be leading U.S. firms by as much as a decade in some electronics process technologies. Japan has established this marked competitive advantage in electronics as a consequence of developing low-cost, high-volume consumer products. Japan's infrastructure, and the remarkable cohesiveness of vision and purpose in government and industry, are key factors in the success of Japan's electronics industry. Although Japan will continue to dominate consumer electronics in the foreseeable future, opportunities exist for the United States and other industrial countries to capture an increasingly large part of the market. The JTEC panel has identified no insurmountable barriers that would prevent the United States from regaining a significant share of the consumer electronics market; in fact, there is ample evidence that the United States needs to aggressively pursue high-volume, low-cost electronic assembly, because it is a critical path leading to high-performance electronic systems.
Cleland, Verity; Hughes, Clarissa; Thornton, Lukar; Squibb, Kathryn; Venn, Alison; Ball, Kylie
2015-08-01
Social-ecological models of health behaviour acknowledge environmental influences, but research examining how the environment shapes physical activity in rural settings is limited. This study aimed to explore the environmental factors that act as barriers or facilitators to physical activity participation among rural adults. Forty-nine adults from three regions of rural Tasmania, Australia, participated in semi-structured interviews that explored features of the environment that supported or hindered physical activity. Interviews were digitally recorded, transcribed verbatim and analysed thematically. Four key themes emerged: functionality, diversity, spaces and places for all and realistic expectations. 'Functionality' included connectivity with other destinations, distance, safety, continuity, supporting infrastructure and surfacing. While there was limited 'diversity' of structured activities and recreational facilities, the importance of easy and convenient access to a natural environment that accommodated physical activity was highlighted. 'Spaces and places for all' highlighted the importance of shared-use areas, particularly those that were family- and dog-friendly. Despite desires for more physical activity opportunities, many participants had 'realistic expectations' of what was feasible in rural settings. Functionality, diversity, spaces and places for all and realistic expectations were identified as considerations important for physical activity among rural adults. Further research using quantitative approaches in larger samples is needed to confirm these findings. SO WHAT? Urban-centric views of environmental influences on physical activity are unlikely to be entirely appropriate for rural areas. Evidence-based recommendations are provided for creating new or modifying existing infrastructure to support active living in rural settings.
17 CFR 23.603 - Business continuity and disaster recovery.
Code of Federal Regulations, 2013 CFR
2013-04-01
..., facilities, infrastructure, personnel and competencies essential to the continued operations of the swap.... The individuals identified shall be authorized to make key decisions on behalf of the swap dealer or...
17 CFR 23.603 - Business continuity and disaster recovery.
Code of Federal Regulations, 2014 CFR
2014-04-01
..., facilities, infrastructure, personnel and competencies essential to the continued operations of the swap.... The individuals identified shall be authorized to make key decisions on behalf of the swap dealer or...
Information Technology and Community Restoration Studies/Task 1: Information Technology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Upton, Jaki F.; Lesperance, Ann M.; Stein, Steven L.
2009-11-19
Executive Summary The Interagency Biological Restoration Demonstration—a program jointly funded by the Department of Defense's Defense Threat Reduction Agency and the Department of Homeland Security's (DHS's) Science and Technology Directorate—is developing policies, methods, plans, and applied technologies to restore large urban areas, critical infrastructures, and Department of Defense installations following the intentional release of a biological agent (anthrax) by terrorists. There is a perception that there should be a common system that can share information both vertically and horizontally amongst participating organizations as well as support analyses. A key question is: "How far away from this are we?" As partmore » of this program, Pacific Northwest National Laboratory conducted research to identify the current information technology tools that would be used by organizations in the greater Seattle urban area in such a scenario, to define criteria for use in evaluating information technology tools, and to identify current gaps. Researchers interviewed 28 individuals representing 25 agencies in civilian and military organizations to identify the tools they currently use to capture data needed to support operations and decision making. The organizations can be grouped into five broad categories: defense (Department of Defense), environmental/ecological (Environmental Protection Agency/Ecology), public health and medical services, emergency management, and critical infrastructure. The types of information that would be communicated in a biological terrorism incident include critical infrastructure and resource status, safety and protection information, laboratory test results, and general emergency information. The most commonly used tools are WebEOC (web-enabled crisis information management systems with real-time information sharing), mass notification software, resource tracking software, and NW WARN (web-based information to protect critical infrastructure systems). It appears that the current information management tools are used primarily for information gathering and sharing—not decision making. Respondents identified the following criteria for a future software system. It is easy to learn, updates information in real time, works with all agencies, is secure, uses a visualization or geographic information system feature, enables varying permission levels, flows information from one stage to another, works with other databases, feeds decision support tools, is compliant with appropriate standards, and is reasonably priced. Current tools have security issues, lack visual/mapping functions and critical infrastructure status, and do not integrate with other tools. It is clear that there is a need for an integrated, common operating system. The system would need to be accessible by all the organizations that would have a role in managing an anthrax incident to enable regional decision making. The most useful tool would feature a GIS visualization that would allow for a common operating picture that is updated in real time. To capitalize on information gained from the interviews, the following activities are recommended: • Rate emergency management decision tools against the criteria specified by the interviewees. • Identify and analyze other current activities focused on information sharing in the greater Seattle urban area. • Identify and analyze information sharing systems/tools used in other regions.« less
NASA Astrophysics Data System (ADS)
Curdt, C.; Hoffmeister, D.; Bareth, G.; Lang, U.
2017-12-01
Science conducted in collaborative, cross-institutional research projects, requires active sharing of research ideas, data, documents and further information in a well-managed, controlled and structured manner. Thus, it is important to establish corresponding infrastructures and services for the scientists. Regular project meetings and joint field campaigns support the exchange of research ideas. Technical infrastructures facilitate storage, documentation, exchange and re-use of data as results of scientific output. Additionally, also publications, conference contributions, reports, pictures etc. should be managed. Both, knowledge and data sharing is essential to create synergies. Within the coordinated programme `Collaborative Research Center' (CRC), the German Research Foundation offers funding to establish research data management (RDM) infrastructures and services. CRCs are large-scale, interdisciplinary, multi-institutional, long-term (up to 12 years), university-based research institutions (up to 25 sub-projects). These CRCs address complex and scientifically challenging research questions. This poster presents the RDM services and infrastructures that have been established for two CRCs, both focusing on environmental sciences. Since 2007, a RDM support infrastructure and associated services have been set up for the CRC/Transregio 32 (CRC/TR32) `Patterns in Soil-Vegetation-Atmosphere-Systems: Monitoring, Modelling and Data Assimilation' (www.tr32.de). The experiences gained have been used to arrange RDM services for the CRC1211 `Earth - Evolution at the Dry Limit' (www.crc1211.de), funded since 2016. In both projects scientists from various disciplines collect heterogeneous data at field campaigns or by modelling approaches. To manage the scientific output, the TR32DB data repository (www.tr32db.de) has been designed and implemented for the CRC/TR32. This system was transferred and adapted to the CRC1211 needs (www.crc1211db.uni-koeln.de) in 2016. Both repositories support secure and sustainable data storage, backup, documentation, publication with DOIs, search, download, statistics as well as web mapping features. Moreover, RDM consulting and support services as well as training sessions are carried out regularly.
SensorWeb Hub infrastructure for open access to scientific research data
NASA Astrophysics Data System (ADS)
de Filippis, Tiziana; Rocchi, Leandro; Rapisardi, Elena
2015-04-01
The sharing of research data is a new challenge for the scientific community that may benefit from a large amount of information to solve environmental issues and sustainability in agriculture and urban contexts. Prerequisites for this challenge is the development of an infrastructure that ensure access, management and preservation of data, technical support for a coordinated and harmonious management of data that, in the framework of Open Data Policies, should encourages the reuse and the collaboration. The neogeography and the citizen as sensors approach, highlight that new data sources need a new set of tools and practices so to collect, validate, categorize, and use / access these "crowdsourced" data, that integrate the data sets produced in the scientific field, thus "feeding" the overall available data for analysis and research. When the scientific community embraces the dimension of collaboration and sharing, access and re-use, in order to accept the open innovation approach, it should redesign and reshape the processes of data management: the challenges of technological and cultural innovation, enabled by web 2.0 technologies, bring to the scenario where the sharing of structured and interoperable data will constitute the unavoidable building block to set up a new paradigm of scientific research. In this perspective the Institute of Biometeorology, CNR, whose aim is contributing to sharing and development of research data, has developed the "SensorWebHub" (SWH) infrastructure to support the scientific activities carried out in several research projects at national and international level. It is designed to manage both mobile and fixed open source meteorological and environmental sensors, in order to integrate the existing agro-meteorological and urban monitoring networks. The proposed architecture uses open source tools to ensure sustainability in the development and deployment of web applications with geographic features and custom analysis, as requested by the different research projects. The SWH components are organized in typical client-server architecture and interact from the sensing process to the representation of the results to the end-users. The Web Application enables to view and analyse the data stored in the GeoDB. The interface is designed following Internet browsers specifications allowing the visualization of collected data in different formats (tabular, chart and geographic map). The services for the dissemination of geo-referenced information, adopt the OGC specifications. SWH is a bottom-up collaborative initiative to share real time research data and pave the way for a open innovation approach in the scientific research. Until now this framework has been used for several WebGIS applications and WebApp for environmental monitoring at different temporal and spatial scales.
NASA Astrophysics Data System (ADS)
Robinson, E.
2015-12-01
The Federal Government has a long history of cross-community coordination between the Scientific Research community, and the Earth Observations and Data Provider communities. Since 1998, the Federation of Earth Science Information Partners (ESIP), organically organized using a collective impact approach that fostered these interactions primarily around Earth science interoperability problems. Unlike most collaborations, collective impact initiatives named in 2011 by the Stanford Social Innovation Review, involve a backbone infrastructure, a dedicated staff, and a structured process that leads to a common agenda, shared measurement, continuous communication, and mutually reinforcing activities among all participants. Over the last ten years, the Foundation for Earth Science (FES) has a proven track record of providing backbone support to ESIP. This presentation will cover FES's general approach to providing backbone support that enables communities to define shared agenda and then will show these practices in two case studies: (1) ESIP at-large as a mature network of developed partnerships and (2) a new project, the Local Community Resilience cluster. This new cluster aims to bridge the gap from the established ESIP network to engage local communities in order to equip citizens, professionals, and other decision-makers with the scientific underpinning necessary to make informed decisions (bounce forward) for society by leveraging the strong existing ESIP community, the backbone capabilities of FES and extending Federal Earth Science, Technology and Innovation Investments.
Setting the Stage for Harmonized Risk Assessment by Seismic Hazard Harmonization in Europe (SHARE)
NASA Astrophysics Data System (ADS)
Woessner, Jochen; Giardini, Domenico; SHARE Consortium
2010-05-01
Probabilistic seismic hazard assessment (PSHA) is arguably one of the most useful products that seismology can offer to society. PSHA characterizes the best available knowledge on the seismic hazard of a study area, ideally taking into account all sources of uncertainty. Results form the baseline for informed decision making, such as building codes or insurance rates and provide essential input to each risk assessment application. Several large scale national and international projects have recently been launched aimed at improving and harmonizing PSHA standards around the globe. SHARE (www.share-eu.org) is the European Commission funded project in the Framework Programme 7 (FP-7) that will create an updated, living seismic hazard model for the Euro-Mediterranean region. SHARE is a regional component of the Global Earthquake Model (GEM, www.globalquakemodel.org), a public/private partnership initiated and approved by the Global Science Forum of the OECD-GSF. GEM aims to be the uniform, independent and open access standard to calculate and communicate earthquake hazard and risk worldwide. SHARE itself will deliver measurable progress in all steps leading to a harmonized assessment of seismic hazard - in the definition of engineering requirements, in the collection of input data, in procedures for hazard assessment, and in engineering applications. SHARE scientists will create a unified framework and computational infrastructure for seismic hazard assessment and produce an integrated European probabilistic seismic hazard assessment (PSHA) model and specific scenario based modeling tools. The results will deliver long-lasting structural impact in areas of societal and economic relevance, they will serve as reference for the Eurocode 8 (EC8) application, and will provide homogeneous input for the correct seismic safety assessment for critical industry, such as the energy infrastructures and the re-insurance sector. SHARE will cover the whole European territory, the Maghreb countries in the Southern Mediterranean and Turkey in the Eastern Mediterranean. By strongly including the seismic engineering community, the project maintains a direct connection to the Eurocode 8 applications and the definition of the Nationally Determined Parameters, through the participation of the CEN/TC250/SC8 committee in the definition of the output specification requirements and in the hazard validation. SHARE will thus produce direct outputs for risk assessment. With this contribution, we focus on providing an overview of the goals and current achievement of the project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kain, J.F.; Gittell, R.; Daniere, A.
1992-01-01
The report surveys the growing use of bus and carpool priority measures to increase the productivity of the nation's transportation infrastructure. While it identifies a wide variety of priority measures, the report principally focuses on the planning and operation of exclusive and shared busways and high occupancy vehicle (HOV) facilities. It presents a variety of case studies describing the implementation of busways and transitways. The document also compares the cost effectiveness of exclusive busways and bus-HOV facilities with the cost effectiveness of recently completed light and heavy rail lines. It also explores the options and problems in serving large downtownmore » areas.« less
BIM Methodology Approach to Infrastructure Design: Case Study of Paniga Tunnel
NASA Astrophysics Data System (ADS)
Osello, Anna; Rapetti, Niccolò; Semeraro, Francesco
2017-10-01
Nowadays, the implementation of Building Information Modelling (BIM) in civil design represent a new challenge for the AECO (Architecture, Engineering, Construction, Owner and Operator) world, which will involve the interest of many researchers in the next years. It is due to the incentives of Public Administration and European Directives that aim to improve the efficiency and to enhance a better management of the complexity of infrastructure projects. For these reasons, the goal of this research is to propose a methodology for the use of BIM in a tunnel project, analysing the definition of a correct level of detail (LOD) and the possibility to share information via interoperability for FEM analysis.
Quantum metropolitan optical network based on wavelength division multiplexing.
Ciurana, A; Martínez-Mateo, J; Peev, M; Poppe, A; Walenta, N; Zbinden, H; Martín, V
2014-01-27
Quantum Key Distribution (QKD) is maturing quickly. However, the current approaches to its application in optical networks make it an expensive technology. QKD networks deployed to date are designed as a collection of point-to-point, dedicated QKD links where non-neighboring nodes communicate using the trusted repeater paradigm. We propose a novel optical network model in which QKD systems share the communication infrastructure by wavelength multiplexing their quantum and classical signals. The routing is done using optical components within a metropolitan area which allows for a dynamically any-to-any communication scheme. Moreover, it resembles a commercial telecom network, takes advantage of existing infrastructure and utilizes commercial components, allowing for an easy, cost-effective and reliable deployment.
A Grid Infrastructure for Supporting Space-based Science Operations
NASA Technical Reports Server (NTRS)
Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)
2002-01-01
Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
PANDOLFI, RONALD; KUMAR, DINESH; VENKATAKRISHNAN, SINGANALLUR
Xi-CAM aims to provide a community driven platform for multimodal analysis in synchrotron science. The platform core provides a robust plugin infrastructure for extensibility, allowing continuing development to simply add further functionality. Current modules include tools for characterization with (GI)SAXS, Tomography, and XAS. This will continue to serve as a development base as algorithms for multimodal analysis develop. Seamless remote data access, visualization and analysis are key elements of Xi-CAM, and will become critical to synchrotron data infrastructure as expectations for future data volume and acquisition rates rise with continuously increasing throughputs. The highly interactive design elements of Xi-cam willmore » similarly support a generation of users which depend on immediate data quality feedback during high-throughput or burst acquisition modes.« less
HUGO urges genetic benefit-sharing.
2000-01-01
In view of the fact that for-profit enterprise exceeds public expenditures on genetic research and that benefits from the Human Genome Project may accrue only to rich people in rich nations, the HUGO Ethics Committee discussed the necessity of benefit-sharing. Discussions involved case examples ranging from single-gene to multi-factorial disorders and included the difficulties of defining community, especially when multifactorial diseases are involved. The Committee discussed arguments for benefit-sharing, including common heritage, the genome as a common resource, and three types of justice: compensatory, procedural, and distributive. The Committee also discussed the importance of community participation in defining benefit, agreed that companies involved in health have special obligations beyond paying taxes, and recommended they devote 1-3% of net profits to healthcare infrastructure or humanitarian efforts.
Towards the cyber security paradigm of ehealth: Resilience and design aspects
NASA Astrophysics Data System (ADS)
Rajamäki, Jyri; Pirinen, Rauno
2017-06-01
Digital technologies have significantly changed the role of healthcare clients in seeking and receiving medical help, as well as brought up more cooperative policy issues in healthcare cross-border services. Citizens continue to take a more co-creative role in decisions about their own healthcare, and new technologies can enable and facilitate this emergent trend. In this study, healthcare services have been intended as a critical societal sector and therefore healthcare systems are focused on as critical infrastructures that ought to be protected from all types of fears, including cyber security threats and attacks. Despite continual progress in the systemic risk management of cyber domain, it is clear that anticipation and prevention of all possible types of attack and malfunction are not achievable for current or future cyber infrastructures. This study focuses on the investigation of a cyber security paradigm, adaptive systems and sense of resilience in a healthcare critical information infrastructure.
2013-12-01
First, any subproject that involved an implementation shared some implementation infrastructure with other subprojects. For example, the Plaid backend ...very same language. We followed this advice in Plaid, and we therefore implemented the compiler backend in Plaid (code generation, type checker, Æminim...programming language aimed at enforcing security properties in web and mobile applications [Nistor et al., 2013]. Wyvern therefore provides an excellent
ERIC Educational Resources Information Center
Shen, Yi
2016-01-01
The data landscape study at Virginia Tech addresses the changing modes of faculty scholarship and supports the development of a user-centric data infrastructure, management, and curation system. The study investigates faculty researchers' current practices in organizing, describing, and preserving data and the emerging needs for services and…
DOT National Transportation Integrated Search
2017-10-27
This Devices to Everything (D2X) Acceptance Test Plan (ATP) and Summary Report provides the plan, test cases, and test procedures that were used to verify Prototype System (version 2.0) system requirements, as well as a summary of results of the test...
As the Economic Crisis Hits Home, Colleges Seek Help from Congress
ERIC Educational Resources Information Center
Field, Kelly
2008-01-01
Congress is crafting a second economic-stimulus bill, and the nation's colleges, hit by the deepening fiscal crisis, want a share of the money. Over the last few weeks, colleges and their lobbyists have bombarded members of Congress with letters and phone calls seeking money for research, student aid, and infrastructure. However, Congress is…
Establishing a Nation Wide Infrastructure for Systematic Use of Patient Reported Information.
Jensen, Sanne; Lyng, Karen Marie
2018-01-01
In Denmark, we have set up a program to establish a nationwide infrastructure for Patient Reported Outcome (PRO) questionnaires. The effort is divided into an IT infrastructure part and a questionnaire development part. This paper describes how development and evaluation are closely knit together in the two tracks, as complexity is high in the PRO field and IT infrastructure, legal issues, various clinical workflows and the numerous stakeholders have to be taken into account concurrently. In the design process, we have thus used a participatory design approach to ensure a high level of active stakeholder involvement and capability of addressing all the relevant issues. In the next phases, we will apply the IT infrastructure in the planned full-scale evaluation of the questionnaires developed in the first phase, while we continue to develop new national questionnaires.
Does a House Divided Stand? Kinship and the Continuity of Shared Living Arrangements
Glick, Jennifer E.; Van Hook, Jennifer
2011-01-01
Shared living arrangements can provide housing, economies of scale, and other instrumental support and may become an important resource in times of economic constraint. But the extent to which such living arrangements experience continuity or rapid change in composition is unclear. Previous research on extended-family households tended to focus on factors that trigger the onset of coresidence, including life course events or changes in health status and related economic needs. Relying on longitudinal data from 9,932 households in the Survey of Income and Program Participation (SIPP), the analyses demonstrate that the distribution of economic resources in the household also influences the continuity of shared living arrangements. The results suggest that multigenerational households of parents and adult children experience greater continuity in composition when one individual or couple has a disproportionate share of the economic resources in the household. Other coresidential households, those shared by other kin or nonkin, experience greater continuity when resources are more evenly distributed. PMID:22259218
Towards a single seismological service infrastructure in Europe
NASA Astrophysics Data System (ADS)
Spinuso, A.; Trani, L.; Frobert, L.; Van Eck, T.
2012-04-01
In the last five year services and data providers, within the seismological community in Europe, focused their efforts in migrating the way of opening their archives towards a Service Oriented Architecture (SOA). This process tries to follow pragmatically the technological trends and available solutions aiming at effectively improving all the data stewardship activities. These advancements are possible thanks to the cooperation and the follow-ups of several EC infrastructural projects that, by looking at general purpose techniques, combine their developments envisioning a multidisciplinary platform for the earth observation as the final common objective (EPOS, Earth Plate Observation System) One of the first results of this effort is the Earthquake Data Portal (http://www.seismicportal.eu), which provides a collection of tools to discover, visualize and access a variety of seismological data sets like seismic waveform, accelerometric data, earthquake catalogs and parameters. The Portal offers a cohesive distributed search environment, linking data search and access across multiple data providers through interactive web-services, map-based tools and diverse command-line clients. Our work continues under other EU FP7 projects. Here we will address initiatives in two of those projects. The NERA, (Network of European Research Infrastructures for Earthquake Risk Assessment and Mitigation) project will implement a Common Services Architecture based on OGC services APIs, in order to provide Resource-Oriented common interfaces across the data access and processing services. This will improve interoperability between tools and across projects, enabling the development of higher-level applications that can uniformly access the data and processing services of all participants. This effort will be conducted jointly with the VERCE project (Virtual Earthquake and Seismology Research Community for Europe). VERCE aims to enable seismologists to exploit the wealth of seismic data within a data-intensive computation framework, which will be tailored to the specific needs of the community. It will provide a new interoperable infrastructure, as the computational backbone laying behind the publicly available interfaces. VERCE will have to face the challenges of implementing a service oriented architecture providing an efficient layer between the Data and the Grid infrastructures, coupling HPC data analysis and HPC data modeling applications through the execution of workflows and data sharing mechanism. Online registries of interoperable worklflow components, storage of intermediate results and data provenance are those aspects that are currently under investigations to make the VERCE facilities usable from a large scale of users, data and service providers. For such purposes the adoption of a Digital Object Architecture, to create online catalogs referencing and describing semantically all these distributed resources, such as datasets, computational processes and derivative products, is seen as one of the viable solution to monitor and steer the usage of the infrastructure, increasing its efficiency and the cooperation among the community.
Sustaining Research Networks: the Twenty-Year Experience of the HMO Research Network
Steiner, John F.; Paolino, Andrea R.; Thompson, Ella E.; Larson, Eric B.
2014-01-01
Purpose: As multi-institutional research networks assume a central role in clinical research, they must address the challenge of sustainability. Despite its importance, the concept of network sustainability has received little attention in the literature, and the sustainability strategies of durable scientific networks have not been described. Innovation: The Health Maintenance Organization Research Network (HMORN) is a consortium of 18 research departments in integrated health care delivery systems with over 15 million members in the United States and Israel. The HMORN has coordinated federally funded scientific networks and studies since 1994. This case study describes the HMORN approach to sustainability, proposes an operational definition of network sustainability, and identifies 10 essential elements that can enhance sustainability. Credibility: The sustainability framework proposed here is drawn from prior publications on organizational issues by HMORN investigators and from the experience of recent HMORN leaders and senior staff. Conclusion and Discussion: Network sustainability can be defined as (1) the development and enhancement of shared research assets to facilitate a sequence of research studies in a specific content area or multiple areas, and (2) a community of researchers and other stakeholders who reuse and develop those assets. Essential elements needed to develop the shared assets of a network include: network governance; trustworthy data and processes for sharing data; shared knowledge about research tools; administrative efficiency; physical infrastructure; and infrastructure funding. The community of researchers within a network is enhanced by: a clearly defined mission, vision and values; protection of human subjects; a culture of collaboration; and strong relationships with host organizations. While the importance of these elements varies based on the membership and goals of a network, this framework for sustainability can enhance strategic planning within the network and can guide relationships with external stakeholders. PMID:25848605
NASA Technical Reports Server (NTRS)
Maluf, David A.; Shetye, Sandeep D.; Chilukuri, Sri; Sturken, Ian
2012-01-01
Cloud computing can reduce cost significantly because businesses can share computing resources. In recent years Small and Medium Businesses (SMB) have used Cloud effectively for cost saving and for sharing IT expenses. With the success of SMBs, many perceive that the larger enterprises ought to move into Cloud environment as well. Government agency s stove-piped environments are being considered as candidates for potential use of Cloud either as an enterprise entity or pockets of small communities. Cloud Computing is the delivery of computing as a service rather than as a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network. Underneath the offered services, there exists a modern infrastructure cost of which is often spread across its services or its investors. As NASA is considered as an Enterprise class organization, like other enterprises, a shift has been occurring in perceiving its IT services as candidates for Cloud services. This paper discusses market trends in cloud computing from an enterprise angle and then addresses the topic of Cloud Computing for NASA in two possible forms. First, in the form of a public Cloud to support it as an enterprise, as well as to share it with the commercial and public at large. Second, as a private Cloud wherein the infrastructure is operated solely for NASA, whether managed internally or by a third-party and hosted internally or externally. The paper addresses the strengths and weaknesses of both paradigms of public and private Clouds, in both internally and externally operated settings. The content of the paper is from a NASA perspective but is applicable to any large enterprise with thousands of employees and contractors.
Hydrogen Vehicles: Impacts of DOE Technical Targets on Market Acceptance and Societal Benefits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Zhenhong; Dong, Jing; Greene, David L
2013-01-01
Hydrogen vehicles (H2V), including H2 internal combustion engine, fuel cell and fuel cell plugin hybrid, could greatly reduce petroleum consumption and greenhouse gas (GHG) emissions in the transportation sector. The U.S. Department of Energy has adopted targets for vehicle component technologies to address key technical barriers towidespread commercialization of H2Vs. This study estimates the market acceptance of H2Vs and the resulting societal benefits and subsidy in 41 scenarios that reflect a wide range of progress in meeting these technical targets. Important results include: (1) H2Vs could reach 20e70% market shares by 2050, depending on progress in achieving the technical targets.Withmore » a basic hydrogen infrastructure (w5% hydrogen availability), the H2V market share is estimated to be 2e8%. Fuel cell and hydrogen costs are the most important factors affecting the long-term market shares of H2Vs. (2) Meeting all technical targets on time could result in about an 80% cut in petroleumuse and a 62% (or 72% with aggressive electricity de-carbonization) reduction in GHG in 2050. (3) The required hydrogen infrastructure subsidy is estimated to range from $22 to $47 billion and the vehicle subsidy from $4 to $17 billion. (4) Long-term H2V market shares, societal benefits and hydrogen subsidies appear to be highly robust against delay in one target, if all other targets are met on time. R&D diversification could provide insurance for greater societal benefits. (5) Both H2Vs and plug-in electric vehicles could exceed 50% market shares by 2050, if all targets are met on time. The overlapping technology, the fuel cell plug-in hybrid electric vehicle, appears attractive both in the short and long runs, but for different reasons.« less
NASA Technical Reports Server (NTRS)
Swenson, Paul
2017-01-01
Satellite/Payload Ground Systems - Typically highly-customized to a specific mission's use cases - Utilize hundreds (or thousands!) of specialized point-to-point interfaces for data flows / file transfers Documentation and tracking of these complex interfaces requires extensive time to develop and extremely high staffing costs Implementation and testing of these interfaces are even more cost-prohibitive, and documentation often lags behind implementation resulting in inconsistencies down the road With expanding threat vectors, IT Security, Information Assurance and Operational Security have become key Ground System architecture drivers New Federal security-related directives are generated on a daily basis, imposing new requirements on current / existing ground systems - These mandated activities and data calls typically carry little or no additional funding for implementation As a result, Ground System Sustaining Engineering groups and Information Technology staff continually struggle to keep up with the rolling tide of security Advancing security concerns and shrinking budgets are pushing these large stove-piped ground systems to begin sharing resources - I.e. Operational / SysAdmin staff, IT security baselines, architecture decisions or even networks / hosting infrastructure Refactoring these existing ground systems into multi-mission assets proves extremely challenging due to what is typically very tight coupling between legacy components As a result, many "Multi-Mission" ops. environments end up simply sharing compute resources and networks due to the difficulty of refactoring into true multi-mission systems Utilizing continuous integration / rapid system deployment technologies in conjunction with an open architecture messaging approach allows System Engineers and Architects to worry less about the low-level details of interfaces between components and configuration of systems GMSEC messaging is inherently designed to support multi-mission requirements, and allows components to aggregate data across multiple homogeneous or heterogeneous satellites or payloads - The highly-successful Goddard Science and Planetary Operations Control Center (SPOCC) utilizes GMSEC as the hub for it's automation and situational awareness capability Shifts focus towards getting GS to a final configuration-managed baseline, as well as multi-mission / big-picture capabilities that help increase situational awareness, promote cross-mission sharing and establish enhanced fleet management capabilities across all levels of the enterprise.
NASA Astrophysics Data System (ADS)
Klima, K.
2013-12-01
Today's environmental problems stretch beyond the bounds of most academic disciplines, and thus solutions require an interdisciplinary approach. For instance, the scientific consensus is changes in the frequency and severity of many types of extreme weather events are increasing (IPCC 2012). Yet despite our efforts to reduce greenhouse gases, we continue to experience severe weather events such as Superstorm Sandy, record heat and blizzards, and droughts. These natural hazards, combined with increased vulnerability and exposure, result in longer-lasting disruptions to critical infrastructure and business continuity throughout the world. In order to protect both our lives and the economy, we must think beyond the bounds of any one discipline to include an integrated assessment of relevant work. In the wake of recent events, New York City, Washington, DC, Chicago, and a myriad of other cities have turned to their academic powerhouses for assistance in better understanding their vulnerabilities. This talk will share a case study of the state of integrated assessments and vulnerability studies of energy, transportation, water, real estate, and other main sectors in Pittsburgh, PA. Then the talk will use integrated assessment models and other vulnerability studies to create coordinated sets of climate projections for use by the many public agencies and private-sector organizations in the region.
Buzzelli, Michelle M; Morgan, Paula; Muschek, Alexander G; Macgregor-Skinner, Gavin
2014-01-01
Lack of success in disaster recovery occurs for many reasons, with one predominant catalyst for catastrophic failure being flawed and inefficient communication systems. Increased occurrences of devastating environmental hazards and human-caused disasters will continue to promulgate throughout the United States and around the globe as a result of the continuous intensive urbanization forcing human population into more concentrated and interconnected societies. With the rapid evolutions in technology and the advent of Information and communication technology (ICT) interfaces such as Facebook, Twitter, Flickr, Myspace, and Smartphone technology, communication is no longer a unidirectional source of information traveling from the newsroom to the public. In the event of a disaster, time critical information can be exchanged to and from any person or organization simultaneously with the capability to receive feedback. A literature review of current information regarding the use of ICT as information infrastructures in disaster management during human-caused and natural disasters will be conducted. This article asserts that the integrated use of ICTs as multidirectional information sharing tools throughout the disaster cycle will increase a community's resiliency and supplement the capabilities of first responders and emergency management officials by providing real-time updates and information needed to assist and recover from a disaster.
Geospatial-enabled Data Exploration and Computation through Data Infrastructure Building Blocks
NASA Astrophysics Data System (ADS)
Song, C. X.; Biehl, L. L.; Merwade, V.; Villoria, N.
2015-12-01
Geospatial data are present everywhere today with the proliferation of location-aware computing devices and sensors. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. The GABBs project aims at enabling broader access to geospatial data exploration and computation by developing spatial data infrastructure building blocks that leverage capabilities of end-to-end application service and virtualized computing framework in HUBzero. Funded by NSF Data Infrastructure Building Blocks (DIBBS) initiative, GABBs provides a geospatial data architecture that integrates spatial data management, mapping and visualization and will make it available as open source. The outcome of the project will enable users to rapidly create tools and share geospatial data and tools on the web for interactive exploration of data without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the development of geospatial data infrastructure building blocks and the scientific use cases that help drive the software development, as well as seek feedback from the user communities.
A Spatial Data Infrastructure to Share Earth and Space Science Data
NASA Astrophysics Data System (ADS)
Nativi, S.; Mazzetti, P.; Bigagli, L.; Cuomo, V.
2006-05-01
Spatial Data Infrastructure:SDI (also known as Geospatial Data Infrastructure) is fundamentally a mechanism to facilitate the sharing and exchange of geospatial data. SDI is a scheme necessary for the effective collection, management, access, delivery and utilization of geospatial data; it is important for: objective decision making and sound land based policy, support economic development and encourage socially and environmentally sustainable development. As far as data model and semantics are concerned, a valuable and effective SDI should be able to cross the boundaries between the Geographic Information System/Science (GIS) and Earth and Space Science (ESS) communities. Hence, SDI should be able to discover, access and share information and data produced and managed by both GIS and ESS communities, in an integrated way. In other terms, SDI must be built on a conceptual and technological framework which abstracts the nature and structure of shared dataset: feature-based data or Imagery, Gridded and Coverage Data (IGCD). ISO TC211 and the Open Geospatial Consortium provided important artifacts to build up this framework. In particular, the OGC Web Services (OWS) initiatives and several Interoperability Experiment (e.g. the GALEON IE) are extremely useful for this purpose. We present a SDI solution which is able to manage both GIS and ESS datasets. It is based on OWS and other well-accepted or promising technologies, such as: UNIDATA netCDF and CDM, ncML and ncML-GML. Moreover, it uses a specific technology to implement a distributed and federated system of catalogues: the GI-Cat. This technology performs data model mediation and protocol adaptation tasks. It is used to work out a metadata clearinghouse service, implementing a common (federal) catalogue model which is based on the ISO 19115 core metadata for geo-dataset. Nevertheless, other well- accepted or standard catalogue data models can be easily implemented as common view (e.g. OGC CS-W, the next coming INSPIRE discovery metadata model, etc.). The proposed solution has been conceived and developed for building up the "Lucan SDI". This is the SDI of the Italian Basilicata Region. It aims to connect the following data providers and users: the National River Basin Authority of Basilicata, the Regional Environmental Agency, the Land Management & Cadastre Regional Authorities, the Prefecture, the Regional Civil Protection Centers, the National Research Council Institutes in Basilicata, the Academia, several SMEs.
Hidden concerns of sharing research data by low/middle-income country scientists.
Bezuidenhout, Louise; Chakauya, Ereck
2018-01-01
There has considerable interest in bringing low/middle-income countries (LMIC) scientists into discussions on Open Data - both as contributors and users. The establishment of in situ data sharing practices within LMIC research institutions is vital for the development of an Open Data landscape in the Global South. Nonetheless, many LMICs have significant challenges - resource provision, research support and extra-laboratory infrastructures. These low-resourced environments shape data sharing activities, but are rarely examined within Open Data discourse. In particular, little attention is given to how these research environments shape scientists' perceptions of data sharing (dis)incentives. This paper expands on these issues of incentivizing data sharing, using data from a quantitative survey disseminated to life scientists in 13 countries in sub-Saharan Africa. This interrogated not only perceptions of data sharing amongst LMIC scientists, but also how these are connected to the research environments and daily challenges experienced by them. The paper offers a series of analysis around commonly cited (dis)incentives such as data sharing as a means of improving research visibility; sharing and funding; and online connectivity. It identifies key areas that the Open Data community need to consider if true openness in research is to be established in the Global South.
Landscape trajectory of natural boreal forest loss as an impediment to green infrastructure.
Svensson, Johan; Andersson, Jon; Sandström, Per; Mikusiński, Grzegorz; Jonsson, Bengt-Gunnar
2018-06-08
Loss of natural forests has been identified as a critical conservation challenge worldwide. This loss impede the establishment of a functional green infrastructure as a spatiotemporally connected landscape-scale network of habitats enhancing biodiversity, favorable conservation status and ecosystem services. In many regions this loss is caused by forest clearcutting. Through retrospective satellite images analysis we assessed a 50-60 year spatiotemporal clearcutting impact trajectory on natural and near-natural boreal forests across a sizable and representative region from the Gulf of Bothnia to the Scandinavian Mountain Range in northern Fennoscandia. Our analysis broadly covers the whole forest clearcutting period and thus our study approach and results can be applied for comprehensive impact assessment of industrial forest management. Our results demonstrate profound disturbance on natural forest landscape configuration. The whole forest landscape is in a late phase in a transition from a natural or near-natural to a land-use modified state. Our results provide evidence of natural forest loss and spatial polarization at the regional scale, with a pre-dominant share of valuable habitats left in the mountain area, whereas the inland area has been more severely impacted. We highlight the importance of interior forest areas as most valuable biodiversity hotspots and the central axis of green infrastructure. Superimposing the effects of edge disturbance on forest fragmentation, the loss of interior forest entities further aggravate the conservation premises. Our results also show a loss of large contiguous forest patches and indicate patch size homogenization. The current forest protection share is low in the region and with geographical imbalance as the absolute majority is located in remote and low productive sites in the mountain area. Our approach provides possibilities to identify forest areas for directed conservation actions in the form of new protection, restoration and nature conservation oriented forest management, for implementing a functional green infrastructure. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
NASA Astrophysics Data System (ADS)
Longo, S.; Nativi, S.; Leone, C.; Migliorini, S.; Mazari Villanova, L.
2012-04-01
Italian Polar Metadata System C.Leone, S.Longo, S.Migliorini, L.Mazari Villanova, S. Nativi The Italian Antarctic Research Programme (PNRA) is a government initiative funding and coordinating scientific research activities in polar regions. PNRA manages two scientific Stations in Antarctica - Concordia (Dome C), jointly operated with the French Polar Institute "Paul Emile Victor", and Mario Zucchelli (Terra Nova Bay, Southern Victoria Land). In addition National Research Council of Italy (CNR) manages one scientific Station in the Arctic Circle (Ny-Alesund-Svalbard Islands), named Dirigibile Italia. PNRA started in 1985 with the first Italian Expedition in Antarctica. Since then each research group has collected data regarding biology and medicine, geodetic observatory, geophysics, geology, glaciology, physics and atmospheric chemistry, earth-sun relationships and astrophysics, oceanography and marine environment, chemistry contamination, law and geographic science, technology, multi and inter disciplinary researches, autonomously with different formats. In 2010 the Italian Ministry of Research assigned the scientific coordination of the Programme to CNR, which is in charge of the management and sharing of the scientific results carried out in the framework of the PNRA. Therefore, CNR is establishing a new distributed cyber(e)-infrastructure to collect, manage, publish and share polar research results. This is a service-based infrastructure building on Web technologies to implement resources (i.e. data, services and documents) discovery, access and visualization; in addition, semantic-enabled functionalities will be provided. The architecture applies the "System of Systems" principles to build incrementally on the existing systems by supplementing but not supplanting their mandates and governance arrangements. This allows to keep the existing capacities as autonomous as possible. This cyber(e)-infrastructure implements multi-disciplinary interoperability following a Brokering approach and supporting the relevant international standards recognized by European and international standards, including: GEO/GEOSS, INSPIRE and SCAR. The Brokering approach is empowered by a technology developed by CNR, advanced by the FP7 EuroGEOSS project, and recently adopted by the GEOSS Common Infrastructure (GCI).
Towards an integrated European strong motion data distribution
NASA Astrophysics Data System (ADS)
Luzi, Lucia; Clinton, John; Cauzzi, Carlo; Puglia, Rodolfo; Michelini, Alberto; Van Eck, Torild; Sleeman, Reinhoud; Akkar, Sinan
2013-04-01
Recent decades have seen a significant increase in the quality and quantity of strong motion data collected in Europe, as dense and often real-time and continuously monitored broadband strong motion networks have been constructed in many nations. There has been a concurrent increase in demand for access to strong motion data not only from researchers for engineering and seismological studies, but also from civil authorities and seismic networks for the rapid assessment of ground motion and shaking intensity following significant earthquakes (e.g. ShakeMaps). Aside from a few notable exceptions on the national scale, databases providing access to strong motion data has not appeared to keep pace with these developments. In the framework of the EC infrastructure project NERA (2010 - 2014), that integrates key research infrastructures in Europe for monitoring earthquakes and assessing their hazard and risk, the network activity NA3 deals with the networking of acceleration networks and SM data. Within the NA3 activity two infrastructures are being constructed: i) a Rapid Response Strong Motion (RRSM) database, that following a strong event, automatically parameterises all available on-scale waveform data within the European Integrated waveform Data Archives (EIDA) and makes the waveforms easily available to the seismological community within minutes of an event; and ii) a European Strong Motion (ESM) database of accelerometric records, with associated metadata relevant to earthquake engineering and seismology research communities, using standard, manual processing that reflects the state of the art and research needs in these fields. These two separate repositories form the core infrastructures being built to distribute strong motion data in Europe in order to guarantee rapid and long-term availability of high quality waveform data to both the international scientific community and the hazard mitigation communities. These infrastructures will provide the access to strong motion data in an eventual EPOS seismological service. A working group on Strong Motion data is being created at ORFEUS in 2013. This body, consisting of experts in strong motion data collection, processing and research from across Europe, will provide the umbrella organisation that will 1) have the political clout to negotiate data sharing agreements with strong motion data providers and 2) manage the software during a transition from the end of NERA to the EPOS community. We expect the community providing data to the RRSM and ESM will gradually grow, under the supervision of ORFEUS, and eventually include strong motion data from networks from all European countries that can have an open data policy.
FixO3 : Early progress towards Open Ocean observatory Data Management Harmonisation
NASA Astrophysics Data System (ADS)
Pagnani, Maureen; Huber, Robert; Lampitt, Richard
2014-05-01
Since 2002 there has been a sustained effort, supported as European framework projects, to harmonise both the technology and the data management of Open Ocean fixed observatories run by European nations. FixO3 started in September 2013, and for 4 years will coordinate the convergence of data management best practice across a constellation of moorings in the Atlantic, in both hemispheres, and in the Mediterranean. To ensure the continued existence of these unique sources of oceanographic data as sustained observatories it is vital to improve access to the data collected, both in terms of methods of presentation, real-time availability, long-term archiving and quality assurance. The data management component of FixO3 will improve access to marine observatory data by harmonizing data management standards and workflows covering the complete life cycle of data from real time data acquisition to long-term archiving. Legal and data policy aspects will be examined to identify transnational barriers to open-access to marine observatory data. A harmonised FixO3 data policy is being synthesised from the partner's existing policies, which will overcome the identified barriers, and provide a formal basis for data exchange between FixO3 infrastructures. Presently, the interpretation and implementation of accepted standards has considerable incompatibilities within the observatory community, and these different approaches will be unified into the FixO3 approach. Further, FixO3 aims to harmonise data management and standardisation efforts with other European and international marine data and observatory infrastructures. The FixO3 synthesis will build on the standards established in other European infrastructures such as EDMONET, SEADATANET, PANGAEA, EuroSITES (European contribution to JCOMMP OceanSITES programme), and MyOcean (the Marine Core Service for GMES) infrastructures as well as relevant international infrastructures and data centres such as the ICOS Ocean Thematic Centre. The data management efforts are central to FixO3. Combined with the procedural and technological harmonisation, tackled in separate work packages, the FixO3 network of observatories will efficiently and cost effectively provide a consistent resource of quality controlled accessible oceanographic data The project website www.fixo3.eu is being developed as both a data showcase and single distribution point, and with database driven tools will enable the sharing of information between the observatories in the most smart and cost effective way. The network of knowledge built throughout the project will become a legacy resource that will ensure access to the unique ensemble data sets only achievable at these key observatories.
The Contribution of the Geodetic Community (WG4) to EPOS
NASA Astrophysics Data System (ADS)
Fernandes, R. M. S.; Bastos, L. C.; Bruyninx, C.; D'Agostino, N.; Dousa, J.; Ganas, A.; Lidberg, M.; Nocquet, J.-M.
2012-04-01
WG4 - "EPOS Geodetic Data and Infrastructure" is the Working Group of the EPOS project responsible to define and prepare the integration of the existing Pan-European Geodetic Infrastructures into a unique future consistent infrastructure that supports the European Geosciences, which is the ultimate goal of the EPOS project. The WG4 is formed by representatives of the participating EPOS countries and from EUREF (European Reference Frame), which also ensures the inclusion and the contact with countries that formally are not part of the current phase of EPOS. In reality, the fact that Europe is formed by many countries (having different laws and policies) lacking an infrastructure similar to UNAVCO (which concentrates the effort of the local geo-science community) raises the difficulties to create a common geodetic infrastructure serving not only the entire geo-science community, but also many other areas of great social-economic impact. The benefits of the creation of such infrastructure (shared and easily accessed by all) are evident in order to optimize the existing and future geodetic resources. This presentation intends to detail the work being produced within the working group WG4 related with the definition of strategies towards the implementation of the best solutions that will permit to the end-users, and in particular geo-scientists, to access the geodetic data, derived solutions, and associated metadata using transparent and uniform processes. Discussed issues include the access to high-rate data in near real-time, storage and backup of historical and future data, the sustainability of the networks in order to achieve long-term stability in the observation infrastructure, seamless access to the data, open data policies, and processing tools.
Pan, Jeng-Jong; Nahm, Meredith; Wakim, Paul; Cushing, Carol; Poole, Lori; Tai, Betty; Pieper, Carl F
2009-02-01
Clinical trial networks (CTNs) were created to provide a sustaining infrastructure for the conduct of multisite clinical trials. As such, they must withstand changes in membership. Centralization of infrastructure including knowledge management, portfolio management, information management, process automation, work policies, and procedures in clinical research networks facilitates consistency and ultimately research. In 2005, the National Institute on Drug Abuse (NIDA) CTN transitioned from a distributed data management model to a centralized informatics infrastructure to support the network's trial activities and administration. We describe the centralized informatics infrastructure and discuss our challenges to inform others considering such an endeavor. During the migration of a clinical trial network from a decentralized to a centralized data center model, descriptive data were captured and are presented here to assess the impact of centralization. We present the framework for the informatics infrastructure and evaluative metrics. The network has decreased the time from last patient-last visit to database lock from an average of 7.6 months to 2.8 months. The average database error rate decreased from 0.8% to 0.2%, with a corresponding decrease in the interquartile range from 0.04%-1.0% before centralization to 0.01-0.27% after centralization. Centralization has provided the CTN with integrated trial status reporting and the first standards-based public data share. A preliminary cost-benefit analysis showed a 50% reduction in data management cost per study participant over the life of a trial. A single clinical trial network comprising addiction researchers and community treatment programs was assessed. The findings may not be applicable to other research settings. The identified informatics components provide the information and infrastructure needed for our clinical trial network. Post centralization data management operations are more efficient and less costly, with higher data quality.
Experience Differences and Continuance Intention of Blog Sharing
ERIC Educational Resources Information Center
Lu, Hsi-Peng; Lee, Ming-Ren
2012-01-01
Although many studies focus on information sharing in communities and organisations, little research has been carried out on the antecedents of continuance intention of blog sharing. This study focuses on amateur blogs, which are the major customers for blog service providers (BSPs). The purposes are to investigate the antecedents of continuous…
Managing the water-energy-food nexus: Opportunities in Central Asia
NASA Astrophysics Data System (ADS)
Jalilov, Shokhrukh-Mirzo; Amer, Saud A.; Ward, Frank A.
2018-02-01
This article examines impacts of infrastructure development and climate variability on economic outcomes for the Amu Darya Basin in Central Asia. It aims to identify the most economically productive mix of expanded reservoir storage for economic benefit sharing to occur, in which economic welfare of all riparians is improved. Policies examined include four combinations of storage infrastructure for each of two climate futures. An empirical optimization model is developed and applied to identify opportunities for improving the welfare of Tajikistan, Uzbekistan, Afghanistan, and Turkmenistan. The analysis 1) characterizes politically constrained and economically optimized water-use patterns for these combinations of expanded reservoir storage capacity, 2) describes Pareto-Improving packages of expanded storage capacity that could raise economic welfare for all four riparians, and accounts for impacts for each of two climate scenarios. Results indicate that a combination of targeted water storage infrastructure and efficient water allocation could produce outcomes for which the discounted net present value of benefits are favorable for each riparian. Results identify a framework to provide economic motivation for all riparians to cooperate through development of water storage infrastructure. Our findings illustrate the principle that development of water infrastructure can expand the negotiation space by which all communities can gain economic benefits in the face of limited water supply. Still, despite our optimistic findings, patient and deliberate negotiation will be required to transform potential improvements into actual gains.
Analysis of CERN computing infrastructure and monitoring data
NASA Astrophysics Data System (ADS)
Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.
2015-12-01
Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.
NASA Astrophysics Data System (ADS)
Paudyal, D. R.; McDougall, K.; Apan, A.
2014-12-01
Spatial information plays an important role in many social, environmental and economic decisions and increasingly acknowledged as a national resource essential for wider societal and environmental benefits. Natural Resource Management is one area where spatial information can be used for improved planning and decision making processes. In Australia, state government organisations are the custodians of spatial information necessary for natural resource management and regional NRM bodies are responsible to regional delivery of NRM activities. The access and sharing of spatial information between government agencies and regional NRM bodies is therefore as an important issue for improving natural resource management outcomes. The aim of this paper is to evaluate the current status of spatial information access, sharing and use with varying statutory arrangements and its impacts on spatial data infrastructure (SDI) development in catchment management sector in Australia. Further, it critically examined whether any trends and significant variations exist due to different institutional arrangements (statutory versus non-statutory) or not. A survey method was used to collect primary data from 56 regional natural resource management (NRM) bodies responsible for catchment management in Australia. Descriptive statistics method was used to show the similarities and differences between statutory and non-statutory arrangements. The key factors which influence sharing and access to spatial information are also explored. The results show the current statutory and administrative arrangements and regional focus for natural resource management is reasonable from a spatial information management perspective and provides an opportunity for building SDI at the catchment scale. However, effective institutional arrangements should align catchment SDI development activities with sub-national and national SDI development activities to address catchment management issues. We found minor differences in spatial information access, use and sharing due to varying institutional environment (statutory versus non-statutory). The non-statutory group appears to be more flexible and selfsufficient whilst statutory regional NRM bodies may lack flexibility in their spatial information management practices. We found spatial information access, use and sharing has significant impacts on spatial data infrastructure development in catchment management sector in Australia.
Open source system OpenVPN in a function of Virtual Private Network
NASA Astrophysics Data System (ADS)
Skendzic, A.; Kovacic, B.
2017-05-01
Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.
Towards usable and interdisciplinary e-infrastructure (Invited)
NASA Astrophysics Data System (ADS)
de Roure, D.
2010-12-01
e-Science and cyberinfrastucture at their outset tended to focus on ‘big science’ and cross-organisational infrastructures, demonstrating complex engineering with the promise of high returns. It soon became evident that the key to researchers harnessing new technology for everyday use is a user-centric approach which empowers the user - both from a developer and an end user viewpoint. For example, this philosophy is demonstrated in workflow systems for systematic data processing and in the Web 2.0 approach as exemplified by the myExperiment social web site for sharing workflows, methods and ‘research objects’. Hence the most disruptive aspect of Cloud and virtualisation is perhaps that they make new computational resources and applications usable, creating a flourishing ecosystem for routine processing and innovation alike - and in this we must consider software sustainability. This talk will discuss the changing nature of e-Science digital ecosystem, focus on the e-infrastructure for cross-disciplinary work, and highlight issues in sustainable software development in this context.
A Real-Time Web of Things Framework with Customizable Openness Considering Legacy Devices
Zhao, Shuai; Yu, Le; Cheng, Bo
2016-01-01
With the development of the Internet of Things (IoT), resources and applications based on it have emerged on a large scale. However, most efforts are “silo” solutions where devices and applications are tightly coupled. Infrastructures are needed to connect sensors to the Internet, open up and break the current application silos and move to a horizontal application mode. Based on the concept of Web of Things (WoT), many infrastructures have been proposed to integrate the physical world with the Web. However, issues such as no real-time guarantee, lack of fine-grained control of data, and the absence of explicit solutions for integrating heterogeneous legacy devices, hinder their widespread and practical use. To address these issues, this paper proposes a WoT resource framework that provides the infrastructures for the customizable openness and sharing of users’ data and resources under the premise of ensuring the real-time behavior of their own applications. The proposed framework is validated by actual systems and experimental evaluations. PMID:27690038
A Real-Time Web of Things Framework with Customizable Openness Considering Legacy Devices.
Zhao, Shuai; Yu, Le; Cheng, Bo
2016-09-28
With the development of the Internet of Things (IoT), resources and applications based on it have emerged on a large scale. However, most efforts are "silo" solutions where devices and applications are tightly coupled. Infrastructures are needed to connect sensors to the Internet, open up and break the current application silos and move to a horizontal application mode. Based on the concept of Web of Things (WoT), many infrastructures have been proposed to integrate the physical world with the Web. However, issues such as no real-time guarantee, lack of fine-grained control of data, and the absence of explicit solutions for integrating heterogeneous legacy devices, hinder their widespread and practical use. To address these issues, this paper proposes a WoT resource framework that provides the infrastructures for the customizable openness and sharing of users' data and resources under the premise of ensuring the real-time behavior of their own applications. The proposed framework is validated by actual systems and experimental evaluations.
EUFAR the unique portal for airborne research in Europe
NASA Astrophysics Data System (ADS)
Gérard, Elisabeth; Brown, Philip
2016-04-01
Created in 2000 and supported by the EU Framework Programmes since then, EUFAR was born out of the necessity to create a central network and access point for the airborne research community in Europe. With the aim to support researchers by granting them access to research infrastructures, not accessible in their home countries, EUFAR also provides technical support and training in the field of airborne research for the environmental and geo-sciences. Today, EUFAR2 (2014-2018) coordinates and facilitates transnational access to 18 instrumented aircraft and 3 remote-sensing instruments through the 13 operators who are part of EUFAR's current 24-partner European consortium. In addition, the current project supports networking and research activities focused on providing an enabling environment for and promoting airborne research. The EUFAR2 activities cover three objectives, supported by the internet website www.eufar.net: (I - Institutional) improvement of the access to the research infrastructures and development of the future fleet according to the strategic advisory committee (SAC) recommendations; (ii - Innovation) improvement of the scientific knowledge and promotion of innovating instruments, processes and services for the emergence of new industrial technologies, with an identification of industrial needs by the SAC; (iii - Service) optimisation and harmonisation of the use of the research infrastructures through the development of the community of young researches in airborne science, of the standards and protocols and of the airborne central database. With the launch of a brand new website (www.eufar.net) in mid-November 2015, EUFAR aims to improve user experience on the website, which serves as a source of information and a hub where users are able to collaborate, learn, share expertise and best practices, and apply for transnational access, and education and training funded opportunities within the network. With its newly designed eye-catching interface, the website offers easy navigation, and user friendly functionalities. New features also include a section on news and airborne research stories to keep users up-to-date on EUFAR's activities, a career section, photo galleries, and much more. By elaborating new solutions for the web portal, EUFAR continues to serve as an interactive and dynamic platform bringing together experts, early-stage researchers, operators, data users, industry and other stakeholders in the airborne research community. A main focus of the current project is the establishment of a sustainable legal structure for EUFAR. This is critical to ensuring the continuity of EUFAR and securing, at the least, partial financial independence from the European Commission who has been funding the project since its start. After carefully examining different legal forms relevant for EUFAR, the arguments are strongly in favour of establishing an International non-profit Association under the Belgian law (AISBL). Together with the implementation of an Open Access scheme by means of resource-sharing to support the mobility of personnel across countries envisaged in 2016, such a sustainable structure would contribute substantially toward broadening the user base of existing airborne research facilities in Europe and mobilising additional resources for this end. In essence, this would cement EUFAR's position as the key portal for airborne research in Europe.
Weiler, Gabriele; Schröder, Christina; Schera, Fatima; Dobkowicz, Matthias; Kiefer, Stephan; Heidtke, Karsten R; Hänold, Stefanie; Nwankwo, Iheanyi; Forgó, Nikolaus; Stanulla, Martin; Eckert, Cornelia; Graf, Norbert
2014-01-01
Biobanks represent key resources for clinico-genomic research and are needed to pave the way to personalised medicine. To achieve this goal, it is crucial that scientists can securely access and share high-quality biomaterial and related data. Therefore, there is a growing interest in integrating biobanks into larger biomedical information and communication technology (ICT) infrastructures. The European project p-medicine is currently building an innovative ICT infrastructure to meet this need. This platform provides tools and services for conducting research and clinical trials in personalised medicine. In this paper, we describe one of its main components, the biobank access framework p-BioSPRE (p-medicine Biospecimen Search and Project Request Engine). This generic framework enables and simplifies access to existing biobanks, but also to offer own biomaterial collections to research communities, and to manage biobank specimens and related clinical data over the ObTiMA Trial Biomaterial Manager. p-BioSPRE takes into consideration all relevant ethical and legal standards, e.g., safeguarding donors’ personal rights and enabling biobanks to keep control over the donated material and related data. The framework thus enables secure sharing of biomaterial within open and closed research communities, while flexibly integrating related clinical and omics data. Although the development of the framework is mainly driven by user scenarios from the cancer domain, in this case, acute lymphoblastic leukaemia and Wilms tumour, it can be extended to further disease entities. PMID:24567758
A Shared Infrastructure for Federated Search Across Distributed Scientific Metadata Catalogs
NASA Astrophysics Data System (ADS)
Reed, S. A.; Truslove, I.; Billingsley, B. W.; Grauch, A.; Harper, D.; Kovarik, J.; Lopez, L.; Liu, M.; Brandt, M.
2013-12-01
The vast amount of science metadata can be overwhelming and highly complex. Comprehensive analysis and sharing of metadata is difficult since institutions often publish to their own repositories. There are many disjoint standards used for publishing scientific data, making it difficult to discover and share information from different sources. Services that publish metadata catalogs often have different protocols, formats, and semantics. The research community is limited by the exclusivity of separate metadata catalogs and thus it is desirable to have federated search interfaces capable of unified search queries across multiple sources. Aggregation of metadata catalogs also enables users to critique metadata more rigorously. With these motivations in mind, the National Snow and Ice Data Center (NSIDC) and Advanced Cooperative Arctic Data and Information Service (ACADIS) implemented two search interfaces for the community. Both the NSIDC Search and ACADIS Arctic Data Explorer (ADE) use a common infrastructure which keeps maintenance costs low. The search clients are designed to make OpenSearch requests against Solr, an Open Source search platform. Solr applies indexes to specific fields of the metadata which in this instance optimizes queries containing keywords, spatial bounds and temporal ranges. NSIDC metadata is reused by both search interfaces but the ADE also brokers additional sources. Users can quickly find relevant metadata with minimal effort and ultimately lowers costs for research. This presentation will highlight the reuse of data and code between NSIDC and ACADIS, discuss challenges and milestones for each project, and will identify creation and use of Open Source libraries.
National Infrastructure Protection Plan
2006-01-01
effective and efficient CI/KR protection; and • Provide a system for continuous measurement and improvement of CI/KR...information- based core processes, a top-down system -, network-, or function- based approach may be more appropri- ate. A bottom-up approach normally... e - commerce , e -mail, and R&D systems . • Control Systems : Cyber systems used within many infrastructure and industries to monitor and
Alloni, Anna; Lanzola, Giordano; Triulzi, Fabio; Bellazzi, Riccardo; Reni, Gianluigi
2015-08-01
The Colibri project is introduced, whose aim is setting up a shared database of Magnetic Resonance images concerning pediatric patients affected by neurological rare disorders. The project involves 19 Italian centers of excellence in pediatric neuro-radiology and is supported by the nationwide coordinating center for the Information and Communication Technology research infrastructure. After the first year devoted to the design and the implementation, in November 2014 the system finally went into service at the centers involved in the project. This paper illustrates the initial assessment of the user perception and provides some preliminary statistics about its use.
An authentication infrastructure for today and tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engert, D.E.
1996-06-01
The Open Software Foundation`s Distributed Computing Environment (OSF/DCE) was originally designed to provide a secure environment for distributed applications. By combining it with Kerberos Version 5 from MIT, it can be extended to provide network security as well. This combination can be used to build both an inter and intra organizational infrastructure while providing single sign-on for the user with overall improved security. The ESnet community of the Department of Energy is building just such an infrastructure. ESnet has modified these systems to improve their interoperability, while encouraging the developers to incorporate these changes and work more closely together tomore » continue to improve the interoperability. The success of this infrastructure depends on its flexibility to meet the needs of many applications and network security requirements. The open nature of Kerberos, combined with the vendor support of OSF/DCE, provides the infrastructure for today and tomorrow.« less
SEE-GRID eInfrastructure for Regional eScience
NASA Astrophysics Data System (ADS)
Prnjat, Ognjen; Balaz, Antun; Vudragovic, Dusan; Liabotis, Ioannis; Sener, Cevat; Marovic, Branko; Kozlovszky, Miklos; Neagu, Gabriel
In the past 6 years, a number of targeted initiatives, funded by the European Commission via its information society and RTD programmes and Greek infrastructure development actions, have articulated a successful regional development actions in South East Europe that can be used as a role model for other international developments. The SEEREN (South-East European Research and Education Networking initiative) project, through its two phases, established the SEE segment of the pan-European G ´EANT network and successfully connected the research and scientific communities in the region. Currently, the SEE-LIGHT project is working towards establishing a dark-fiber backbone that will interconnect most national Research and Education networks in the region. On the distributed computing and storage provisioning i.e. Grid plane, the SEE-GRID (South-East European GRID e-Infrastructure Development) project, similarly through its two phases, has established a strong human network in the area of scientific computing and has set up a powerful regional Grid infrastructure, and attracted a number of applications from different fields from countries throughout the South-East Europe. The current SEEGRID-SCI project, ending in April 2010, empowers the regional user communities from fields of meteorology, seismology and environmental protection in common use and sharing of the regional e-Infrastructure. Current technical initiatives in formulation are focusing on a set of coordinated actions in the area of HPC and application fields making use of HPC initiatives. Finally, the current SEERA-EI project brings together policy makers - programme managers from 10 countries in the region. The project aims to establish a communication platform between programme managers, pave the way towards common e-Infrastructure strategy and vision, and implement concrete actions for common funding of electronic infrastructures on the regional level. The regional vision on establishing an e-Infrastructure compatible with European developments, and empowering the scientists in the region in equal participation in the use of pan- European infrastructures, is materializing through the above initiatives. This model has a number of concrete operational and organizational guidelines which can be adapted to help e-Infrastructure developments in other world regions. In this paper we review the most important developments and contributions by the SEEGRID- SCI project.
NASA Astrophysics Data System (ADS)
Marshall, Eric
2009-03-01
Science centers, professional associations, corporations and university research centers share the same mission of education and outreach, yet come from ``different worlds.'' This gap may be bridged by working together to leverage unique strengths in partnership. Front-end evaluation results for the development of new resources to support these (mostly volunteer-based) partnerships elucidate the factors which lead to a successful relationship. Maintaining a science museum-scientific community partnership requires that all partners devote adequate resources (time, money, etc.). In general, scientists/engineers and science museum professionals often approach relationships with different assumptions and expectations. The culture of science centers is distinctly different from the culture of science. Scientists/engineers prefer to select how they will ultimately share their expertise from an array of choices. Successful partnerships stem from clearly defined roles and responsibilities. Scientists/engineers are somewhat resistant to the idea of traditional, formal training. Instead of developing new expertise, many prefer to offer their existing strengths and expertise. Maintaining a healthy relationship requires the routine recognition of the contributions of scientists/engineers. As professional societies, university research centers and corporations increasingly engage in education and outreach, a need for a supportive infrastructure becomes evident. Work of TryScience.org/VolTS (Volunteers TryScience), the MRS NISE Net (Nanoscale Informal Science Education Network) subcommittee, NRCEN (NSF Research Center Education Network), the IBM On Demand Community, and IEEE Educational Activities exemplify some of the pieces of this evolving infrastructure.
Social Media Principles Applied to Critical Infrastructure Information Sharing
2013-12-01
shooters. The DHS works throughout the year to build partnerships with industries across a wide spectrum, to include commercial facilities. They...security professionals , industry association and security organizations, emergency managers, and planners and architects. Each of these stakeholder sets... Project Report.126 The DARPA SCP fellows identified 14 factors that affected the performance of any one team. Notable among the collection were
ERIC Educational Resources Information Center
Gladney, Henry M.; Andreoni, Antonella; Baldacci, Maria Bruna; Biagioni, Stefania; Carlesi, Carlo; Castelli, Donatella; Pagano, Pasquale; Peters, Carol; Pisani, Serena; Dempsey, Lorcan; Gardner, Tracy; Day, Michael; van der Werf, Titia; Bacsich, Paul; Heath, Andy; Lefrere, Paul; Miller, Paul; Riley, Kevin
1999-01-01
Includes four articles that discuss the impact of the emerging digital information infrastructure on intellectual property; the implementation of a digital library for a European consortium of national research institutions; an international information gateway collaboration; and developing standards for the description and sharing of educational…
2011 Defense Industrial Base Critical Infrastructure Protection Conference (DIBCIP)
2011-08-25
Office of the Program Manager, Information Sharing Environment u Mr. Vince Jarvie , Vice President, Corporate Security, L-3 Communications...National Defense University IRM College and in 2008 he obtained the Certified Information System Security Professional certificate. MR. VINCE JARVIE ...Vice President, Corporate Security, L-3 Communciations Corporation Mr. Vincent (Vince) Jarvie is the Vice President, Corporate Security for L-3
ERIC Educational Resources Information Center
Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen
2010-01-01
Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…
Innovation leadership: new perspectives for new work.
Malloch, Kathy
2010-03-01
The industrial age command and control leadership style and supporting infrastructure are ineffective in meeting the challenges of the increased availability and sharing of information, the media used for knowledge transfer, the changing range and types of relationships between individuals, and the time required to transfer and share information. What has not changed is the need for effective personal relationships in the evaluation and selection of new technologies; human to human sensitivity, acknowledgment, and respect for the patient care experience. As individuals embrace these new technologies, the essence of the innovation leader emerges to purposefully guide, assess, integrate, and synthesize technology into the human work of patient care. Building organizational infrastructures with openness for technology and innovations to enhance effective patient care relationships now requires an innovation skill set that understands and integrates human needs with the best of technology. In this article a brief description of innovation leadership is presented as the backdrop for change along with 4 significant changes in work processes that have irreversibly altered health care work, the trimodal organizational structure to accommodate operations, innovation, and transition between the 2, and finally, individual and team behaviors that emphasize the work of innovation. Copyright 2010 Elsevier Inc. All rights reserved.
Thibault, J. C.; Roe, D. R.; Eilbeck, K.; Cheatham, T. E.; Facelli, J. C.
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data – both within the same organization and among different ones – remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations. PMID:26387907
Thibault, J C; Roe, D R; Eilbeck, K; Cheatham, T E; Facelli, J C
2015-01-01
Biomolecular simulations aim to simulate structure, dynamics, interactions, and energetics of complex biomolecular systems. With the recent advances in hardware, it is now possible to use more complex and accurate models, but also reach time scales that are biologically significant. Molecular simulations have become a standard tool for toxicology and pharmacology research, but organizing and sharing data - both within the same organization and among different ones - remains a substantial challenge. In this paper we review our recent work leading to the development of a comprehensive informatics infrastructure to facilitate the organization and exchange of biomolecular simulations data. Our efforts include the design of data models and dictionary tools that allow the standardization of the metadata used to describe the biomedical simulations, the development of a thesaurus and ontology for computational reasoning when searching for biomolecular simulations in distributed environments, and the development of systems based on these models to manage and share the data at a large scale (iBIOMES), and within smaller groups of researchers at laboratory scale (iBIOMES Lite), that take advantage of the standardization of the meta data used to describe biomolecular simulations.