Sample records for case projected source

  1. Teaching Discrete Mathematics Entirely from Primary Historical Sources

    ERIC Educational Resources Information Center

    Barnett, Janet Heine; Bezhanishvili, Guram; Lodder, Jerry; Pengelley, David

    2016-01-01

    We describe teaching an introductory discrete mathematics course entirely from student projects based on primary historical sources. We present case studies of four projects that cover the content of a one-semester course, and mention various other courses that we have taught with primary source projects.

  2. Open Source Drug Discovery in Practice: A Case Study

    PubMed Central

    Årdal, Christine; Røttingen, John-Arne

    2012-01-01

    Background Open source drug discovery offers potential for developing new and inexpensive drugs to combat diseases that disproportionally affect the poor. The concept borrows two principle aspects from open source computing (i.e., collaboration and open access) and applies them to pharmaceutical innovation. By opening a project to external contributors, its research capacity may increase significantly. To date there are only a handful of open source R&D projects focusing on neglected diseases. We wanted to learn from these first movers, their successes and failures, in order to generate a better understanding of how a much-discussed theoretical concept works in practice and may be implemented. Methodology/Principal Findings A descriptive case study was performed, evaluating two specific R&D projects focused on neglected diseases. CSIR Team India Consortium's Open Source Drug Discovery project (CSIR OSDD) and The Synaptic Leap's Schistosomiasis project (TSLS). Data were gathered from four sources: interviews of participating members (n = 14), a survey of potential members (n = 61), an analysis of the websites and a literature review. Both cases have made significant achievements; however, they have done so in very different ways. CSIR OSDD encourages international collaboration, but its process facilitates contributions from mostly Indian researchers and students. Its processes are formal with each task being reviewed by a mentor (almost always offline) before a result is made public. TSLS, on the other hand, has attracted contributors internationally, albeit significantly fewer than CSIR OSDD. Both have obtained funding used to pay for access to facilities, physical resources and, at times, labor costs. TSLS releases its results into the public domain, whereas CSIR OSDD asserts ownership over its results. Conclusions/Significance Technically TSLS is an open source project, whereas CSIR OSDD is a crowdsourced project. However, both have enabled high quality research at low cost. The critical success factors appear to be clearly defined entry points, transparency and funding to cover core material costs. PMID:23029588

  3. RadioSource.NET: Case-Study of a Collaborative Land-Grant Internet Audio Project.

    ERIC Educational Resources Information Center

    Sohar, Kathleen; Wood, Ashley M.; Ramirez, Roberto

    2002-01-01

    Provides a case study of RadioSource.NET, an Internet broadcasting venture developed collaboratively by land-grant university communication departments to share resources, increase online distribution, and promote access to agricultural and natural and life science research. Describes planning, marketing, and implementation processes. (Contains 18…

  4. SolarPILOT Feature Requests and Collaboration | Concentrating Solar Power |

    Science.gov Websites

    DOE of the CSP community's needs. As of March 2018, SolarPILOT is also available as an open source project. While not every project benefits from an open source approach, several factors influenced this , but lack of availability has, in some cases, prevented widespread adoption of a common platform. Open

  5. Assessing the Financial Benefits of Faster Development Times: The Case of Single-source Versus Multi-vendor Outsourced Biopharmaceutical Manufacturing.

    PubMed

    DiMasi, Joseph A; Smith, Zachary; Getz, Kenneth A

    2018-05-10

    The extent to which new drug developers can benefit financially from shorter development times has implications for development efficiency and innovation incentives. We provided a real-world example of such gains by using recent estimates of drug development costs and returns. Time and fee data were obtained on 5 single-source manufacturing projects. Time and fees were modeled for these projects as if the drug substance and drug product processes had been contracted separately from 2 vendors. The multi-vendor model was taken as the base case, and financial impacts from single-source contracting were determined relative to the base case. The mean and median after-tax financial benefits of shorter development times from single-source contracting were $44.7 million and $34.9 million, respectively (2016 dollars). The after-tax increases in sponsor fees from single-source contracting were small in comparison (mean and median of $0.65 million and $0.25 million). For the data we examined, single-source contracting yielded substantial financial benefits over multi-source contracting, even after accounting for somewhat higher sponsor fees. Copyright © 2018 Elsevier HS Journals, Inc. All rights reserved.

  6. Project-based physics labs using low-cost open-source hardware

    NASA Astrophysics Data System (ADS)

    Bouquet, F.; Bobroff, J.; Fuchs-Gallezot, M.; Maurines, L.

    2017-03-01

    We describe a project-based physics lab, which we proposed to third-year university students. These labs are based on new open-source low-cost equipment (Arduino microcontrollers and compatible sensors). Students are given complete autonomy: they develop their own experimental setup and study the physics topic of their choice. The goal of these projects is to let students to discover the reality of experimental physics. Technical specifications of the acquisition material and case studies are presented for practical implementation in other universities.

  7. Heavy oil reservoirs recoverable by thermal technology

    NASA Astrophysics Data System (ADS)

    Kujawa, P.

    1981-02-01

    Reservoir, production, and project data for target reservoirs which contain heavy oil in the 8 to 25(0) API gravity range and are susceptible to recovery by in situ combustion and steam drive are presented. The reservoirs for steam recovery are less than 2500 feet deep to comply with state of the art technology. In cases where one reservoir would be a target for in situ combustion or steam drive, that reservoir is reported in both sections. Data were collected from three source types: hands-on, once removed, and twice removed. In all cases, data were sought depicting and characterizing individual reservoirs as opposed to data covering an entire field with more than one producing interval or reservoir. The data sources are listed at the end of each case. A complete listing of operators and projects is included as well as a bibliography of source material.

  8. A Survey of Research Projects in Schools and Colleges of Optometry.

    ERIC Educational Resources Information Center

    Whitener, John C.

    1981-01-01

    A survey undertaken by the American Optometric Association reveals research projects, investigators, and in some cases, funding sources for research in the areas of low vision, ophthalmic lenses, pharmacology, anatomy and pathology, and sensory and motor functions. A total of 205 projects are charted. (MSE)

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mosher, J.C.; Leahy, R.M.

    A new method for source localization is described that is based on a modification of the well known multiple signal classification (MUSIC) algorithm. In classical MUSIC, the array manifold vector is projected onto an estimate of the signal subspace, but errors in the estimate can make location of multiple sources difficult. Recursively applied and projected (RAP) MUSIC uses each successively located source to form an intermediate array gain matrix, and projects both the array manifold and the signal subspace estimate into its orthogonal complement. The MUSIC projection is then performed in this reduced subspace. Using the metric of principal angles,more » the authors describe a general form of the RAP-MUSIC algorithm for the case of diversely polarized sources. Through a uniform linear array simulation, the authors demonstrate the improved Monte Carlo performance of RAP-MUSIC relative to MUSIC and two other sequential subspace methods, S and IES-MUSIC.« less

  10. Resilience in Utility Technologies

    NASA Astrophysics Data System (ADS)

    Seaton, Roger

    The following sections are included: * Scope of paper * Preamble * Background to the case-study projects * Source projects * Resilience * Case study 1: Electricity generation * Context * Model * Case study 2: Water recycling * Context * Model * Case study 3: Ecotechnology and water treatment * Context * The problem of classification: Finding a classificatory solution * Application of the new taxonomy to water treatment * Concluding comments and questions * Conclusions * Questions and issues * Purposive or Purposeful? * Resilience: Flexibility and adaptivity? * Resilience: With respect of what? * Risk, uncertainty, surprise, emergence - What sort of shock, and who says so? * Co-evolutionary friction * References

  11. Curriculum Adaptation in Special Schools for Students with Intellectual Disabilities (SID): A Case Study of Project Learning in One SID School in Hong Kong

    ERIC Educational Resources Information Center

    Zhang, Jia-Wei; Wong, Lam; Chan, Tak-Hang; Chiu, Chi-Shing

    2014-01-01

    Using a qualitative case study approach, the authors analyzed the curriculum adaptation process for one project learning activity in School K, which is a SID school in the context of school-university collaboration. Multiple sources of data were collected for triangulation, including interviews, documents and observations. Curriculum adaptation…

  12. Harvard University: Green Loan Fund. Green Revolving Funds in Action: Case Study Series

    ERIC Educational Resources Information Center

    Foley, Robert

    2011-01-01

    The Green Loan Fund at Harvard University has been an active source of capital for energy efficiency and waste reduction projects for almost a decade. This case study examines the revolving fund's history from its inception as a pilot project in the 1990s to its regeneration in the early 2000s to its current operations today. The green revolving…

  13. Case studies for GSHP demonstration projects in the US

    DOE PAGES

    Liu, Xiaobing; Malhotra, Mini; Im, Piljae

    2015-07-01

    Under the American Recovery and Reinvestment Act , twenty-six ground source heat pump (GSHP) projects were competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This article gives an overview of the case studies for six of the systems. These case studies evaluated efficiencies, energy savings, and costs of the demonstrated systems. In addition, it was found that more energy savings could be achieved if controls of GSHP system are improved.

  14. An Embedded Systems Course for Engineering Students Using Open-Source Platforms in Wireless Scenarios

    ERIC Educational Resources Information Center

    Rodriguez-Sanchez, M. C.; Torrado-Carvajal, Angel; Vaquero, Joaquin; Borromeo, Susana; Hernandez-Tamames, Juan A.

    2016-01-01

    This paper presents a case study analyzing the advantages and disadvantages of using project-based learning (PBL) combined with collaborative learning (CL) and industry best practices, integrated with information communication technologies, open-source software, and open-source hardware tools, in a specialized microcontroller and embedded systems…

  15. Energy Sources and Systems Analysis: 40 South Lincoln Redevelopment District (Short Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-08-01

    This report presents the a brief overview of the results of a case study to analyze district energy systems for their potential use in a project that involves redeveloping 270 units of existing public housing, along with other nearby sites. When complete, the redevelopment project will encompass more than 900 mixed-income residential units, commercial and retail properties, and open space. The analysis estimated the hourly heating, cooling, domestic hot water, and electric loads required by the community; investigated potential district system technologies to meet those needs; and researched available fuel sources to power such systems. A full report of thismore » case study is also available.« less

  16. Government Accountability Office Bid Protests in Air Force Source Selections: Evidence and Options

    DTIC Science & Technology

    2012-01-01

    chapter, we focus on the sustained protests and lessons that can be learned from them. Th is chapter does not off er complete case histories of these...resulting research project, “Air Force Source Selections: Lessons Learned and Best Practices,” which was conducted within the Resource Management...Program of PAF in fiscal year (FY) 2009. This project studied the Air Force’s recent experience with bid protests before GAO and documented lessons that

  17. A Survey of Usability Practices in Free/Libre/Open Source Software

    NASA Astrophysics Data System (ADS)

    Paul, Celeste Lyn

    A review of case studies about usability in eight Free/Libre/Open Source Software (FLOSS) projects showed that an important issue regarding a usability initiative in the project was the lack of user research. User research is a key component in the user-centered design (UCD) process and a necessary step for creating usable products. Reasons why FLOSS projects suffered from a lack of user research included poor or unclear project leadership, cultural differences between developer and designers, and a lack of usability engineers. By identifying these critical issues, the FLOSS usability community can begin addressing problems in the efficacy of usability activities and work towards creating more usable FLOSS products.

  18. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    ERIC Educational Resources Information Center

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  19. Local government funding and financing of roads : Virginia case studies and examples from other states.

    DOT National Transportation Integrated Search

    2014-10-01

    Several Virginia localities have used local funding and financing sources to build new roads or complete major street : improvement projects when state and/or federal funding was not available. Many others have combined local funding sources : with s...

  20. Cogeneration technology alternatives study. Volume 4: Heat Sources, balance of plant and auxiliary systems

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Data and information established for heat sources balance of plant items, thermal energy storage, and heat pumps are presented. Design case descriptions are given along with projected performance values. Capital cost estimates for representative cogeneration plants are also presented.

  1. Refinement of Methods for Evaluation of Near-Hypersingular Integrals in BEM Formulations

    NASA Technical Reports Server (NTRS)

    Fink, Patricia W.; Khayat, Michael A.; Wilton, Donald R.

    2006-01-01

    In this paper, we present advances in singularity cancellation techniques applied to integrals in BEM formulations that are nearly hypersingular. Significant advances have been made recently in singularity cancellation techniques applied to 1 R type kernels [M. Khayat, D. Wilton, IEEE Trans. Antennas and Prop., 53, pp. 3180-3190, 2005], as well as to the gradients of these kernels [P. Fink, D. Wilton, and M. Khayat, Proc. ICEAA, pp. 861-864, Torino, Italy, 2005] on curved subdomains. In these approaches, the source triangle is divided into three tangent subtriangles with a common vertex at the normal projection of the observation point onto the source element or the extended surface containing it. The geometry of a typical tangent subtriangle and its local rectangular coordinate system with origin at the projected observation point is shown in Fig. 1. Whereas singularity cancellation techniques for 1 R type kernels are now nearing maturity, the efficient handling of near-hypersingular kernels still needs attention. For example, in the gradient reference above, techniques are presented for computing the normal component of the gradient relative to the plane containing the tangent subtriangle. These techniques, summarized in the transformations in Table 1, are applied at the sub-triangle level and correspond particularly to the case in which the normal projection of the observation point lies within the boundary of the source element. They are found to be highly efficient as z approaches zero. Here, we extend the approach to cover two instances not previously addressed. First, we consider the case in which the normal projection of the observation point lies external to the source element. For such cases, we find that simple modifications to the transformations of Table 1 permit significant savings in computational cost. Second, we present techniques that permit accurate computation of the tangential components of the gradient; i.e., tangent to the plane containing the source element.

  2. 6. Photographic copy of photograph. No date. Photographer unknown. (Source: ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Photographic copy of photograph. No date. Photographer unknown. (Source: SCIP office, Coolidge, AZ) CHINA WASH FLUME UNDER CONSTRUCTION - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ

  3. Case study: molasses as the primary energy source on an organic grazing dairy

    USDA-ARS?s Scientific Manuscript database

    Organic dairies face many challenges, one of which is the high cost of purchased organic grains. Molasses may be a less expensive energy alternative. However, anecdotal results have been mixed for farms that used molasses as the sole energy source. This research project quantified animal performance...

  4. Mathematically trivial control of sound using a parametric beam focusing source.

    PubMed

    Tanaka, Nobuo; Tanaka, Motoki

    2011-01-01

    By exploiting a case regarded as trivial, this paper presents global active noise control using a parametric beam focusing source (PBFS). As with a dipole model, one is used for a primary sound source and the other for a control sound source, the control effect for minimizing a total acoustic power depends on the distance between the two. When the distance becomes zero, the total acoustic power becomes null, hence nothing less than a trivial case. Because of the constraints in practice, there exist difficulties in placing a control source close enough to a primary source. However, by projecting a sound beam of a parametric array loudspeaker onto the target sound source (primary source), a virtual sound source may be created on the target sound source, thereby enabling the collocation of the sources. In order to further ensure feasibility of the trivial case, a PBFS is then introduced in an effort to meet the size of the two sources. Reflected sound wave of the PBFS, which is tantamount to the virtual sound source output, aims to suppress the primary sound. Finally, a numerical analysis as well as an experiment is conducted, verifying the validity of the proposed methodology.

  5. Generic project definitions for improvement of health care delivery: a case-based approach.

    PubMed

    Niemeijer, Gerard C; Does, Ronald J M M; de Mast, Jeroen; Trip, Albert; van den Heuvel, Jaap

    2011-01-01

    The purpose of this article is to create actionable knowledge, making the definition of process improvement projects in health care delivery more effective. This study is a retrospective analysis of process improvement projects in hospitals, facilitating a case-based reasoning approach to project definition. Data sources were project documentation and hospital-performance statistics of 271 Lean Six Sigma health care projects from 2002 to 2009 of general, teaching, and academic hospitals in the Netherlands and Belgium. Objectives and operational definitions of improvement projects in the sample, analyzed and structured in a uniform format and terminology. Extraction of reusable elements of earlier project definitions, presented in the form of 9 templates called generic project definitions. These templates function as exemplars for future process improvement projects, making the selection, definition, and operationalization of similar projects more efficient. Each template includes an explicated rationale, an operationalization in the form of metrics, and a prototypical example. Thus, a process of incremental and sustained learning based on case-based reasoning is facilitated. The quality of project definitions is a crucial success factor in pursuits to improve health care delivery. We offer 9 tried and tested improvement themes related to patient safety, patient satisfaction, and business-economic performance of hospitals.

  6. OER Use in Intermediate Language Instruction: A Case Study

    ERIC Educational Resources Information Center

    Godwin-Jones, Robert

    2017-01-01

    This paper reports on a case study in the experimental use of Open Educational Resources (OERs) in intermediate level language instruction. The resources come from three sources: the instructor, the students, and open content repositories. The objective of this action research project was to provide student-centered learning materials, enhance…

  7. Accelerator-based BNCT.

    PubMed

    Kreiner, A J; Baldo, M; Bergueiro, J R; Cartelli, D; Castell, W; Thatar Vento, V; Gomez Asoia, J; Mercuri, D; Padulo, J; Suarez Sandin, J C; Erhardt, J; Kesque, J M; Valda, A A; Debray, M E; Somacal, H R; Igarzabal, M; Minsky, D M; Herrera, M S; Capoulat, M E; Gonzalez, S J; del Grosso, M F; Gagetti, L; Suarez Anzorena, M; Gun, M; Carranza, O

    2014-06-01

    The activity in accelerator development for accelerator-based BNCT (AB-BNCT) both worldwide and in Argentina is described. Projects in Russia, UK, Italy, Japan, Israel, and Argentina to develop AB-BNCT around different types of accelerators are briefly presented. In particular, the present status and recent progress of the Argentine project will be reviewed. The topics will cover: intense ion sources, accelerator tubes, transport of intense beams, beam diagnostics, the (9)Be(d,n) reaction as a possible neutron source, Beam Shaping Assemblies (BSA), a treatment room, and treatment planning in realistic cases. © 2013 Elsevier Ltd. All rights reserved.

  8. Energy Sources and Systems Analysis: 40 South Lincoln Redevelopment District (Full Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-08-01

    This report presents the results of a case study to analyze district energy systems for their potential use in a project that involves redeveloping 270 units of existing public housing, along with other nearby sites. When complete, the redevelopment project will encompass more than 900 mixed-income residential units, commercial and retail properties, and open space. The analysis estimated the hourly heating, cooling, domestic hot water, and electric loads required by the community; investigated potential district system technologies to meet those needs; and researched available fuel sources to power such systems.

  9. Studying the laws of software evolution in a long-lived FLOSS project.

    PubMed

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-07-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd.

  10. Studying the laws of software evolution in a long-lived FLOSS project

    PubMed Central

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-01-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd. PMID:25893093

  11. Environmental economics reality check: a case study of the Abanico Medicinal Plant and Organic Agriculture Microenterprise Project.

    PubMed

    Isla, Ana; Thompson, Shirley

    2003-01-01

    This paper presents a case study of the Abanico Medicinal Plant and Organic Agriculture Microenterprise Project in the Arenal Conservation Area, Costa Rica. Microenterprise is the Sustainable Development and the Women in Development model for gender equity and environment of the World Bank, International Monetary Fund and large non-government organizations, like the World Wildlife Fund-Canada. The authors of this paper argue that debt-for-nature investment in microenterprise and ecological economic models are not distinct from neoclassical economic and development models that created the environmental, social and cultural crises in the first place. This case study shows that the world market accommodates only one model of development: unsustainable export-oriented production based on flexible labour markets, low wages, indebtedness and low cost production. Working standards in those micro-enterprises are eroded due to many factors,including indebtedness. What happened at a national level in non-industrial countries with the international debt crisis is now mirrored in individual indebtedness through microenterprise. Is current development policy creating a new form of indentured servitude? Medicinal plants, prior to commodification, were a source of women's power and upon commodification in international development projects, are the source of their exploitation.

  12. Improving tuberculosis control through public-private collaboration in India: literature review.

    PubMed

    Dewan, Puneet K; Lal, S S; Lonnroth, Knut; Wares, Fraser; Uplekar, Mukund; Sahu, Suvanand; Granich, Reuben; Chauhan, Lakhbir Singh

    2006-03-11

    To review the characteristics of public-private mix projects in India and their effect on case notification and treatment outcomes for tuberculosis. Literature review. Review of surveillance records from Indian tuberculosis programme project, evaluation reports, and medical literature for public-private mix projects in India. Project characteristics, tuberculosis case notification of new patients with sputum smear results positive for acid fast bacilli, and treatment outcome. Of 24 identified public-private mix projects, data were available from 14 (58%), involving private practitioners, corporations, and non-governmental organisations. In all reviewed projects, the public sector tuberculosis programme provided training and supervision of private providers. Among the five projects with available data on historical controls, case notification rates were higher after implementation of a public-private mix project. Among seven projects involving private practitioners, 2796 of 12 147 (23%) new patients positive for acid fast bacilli were attributed to private providers. Corporate based and non-governmental organisations served as the main source for tuberculosis programme services in seven project areas, detecting 9967 new patients positive for acid fast bacilli. In nine of 12 projects with data on treatment outcomes, private providers exceeded the programme target of 85% treatment success for new patients positive for acid fast bacilli. Public-private mix activities were associated with increased case notification, while maintaining acceptable treatment outcomes. Collaborations between public and private providers of health care hold considerable potential to improve tuberculosis control in India.

  13. Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China

    NASA Astrophysics Data System (ADS)

    Yuan, Fei; Zhao, Chongxu; Jiang, Yong; Ren, Liliang; Shan, Hongcui; Zhang, Limin; Zhu, Yonghua; Chen, Tao; Jiang, Shanhu; Yang, Xiaoli; Shen, Hongren

    2017-11-01

    Projections of hydrological changes are associated with large uncertainties from different sources, which should be quantified for an effective implementation of water management policies adaptive to future climate change. In this study, a modeling chain framework to project future hydrological changes and the associated uncertainties in the Xijiang River basin, South China, was established. The framework consists of three emission scenarios (ESs), four climate models (CMs), four statistical downscaling (SD) methods, four hydrological modeling (HM) schemes, and four probability distributions (PDs) for extreme flow frequency analyses. Direct variance method was adopted to analyze the manner by which uncertainty sources such as ES, CM, SD, and HM affect the estimates of future evapotranspiration (ET) and streamflow, and to quantify the uncertainties of PDs in future flood and drought risk assessment. Results show that ES is one of the least important uncertainty sources in most situations. CM, in general, is the dominant uncertainty source for the projections of monthly ET and monthly streamflow during most of the annual cycle, daily streamflow below the 99.6% quantile level, and extreme low flow. SD is the most predominant uncertainty source in the projections of extreme high flow, and has a considerable percentage of uncertainty contribution in monthly streamflow projections in July-September. The effects of SD in other cases are negligible. HM is a non-ignorable uncertainty source that has the potential to produce much larger uncertainties for the projections of low flow and ET in warm and wet seasons than for the projections of high flow. PD contributes a larger percentage of uncertainty in extreme flood projections than it does in extreme low flow estimates. Despite the large uncertainties in hydrological projections, this work found that future extreme low flow would undergo a considerable reduction, and a noticeable increase in drought risk in the Xijiang River basin would be expected. Thus, the necessity of employing effective water-saving techniques and adaptive water resources management strategies for drought disaster mitigation should be addressed.

  14. Shaping Software Engineering Curricula Using Open Source Communities: A Case Study

    ERIC Educational Resources Information Center

    Bowring, James; Burke, Quinn

    2016-01-01

    This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…

  15. The CHT2 Project: Diachronic 3d Reconstruction of Historic Sites

    NASA Astrophysics Data System (ADS)

    Guidi, G.; Micoli, L.; Gonizzi Barsanti, S.; Malik, U.

    2017-08-01

    Digital modelling archaeological and architectural monuments in their current state and in their presumed past aspect has been recognized not only as a way for explaining to the public the genesis of a historical site, but also as an effective tool for research. The search for historical sources, their proper analysis and interdisciplinary relationship between technological disciplines and the humanities are fundamental for obtaining reliable hypothetical reconstructions. This paper presents an experimental activity defined by the project Cultural Heritage Through Time - CHT2 (http://cht2-project.eu), funded in the framework of the Joint Programming Initiative on Cultural Heritage (JPI-CH) of the European Commission. Its goal is to develop time-varying 3D products, from landscape to architectural scale, deals with the implementation of the methodology on one of the case studies: the late Roman circus of Milan, built in the era when the city was the capital of the Western Roman Empire (286-402 A.D). The work presented here covers one of the cases in which the physical evidences have now been almost entirely disappeared. The diachronic reconstruction is based on a proper mix of quantitative data originated by 3D surveys at present time, and historical sources like ancient maps, drawings, archaeological reports, archaeological restrictions decrees and old photographs. Such heterogeneous sources have been first georeferenced and then properly integrated according to the methodology defined in the framework of the CHT2 project, to hypothesize a reliable reconstruction of the area in different historical periods.

  16. Locating People Diagnosed With HIV for Public Health Action: Utility of HIV Case Surveillance and Other Data Sources.

    PubMed

    Padilla, Mabel; Mattson, Christine L; Scheer, Susan; Udeagu, Chi-Chi N; Buskin, Susan E; Hughes, Alison J; Jaenicke, Thomas; Wohl, Amy Rock; Prejean, Joseph; Wei, Stanley C

    Human immunodeficiency virus (HIV) case surveillance and other health care databases are increasingly being used for public health action, which has the potential to optimize the health outcomes of people living with HIV (PLWH). However, often PLWH cannot be located based on the contact information available in these data sources. We assessed the accuracy of contact information for PLWH in HIV case surveillance and additional data sources and whether time since diagnosis was associated with accurate contact information in HIV case surveillance and successful contact. The Case Surveillance-Based Sampling (CSBS) project was a pilot HIV surveillance system that selected a random population-based sample of people diagnosed with HIV from HIV case surveillance registries in 5 state and metropolitan areas. From November 2012 through June 2014, CSBS staff members attempted to locate and interview 1800 sampled people and used 22 data sources to search for contact information. Among 1063 contacted PLWH, HIV case surveillance data provided accurate telephone number, address, or HIV care facility information for 239 (22%), 412 (39%), and 827 (78%) sampled people, respectively. CSBS staff members used additional data sources, such as support services and commercial people-search databases, to locate and contact PLWH with insufficient contact information in HIV case surveillance. PLWH diagnosed <1 year ago were more likely to have accurate contact information in HIV case surveillance than were PLWH diagnosed ≥1 year ago ( P = .002), and the benefit from using additional data sources was greater for PLWH with more longstanding HIV infection ( P < .001). When HIV case surveillance cannot provide accurate contact information, health departments can prioritize searching additional data sources, especially for people with more longstanding HIV infection.

  17. Emittance study of a 28 GHz electron cyclotron resonance ion source for the Rare Isotope Science Project superconducting linear accelerator.

    PubMed

    Park, Bum-Sik; Hong, In-Seok; Jang, Ji-Ho; Jin, Hyunchang; Choi, Sukjin; Kim, Yonghwan

    2016-02-01

    A 28 GHz electron cyclotron resonance (ECR) ion source is being developed for use as an injector for the superconducting linear accelerator of the Rare Isotope Science Project. Beam extraction from the ECR ion source has been simulated using the KOBRA3-INP software. The simulation software can calculate charged particle trajectories in three dimensional complex magnetic field structures, which in this case are formed by the arrangement of five superconducting magnets. In this study, the beam emittance is simulated to understand the effects of plasma potential, mass-to-charge ratio, and spatial distribution. The results of these simulations and their comparison to experimental results are presented in this paper.

  18. Case Study of a Participatory Health-Promotion Intervention in School

    ERIC Educational Resources Information Center

    Simovska, Venka

    2012-01-01

    This article discusses the findings from a case study focusing on processes involving pupils to bring about health-promotion changes. The study is related to an EU intervention project aiming to promote health and well-being among children (4-16 years). Qualitative research was carried out in a school in the Netherlands. Data sources include…

  19. Reaching High-Risk Youth through Model AIDS Education Programs: A Case by Case Study.

    ERIC Educational Resources Information Center

    Center for Population Options, Washington, DC.

    This report evaluates the High Risk Youth Demonstration Project, which is predicated on the idea that youth-serving agencies (YSAs) can be key sources for adolescent AIDS education. When the Center for Population Options (CPO) conceptualized a strategy for bringing AIDS education to underserved youth, it was responding to the following three areas…

  20. Open Source Software and Design-Based Research Symbiosis in Developing 3D Virtual Learning Environments: Examples from the iSocial Project

    ERIC Educational Resources Information Center

    Schmidt, Matthew; Galyen, Krista; Laffey, James; Babiuch, Ryan; Schmidt, Carla

    2014-01-01

    Design-based research (DBR) and open source software are both acknowledged as potentially productive ways for advancing learning technologies. These approaches have practical benefits for the design and development process and for building and leveraging community to augment and sustain design and development. This report presents a case study of…

  1. University of Colorado at Boulder: Energy and Climate Revolving Fund. Green Revolving Funds in Action: Case Study Series

    ERIC Educational Resources Information Center

    Caine, Rebecca

    2012-01-01

    The University of Colorado at Boulder's student run Environmental Center leads the campus' sustainability efforts. The Center created the Energy and Climate Revolving Fund (ECRF) in 2007 to finance energy-efficiency upgrades. The ECRF functions as a source of funding for project loans and provides a method of financing projects that seeks to save…

  2. Taking a Giant Step: A Case Study of New York City's Efforts to Implement Universal Pre-Kindergarten Services. Working Paper Series.

    ERIC Educational Resources Information Center

    Gatenio, Shirley

    This working paper examines why Project Giant Step, a well-received policy initiative for universal preschool for 4-year-olds in New York City, was discontinued after overcoming many of the challenges it faced. Project Giant Step forced collaboration between large public agencies differing in their institutional structures, funding sources, and…

  3. Newly invented biobased materials from low-carbon, diverted waste fibers: research methods, testing, and full-scale application in a case study structure

    Treesearch

    Julee A Herdt; John Hunt; Kellen Schauermann

    2016-01-01

    This project demonstrates newly invented, biobased construction materials developed by applying lowcarbon, biomass waste sources through the Authors’ engineered fiber processes and technology. If manufactured and applied large-scale the project inventions can divert large volumes of cellulose waste into high-performance, low embodied energy, environmental construction...

  4. Technical Guidance Manual: Contaminant Flux Reduction Barriers for Managing Difficult-to-Treat Source Zones in Unconsolidated Media

    DTIC Science & Technology

    2017-06-20

    was to evaluate if inexpensive flow reduction agents delivered via permeation grouting technology could help manage difficult-to-treat chlorinated...30  Table 4.  Description of Case Study Site ................................................................................... 30  Table 5...Intentionally Left Blank ES-1 EXECUTIVE SUMMARY Project Objective The overall objective of this project was to evaluate if inexpensive flow reduction

  5. A Process Study of the Development of Virtual Research Environments

    NASA Astrophysics Data System (ADS)

    Ahmed, I.; Cooper, K.; McGrath, R.; Griego, G.; Poole, M. S.; Hanisch, R. J.

    2014-05-01

    In recent years, cyberinfrastructures have been deployed to create virtual research environments (VREs) - such as the Virtual Astronomical Observatory (VAO) - to enhance the quality and speed of scientific research, and to foster global scientific communities. Our study utilizes process methodology to study the evolution of VREs. This approach focuses on a series of events that bring about or lead to some outcome, and attempts to specify the generative mechanism that could produce the event series. This paper briefly outlines our approach and describes initial results of a case study of the VAO, one of the participating VREs. The case study is based on interviews with seven individuals participating in the VAO, and analysis of project documents and online resources. These sources are hand tagged to identify events related to the thematic tracks, to yield a narrative of the project. Results demonstrate the event series of an organization through traditional methods augmented by virtual sources.

  6. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  7. Implementing an Open Source Electronic Health Record System in Kenyan Health Care Facilities: Case Study

    PubMed Central

    Magare, Steve; Monda, Jonathan; Kamau, Onesmus; Houston, Stuart; Fraser, Hamish; Powell, John; English, Mike; Paton, Chris

    2018-01-01

    Background The Kenyan government, working with international partners and local organizations, has developed an eHealth strategy, specified standards, and guidelines for electronic health record adoption in public hospitals and implemented two major health information technology projects: District Health Information Software Version 2, for collating national health care indicators and a rollout of the KenyaEMR and International Quality Care Health Management Information Systems, for managing 600 HIV clinics across the country. Following these projects, a modified version of the Open Medical Record System electronic health record was specified and developed to fulfill the clinical and administrative requirements of health care facilities operated by devolved counties in Kenya and to automate the process of collating health care indicators and entering them into the District Health Information Software Version 2 system. Objective We aimed to present a descriptive case study of the implementation of an open source electronic health record system in public health care facilities in Kenya. Methods We conducted a landscape review of existing literature concerning eHealth policies and electronic health record development in Kenya. Following initial discussions with the Ministry of Health, the World Health Organization, and implementing partners, we conducted a series of visits to implementing sites to conduct semistructured individual interviews and group discussions with stakeholders to produce a historical case study of the implementation. Results This case study describes how consultants based in Kenya, working with developers in India and project stakeholders, implemented the new system into several public hospitals in a county in rural Kenya. The implementation process included upgrading the hospital information technology infrastructure, training users, and attempting to garner administrative and clinical buy-in for adoption of the system. The initial deployment was ultimately scaled back due to a complex mix of sociotechnical and administrative issues. Learning from these early challenges, the system is now being redesigned and prepared for deployment in 6 new counties across Kenya. Conclusions Implementing electronic health record systems is a challenging process in high-income settings. In low-income settings, such as Kenya, open source software may offer some respite from the high costs of software licensing, but the familiar challenges of clinical and administration buy-in, the need to adequately train users, and the need for the provision of ongoing technical support are common across the North-South divide. Strategies such as creating local support teams, using local development resources, ensuring end user buy-in, and rolling out in smaller facilities before larger hospitals are being incorporated into the project. These are positive developments to help maintain momentum as the project continues. Further integration with existing open source communities could help ongoing development and implementations of the project. We hope this case study will provide some lessons and guidance for other challenging implementations of electronic health record systems as they continue across Africa. PMID:29669709

  8. Optimization of subcutaneous vein contrast enhancement

    NASA Astrophysics Data System (ADS)

    Zeman, Herbert D.; Lovhoiden, Gunnar; Deshmukh, Harshal

    2000-05-01

    A technique for enhancing the contrast of subcutaneous veins has been demonstrated. This techniques uses a near IR light source and one or more IR sensitive CCD TV cameras to produce a contrast enhanced image of the subcutaneous veins. This video image of the veins is projected back onto the patient's skin using a n LCD video projector. The use of an IR transmitting filter in front of the video cameras prevents any positive feedback from the visible light from the video projector from causing instabilities in the projected image. The demonstration contrast enhancing illuminator has been tested on adults and children, both Caucasian and African-American, and it enhances veins quite well in all cases. The most difficult cases are those where significant deposits of subcutaneous fat are present which make the veins invisible under normal room illumination. Recent attempts to see through fat using different IR wavelength bands and both linearly and circularly polarized light were unsuccessful. The key to seeing through fat turns out to be a very diffuse source of RI light. Results on adult and pediatric subjects are shown with this new IR light source.

  9. Summary of Carbon Storage Incentives and Potential Legislation: East Sub-Basin Project Task 3.1 Business and Financial Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Trabucchi, Chiara

    The CarbonSAFE Illinois – East Sub-Basin project is conducting a pre-feasibility assessment for commercial-scale CO2 geological storage complexes. The project aims to identify sites capable of storing more than 50 million tons of industrially-sourced CO2. To support the business development assessment of the economic viability of potential sites in the East Sub-Basin and explore conditions under which a carbon capture and storage (CCS) project therein might be revenue positive, this document provides a summary of carbon storage incentives and legislation of potential relevance to the project.

  10. A Pedagogical Trebuchet: A Case Study in Experimental History and History Pedagogy

    ERIC Educational Resources Information Center

    Brice, Lee L.; Catania, Steven

    2012-01-01

    A common problem history teachers face regardless of their field of specialization is how to help students find answers to the most difficult historical questions, those for which the sources are unavailable or inaccessible, and teach them to do so in a methodologically valid manner. This article presents a case study which shows how a project in…

  11. One Science Teacher's Professional Development Experience: A Case Study Exploring Changes in Students' Perceptions of Their Fluency with Innovative Technologies

    ERIC Educational Resources Information Center

    Ebenezer, Jazlin; Columbus, Russell; Kaya, Osman Nafiz; Zhang, Lin; Ebenezer, Devairakkam Luke

    2012-01-01

    The purpose of this case-study is to narrate a secondary science teacher's experience of his professional development (PD) education and training in innovative technologies (IT) in the context of engaging students in environmental research projects. The sources from which the narrative is derived include (1) the science teacher's reflective…

  12. Cyanide poisoning in Thailand before and after establishment of the National Antidote Project.

    PubMed

    Srisuma, Sahaphume; Pradoo, Aimon; Rittilert, Panee; Wongvisavakorn, Sunun; Tongpoo, Achara; Sriapha, Charuwan; Krairojananan, Wannapa; Suchonwanich, Netnapis; Khomvilai, Sumana; Wananukul, Winai

    2018-04-01

    Antidote shortage is a global problem. In Thailand, the National Antidote Project (NAP) has operated since November 2010 to manage the national antidote stockpile, educate the healthcare providers on appropriate antidote use, and evaluate antidote usage. To evaluate the effect of NAP implementation on mortality rate and antidote use in cyanide poisoning cases arising from ingestion of cyanide or cyanogenic glycoside. This is a retrospective cohort of poisoning cases involving cyanide or cyanogenic glycoside ingestion reported to Ramathibodi Poison Center from 1 January 2007 to 31 December 2015. Mortality rate, antidote use, and appropriateness of antidote use (defined as correct indication, proper dosing regimen, and administration within 90 min) before and after NAP implementation were compared. Association between parameters and fatal outcomes was analyzed. A total of 343 cases involving cyanide or cyanogenic glycoside ingestion were reported to Ramathibodi Poison Center. There were 213 cases (62.1%) during NAP (Project group) and 130 cases (37.9%) pre-NAP implementation (Before group). Implementation of NAP led to increased antidote use (39.9% in Project group versus 24.6% in Before group) and a higher rate of appropriate antidote use (74.1% in Project group versus 50.0% in Before group). All 30 deaths were presented with initial severe symptoms. Cyanide chemical source and self-harm intent were associated with death (OR: 12.919, 95% CI: 4.863-39.761 and OR: 10.747, 95% CI: 3.884-28.514, respectively). No difference in overall mortality rate (13 [10.0%] deaths before versus 17 [8.0%] deaths after NAP) was found. In subgroup analysis of 80 cases with initial severe symptoms, NAP and appropriate antidote use reduced mortality (OR: 0.327, 95% CI: 0.106-0.997 and OR: 0.024, 95% CI: 0.004-0.122, respectively). In the multivariate analysis of the cases with initial severe symptoms, presence of the NAP and appropriate antidote use independently reduced the risk of death (OR: 0.122, 95% CI: 0.023-0.633 and OR: 0.034, 95% CI: 0.007-0.167, respectively), adjusted for intent of exposure, cyanide source, age, and sex. After NAP implementation, both antidote use and appropriate antidote use increased. In cases presenting with severe symptoms, presence of the NAP and appropriate antidote use independently reduced the risk of mortality.

  13. Indian energy sources in 1980's

    NASA Astrophysics Data System (ADS)

    Chaturvedi, A. C.

    Indian energy sources for electrical power generation are surveyed with a view to the development of the available hydroelectric resources. The capital-intensive nature of hydroelectric projects and their long gestation periods have impeded the rapid exploitation of the hydroelectric resources in the country, which are expected to provide 37% of the 16,200 MW capacity anticipated by 2001. Alternative sources of power such as solar and wind energy, biogas conversion and the use of industrial waste heat to produce electricity are discussed with case studies presented.

  14. Organizing and Presenting Program Outcome Data.

    ERIC Educational Resources Information Center

    Anema, Marion G.; Brown, Barbara E.; Stringfield, Yvonne N.

    2003-01-01

    Data collection and assessment processes used by a nursing school are described. Sources include student achievement data from tests, projects, journals, case studies, community service, and clinical practicums. The ways in which data are organized, presented, and used are discussed. (SK)

  15. Workflows and the Role of Images for Virtual 3d Reconstruction of no Longer Extant Historic Objects

    NASA Astrophysics Data System (ADS)

    Münster, S.

    2013-07-01

    3D reconstruction technologies have gained importance as tools for the research and visualization of no longer extant historic objects during the last decade. Within such reconstruction processes, visual media assumes several important roles: as the most important sources especially for a reconstruction of no longer extant objects, as a tool for communication and cooperation within the production process, as well as for a communication and visualization of results. While there are many discourses about theoretical issues of depiction as sources and as visualization outcomes of such projects, there is no systematic research about the importance of depiction during a 3D reconstruction process and based on empirical findings. Moreover, from a methodological perspective, it would be necessary to understand which role visual media plays during the production process and how it is affected by disciplinary boundaries and challenges specific to historic topics. Research includes an analysis of published work and case studies investigating reconstruction projects. This study uses methods taken from social sciences to gain a grounded view of how production processes would take place in practice and which functions and roles images would play within them. For the investigation of these topics, a content analysis of 452 conference proceedings and journal articles related to 3D reconstruction modeling in the field of humanities has been completed. Most of the projects described in those publications dealt with data acquisition and model building for existing objects. Only a small number of projects focused on structures that no longer or never existed physically. Especially that type of project seems to be interesting for a study of the importance of pictures as sources and as tools for interdisciplinary cooperation during the production process. In the course of the examination the authors of this paper applied a qualitative content analysis for a sample of 26 previously published project reports to depict strategies and types and three case studies of 3D reconstruction projects to evaluate evolutionary processes during such projects. The research showed that reconstructions of no longer existing historic structures are most commonly used for presentation or research purposes of large buildings or city models. Additionally, they are often realized by interdisciplinary workgroups using images as the most important source for reconstruction as far as important media for communication and quality control during the reconstruction process.

  16. Cost-effectiveness of diabetes case management for low-income populations.

    PubMed

    Gilmer, Todd P; Roze, Stéphane; Valentine, William J; Emy-Albrecht, Katrina; Ray, Joshua A; Cobden, David; Nicklasson, Lars; Philis-Tsimikas, Athena; Palmer, Andrew J

    2007-10-01

    To evaluate the cost-effectiveness of Project Dulce, a culturally specific diabetes case management and self-management training program, in four cohorts defined by insurance status. Clinical and cost data on 3,893 persons with diabetes participating in Project Dulce were used as inputs into a diabetes simulation model. The Center for Outcomes Research Diabetes Model, a published, peer-reviewed and validated simulation model of diabetes, was used to evaluate life expectancy, quality-adjusted life expectancy (QALY), cumulative incidence of complications and direct medical costs over patient lifetimes (40-year time horizon) from a third-party payer perspective. Cohort characteristics, treatment effects, and case management costs were derived using a difference in difference design comparing data from the Project Dulce program to a cohort of historical controls. Long-term costs were derived from published U.S. sources. Costs and clinical benefits were discounted at 3.0 percent per annum. Sensitivity analyses were performed. Incremental cost-effectiveness ratios of $10,141, $24,584, $44,941, and $69,587 per QALY gained were estimated for Project Dulce participants versus control in the uninsured, County Medical Services, Medi-Cal, and commercial insurance cohorts, respectively. The Project Dulce diabetes case management program was associated with cost-effective improvements in quality-adjusted life expectancy and decreased incidence of diabetes-related complications over patient lifetimes. Diabetes case management may be particularly cost effective for low-income populations.

  17. Risk Management in Complex Construction Projects that Apply Renewable Energy Sources: A Case Study of the Realization Phase of the Energis Educational and Research Intelligent Building

    NASA Astrophysics Data System (ADS)

    Krechowicz, Maria

    2017-10-01

    Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.

  18. The Unified Medical Language System

    PubMed Central

    Humphreys, Betsy L.; Lindberg, Donald A. B.; Schoolman, Harold M.; Barnett, G. Octo

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago. PMID:9452981

  19. The Unified Medical Language System: an informatics research collaboration.

    PubMed

    Humphreys, B L; Lindberg, D A; Schoolman, H M; Barnett, G O

    1998-01-01

    In 1986, the National Library of Medicine (NLM) assembled a large multidisciplinary, multisite team to work on the Unified Medical Language System (UMLS), a collaborative research project aimed at reducing fundamental barriers to the application of computers to medicine. Beyond its tangible products, the UMLS Knowledge Sources, and its influence on the field of informatics, the UMLS project is an interesting case study in collaborative research and development. It illustrates the strengths and challenges of substantive collaboration among widely distributed research groups. Over the past decade, advances in computing and communications have minimized the technical difficulties associated with UMLS collaboration and also facilitated the development, dissemination, and use of the UMLS Knowledge Sources. The spread of the World Wide Web has increased the visibility of the information access problems caused by multiple vocabularies and many information sources which are the focus of UMLS work. The time is propitious for building on UMLS accomplishments and making more progress on the informatics research issues first highlighted by the UMLS project more than 10 years ago.

  20. [Three patients with pneumonia due to Legionella associated with a sauna, a cooling tower and a caravan in The Netherlands].

    PubMed

    Bencini, M A; IJzerman, E P F; Bruin, J P; den Boer, J W

    2005-09-03

    In three male patients with lower respiratory disease, aged 51, 32 and 63 years, Legionnaires' disease was diagnosed by urinary antigen test and culture of the respiratory-tract fluid. In the second patient, the bronchoalveolar fluid also contained Streptococcus pneumoniae and Haemophilus influenzae. All three patients recovered after treatment with azithromycin in the first, cefotaxime, vancomycin and levofloxacin in the second, and erythromycin and ciprofloxacin in the third, respectively. Legionella pneumophila pneumonia is clinically not clearly distinct from other pneumonias and has a high mortality rate when not treated with the proper antibiotics. For that reason, adequate and swift diagnosis is of great importance. The urinary antigen test meets both of these criteria. Still, it is advisable to use culture and serology as well if Legionnaires' disease is suspected in a patient, since the urinary antigen test has limitations. In addition, patient isolates are ofepidemiological importance for public health. By comparing available patient isolates with Legionella strains from water sources, it is possible to identify sources of infection. In 2002, based on this principle, a project was started in The Netherlands aimed at identifying sources of infection, thereby preventing outbreaks of Legionnaires' disease by swift elimination of the source. Since the start of the project, 29 sources have been identified. In the cases described above these were a sauna, a cooling tower and a caravan, respectively. In suspected cases, respiratory-tract fluid must be collected to make possible such a source investigation.

  1. Repair, Evaluation, Maintenance, and Rehabilitation Research Program. Rehabilitation of Navigation Lock Walls: Case Histories.

    DTIC Science & Technology

    1987-12-01

    the 50-year design ser- vice life. Since these structures were built prior to 1940, the concrete does not contain intentionally entrained air and is...with which designers and contractors are familiar from past experience on new construction. However, there is increasing evidence that rehabilitation...with designers and contractors. Although the Information obtained from the various sources varied widely from project to project, attempts were made to

  2. The IASLC Lung Cancer Staging Project: A Renewed Call to Participation.

    PubMed

    Giroux, Dorothy J; Van Schil, Paul; Asamura, Hisao; Rami-Porta, Ramón; Chansky, Kari; Crowley, John J; Rusch, Valerie W; Kernstine, Kemp

    2018-06-01

    Over the past two decades, the International Association for the Study of Lung Cancer (IASLC) Staging Project has been a steady source of evidence-based recommendations for the TNM classification for lung cancer published by the Union for International Cancer Control and the American Joint Committee on Cancer. The Staging and Prognostic Factors Committee of the IASLC is now issuing a call for participation in the next phase of the project, which is designed to inform the ninth edition of the TNM classification for lung cancer. Following the case recruitment model for the eighth edition database, volunteer site participants are asked to submit data on patients whose lung cancer was diagnosed between January 1, 2011, and December 31, 2019, to the project by means of a secure, electronic data capture system provided by Cancer Research And Biostatistics in Seattle, Washington. Alternatively, participants may transfer existing data sets. The continued success of the IASLC Staging Project in achieving its objectives will depend on the extent of international participation, the degree to which cases are entered directly into the electronic data capture system, and how closely externally submitted cases conform to the data elements for the project. Copyright © 2018 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  3. The Source Inversion Validation (SIV) Initiative: A Collaborative Study on Uncertainty Quantification in Earthquake Source Inversions

    NASA Astrophysics Data System (ADS)

    Mai, P. M.; Schorlemmer, D.; Page, M.

    2012-04-01

    Earthquake source inversions image the spatio-temporal rupture evolution on one or more fault planes using seismic and/or geodetic data. Such studies are critically important for earthquake seismology in general, and for advancing seismic hazard analysis in particular, as they reveal earthquake source complexity and help (i) to investigate earthquake mechanics; (ii) to develop spontaneous dynamic rupture models; (iii) to build models for generating rupture realizations for ground-motion simulations. In applications (i - iii), the underlying finite-fault source models are regarded as "data" (input information), but their uncertainties are essentially unknown. After all, source models are obtained from solving an inherently ill-posed inverse problem to which many a priori assumptions and uncertain observations are applied. The Source Inversion Validation (SIV) project is a collaborative effort to better understand the variability between rupture models for a single earthquake (as manifested in the finite-source rupture model database) and to develop robust uncertainty quantification for earthquake source inversions. The SIV project highlights the need to develop a long-standing and rigorous testing platform to examine the current state-of-the-art in earthquake source inversion, and to develop and test novel source inversion approaches. We will review the current status of the SIV project, and report the findings and conclusions of the recent workshops. We will briefly discuss several source-inversion methods, how they treat uncertainties in data, and assess the posterior model uncertainty. Case studies include initial forward-modeling tests on Green's function calculations, and inversion results for synthetic data from spontaneous dynamic crack-like strike-slip earthquake on steeply dipping fault, embedded in a layered crustal velocity-density structure.

  4. Implementing an Open Source Electronic Health Record System in Kenyan Health Care Facilities: Case Study.

    PubMed

    Muinga, Naomi; Magare, Steve; Monda, Jonathan; Kamau, Onesmus; Houston, Stuart; Fraser, Hamish; Powell, John; English, Mike; Paton, Chris

    2018-04-18

    The Kenyan government, working with international partners and local organizations, has developed an eHealth strategy, specified standards, and guidelines for electronic health record adoption in public hospitals and implemented two major health information technology projects: District Health Information Software Version 2, for collating national health care indicators and a rollout of the KenyaEMR and International Quality Care Health Management Information Systems, for managing 600 HIV clinics across the country. Following these projects, a modified version of the Open Medical Record System electronic health record was specified and developed to fulfill the clinical and administrative requirements of health care facilities operated by devolved counties in Kenya and to automate the process of collating health care indicators and entering them into the District Health Information Software Version 2 system. We aimed to present a descriptive case study of the implementation of an open source electronic health record system in public health care facilities in Kenya. We conducted a landscape review of existing literature concerning eHealth policies and electronic health record development in Kenya. Following initial discussions with the Ministry of Health, the World Health Organization, and implementing partners, we conducted a series of visits to implementing sites to conduct semistructured individual interviews and group discussions with stakeholders to produce a historical case study of the implementation. This case study describes how consultants based in Kenya, working with developers in India and project stakeholders, implemented the new system into several public hospitals in a county in rural Kenya. The implementation process included upgrading the hospital information technology infrastructure, training users, and attempting to garner administrative and clinical buy-in for adoption of the system. The initial deployment was ultimately scaled back due to a complex mix of sociotechnical and administrative issues. Learning from these early challenges, the system is now being redesigned and prepared for deployment in 6 new counties across Kenya. Implementing electronic health record systems is a challenging process in high-income settings. In low-income settings, such as Kenya, open source software may offer some respite from the high costs of software licensing, but the familiar challenges of clinical and administration buy-in, the need to adequately train users, and the need for the provision of ongoing technical support are common across the North-South divide. Strategies such as creating local support teams, using local development resources, ensuring end user buy-in, and rolling out in smaller facilities before larger hospitals are being incorporated into the project. These are positive developments to help maintain momentum as the project continues. Further integration with existing open source communities could help ongoing development and implementations of the project. We hope this case study will provide some lessons and guidance for other challenging implementations of electronic health record systems as they continue across Africa. ©Naomi Muinga, Steve Magare, Jonathan Monda, Onesmus Kamau, Stuart Houston, Hamish Fraser, John Powell, Mike English, Chris Paton. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 18.04.2018.

  5. Vertical amplitude phase structure of a low-frequency acoustic field in shallow water

    NASA Astrophysics Data System (ADS)

    Kuznetsov, G. N.; Lebedev, O. V.; Stepanov, A. N.

    2016-11-01

    We obtain in integral and analytic form the relations for calculating the amplitude and phase characteristics of an interference structure of orthogonal projections of the oscillation velocity vector in shallow water. For different frequencies and receiver depths, we numerically study the source depth dependences of the effective phase velocities of an equivalent plane wave, the orthogonal projections of the sound pressure phase gradient, and the projections of the oscillation velocity vector. We establish that at low frequencies in zones of interference maxima, independently of source depth, weakly varying effective phase velocity values are observed, which exceed the sound velocity in water by 5-12%. We show that the angles of arrival of the equivalent plane wave and the oscillation velocity vector in the general case differ; however, they virtually coincide in the zone of the interference maximum of the sound pressure under the condition that the horizontal projections of the oscillation velocity appreciably exceed the value of the vertical projection. We give recommendations on using the sound field characteristics in zones with maximum values for solving rangefinding and signal-detection problems.

  6. Planning, implementation, and history of the first 5 years of operation of the Craig, Alaska, pool and school biomass heating system—a case study

    Treesearch

    Allen M. Brackley; K. Petersen

    2016-01-01

    A wood-based energy project in Craig, Alaska, to heat the community's aquatic center and two of its schools was the first such installation in Alaska to convert from fossil fuels to a renewable energy source. Initial interest in the project started in 2004. The system came online in April 2008. This report provides an overview of the new heating system's...

  7. Generation of Electrical Power from Stimulated Muscle Contractions Evaluated

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth; Kilgore, Kevin; Ercegovic, David B.

    2004-01-01

    This project is a collaborative effort between NASA Glenn Research Center's Revolutionary Aeropropulsion Concepts (RAC) Project, part of the NASA Aerospace Propulsion and Power Program of the Aerospace Technology Enterprise, and Case Western Reserve University's Cleveland Functional Electrical Stimulation (FES) Center. The RAC Project foresees implantable power requirements for future applications such as organically based sensor platforms and robotics that can interface with the human senses. One of the goals of the FES Center is to develop a totally implantable neural prosthesis. This goal is based on feedback from patients who would prefer a system with an internal power source over the currently used system with an external power source. The conversion system under investigation would transform the energy produced from a stimulated muscle contraction into electrical energy. We hypothesize that the output power of the system will be greater than the input power necessary to initiate, sustain, and control the electrical conversion system because of the stored potential energy of the muscle. If the system can be made biocompatible, durable, and with the potential for sustained use, then the biological power source will be a viable solution.

  8. General Mission Analysis Tool (GMAT): Mission, Vision, and Business Case

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.

    2007-01-01

    The Goal of the GMAT project is to develop new space trajectory optimization and mission design technology by working inclusively with ordinary people, universities businesses and other government organizations; and to share that technology in an open and unhindered way. GMAT's a free and open source software system; free for anyone to use in development of new mission concepts or to improve current missions, freely available in source code form for enhancement or future technology development.

  9. ME 5620 Fracture Mechanics in Engineering Design. Case Study Project

    DTIC Science & Technology

    2011-04-03

    14 UNCLASSIFIED References 1 . A First Course in the Finite Element Method, 4th edition, by D.L. Logan, Thomson Engineering, 2006. 2. Altair ...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources...currently valid OMB control number. 1 . REPORT DATE 03 APR 2011 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Case Study

  10. 77 FR 38637 - Announcement of the Award of Single-Source Cooperative Agreement to Rubicon Programs, Inc., in...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-28

    ...-Prisoner Reentry activities to promote responsible fatherhood, family reunification, and economic stability... economic stability. The project will implement a program that includes comprehensive case management to... eliminate barriers to social and economic self-sufficiency for individuals preparing to reenter their...

  11. Persistent Teaching Practices after Geospatial Technology Professional Development

    ERIC Educational Resources Information Center

    Rubino-Hare, Lori A.; Whitworth, Brooke A.; Bloom, Nena E.; Claesgens, Jennifer M.; Fredrickson, Kristi M.; Sample, James C.

    2016-01-01

    This case study described teachers with varying technology skills who were implementing the use of geospatial technology (GST) within project-based instruction (PBI) at varying grade levels and contexts 1 to 2 years following professional development. The sample consisted of 10 fifth- to ninth-grade teachers. Data sources included artifacts,…

  12. An Experimental Comparison of Two Different Technetium Source Activities Which Can Imitate Thyroid Scintigraphy in Case of Thyroid Toxic Nodule

    PubMed Central

    Miftari, Ramë; Fejza, Ferki; Bicaj, Xhavit; Nura, Adem; Topciu, Valdete; Bajrami, Ismet

    2014-01-01

    Purpose: In cases of thyroid toxic autonomous nodule, anterior projection of Tc-99m pertechnetate image shows a hot nodule that occupies most, or the entire thyroid lobe with near-total or total suppression of the contra lateral lobe. In this case is very difficult to distinguish toxic nodule from lobe agenesis. Our interest was to estimate and determinate the rate of radioactivity when the source with high activity can make total suppression of the second source with low activity in same conditions with thyroid scintigraphy procedures. Material and methodology: Thyroid scintigraphy was performed with Technetium 99 meta stable pertechnetate. A parallel high resolution low energy collimator was used as an energy setting of 140 KeV photo peak for T-99m. Images are acquired at 200 Kilo Counts in the anterior projection with the collimator positioned as close as the patient’s extended neck (approximately in distance of 18 cm). The scintigraphy of thyroid gland was performed 15 minutes after intravenous administration of 1.5 mCi Tc-99m pertechnetate. Technetium 99 meta stable radioactive sources with different activity were used for two scintigraphies studies, performed in same thyroid scintigraphy acquisition procedures. In the first study, were compared the standard source with high activity A=11.2 mCi with sources with variable activities B=1.33 mCi; 1.03 mCi; 0.7 mCi; 0.36 mCi; and 0.16mCi) in distance of 1.5cm from each other sources, which is approximately same with distance between two thyroid lobes. In the second study were compared the sources with low activity in proportion 70:1(source A = 1.5 mCi and source B=0.021mCi). As clinical studies we preferred two different patents with different thyroid disorders. There were one patient with thyroid toxic nodule in the right lobe, therefore the second patient was with left thyroid nodule agenesis. Results: During our examination, we accurately determined that two radioactive sources in proportion 70:1 will be displayed as only one source with complete suppression of other source with low radioactivity. Also we found that covering of toxic nodules with lead cover (plaque), can allow visualization of activity in suppressed lobe. Conclusion: Our study concluded that total lobe suppression, in cases of patients with thyroid toxic nodule, will happened for sure, if toxic nodule had accumulated seventy times more radioactivity than normal lobe. Also we concluded that covering of the toxic nodule with lead plaque, may permit the presentation of radioactivity in suppressed nodule. PMID:24825932

  13. Test-Case Generation using an Explicit State Model Checker Final Report

    NASA Technical Reports Server (NTRS)

    Heimdahl, Mats P. E.; Gao, Jimin

    2003-01-01

    In the project 'Test-Case Generation using an Explicit State Model Checker' we have extended an existing tools infrastructure for formal modeling to export Java code so that we can use the NASA Ames tool Java Pathfinder (JPF) for test case generation. We have completed a translator from our source language RSML(exp -e) to Java and conducted initial studies of how JPF can be used as a testing tool. In this final report, we provide a detailed description of the translation approach as implemented in our tools.

  14. Estimating the cost to U.S. health departments to conduct HIV surveillance.

    PubMed

    Shrestha, Ram K; Sansom, Stephanie L; Laffoon, Benjamin T; Farnham, Paul G; Shouse, R Luke; MacMaster, Karen; Hall, H Irene

    2014-01-01

    HIV case surveillance is a primary source of information for monitoring HIV burden in the United States and guiding the allocation of prevention and treatment funds. While the number of people living with HIV and the need for surveillance data have increased, little is known about the cost of surveillance. We estimated the economic cost to health departments of conducting high-quality HIV case surveillance. We collected primary data on the unit cost and quantity of resources used to operate the HIV case surveillance program in Michigan, where HIV burden (i.e., the number of HIV cases) is moderate to high (n=14,864 cases). Based on Michigan's data, we projected the expected annual HIV surveillance cost for U.S., state, local, and territorial health departments. We based our cost projection on the variation in the number of new and established cases, area-specific wages, and potential economies of scale. We estimated the annual total HIV surveillance cost to the Michigan health department to be $1,286,524 ($87/case), the annual total cost of new cases to be $108,657 ($133/case), and the annual total cost of established cases to be $1,177,867 ($84/case). Our projected median annual HIV surveillance cost per health department ranged from $210,600 in low-HIV burden sites to $1,835,000 in high-HIV burden sites. Our analysis shows that a systematic approach to costing HIV surveillance at the health department level is feasible. For HIV surveillance, a substantial portion of total surveillance costs is attributable to maintaining established cases.

  15. Sharing Lessons-Learned on Effective Open Data, Open-Source Practices from OpenAQ, a Global Open Air Quality Community.

    NASA Astrophysics Data System (ADS)

    Hasenkopf, C. A.

    2017-12-01

    Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.

  16. The Competition Between a Localised and Distributed Source of Buoyancy

    NASA Astrophysics Data System (ADS)

    Partridge, Jamie; Linden, Paul

    2012-11-01

    We propose a new mathematical model to study the competition between localised and distributed sources of buoyancy within a naturally ventilated filling box. The main controlling parameters in this configuration are the buoyancy fluxes of the distributed and local source, specifically their ratio Ψ. The steady state dynamics of the flow are heavily dependent on this parameter. For large Ψ, where the distributed source dominates, we find the space becomes well mixed as expected if driven by an distributed source alone. Conversely, for small Ψ we find the space reaches a stable two layer stratification. This is analogous to the classical case of a purely local source but here the lower layer is buoyant compared to the ambient, due to the constant flux of buoyancy emanating from the distributed source. The ventilation flow rate, buoyancy of the layers and also the location of the interface height, which separates the two layer stratification, are obtainable from the model. To validate the theoretical model, small scale laboratory experiments were carried out. Water was used as the working medium with buoyancy being driven directly by temperature differences. Theoretical results were compared with experimental data and overall good agreement was found. A CASE award project with Arup.

  17. EWork in Southern Europe. IES Report.

    ERIC Educational Resources Information Center

    Altieri, G.; Birindelli, L.; Bracaglia, P.; Tartaglione, C.; Albarracin, D.; Vaquero, J.; Fissamber, V.

    Part of the EMERGENCE project to measure and map employment relocation in a global economy in the new communications environment, this report on eWork in southern Europe (SE) combines results of a European employer survey, case studies, and data from other sources. Chapter 1 analyzes national and sector dimensions. Chapter 2 studies eWork practice…

  18. Journeys to the Self: Using Movie Directors in the Classroom

    ERIC Educational Resources Information Center

    Alvarez, Jose Luis; Miller, Paddy; Levy, Jan; Svejenova, Silviya

    2004-01-01

    This article suggests that temporary (project based) filmmaking organizations, and film directors as their leaders, lend themselves to examining a plethora of leadership issues, from social sources of power to competencies in network organizations. It advances for classroom discussion and teaching the cases of Almodovar and Coppola as examples of…

  19. The generation of entangled states from independent particle sources

    NASA Technical Reports Server (NTRS)

    Rubin, Morton H.; Shih, Yan-Hua

    1994-01-01

    The generation of entangled states of two systems from product states is discussed for the case in which the paths of the two systems do not overlap. A particular method of measuring allows one to project out the nonlocal entangled state. An application to the production of four photon entangled states is outlined.

  20. Rendering of Foreign Language Inclusions in the Russian Translations of the Novels by Graham Greene

    ERIC Educational Resources Information Center

    Valeeva, Roza A.; Martynova, Irina N.

    2016-01-01

    The importance of the problem under discussion is preconditioned by the scientific inquiry of the best variants of foreign language inclusions translation which would suite original narration in the source text stylistically, emotionally and conceptually and also fully projects the author's communicative intention in every particular case. The…

  1. An alternative subspace approach to EEG dipole source localization

    NASA Astrophysics Data System (ADS)

    Xu, Xiao-Liang; Xu, Bobby; He, Bin

    2004-01-01

    In the present study, we investigate a new approach to electroencephalography (EEG) three-dimensional (3D) dipole source localization by using a non-recursive subspace algorithm called FINES. In estimating source dipole locations, the present approach employs projections onto a subspace spanned by a small set of particular vectors (FINES vector set) in the estimated noise-only subspace instead of the entire estimated noise-only subspace in the case of classic MUSIC. The subspace spanned by this vector set is, in the sense of principal angle, closest to the subspace spanned by the array manifold associated with a particular brain region. By incorporating knowledge of the array manifold in identifying FINES vector sets in the estimated noise-only subspace for different brain regions, the present approach is able to estimate sources with enhanced accuracy and spatial resolution, thus enhancing the capability of resolving closely spaced sources and reducing estimation errors. The present computer simulations show, in EEG 3D dipole source localization, that compared to classic MUSIC, FINES has (1) better resolvability of two closely spaced dipolar sources and (2) better estimation accuracy of source locations. In comparison with RAP-MUSIC, FINES' performance is also better for the cases studied when the noise level is high and/or correlations among dipole sources exist.

  2. Free and Open Source Software for Geospatial in the field of planetary science

    NASA Astrophysics Data System (ADS)

    Frigeri, A.

    2012-12-01

    Information technology applied to geospatial analyses has spread quickly in the last ten years. The availability of OpenData and data from collaborative mapping projects increased the interest on tools, procedures and methods to handle spatially-related information. Free Open Source Software projects devoted to geospatial data handling are gaining a good success as the use of interoperable formats and protocols allow the user to choose what pipeline of tools and libraries is needed to solve a particular task, adapting the software scene to his specific problem. In particular, the Free Open Source model of development mimics the scientific method very well, and researchers should be naturally encouraged to take part to the development process of these software projects, as this represent a very agile way to interact among several institutions. When it comes to planetary sciences, geospatial Free Open Source Software is gaining a key role in projects that commonly involve different subjects in an international scenario. Very popular software suites for processing scientific mission data (for example, ISIS) and for navigation/planning (SPICE) are being distributed along with the source code and the interaction between user and developer is often very strict, creating a continuum between these two figures. A very widely spread library for handling geospatial data (GDAL) has started to support planetary data from the Planetary Data System, and recent contributions enabled the support to other popular data formats used in planetary science, as the Vicar one. The use of Geographic Information System in planetary science is now diffused, and Free Open Source GIS, open GIS formats and network protocols allow to extend existing tools and methods developed to solve Earth based problems, also to the case of the study of solar system bodies. A day in the working life of a researcher using Free Open Source Software for geospatial will be presented, as well as benefits and solutions to possible detriments coming from the effort required by using, supporting and contributing.

  3. BASINs and WEPP Climate Assessment Tools (CAT): Case ...

    EPA Pesticide Factsheets

    EPA announced the release of the final report, BASINs and WEPP Climate Assessment Tools (CAT): Case Study Guide to Potential Applications. This report supports application of two recently developed water modeling tools, the Better Assessment Science Integrating point & Non-point Sources (BASINS) and the Water Erosion Prediction Project Climate Assessment Tool (WEPPCAT). The report presents a series of short case studies designed to illustrate the capabilities of these tools for conducting scenario based assessments of the potential effects of climate change on streamflow and water quality. This report presents a series of short, illustrative case studies using the BASINS and WEPP climate assessment tools.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a ground-source variable refrigerant flow (GS-VRF) system installed at the Human Health Building at Oakland University in Rochester, Michigan.more » This case study is based on the analysis of measured performance data, maintenance records, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning as the demonstrated GS-VRF system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GS-VRF system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GS-VRF system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation, improving the operational efficiency, and reducing the installed cost of similar GSHP systems in the future.« less

  5. Guidance and Control Software Project Data - Volume 2: Development Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the development documents from the GCS project. Volume 2 contains three appendices: A. Guidance and Control Software Development Specification; B. Design Description for the Pluto Implementation of the Guidance and Control Software; and C. Source Code for the Pluto Implementation of the Guidance and Control Software

  6. Disaster Risk Reduction through Innovative Uses of Crowd Sourcing (Invited)

    NASA Astrophysics Data System (ADS)

    Berger, J.; Greene, M.

    2010-12-01

    Crowd sourcing can be described as a method of distributed problem-solving. It takes advantage of the power of the crowd, which can in some cases be a community of experts and in other cases the collective insight of a broader range of contributors with varying degrees of domain knowledge. The term crowd sourcing was first used by Jeff Howe in a June 2006 Wired magazine article “The Rise of Crowdsourcing,” and is a combination of the terms “crowd” and “outsourcing.” Some commonly known examples of crowd sourcing, in its broadest sense, include Wikepedia, distributed participatory design projects, and consumer websites such as Yelp and Angie’s List. The popularity and success of early large-scale crowd sourcing activities is made possible through leveraging Web 2.0 technologies that allow for mass participation from distributed individuals. The Earthquake Engineering Research Institute (EERI) in Oakland, California recently participated in two crowd sourcing projects. One was initiated and coordinated by EERI, while in the second case EERI was invited to contribute once the crowd sourcing activity was underway. In both projects there was: 1) the determination of a problem or set of tasks that could benefit immediately from the engagement of an informed volunteer group of professionals; 2) a segmenting of the problem into discrete pieces that could be completed in a short period of time (from ten minutes to four hours); 3) a call to action, where an interested community was made aware of the project; and 4) the collection, aggregation, vetting and ultimately distribution of the results in a relatively short period of time. The first EERI crowd sourcing example was the use of practicing engineers and engineering students in California to help estimate the number of pre-1980 concrete buildings in the high seismic risk counties in the state. This building type is known to perform poorly in earthquakes, and state officials were interested in understanding more about the size of the problem—how many buildings, which jurisdictions. Volunteers signed up for individual jurisdictions and used a variety of techniques to estimate the count. They shared their techniques at meetings and posted their results online. Over 100 volunteers also came together to walk the streets of downtown San Francisco, a city with a particularly large number of these buildings, gathering more data on each building that will be used in a later phase to identify possible mitigation strategies. The second example was EERI’s participation in a response network, GEO-CAN, created in support of the World Bank’s responsibility in the damage assessment of buildings in Port-au-Prince immediately after the January 12, 2010 earthquake. EERI members, primarily earthquake engineers, were invited to speed up critical damage assessment using pre- and post-event aerial imagery. An area of 300 sq km was divided into grids, and grids were then allocated to knowledgeable individuals for analysis. The initial analysis was completed within 96 hours through the participation of over 300 volunteers. Ultimately, over 600 volunteers completed damage assessments for about 30,000 buildings.

  7. Future trends of global atmospheric antimony emissions from anthropogenic activities until 2050

    NASA Astrophysics Data System (ADS)

    Zhou, Junrui; Tian, Hezhong; Zhu, Chuanyong; Hao, Jiming; Gao, Jiajia; Wang, Yong; Xue, Yifeng; Hua, Shenbin; Wang, Kun

    2015-11-01

    This paper presents the scenario forecast of global atmospheric antimony (Sb) emissions from anthropogenic activities till 2050. The projection scenarios are built based on the comprehensive global antimony emission inventory for the period 1995-2010 which is reported in our previous study. Three scenarios are set up to investigate the future changes of global antimony emissions as well as their source and region contribution characteristics. Trends of activity levels specified as 5 primary source categories are projected by combining the historical trend extrapolation with EIA International energy outlook 2013, while the source-specific dynamic emission factors are determined by applying transformed normal distribution functions. If no major changes in the efficiency of emission control are introduced and keep current air quality legislations (Current Legislation scenario), global antimony emissions will increase by a factor of 2 between 2010 and 2050. The largest increase in Sb emissions is projected from Asia due to large volume of nonferrous metals production and waste incineration. In case of enforcing the pollutant emission standards (Strengthened Control scenario), global antimony emissions in 2050 will stabilize with that of 2010. Moreover, we can anticipate further declines in Sb emissions for all continents with the best emission control performances (Maximum Feasible Technological Reduction scenario). Future antimony emissions from the top 10 largest emitting countries have also been calculated and source category contributions of increasing emissions of these countries present significant diversity. Furthermore, global emission projections in 2050 are distributed within a 1° × 1°latitude/longitude grid. East Asia, Western Europe and North America present remarkable differences in emission intensity under the three scenarios, which implies that source-and-country specific control measures are necessary to be implemented for abating Sb emissions from varied continents and countries in the future.

  8. The importance of using open source technologies and common standards for interoperability within eHealth: Perspectives from the Millennium Villages Project

    PubMed Central

    Borland, Rob; Barasa, Mourice; Iiams-Hauser, Casey; Velez, Olivia; Kaonga, Nadi Nina; Berg, Matt

    2013-01-01

    The purpose of this paper is to illustrate the importance of using open source technologies and common standards for interoperability when implementing eHealth systems and illustrate this through case studies, where possible. The sources used to inform this paper draw from the implementation and evaluation of the eHealth Program in the context of the Millennium Villages Project (MVP). As the eHealth Team was tasked to deploy an eHealth architecture, the Millennium Villages Global-Network (MVG-Net), across all fourteen of the MVP sites in Sub-Saharan Africa, the team recognized the need for standards and uniformity but also realized that context would be an important factor. Therefore, the team decided to utilize open source solutions. The MVP implementation of MVG-Net provides a model for those looking to implement informatics solutions across disciplines and countries. Furthermore, there are valuable lessons learned that the eHealth community can benefit from. By sharing lessons learned and developing an accessible, open-source eHealth platform, we believe that we can more efficiently and rapidly achieve the health-related and collaborative Millennium Development Goals (MDGs). PMID:22894051

  9. Studies on the Effects of High Renewable Penetrations on Driving Point Impedance and Voltage Regulator Performance: National Renewable Energy Laboratory/Sacramento Municipal Utility District Load Tap Changer Driving Point Impedance Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nagarajan, Adarsh; Coddington, Michael H.; Brown, David

    Voltage regulators perform as desired when regulating from the source to the load and when regulating from a strong source (utility) to a weak source (distributed generation). (See the glossary for definitions of a strong source and weak source.) Even when the control is provisioned for reverse operation, it has been observed that tap-changing voltage regulators do not perform as desired in reverse when attempting regulation from the weak source to the strong source. The region of performance that is not as well understood is the regulation between sources that are approaching equal strength. As part of this study, wemore » explored all three scenarios: regulator control from a strong source to a weak source (classic case), control from a weak source to a strong source (during reverse power flow), and control between equivalent sources.« less

  10. CoopEUS Case Study: Tsunami Modelling and Early Warning Systems for Near Source Areas (Mediterranean, Juan de Fuca).

    NASA Astrophysics Data System (ADS)

    Beranzoli, Laura; Best, Mairi; Chierici, Francesco; Embriaco, Davide; Galbraith, Nan; Heeseman, Martin; Kelley, Deborah; Pirenne, Benoit; Scofield, Oscar; Weller, Robert

    2015-04-01

    There is a need for tsunami modeling and early warning systems for near-source areas. For example this is a common public safety threat in the Mediterranean and Juan de Fuca/NE Pacific Coast of N.A.; Regions covered by the EMSO, OOI, and ONC ocean observatories. Through the CoopEUS international cooperation project, a number of environmental research infrastructures have come together to coordinate efforts on environmental challenges; this tsunami case study tackles one such challenge. There is a mutual need of tsunami event field data and modeling to deepen our experience in testing methodology and developing real-time data processing. Tsunami field data are already available for past events, part of this use case compares these for compatibility, gap analysis, and model groundtruthing. It also reviews sensors needed and harmonizes instrument settings. Sensor metadata and registries are compared, harmonized, and aligned. Data policies and access are also compared and assessed for gap analysis. Modelling algorithms are compared and tested against archived and real-time data. This case study will then be extended to other related tsunami data and model sources globally with similar geographic and seismic scenarios.

  11. Defense Small Business Innovation Research Program (SBIR). Volume 4. Defense Agency Projects, Abstracts of Phase 1 Awards from FY 1989 SBIR Solicitation

    DTIC Science & Technology

    1990-04-01

    EXPLOSIVE ACTIVITY . FINDINGS AND MEASUREMENTS FROM EACH IMAGE WILL BE COMBINED IN A GEOGRAPHIC INFORMATION DATA BASE . VARIOUS IMAGE AND MAP PROJECTS WILL BE...PROPOSAL OF LAND MINES DETECTION BY A NUCLEAR ACTIVATION METHOD IS BASED ON A NEW EXTREMELY INTENSE, COMPACT PULSED SOURCE OF 14.1 MeV NEUTRONS (WITH A...CONVENTIONAL KNOWLEDGE- BASED SYSTEMS TOPIC# 38 OFFICE: PM/SBIR IDENT#: 33862 CASE- BASED REASONING (CBR) REPRESENTS A POWERFUL NEW PARADIGM FOR BUILDING EXPERT

  12. Investigating Causality Between Interacting Brain Areas with Multivariate Autoregressive Models of MEG Sensor Data

    PubMed Central

    Michalareas, George; Schoffelen, Jan-Mathijs; Paterson, Gavin; Gross, Joachim

    2013-01-01

    Abstract In this work, we investigate the feasibility to estimating causal interactions between brain regions based on multivariate autoregressive models (MAR models) fitted to magnetoencephalographic (MEG) sensor measurements. We first demonstrate the theoretical feasibility of estimating source level causal interactions after projection of the sensor-level model coefficients onto the locations of the neural sources. Next, we show with simulated MEG data that causality, as measured by partial directed coherence (PDC), can be correctly reconstructed if the locations of the interacting brain areas are known. We further demonstrate, if a very large number of brain voxels is considered as potential activation sources, that PDC as a measure to reconstruct causal interactions is less accurate. In such case the MAR model coefficients alone contain meaningful causality information. The proposed method overcomes the problems of model nonrobustness and large computation times encountered during causality analysis by existing methods. These methods first project MEG sensor time-series onto a large number of brain locations after which the MAR model is built on this large number of source-level time-series. Instead, through this work, we demonstrate that by building the MAR model on the sensor-level and then projecting only the MAR coefficients in source space, the true casual pathways are recovered even when a very large number of locations are considered as sources. The main contribution of this work is that by this methodology entire brain causality maps can be efficiently derived without any a priori selection of regions of interest. Hum Brain Mapp, 2013. © 2012 Wiley Periodicals, Inc. PMID:22328419

  13. Management of Globally Distributed Software Development Projects in Multiple-Vendor Constellations

    NASA Astrophysics Data System (ADS)

    Schott, Katharina; Beck, Roman; Gregory, Robert Wayne

    Global information systems development outsourcing is an apparent trend that is expected to continue in the foreseeable future. Thereby, IS-related services are not only increasingly provided from different geographical sites simultaneously but beyond that from multiple service providers based in different countries. The purpose of this paper is to understand how the involvement of multiple service providers affects the management of the globally distributed information systems development projects. As research on this topic is scarce, we applied an exploratory in-depth single-case study design as research approach. The case we analyzed comprises a global software development outsourcing project initiated by a German bank together with several globally distributed vendors. For data collection and data analysis we have adopted techniques suggested by the grounded theory method. Whereas the extant literature points out the increased management overhead associated with multi-sourcing, the analysis of our case suggests that the required effort for managing global outsourcing projects with multiple vendors depends among other things on the maturation level of the cooperation within the vendor portfolio. Furthermore, our data indicate that this interplay maturity is positively impacted through knowledge about the client that has been derived based on already existing client-vendor relationships. The paper concludes by offering theoretical and practical implications.

  14. Policy and practice impacts of applied research: a case study analysis of the New South Wales Health Promotion Demonstration Research Grants Scheme 2000–2006

    PubMed Central

    2013-01-01

    Background Intervention research provides important information regarding feasible and effective interventions for health policy makers, but few empirical studies have explored the mechanisms by which these studies influence policy and practice. This study provides an exploratory case series analysis of the policy, practice and other related impacts of the 15 research projects funded through the New South Wales Health Promotion Demonstration Research Grants Scheme during the period 2000 to 2006, and explored the factors mediating impacts. Methods Data collection included semi-structured interviews with the chief investigators (n = 17) and end-users (n = 29) of each of the 15 projects to explore if, how and under what circumstances the findings had been used, as well as bibliometric analysis and verification using documentary evidence. Data analysis involved thematic coding of interview data and triangulation with other data sources to produce case summaries of impacts for each project. Case summaries were then individually assessed against four impact criteria and discussed at a verification panel meeting where final group assessments of the impact of research projects were made and key influences of research impact identified. Results Funded projects had variable impacts on policy and practice. Project findings were used for agenda setting (raising awareness of issues), identifying areas and target groups for interventions, informing new policies, and supporting and justifying existing policies and programs across sectors. Reported factors influencing the use of findings were: i) nature of the intervention; ii) leadership and champions; iii) research quality; iv) effective partnerships; v) dissemination strategies used; and, vi) contextual factors. Conclusions The case series analysis provides new insights into how and under what circumstances intervention research is used to influence real world policy and practice. The findings highlight that intervention research projects can achieve the greatest policy and practice impacts if they address proximal needs of the policy context by engaging end-users from the inception of projects and utilizing existing policy networks and structures, and using a range of strategies to disseminate findings that go beond traditional peer review publications. PMID:23374280

  15. A Child Abuse Assessment Center: Alternative Investigative Approaches.

    ERIC Educational Resources Information Center

    Hiester, Douglas S.

    A child abuse assessment center was created in Dade County, Florida, and was funded by state and local government sources. Staff includes a project director, two clinical social workers, a follow-up case monitor, clerical support, and a psychologist. The center attempts to minimize trauma to the child victim of sexual and physical abuse by a…

  16. Inclusive Work at a European Level: A Case Study

    ERIC Educational Resources Information Center

    Stephenson, Paul; Rumley, Glynis

    2005-01-01

    In this article, Paul Stephenson and Glynis Rumley describe the way in which educators in Kent have developed strong links with their colleagues and neighbours from Nord Pas de Calais in France. From a variety of projects undertaken, some of which were assisted by funding from European sources, children of all abilities and needs have been able to…

  17. Projecting the Water Footprint Associated with Shale Resource Production: Eagle Ford Shale Case Study.

    PubMed

    Ikonnikova, Svetlana A; Male, Frank; Scanlon, Bridget R; Reedy, Robert C; McDaid, Guinevere

    2017-12-19

    Production of oil from shale and tight reservoirs accounted for almost 50% of 2016 total U.S. production and is projected to continue growing. The objective of our analysis was to quantify the water outlook for future shale oil development using the Eagle Ford Shale as a case study. We developed a water outlook model that projects water use for hydraulic fracturing (HF) and flowback and produced water (FP) volumes based on expected energy prices; historical oil, natural gas, and water-production decline data per well; projected well spacing; and well economics. The number of wells projected to be drilled in the Eagle Ford through 2045 is almost linearly related to oil price, ranging from 20 000 wells at $30/barrel (bbl) oil to 97 000 wells at $100/bbl oil. Projected FP water volumes range from 20% to 40% of HF across the play. Our base reference oil price of $50/bbl would result in 40 000 additional wells and related HF of 265 × 10 9 gal and FP of 85 × 10 9 gal. The presented water outlooks for HF and FP water volumes can be used to assess future water sourcing and wastewater disposal or reuse, and to inform policy discussions.

  18. STS Case Study Development Support

    NASA Technical Reports Server (NTRS)

    Rosa de Jesus, Dan A.; Johnson, Grace K.

    2013-01-01

    The Shuttle Case Study Collection (SCSC) has been developed using lessons learned documented by NASA engineers, analysts, and contractors. The SCSC provides educators with a new tool to teach real-world engineering processes with the goal of providing unique educational materials that enhance critical thinking, decision-making and problem-solving skills. During this third phase of the project, responsibilities included: the revision of the Hyper Text Markup Language (HTML) source code to ensure all pages follow World Wide Web Consortium (W3C) standards, and the addition and edition of website content, including text, documents, and images. Basic HTML knowledge was required, as was basic knowledge of photo editing software, and training to learn how to use NASA's Content Management System for website design. The outcome of this project was its release to the public.

  19. Experimental evaluation of the ring focus test for X-ray telescopes using AXAF's technology mirror assembly, MSFC CDDF Project No. H20

    NASA Technical Reports Server (NTRS)

    Zissa, D. E.; Korsch, D.

    1986-01-01

    A test method particularly suited for X-ray telescopes was evaluated experimentally. The method makes use of a focused ring formed by an annular aperture when using a point source at a finite distance. This would supplement measurements of the best focus image which is blurred when the test source is at a finite distance. The telescope used was the Technology Mirror Assembly of the Advanced X-ray Astrophysis Facility (AXAF) program. Observed ring image defects could be related to the azimuthal location of their sources in the telescope even though in this case the predicted sharp ring was obscured by scattering, finite source size, and residual figure errors.

  20. Meteor Beliefs Project: The Palladium in ancient and early Medieval sources

    NASA Astrophysics Data System (ADS)

    McBeath, A. Alistair; Gheorghe, A. D.

    2004-08-01

    An examination of the, apparently meteoritic, object, anciently called the Palladium after the Greek goddess Pallas Athene, is presented, as discussed in various ancient and early medieval sources. Although made of wood, the Palladium was believed to have fallen from the sky. In myths, it was a powerful totemic object, first at the legendary city of Troy, then later at Rome, and had magically protective properties associated with it. Despite its implausibly meteoritic nature, the Palladium can be suggested as supporting the case for ancient meteorite worship.

  1. A Business Case Study of Open Source Software

    DTIC Science & Technology

    2001-07-01

    LinuxPPC LinuxPPC www.linuxppc.com MandrakeSoft Linux -Mandrake www.linux-mandrake.com/ en / CLE Project CLE cle.linux.org.tw/CLE/e_index.shtml Red Hat... en Coyote Linux www2.vortech.net/coyte/coyte.htm MNIS www.mnis.fr Data-Portal www.data-portal.com Mr O’s Linux Emporium www.ouin.com DLX Linux www.wu...1998 1999 Year S h ip m en ts ( in m ill io n s) Source: IDC, 2000. Figure 11. Worldwide New Linux Shipments (Client and Server) 3.2.2 Market

  2. 4. Photographic copy of photograph. (Source: U.S. Department of Interior. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    4. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, CHINA WASH FLUME, 5/13/25 - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ

  3. 5. Photographic copy of photograph. (Source: U.S. Department of Interior. ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Photographic copy of photograph. (Source: U.S. Department of Interior. Office of Indian Affairs. Indian Irrigation Service. Annual Report, Fiscal Year 1925. Vol. I, Narrative and Photographs, Irrigation District #4, California and Southern Arizona, RG 75, Entry 655, Box 28, National Archives, Washington, DC.) Photographer unknown. MAIN (TITLED FLORENCE) CANAL, CHINA WASH FLUME, 5/13/25 - San Carlos Irrigation Project, China Wash Flume, Main (Florence-Case Grande) Canal at Station 137+00, T4S, R10E, S14, Coolidge, Pinal County, AZ

  4. From climate-change spaghetti to climate-change distributions for 21st Century California

    USGS Publications Warehouse

    Dettinger, M.D.

    2005-01-01

    The uncertainties associated with climate-change projections for California are unlikely to disappear any time soon, and yet important long-term decisions will be needed to accommodate those potential changes. Projection uncertainties have typically been addressed by analysis of a few scenarios, chosen based on availability or to capture the extreme cases among available projections. However, by focusing on more common projections rather than the most extreme projections (using a new resampling method), new insights into current projections emerge: (1) uncertainties associated with future greenhouse-gas emissions are comparable with the differences among climate models, so that neither source of uncertainties should be neglected or underrepresented; (2) twenty-first century temperature projections spread more, overall, than do precipitation scenarios; (3) projections of extremely wet futures for California are true outliers among current projections; and (4) current projections that are warmest tend, overall, to yield a moderately drier California, while the cooler projections yield a somewhat wetter future. The resampling approach applied in this paper also provides a natural opportunity to objectively incorporate measures of model skill and the likelihoods of various emission scenarios into future assessments.

  5. Case Study for the ARRA-funded Ground Source Heat Pump (GSHP) Demonstration at Wilders Grove Solid Waste Service Center in Raleigh, NC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Xiaobing; Malhotra, Mini; Xiong, Zeyu

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects have been competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This paper highlights the findings of a case study of one of the ARRA-funded GSHP demonstration projects, a distributed GSHP system for providing all the space conditioning, outdoor air ventilation, and 100% domestic hot water tomore » the Wilders Grove Solid Waste Service Center of City of Raleigh, North Carolina. This case study is based on the analysis of measured performance data, construction costs, and simulations of the energy consumption of conventional central heating, ventilation, and air-conditioning (HVAC) systems providing the same level of space conditioning and outdoor air ventilation as the demonstrated GSHP system. The evaluated performance metrics include the energy efficiency of the heat pump equipment and the overall GSHP system, pumping performance, energy savings, carbon emission reductions, and cost-effectiveness of the GSHP system compared with conventional HVAC systems. This case study also identified opportunities for reducing uncertainties in the performance evaluation and improving the operational efficiency of the demonstrated GSHP system.« less

  6. Exploring English Language Learners (ELL) experiences with scientific language and inquiry within a real life context

    NASA Astrophysics Data System (ADS)

    Algee, Lisa M.

    English Language Learners (ELL) are often at a distinct disadvantage from receiving authentic science learning opportunites. This study explored English Language Learners (ELL) learning experiences with scientific language and inquiry within a real life context. This research was theoretically informed by sociocultural theory and literature on student learning and science teaching for ELL. A qualitative, case study was used to explore students' learning experiences. Data from multiple sources was collected: student interviews, science letters, an assessment in another context, field-notes, student presentations, inquiry assessment, instructional group conversations, parent interviews, parent letters, parent homework, teacher-researcher evaluation, teacher-researcher reflective journal, and student ratings of learning activities. These data sources informed the following research questions: (1) Does participation in an out-of-school contextualized inquiry science project increase ELL use of scientific language? (2) Does participation in an out-of-school contextualized inquiry science project increase ELL understanding of scientific inquiry and their motivation to learn? (3) What are parents' funds of knowledge about the local ecology and does this inform students' experiences in the science project? All data sources concerning students were analyzed for similar patterns and trends and triangulation was sought through the use of these data sources. The remaining data sources concerning the teacher-researcher were used to inform and assess whether the pedagogical and research practices were in alignment with the proposed theoretical framework. Data sources concerning parental participation accessed funds of knowledge, which informed the curriculum in order to create continuity and connections between home and school. To ensure accuracy in the researchers' interpretations of student and parent responses during interviews, member checking was employed. The findings suggest that participation in an out-of-school contextualized inquiry science project increased ELL use of scientific language and understanding of scientific inquiry and motivation to learn. In addition, parent' funds of knowledge informed students' experiences in the science project. These findings suggest that the learning and teaching practices and the real life experiential learning contexts served as an effective means for increasing students' understandings and motivation to learn.

  7. The State of Open Source Electronic Health Record Projects: A Software Anthropology Study

    PubMed Central

    2017-01-01

    Background Electronic health records (EHR) are a key tool in managing and storing patients’ information. Currently, there are over 50 open source EHR systems available. Functionality and usability are important factors for determining the success of any system. These factors are often a direct reflection of the domain knowledge and developers’ motivations. However, few published studies have focused on the characteristics of free and open source software (F/OSS) EHR systems and none to date have discussed the motivation, knowledge background, and demographic characteristics of the developers involved in open source EHR projects. Objective This study analyzed the characteristics of prevailing F/OSS EHR systems and aimed to provide an understanding of the motivation, knowledge background, and characteristics of the developers. Methods This study identified F/OSS EHR projects on SourceForge and other websites from May to July 2014. Projects were classified and characterized by license type, downloads, programming languages, spoken languages, project age, development status, supporting materials, top downloads by country, and whether they were “certified” EHRs. Health care F/OSS developers were also surveyed using an online survey. Results At the time of the assessment, we uncovered 54 open source EHR projects, but only four of them had been successfully certified under the Office of the National Coordinator for Health Information Technology (ONC Health IT) Certification Program. In the majority of cases, the open source EHR software was downloaded by users in the United States (64.07%, 148,666/232,034), underscoring that there is a significant interest in EHR open source applications in the United States. A survey of EHR open source developers was conducted and a total of 103 developers responded to the online questionnaire. The majority of EHR F/OSS developers (65.3%, 66/101) are participating in F/OSS projects as part of a paid activity and only 25.7% (26/101) of EHR F/OSS developers are, or have been, health care providers in their careers. In addition, 45% (45/99) of developers do not work in the health care field. Conclusion The research presented in this study highlights some challenges that may be hindering the future of health care F/OSS. A minority of developers have been health care professionals, and only 55% (54/99) work in the health care field. This undoubtedly limits the ability of functional design of F/OSS EHR systems from being a competitive advantage over prevailing commercial EHR systems. Open source software seems to be a significant interest to many; however, given that only four F/OSS EHR systems are ONC-certified, this interest is unlikely to yield significant adoption of these systems in the United States. Although the Health Information Technology for Economic and Clinical Health (HITECH) act was responsible for a substantial infusion of capital into the EHR marketplace, the lack of a corporate entity in most F/OSS EHR projects translates to a marginal capacity to market the respective F/OSS system and to navigate certification. This likely has further disadvantaged F/OSS EHR adoption in the United States. PMID:28235750

  8. Provision of pandemic disease information by health sciences librarians: a multisite comparative case series.

    PubMed

    Featherstone, Robin M; Boldt, R Gabriel; Torabi, Nazi; Konrad, Shauna-Lee

    2012-04-01

    The research provides an understanding of pandemic information needs and informs professional development initiatives for librarians in disaster medicine. Utilizing a multisite, comparative case series design, the researchers conducted semi-structured interviews and examined supplementary materials in the form of organizational documents, correspondence, and websites to create a complete picture of each case. The rigor of the case series was ensured through data and investigator triangulation. Interview transcripts were coded using NVivo to identify common themes and points of comparison. Comparison of the four cases revealed a distinct difference between "client-initiated" and "librarian-initiated" provision of pandemic information. Librarian-initiated projects utilized social software to "push" information, whereas client-initiated projects operated within patron-determined parameters to deliver information. Health care administrators were identified as a key audience for pandemic information, and news agencies were utilized as essential information sources. Librarians' skills at evaluating available information proved crucial for selecting best-quality evidence to support administrative decision making. Qualitative analysis resulted in increased understanding of pandemic information needs and identified best practices for disseminating information during periods of high organizational stress caused by an influx of new cases of an unknown infectious disease.

  9. Greater-than-Class C low-level waste characterization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Piscitella, R.R.

    1991-12-31

    In 1985, Public Law 99-240 (Low-Level Radioactive Waste Policy Amendments Act of 1985) made the Department of Energy (DOE) responsible for the disposal of greater-than-Class C low-level radioactive waste (GTCC LLW). DOE strategies for storage and disposal of GTCC LLW required characterization of volumes, radionuclide activities, and waste forms. Data from existing literature, disposal records, and original research were used to estimate characteristics, project volumes, and determine radionuclide activities to the years 2035 and 2055. Twenty-year life extensions for 70% of the operating nuclear reactors were assumed to calculate the GTCC LLW available in 2055. The following categories of GTCCmore » LLW were addressed: Nuclear Utilities Waste; Potential Sealed Sources GTCC LLW; DOE-Held Potential GTCC LLW; and Other Generator Waste. It was determined that the largest volume of these wastes, approximately 57%, is generated by nuclear utilities. The Other Generator Waste category contributes approximately 10% of the total GTCC LLW volume projected to the year 2035. DOE-Held Potential GTCC LLW accounts for nearly 33% of all waste projected to the year 2035. Potential Sealed Sources GTCC LLW is less than 0.2% of the total projected volume. The base case total projected volume of GTCC LLW for all categories was 3,250 cubic meters. This was substantially less than previous estimates.« less

  10. Toxicological database of soil and derived products (BDT).

    PubMed

    Uricchio, Vito Felice

    2008-01-01

    The Toxicological database of soil and derived products is a project firstly proposed by the Regional Environmental Authority of Apulia. Such a project aims to provide comprehensive and updated information on the regional environmental characteristics, on the pollution state of the regional soil, on the main pollutants and on the reclaim techniques to be used in case of both non-point (agricultural activities) and point (industrial activities) sources of pollution. The project's focus is on the soil pollution because of the fundamental role played by the soil in supporting the biological cycle. Furthermore, the reasons for the project are related both to the reduction of human health risks due to toxic substances ingestion (these substances are present in some ring of the eating chain), and to the recognition of the importance of the groundwater quality safety (primary source of fresh water in many Mediterranean Regions). The essential requirements of a data entry are the following: speed and simplicity of the data entry; reliability and stability of the database structures; speed, easiness and pliability of the queries. Free consultation of the database represents one of the most remarkable advantages coming from the use of an "open" system.

  11. Cost-effectiveness of alternate strategies for childhood immunization against meningococcal disease with monovalent and quadrivalent conjugate vaccines in Canada.

    PubMed

    Delea, Thomas E; Weycker, Derek; Atwood, Mark; Neame, Dion; Alvarez, Fabián P; Forget, Evelyn; Langley, Joanne M; Chit, Ayman

    2017-01-01

    Public health programs to prevent invasive meningococcal disease (IMD) with monovalent serogroup C meningococcal conjugate vaccine (MCV-C) and quadrivalent meningococcal conjugate vaccines (MCV-4) in infancy and adolescence vary across Canadian provinces. This study evaluated the cost-effectiveness of various vaccination strategies against IMD using current and anticipated future pricing and recent epidemiology. A cohort model was developed to estimate the clinical burden and costs (CAN$2014) of IMD in the Canadian population over a 100-year time horizon for three strategies: (1) MCV-C in infants and adolescents (MCV-C/C); (2) MCV-C in infants and MCV-4 in adolescents (MCV-C/4); and (3) MCV-4 in infants (2 doses) and adolescents (MCV-4/4). The source for IMD incidence was Canadian surveillance data. The effectiveness of MCV-C was based on published literature. The effectiveness of MCV-4 against all vaccination regimens was assumed to be the same as for MCV-C regimens against serogroup C. Herd effects were estimated by calibration to estimates reported in prior analyses. Costs were from published sources. Vaccines prices were projected to decline over time reflecting historical procurement trends. Over the modeling horizon there are a projected 11,438 IMD cases and 1,195 IMD deaths with MCV-C/C; expected total costs are $597.5 million. MCV-C/4 is projected to reduce cases of IMD by 1,826 (16%) and IMD deaths by 161 (13%). Vaccination costs are increased by $32 million but direct and indirect IMD costs are projected to be reduced by $46 million. MCV-C/4 is therefore dominant vs. MCV-C/C in the base case. Cost-effectiveness of MCV-4/4 was $111,286 per QALY gained versus MCV-C/4 (2575/206 IMD cases/deaths prevented; incremental costs $68 million). If historical trends in Canadian vaccines prices continue, use of MCV-4 instead of MCV-C in adolescents may be cost-effective. From an economic perspective, switching to MCV-4 as the adolescent booster should be considered.

  12. A Case for Data and Service Fusions

    NASA Astrophysics Data System (ADS)

    Huang, T.; Boening, C.; Quach, N. T.; Gill, K.; Zlotnicki, V.; Moore, B.; Tsontos, V. M.

    2015-12-01

    In this distributed, data-intensive era, developing any solution that requires multi-disciplinary data and service requires careful review of interfaces with data and service providers. Information is stored in many different locations and data services are distributed across the Internet. In design and development of mash-up heterogeneous data systems, the challenge is not entirely technological; it is our ability to document the external interface specifications and to create a coherent environment for our users. While is impressive to present a complex web of data, the true measure of our success is in the quality of the data we are serving, the throughput of our creation, and user experience. The presentation presents two current funded NASA projects that require integration of heterogeneous data and service that reside in different locations. The NASA Sea Level Change Portal is designed a "one-stop" source for current sea level change information. Behind this portal is an architecture that integrates data and services from various sources, which includes PI-generated products, satellite products from the DAACs, and metadata from ESDIS Common Metadata Repository (CMR) and other sources, and services reside in the data centers, universities, and ESDIS. The recently funded Distributed Oceanographic Matchup Service (DOMS) project is a project under the NASA Advance Information Technology (AIST) program. DOMS will integrate with satellite products managed by NASA Physical Oceanography Distributed Active Archive Center (PO.DAAC) and three different in-situ projects that are located in difference parts of the U.S. These projects are good examples of delivering content-rich solutions through mash-up of heterogeneous data and systems.

  13. Regular Topologies for Gigabit Wide-Area Networks: Congestion Avoidance Testbed Experiments. Volume 3

    NASA Technical Reports Server (NTRS)

    Denny, Barbara A.; McKenney, Paul E., Sr.; Lee, Danny

    1994-01-01

    This document is Volume 3 of the final technical report on the work performed by SRI International (SRI) on SRI Project 8600. The document includes source listings for all software developed by SRI under this effort. Since some of our work involved the use of ST-II and the Sun Microsystems, Inc. (Sun) High-Speed Serial Interface (HSI/S) driver, we have included some of the source developed by LBL and BBN as well. In most cases, our decision to include source developed by other contractors depended on whether it was necessary to modify the original code. If we have modified the software in any way, it is included in this document. In the case of the Traffic Generator (TG), however, we have included all the ST-II software, even though BBN performed the integration, because the ST-II software is part of the standard TG release. It is important to note that all the code developed by other contractors is in the public domain, so that all software developed under this effort can be re-created from the source included here.

  14. SANDS: A Service-Oriented Architecture for Clinical Decision Support in a National Health Information Network

    PubMed Central

    Wright, Adam; Sittig, Dean F.

    2008-01-01

    In this paper we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. PMID:18434256

  15. Development and validation of a building design waste reduction model.

    PubMed

    Llatas, C; Osmani, M

    2016-10-01

    Reduction in construction waste is a pressing need in many countries. The design of building elements is considered a pivotal process to achieve waste reduction at source, which enables an informed prediction of their wastage reduction levels. However the lack of quantitative methods linking design strategies to waste reduction hinders designing out waste practice in building projects. Therefore, this paper addresses this knowledge gap through the design and validation of a Building Design Waste Reduction Strategies (Waste ReSt) model that aims to investigate the relationships between design variables and their impact on onsite waste reduction. The Waste ReSt model was validated in a real-world case study involving 20 residential buildings in Spain. The validation process comprises three stages. Firstly, design waste causes were analyzed. Secondly, design strategies were applied leading to several alternative low waste building elements. Finally, their potential source reduction levels were quantified and discussed within the context of the literature. The Waste ReSt model could serve as an instrumental tool to simulate designing out strategies in building projects. The knowledge provided by the model could help project stakeholders to better understand the correlation between the design process and waste sources and subsequently implement design practices for low-waste buildings. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. A Study of Clinically Related Open Source Software Projects

    PubMed Central

    Hogarth, Michael A.; Turner, Stuart

    2005-01-01

    Open source software development has recently gained significant interest due to several successful mainstream open source projects. This methodology has been proposed as being similarly viable and beneficial in the clinical application domain as well. However, the clinical software development venue differs significantly from the mainstream software venue. Existing clinical open source projects have not been well characterized nor formally studied so the ‘fit’ of open source in this domain is largely unknown. In order to better understand the open source movement in the clinical application domain, we undertook a study of existing open source clinical projects. In this study we sought to characterize and classify existing clinical open source projects and to determine metrics for their viability. This study revealed several findings which we believe could guide the healthcare community in its quest for successful open source clinical software projects. PMID:16779056

  17. Cognitive Disequilibrium and Service-Learning in Physical Education Teacher Education: Perceptions of Pre-Service Teachers in a Study Abroad Experience

    ERIC Educational Resources Information Center

    Ward, Stephen; Pellet, Heidi Henschel; Perez, Mark I.

    2017-01-01

    Purpose: The purpose of this study was to explore preservice teachers' experiences of cognitive disequilibrium (CD) theory during a service-learning project in a study abroad experience. Method: A case study with 8 participants was used. Data sources consisted of: Formal interviews, videos of planning, videos of teaching, videos of reflection…

  18. A Case Study of Periodical Use by Library and Information Science Students

    ERIC Educational Resources Information Center

    Ivins, Tammy

    2013-01-01

    There is a lack of information in the literature about the sources used for research by modern Master of Library and Information Science students in the United States, and so the objective of this project is to understand the use of periodical articles by these students. Specifically: do articles play a major role in student research, how current…

  19. What Does a College Degree Cost? Comparing Approaches to Measuring "Cost Per Degree". Delta Cost Project White Paper Series

    ERIC Educational Resources Information Center

    Johnson, Nate

    2009-01-01

    What does it cost to provide a bachelor's-level education? This question arises with increasing frequency and urgency as pressure mounts on policymakers and education leaders to increase the education attainment level in the United States, to "Double the Numbers" in some cases. At the same time, the two traditional sources of…

  20. Creating Collaboration: Exploring the Development of a Baptist Digital Library and Archive. A Case Study

    ERIC Educational Resources Information Center

    Hall, Taffey

    2013-01-01

    The purpose of this study was to explore the construction of a collaborative Baptist digital library and archive on the Internet. The study investigated how a central electronic location of digitized Baptist primary source materials could look and work on the Internet and how such a project could benefit Baptist history professors, the primary…

  1. Contaminant Flux Reduction Barriers for Managing Difficult to Treat Source Zones in Unconsolidated Media

    DTIC Science & Technology

    2017-06-20

    39  Table 7.4:  Description of Case Study Site... Research Flux Reduction Materials: Several novel silica gel/vegetable oil- formulations were developed and tested in lab-scale batch and column studies by...Demonstration Results The project demonstration had these results:  Two grout mixtures were selected based on gel tests and a treatability study by

  2. Finding the forest in the trees. The challenge of combining diverse environmental data

    NASA Technical Reports Server (NTRS)

    1995-01-01

    Development of analytical and functional guidelines to help researchers and technicians engaged in interdisciplinary research to better plan and implement their supporting data management activities is addressed. An emphasis is on the projects that involve both geophysical and ecological issues. Six case studies were used to identify and to understand problems associated with collecting, integrating, and analyzing environmental data from local to global spatial scales and over a range of temporal scales. These case studies were also used to elaborate the common barriers to interfacing data of disparate sources and types. A number of lessons derived from the case studies are summarized and analyzed.

  3. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  4. Shared Voyage: Learning and Unlearning from Remarkable Projects

    NASA Technical Reports Server (NTRS)

    Laufer, Alexander; Post, Todd; Hoffman, Edward J.

    2005-01-01

    Shared Voyage is about four remarkable projects: the Advanced Composition Explorer (NASA), the Joint Air-to-Surface Standoff Missile (U.S. Air Force), the Pathfinder Solar-Powered Airplane (NASA), and the Advanced Medium Range Air-to-Air Missile (U.S.Air Force). Each project is presented as a case study comprised of stories collected from key members of the project teams. The stories found in the book are included with the purpose of providing an effective learning source for project management, encouraging the unlearning of outdated project management concepts, and enhancing awareness of the contexts surrounding different projects. Significantly different from project concepts found in most project management literature, Shared Voyage highlights concepts like a will to win, a results-oriented focus, and collaboration through trust. All four project teams researched in this study applied similar concepts; however, they applied them differently, tailoring them to fit the context of their own particular projects. It is clear that the one best way approach which is still the prevailing paradigm in project management literature should be replaced by a new paradigm: Even though general project management principles exist, their successful application depends on the specifics of the situation.

  5. DBGC: A Database of Human Gastric Cancer

    PubMed Central

    Wang, Chao; Zhang, Jun; Cai, Mingdeng; Zhu, Zhenggang; Gu, Wenjie; Yu, Yingyan; Zhang, Xiaoyan

    2015-01-01

    The Database of Human Gastric Cancer (DBGC) is a comprehensive database that integrates various human gastric cancer-related data resources. Human gastric cancer-related transcriptomics projects, proteomics projects, mutations, biomarkers and drug-sensitive genes from different sources were collected and unified in this database. Moreover, epidemiological statistics of gastric cancer patients in China and clinicopathological information annotated with gastric cancer cases were also integrated into the DBGC. We believe that this database will greatly facilitate research regarding human gastric cancer in many fields. DBGC is freely available at http://bminfor.tongji.edu.cn/dbgc/index.do PMID:26566288

  6. An open library of CT patient projection data

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Leng, Shuai; Yu, Lifeng; Holmes, David; Fletcher, Joel; McCollough, Cynthia

    2016-03-01

    Lack of access to projection data from patient CT scans is a major limitation for development and validation of new reconstruction algorithms. To meet this critical need, we are building a library of CT patient projection data in an open and vendor-neutral format, DICOM-CT-PD, which is an extended DICOM format that contains sinogram data, acquisition geometry, patient information, and pathology identification. The library consists of scans of various types, including head scans, chest scans, abdomen scans, electrocardiogram (ECG)-gated scans, and dual-energy scans. For each scan, three types of data are provided, including DICOM-CT-PD projection data at various dose levels, reconstructed CT images, and a free-form text file. Several instructional documents are provided to help the users extract information from DICOM-CT-PD files, including a dictionary file for the DICOM-CT-PD format, a DICOM-CT-PD reader, and a user manual. Radiologist detection performance based on the reconstructed CT images is also provided. So far 328 head cases, 228 chest cases, and 228 abdomen cases have been collected for potential inclusion. The final library will include a selection of 50 head, chest, and abdomen scans each from at least two different manufacturers, and a few ECG-gated scans and dual-source, dual-energy scans. It will be freely available to academic researchers, and is expected to greatly facilitate the development and validation of CT reconstruction algorithms.

  7. The Grand Ethiopian Renaissance Dam: Source of cooperation or contention?

    USGS Publications Warehouse

    Teferi Taye, Meron; Tadesse, Tsegaye; Senay, Gabriel; Block, Paul

    2016-01-01

    This paper discusses the challenges and benefits of the Grand Ethiopian Renaissance Dam (GERD), which is under construction and expected to be operational on the Blue Nile River in Ethiopia in a few years. Like many large-scale projects on transboundary rivers, the GERD has been criticized for potentially jeopardizing downstream water security and livelihoods through upstream unilateral decision making. In spite of the contentious nature of the project, the authors argue that this project can provide substantial benefits for regional development. The GERD, like any major river infrastructure project, will undeniably bring about social, environmental, and economic change, and in this unique case has, on balance, the potential to achieve success on all fronts. It must be stressed, however, that strong partnerships between riparian countries are essential. National success is contingent on regional cooperation.

  8. Lessons from Fraxinus, a crowd-sourced citizen science game in genomics

    PubMed Central

    Rallapalli, Ghanasyam; Saunders, Diane GO; Yoshida, Kentaro; Edwards, Anne; Lugo, Carlos A; Collin, Steve; Clavijo, Bernardo; Corpas, Manuel; Swarbreck, David; Clark, Matthew; Downie, J Allan; Kamoun, Sophien

    2015-01-01

    In 2013, in response to an epidemic of ash dieback disease in England the previous year, we launched a Facebook-based game called Fraxinus to enable non-scientists to contribute to genomics studies of the pathogen that causes the disease and the ash trees that are devastated by it. Over a period of 51 weeks players were able to match computational alignments of genetic sequences in 78% of cases, and to improve them in 15% of cases. We also found that most players were only transiently interested in the game, and that the majority of the work done was performed by a small group of dedicated players. Based on our experiences we have built a linear model for the length of time that contributors are likely to donate to a crowd-sourced citizen science project. This model could serve a guide for the design and implementation of future crowd-sourced citizen science initiatives. DOI: http://dx.doi.org/10.7554/eLife.07460.001 PMID:26219214

  9. Informatics in radiology: An open-source and open-access cancer biomedical informatics grid annotation and image markup template builder.

    PubMed

    Mongkolwat, Pattanasak; Channin, David S; Kleper, Vladimir; Rubin, Daniel L

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and image markup (AIM), a project supported by the National Cancer Institute's cancer biomedical informatics grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible.

  10. Informatics in Radiology: An Open-Source and Open-Access Cancer Biomedical Informatics Grid Annotation and Image Markup Template Builder

    PubMed Central

    Channin, David S.; Rubin, Vladimir Kleper Daniel L.

    2012-01-01

    In a routine clinical environment or clinical trial, a case report form or structured reporting template can be used to quickly generate uniform and consistent reports. Annotation and Image Markup (AIM), a project supported by the National Cancer Institute’s cancer Biomedical Informatics Grid, can be used to collect information for a case report form or structured reporting template. AIM is designed to store, in a single information source, (a) the description of pixel data with use of markups or graphical drawings placed on the image, (b) calculation results (which may or may not be directly related to the markups), and (c) supplemental information. To facilitate the creation of AIM annotations with data entry templates, an AIM template schema and an open-source template creation application were developed to assist clinicians, image researchers, and designers of clinical trials to quickly create a set of data collection items, thereby ultimately making image information more readily accessible. © RSNA, 2012 PMID:22556315

  11. Case Study for the ARRA-funded Ground Source Heat Pump Demonstration at Denver Museum of Nature & Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing

    High initial costs and lack of public awareness of ground-source heat pump (GSHP) technology are the two major barriers preventing rapid deployment of this energy-saving technology in the United States. Under the American Recovery and Reinvestment Act (ARRA), 26 GSHP projects were competitively selected and carried out to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. This report highlights the findings of a case study of one such GSHP demonstration projects that uses a recycled water heat pump (RWHP) system installed at the Denver Museum of Nature & Science in Denver, Colorado. Themore » RWHP system uses recycled water from the city’s water system as the heat sink and source for a modular water-to-water heat pump (WWHP). This case study was conducted based on the available measured performance data from December 2014 through August 2015, utility bills of the building in 2014 and 2015, construction drawings, maintenance records, personal communications, and construction costs. The annual energy consumption of the RWHP system was calculated based on the available measured data and other related information. It was compared with the performance of a baseline scenario— a conventional VAV system using a water-cooled chiller and a natural gas fired boiler, both of which have the minimum energy efficiencies allowed by ASHRAE 90.1-2010. The comparison was made to determine energy savings, operating cost savings, and CO2 emission reductions achieved by the RWHP system. A cost analysis was performed to evaluate the simple payback of the RWHP system. Summarized below are the results of the performance analysis, the learned lessons, and recommended improvement in the operation of the RWHP system.« less

  12. Project analysis procedures for an OPEC country: case study of Qatar's Northwest Dome Gas Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, A.B.; Khalifah, H.

    1986-01-01

    The discovery of oil in most OPEC countries in the 1940s changed the economies of these countries from a state of capital shortage and stagnation to a state of capital surplus and economic growth. This growth, however, is lopsided. Oil production and export dominate the gross domestic products (GDPs) of those economies. Concern arising during the 1970s about overdependence on crude oil export as the main source of national income has resulted in the initiation of various industrial development programs in OPEC states aiming to diversify their economies. This study was conducted with two primary objectives: (1) to identify andmore » understand the features of selected OPEC countries' development problems, strategies and plans, focusing on the role of oil and gas resources and opportunities for diversification, and (2) to suggest an appropriate development strategy, with project evaluation implications, for capital-abundant, labor-scarce OPEC countries in the Gulf region such as Qatar. This proposed approach is designed to evaluate the project from its contribution to the national income, people's welfare, the expansion of the economy's absorptive capacity, and relief of the economy's dependence on nonrenewable resources. The Northwest Dome Gas Project in Qatar was selected as an illustrative case study for this approach.« less

  13. Is drinking water from 'improved sources' really safe? A case study in the Logone valley (Chad-Cameroon).

    PubMed

    Sorlini, S; Palazzini, D; Mbawala, A; Ngassoum, M B; Collivignarelli, M C

    2013-12-01

    Within a cooperation project coordinated by the Association for Rural Cooperation in Africa and Latin America (ACRA) Foundation, water supplies were sampled across the villages of the Logone valley (Chad-Cameroon) mostly from boreholes, open wells, rivers and lakes as well as from some piped water. Microbiological analyses and sanitary inspections were carried out at each source. The microbiological quality was determined by analysis of indicators of faecal contamination, Escherichia coli, Enterococci and Salmonellae, using the membrane filtration method. Sanitary inspections were done using WHO query forms. The assessment confirmed that there are several parameters of health concern in the studied area; bacteria of faecal origins are the most significant. Furthermore, this study demonstrated that Joint Monitoring Programme (JMP) classification and E. coli measurement are not sufficient to state water safety. In fact, in the studied area, JMP defined 'improved sources' may provide unsafe water depending on their structure and sources without E. coli may have Enterococci and Salmonellae. Sanitary inspections also revealed high health risks for some boreholes. In other cases, sources with low sanitary risk and no E. coli were contaminated by Enterococci and Salmonellae. Better management and protection of the sources, hygiene improvement and domestic water treatment before consumption are possible solutions to reduce health risks in the Logone valley.

  14. Herschel Observations of Protostellar and Young Stellar Objects in Nearby Molecular Clouds: The DIGIT Open Time Key Project

    NASA Astrophysics Data System (ADS)

    Green, Joel D.; DIGIT OTKP Team

    2010-01-01

    The DIGIT (Dust, Ice, and Gas In Time) Open Time Key Project utilizes the PACS spectrometer (57-210 um) onboard the Herschel Space Observatory to study the colder regions of young stellar objects and protostellar cores, complementary to recent observations from Spitzer and ground-based observatories. DIGIT focuses on 30 embedded sources and 64 disk sources, and includes supporting photometry from PACS and SPIRE, as well as spectroscopy from HIFI, selected from nearby molecular clouds. For the embedded sources, PACS spectroscopy will allow us to address the origin of [CI] and high-J CO lines observed with ISO-LWS. Our observations are sensitive to the presence of cold crystalline water ice, diopside, and carbonates. Additionally, PACS scans are 5x5 maps of the embedded sources and their outflows. Observations of more evolved disk sources will sample low and intermediate mass objects as well as a variety of spectral types from A to M. Many of these sources are extremely rich in mid-IR crystalline dust features, enabling us to test whether similar features can be detected at larger radii, via colder dust emission at longer wavelengths. If processed grains are present only in the inner disk (in the case of full disks) or from the emitting wall surface which marks the outer edge of the gap (in the case of transitional disks), there must be short timescales for dust processing; if processed grains are detected in the outer disk, radial transport must be rapid and efficient. Weak bands of forsterite and clino- and ortho-enstatite in the 60-75 um range provide information about the conditions under which these materials were formed. For the Science Demonstration Phase we are observing an embedded protostar (DK Cha) and a Herbig Ae/Be star (HD 100546), exemplars of the kind of science that DIGIT will achieve over the full program.

  15. The State of Open Source Electronic Health Record Projects: A Software Anthropology Study.

    PubMed

    Alsaffar, Mona; Yellowlees, Peter; Odor, Alberto; Hogarth, Michael

    2017-02-24

    Electronic health records (EHR) are a key tool in managing and storing patients' information. Currently, there are over 50 open source EHR systems available. Functionality and usability are important factors for determining the success of any system. These factors are often a direct reflection of the domain knowledge and developers' motivations. However, few published studies have focused on the characteristics of free and open source software (F/OSS) EHR systems and none to date have discussed the motivation, knowledge background, and demographic characteristics of the developers involved in open source EHR projects. This study analyzed the characteristics of prevailing F/OSS EHR systems and aimed to provide an understanding of the motivation, knowledge background, and characteristics of the developers. This study identified F/OSS EHR projects on SourceForge and other websites from May to July 2014. Projects were classified and characterized by license type, downloads, programming languages, spoken languages, project age, development status, supporting materials, top downloads by country, and whether they were "certified" EHRs. Health care F/OSS developers were also surveyed using an online survey. At the time of the assessment, we uncovered 54 open source EHR projects, but only four of them had been successfully certified under the Office of the National Coordinator for Health Information Technology (ONC Health IT) Certification Program. In the majority of cases, the open source EHR software was downloaded by users in the United States (64.07%, 148,666/232,034), underscoring that there is a significant interest in EHR open source applications in the United States. A survey of EHR open source developers was conducted and a total of 103 developers responded to the online questionnaire. The majority of EHR F/OSS developers (65.3%, 66/101) are participating in F/OSS projects as part of a paid activity and only 25.7% (26/101) of EHR F/OSS developers are, or have been, health care providers in their careers. In addition, 45% (45/99) of developers do not work in the health care field. The research presented in this study highlights some challenges that may be hindering the future of health care F/OSS. A minority of developers have been health care professionals, and only 55% (54/99) work in the health care field. This undoubtedly limits the ability of functional design of F/OSS EHR systems from being a competitive advantage over prevailing commercial EHR systems. Open source software seems to be a significant interest to many; however, given that only four F/OSS EHR systems are ONC-certified, this interest is unlikely to yield significant adoption of these systems in the United States. Although the Health Information Technology for Economic and Clinical Health (HITECH) act was responsible for a substantial infusion of capital into the EHR marketplace, the lack of a corporate entity in most F/OSS EHR projects translates to a marginal capacity to market the respective F/OSS system and to navigate certification. This likely has further disadvantaged F/OSS EHR adoption in the United States. ©Mona Alsaffar, Peter Yellowlees, Alberto Odor, Michael Hogarth. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 24.02.2017.

  16. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    NASA Astrophysics Data System (ADS)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them for future streamflow projections and segregate the contribution of various sources to the uncertainty.

  17. The use of a digital computer for calculation of acoustic fields of complex vibrating structures by the reciprocity principle

    NASA Technical Reports Server (NTRS)

    Rimskiy-Korsakov, A. V.; Belousov, Y. I.

    1973-01-01

    A program was compiled for calculating acoustical pressure levels, which might be created by vibrations of complex structures (an assembly of shells and rods), under the influence of a given force, for cases when these fields cannot be measured directly. The acoustical field is determined according to transition frequency and pulse characteristics of the structure in the projection mode. Projection characteristics are equal to the reception characteristics, for vibrating systems in which the reciprocity principle holds true. Characteristics in the receiving mode are calculated on the basis of experimental data on a point pulse space velocity source (input signal) and vibration response of the structure (output signal). The space velocity of a pulse source, set at a point in space r, where it is necessary to calculate the sound field of the structure p(r,t), is determined by measurements of acoustic pressure, created by a point source at a distance R. The vibration response is measured at the point where the forces F and f exciting the system should act.

  18. Projections of number of cancer cases in India (2010-2020) by cancer groups.

    PubMed

    Takiar, Ramnath; Nadayil, Deenu; Nandakumar, A

    2010-01-01

    Recently, NCRP (ICMR), Bangalore, has published a report on Time Trends in Cancer Incidence Rates. The report also provided projected numbers of cancer cases at the India country level for selected leadingsites. In the present paper, an attempt has been made to project cancer cases for India by sex, years and cancer groups. The incidence data generated by population-based cancer registries (PBCRs) at Bangalore, Barshi, Bhopal, Chennai, Delhi and Mumbai for the years 2001-2005 formed the sources of data. In addition, the latest incidence data of North Eastern Registries for the year 2005-06 were utilized. The crude incidence rate (CR) was considered suitable for assessing the future load of cancer cases in the country. The Linear Regression method (IARC 1991) was used to assess the time trend and the projection of rates for the periods 2010-2020. For whichever sites where trends were not found to be significant, their latest rates were taken into consideration and assumed to remain same for the period 2010-2020. The total cancer cases are likely to go up from 979,786 cases in the year 2010 to 1,148,757 cases in the year 2020. The tobacco-related cancers for males are estimated to go up from 190,244 in the year 2010 to 225,241 in the year 2020. Similarly, the female cases will go up from 75,289 in year 2010 to 93,563 in the year 2020. For the year 2010, the number of cancer cases related to digestive system, for both males and females, are estimated to be 107,030 and 86,606 respectively. For, head and neck cancers, the estimates are 122,643 and 53,148 cases, respectively. and for the lymphoid and hematopoietic system (LHS), for the year 2010, are 62,648 for males and 41,591 for females. Gynecological-related cancers are estimated to go up from 153,850 in 2010 to 182,602 in 2020. Among males and females, cancer of breast alone is expected to cross the figure of 100,000 by the year 2020.

  19. A new traffic model with a lane-changing viscosity term

    NASA Astrophysics Data System (ADS)

    Ko, Hung-Tang; Liu, Xiao-He; Guo, Ming-Min; Wu, Zheng

    2015-09-01

    In this paper, a new continuum traffic flow model is proposed, with a lane-changing source term in the continuity equation and a lane-changing viscosity term in the acceleration equation. Based on previous literature, the source term addresses the impact of speed difference and density difference between adjacent lanes, which provides better precision for free lane-changing simulation; the viscosity term turns lane-changing behavior to a “force” that may influence speed distribution. Using a flux-splitting scheme for the model discretization, two cases are investigated numerically. The case under a homogeneous initial condition shows that the numerical results by our model agree well with the analytical ones; the case with a small initial disturbance shows that our model can simulate the evolution of perturbation, including propagation, dissipation, cluster effect and stop-and-go phenomenon. Project supported by the National Natural Science Foundation of China (Grant Nos. 11002035 and 11372147) and Hui-Chun Chin and Tsung-Dao Lee Chinese Undergraduate Research Endowment (Grant No. CURE 14024).

  20. Project Ci-Nergy Towards AN Integrated Energy Urban Planning System from a Data Modelling and System Architecture Perspective

    NASA Astrophysics Data System (ADS)

    Agugiaro, G.; Robineau, J.-L.; Rodrigues, P.

    2017-09-01

    Growing urbanisation, its related environmental impacts, and social inequalities in cities are challenges requiring a holistic urban planning perspective that takes into account the different aspects of sustainable development. One crucial point is to reconcile urban planning with environmental targets, which include decreasing energy demand and CO2 emissions, and increasing the share of renewable energy. Within this context, the project CI-NERGY aims to develop urban energy modelling, simulation and optimisation methods and tools to support decision making in urban planning. However, there are several barriers to the implementation of such tools, such as: fragmentation of involved disciplines, different stakeholders, multiplicity of scales in a city and extreme heterogeneity of data regarding all the processes to be addressed. Project CI-NERGY aims, among other goals, at overcoming these barriers, and focuses on two case study cities, Geneva in Switzerland and Vienna in Austria. In particular, project CI-NERGY faces several challenges starting with different cities, heterogeneous data sources and simulation tools, diverse user groups and their individual needs. This paper describes the experiences gathered during the project. After giving a brief overview of the project, the two case study cities, Geneva and Vienna, are briefly presented, and the focus shifts then on overall system architecture of the project, ranging from urban data modelling topics to the implementation of a Service-Oriented Architecture. Some of the challenges faced, the solutions found, as well some plans for future improvements are described and commented.

  1. Building America Case Study: Simplified Air Distribution, Desuperheaters, and Sub-Slab Geothermal Heat Exchangers, Pittsburgh, Pennsylvania

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    This report presents a cold-climate project that examines an alternative approach to ground source heat pump (GSHP) ground loop design. The innovative ground loop design is an attempt to reduce the installed cost of the ground loop heat exchange portion of the system by containing the entire ground loop within the excavated location beneath the basement slab.

  2. Effects of DEM source and resolution on WEPP hydrologic and erosion simulation: A case study of two forest watersheds in northern Idaho

    Treesearch

    J. X. Zhang; J. Q. Wu; K. Chang; W. J. Elliot; S. Dun

    2009-01-01

    The recent modification of the Water Erosion Prediction Project (WEPP) model has improved its applicability to hydrology and erosion modeling in forest watersheds. To generate reliable topographic and hydrologic inputs for the WEPP model, carefully selecting digital elevation models (DEMs) with appropriate resolution and accuracy is essential because topography is a...

  3. The Web and Information Literacy: Scaffolding the use of Web Sources in a Project-Based Curriculum

    ERIC Educational Resources Information Center

    Walton, Marion; Archer, Arlene

    2004-01-01

    In this article we describe and discuss a three-year case study of a course in web literacy, part of the academic literacy curriculum for first-year engineering students at the University of Cape Town (UCT). Because they are seen as practical knowledge, not theoretical, information skills tend to be devalued at university and rendered invisible to…

  4. The main sources of pollution of the aquatic environment in Hellas

    NASA Astrophysics Data System (ADS)

    Koumantakis, J.; Dimitrakopoulos, D.; Markantonis, K.; Grigorakou, E.; Vassiliou, E.

    2003-04-01

    The research team of the laboratory of Engineering Geology &Hydrogeology of NTUA and P.P.C. have carried out several research projects since 1990. The conclusions of these projects for the main sources of pollution of the aquatic environment in Hellas are the following: Human activities : a) Urban and industrial wastes (solid and liquids) are disposed or discharged to the surface or groundwater bodies causing degradation of their quality (case studies of Athens Basin, Lavrio region, Atalanti plain), b) intensive use of pesticides and fertilizers for agriculture, through the process of percolation or leaching causes the deterioration of aquifers and surface water (case studies of Plolemais Basin, Korinth region, Elassona Basin, Atalanti plain, Thrapsana Basin Iraklio), c) current exploitations and old or abandoned mining sites, disturb the aquatic environment and create new hydraulic connections between clean and polluted aquifers or the sea (case studies of Lavrio region, Ptolemais Basin, Megalopoli Basin), d) over-pumping of aquifers mainly for irrigation but also in some cases for dewatering of mines, results in continues drawdown of the groundwater level and intrusion of sea (case studies of Korinth region, Athens basin, Naxos island, Nea Peramos Kavala, Marathon, Argolida Field, Atalanti plain, Achaia region, Stratoni area Chalkidiki, Gouves Iraklio). Geological Environment: a) extensive karstification of limestones that spread up all over the Greek region (33%) causes the intrusion of the sea far into the land (case studies of Lavrio region, Kefalonia island, Hymettus mountain), b) the chemical composition of the geological formations through the process of ion exchange and solubility pollute the groundwater resources (case studies of Vegoritis Basin, Katsika Chalkidiki, Florina region). The proposed measures to face these problems are : - the orthological management of the water resources - the artificial recharge of the aquifers, - proper waste management, of wastes generated by human activities, - systematic study of the karstic saline springs of Greece for their exploitation.

  5. A GIS-based atmospheric dispersion model for pollutants emitted by complex source areas.

    PubMed

    Teggi, Sergio; Costanzini, Sofia; Ghermandi, Grazia; Malagoli, Carlotta; Vinceti, Marco

    2018-01-01

    Gaussian dispersion models are widely used to simulate the concentrations and deposition fluxes of pollutants emitted by source areas. Very often, the calculation time limits the number of sources and receptors and the geometry of the sources must be simple and without holes. This paper presents CAREA, a new GIS-based Gaussian model for complex source areas. CAREA was coded in the Python language, and is largely based on a simplified formulation of the very popular and recognized AERMOD model. The model allows users to define in a GIS environment thousands of gridded or scattered receptors and thousands of complex sources with hundreds of vertices and holes. CAREA computes ground level, or near ground level, concentrations and dry deposition fluxes of pollutants. The input/output and the runs of the model can be completely managed in GIS environment (e.g. inside a GIS project). The paper presents the CAREA formulation and its applications to very complex test cases. The tests shows that the processing time are satisfactory and that the definition of sources and receptors and the output retrieval are quite easy in a GIS environment. CAREA and AERMOD are compared using simple and reproducible test cases. The comparison shows that CAREA satisfactorily reproduces AERMOD simulations and is considerably faster than AERMOD. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Current and future levels of mercury atmospheric pollution on a global scale

    NASA Astrophysics Data System (ADS)

    Pacyna, Jozef M.; Travnikov, Oleg; De Simone, Francesco; Hedgecock, Ian M.; Sundseth, Kyrre; Pacyna, Elisabeth G.; Steenhuisen, Frits; Pirrone, Nicola; Munthe, John; Kindbom, Karin

    2016-10-01

    An assessment of current and future emissions, air concentrations, and atmospheric deposition of mercury worldwide is presented on the basis of results obtained during the performance of the EU GMOS (Global Mercury Observation System) project. Emission estimates for mercury were prepared with the main goal of applying them in models to assess current (2013) and future (2035) air concentrations and atmospheric deposition of this contaminant. The combustion of fossil fuels (mainly coal) for energy and heat production in power plants and in industrial and residential boilers, as well as artisanal and small-scale gold mining, is one of the major anthropogenic sources of Hg emissions to the atmosphere at present. These sources account for about 37 and 25 % of the total anthropogenic Hg emissions globally, estimated to be about 2000 t. Emissions in Asian countries, particularly in China and India, dominate the total emissions of Hg. The current estimates of mercury emissions from natural processes (primary mercury emissions and re-emissions), including mercury depletion events, were estimated to be 5207 t year-1, which represents nearly 70 % of the global mercury emission budget. Oceans are the most important sources (36 %), followed by biomass burning (9 %). A comparison of the 2035 anthropogenic emissions estimated for three different scenarios with current anthropogenic emissions indicates a reduction of these emissions in 2035 up to 85 % for the best-case scenario. Two global chemical transport models (GLEMOS and ECHMERIT) have been used for the evaluation of future mercury pollution levels considering future emission scenarios. Projections of future changes in mercury deposition on a global scale simulated by these models for three anthropogenic emissions scenarios of 2035 indicate a decrease in up to 50 % deposition in the Northern Hemisphere and up to 35 % in Southern Hemisphere for the best-case scenario. The EU GMOS project has proved to be a very important research instrument for supporting the scientific justification for the Minamata Convention and monitoring of the implementation of targets of this convention, as well as the EU Mercury Strategy. This project provided the state of the art with regard to the development of the latest emission inventories for mercury, future emission scenarios, dispersion modelling of atmospheric mercury on a global and regional scale, and source-receptor techniques for mercury emission apportionment on a global scale.

  7. Background sampling and transferability of species distribution model ensembles under climate change

    NASA Astrophysics Data System (ADS)

    Iturbide, Maialen; Bedia, Joaquín; Gutiérrez, José Manuel

    2018-07-01

    Species Distribution Models (SDMs) constitute an important tool to assist decision-making in environmental conservation and planning. A popular application of these models is the projection of species distributions under climate change conditions. Yet there are still a range of methodological SDM factors which limit the transferability of these models, contributing significantly to the overall uncertainty of the resulting projections. An important source of uncertainty often neglected in climate change studies comes from the use of background data (a.k.a. pseudo-absences) for model calibration. Here, we study the sensitivity to pseudo-absence sampling as a determinant factor for SDM stability and transferability under climate change conditions, focusing on European wide projections of Quercus robur as an illustrative case study. We explore the uncertainty in future projections derived from ten pseudo-absence realizations and three popular SDMs (GLM, Random Forest and MARS). The contribution of the pseudo-absence realization to the uncertainty was higher in peripheral regions and clearly differed among the tested SDMs in the whole study domain, being MARS the most sensitive - with projections differing up to a 40% for different realizations - and GLM the most stable. As a result we conclude that parsimonious SDMs are preferable in this context, avoiding complex methods (such as MARS) which may exhibit poor model transferability. Accounting for this new source of SDM-dependent uncertainty is crucial when forming multi-model ensembles to undertake climate change projections.

  8. Provision of pandemic disease information by health sciences librarians: a multisite comparative case series*†‡§

    PubMed Central

    Featherstone, Robin M; Boldt, R. Gabriel; Torabi, Nazi; Konrad, Shauna-Lee

    2012-01-01

    Objective: The research provides an understanding of pandemic information needs and informs professional development initiatives for librarians in disaster medicine. Methods: Utilizing a multisite, comparative case series design, the researchers conducted semi-structured interviews and examined supplementary materials in the form of organizational documents, correspondence, and websites to create a complete picture of each case. The rigor of the case series was ensured through data and investigator triangulation. Interview transcripts were coded using NVivo to identify common themes and points of comparison. Results: Comparison of the four cases revealed a distinct difference between “client-initiated” and “librarian-initiated” provision of pandemic information. Librarian-initiated projects utilized social software to “push” information, whereas client-initiated projects operated within patron-determined parameters to deliver information. Health care administrators were identified as a key audience for pandemic information, and news agencies were utilized as essential information sources. Librarians' skills at evaluating available information proved crucial for selecting best-quality evidence to support administrative decision making. Conclusions: Qualitative analysis resulted in increased understanding of pandemic information needs and identified best practices for disseminating information during periods of high organizational stress caused by an influx of new cases of an unknown infectious disease. PMID:22514506

  9. A primer on theory-driven web scraping: Automatic extraction of big data from the Internet for use in psychological research.

    PubMed

    Landers, Richard N; Brusso, Robert C; Cavanaugh, Katelyn J; Collmus, Andrew B

    2016-12-01

    The term big data encompasses a wide range of approaches of collecting and analyzing data in ways that were not possible before the era of modern personal computing. One approach to big data of great potential to psychologists is web scraping, which involves the automated collection of information from webpages. Although web scraping can create massive big datasets with tens of thousands of variables, it can also be used to create modestly sized, more manageable datasets with tens of variables but hundreds of thousands of cases, well within the skillset of most psychologists to analyze, in a matter of hours. In this article, we demystify web scraping methods as currently used to examine research questions of interest to psychologists. First, we introduce an approach called theory-driven web scraping in which the choice to use web-based big data must follow substantive theory. Second, we introduce data source theories , a term used to describe the assumptions a researcher must make about a prospective big data source in order to meaningfully scrape data from it. Critically, researchers must derive specific hypotheses to be tested based upon their data source theory, and if these hypotheses are not empirically supported, plans to use that data source should be changed or eliminated. Third, we provide a case study and sample code in Python demonstrating how web scraping can be conducted to collect big data along with links to a web tutorial designed for psychologists. Fourth, we describe a 4-step process to be followed in web scraping projects. Fifth and finally, we discuss legal, practical and ethical concerns faced when conducting web scraping projects. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. A comprehensive Probabilistic Tsunami Hazard Assessment for the city of Naples (Italy)

    NASA Astrophysics Data System (ADS)

    Anita, G.; Tonini, R.; Selva, J.; Sandri, L.; Pierdominici, S.; Faenza, L.; Zaccarelli, L.

    2012-12-01

    A comprehensive Probabilistic Tsunami Hazard Assessment (PTHA) should consider different tsunamigenic sources (seismic events, slide failures, volcanic eruptions) to calculate the hazard on given target sites. This implies a multi-disciplinary analysis of all natural tsunamigenic sources, in a multi-hazard/risk framework, which considers also the effects of interaction/cascade events. Our approach shows the ongoing effort to analyze the comprehensive PTHA for the city of Naples (Italy) including all types of sources located in the Tyrrhenian Sea, as developed within the Italian project ByMuR (Bayesian Multi-Risk Assessment). The project combines a multi-hazard/risk approach to treat the interactions among different hazards, and a Bayesian approach to handle the uncertainties. The natural potential tsunamigenic sources analyzed are: 1) submarine seismic sources located on active faults in the Tyrrhenian Sea and close to the Southern Italian shore line (also we consider the effects of the inshore seismic sources and the associated active faults which we provide their rapture properties), 2) mass failures and collapses around the target area (spatially identified on the basis of their propensity to failure), and 3) volcanic sources mainly identified by pyroclastic flows and collapses from the volcanoes in the Neapolitan area (Vesuvius, Campi Flegrei and Ischia). All these natural sources are here preliminary analyzed and combined, in order to provide a complete picture of a PTHA for the city of Naples. In addition, the treatment of interaction/cascade effects is formally discussed in the case of significant temporary variations in the short-term PTHA due to an earthquake.

  11. A Novel Approach to model EPIC variable background

    NASA Astrophysics Data System (ADS)

    Marelli, M.; De Luca, A.; Salvetti, D.; Belfiore, A.

    2017-10-01

    One of the main aim of the EXTraS (Exploring the X-ray Transient and variable Sky) project is to characterise the variability of serendipitous XMM-Newton sources within each single observation. Unfortunately, 164 Ms out of the 774 Ms of cumulative exposure considered (21%) are badly affected by soft proton flares, hampering any classical analysis of field sources. De facto, the latest releases of the 3XMM catalog, as well as most of the analysis in literature, simply exclude these 'high background' periods from analysis. We implemented a novel SAS-indipendent approach to produce background-subtracted light curves, which allows to treat the case of very faint sources and very bright proton flares. EXTraS light curves of 3XMM-DR5 sources will be soon released to the community, together with new tools we are developing.

  12. Integration of midwives into the Quebec health care system. L'Equipe d'Evaluation des Projets-Pilotes Sages-Femmes.

    PubMed

    Collin, J; Blais, R; White, D; Demers, A; Desbiens, F

    2000-01-01

    This paper reports on one aspect of the evaluation of the midwifery pilot projects in Quebec: the identification of the professional and organizational factors, as well as the mode of integrating midwives into the maternity care system, that would promote the best outcomes and the autonomy of midwives. The research strategy involved a multiple-case study, in which each midwifery pilot project represented a case. Based on a qualitative approach, the study employed various sources of data: individual interviews and focus groups with key informants, site observations and analyses of written documents. Results show that midwives were poorly integrated into the health care system during the evaluation. Four main reasons were identified: lack of knowledge about the practice of midwifery on the part of other health care providers; deficiencies in the legal and organizational structure of the pilot projects; competition over professional territories; and gaps between the midwives' and other providers' professional cultures. Recommendations are provided to facilitate the integration of midwives into the health care system.

  13. Impacts of biogas projects on agro-ecosystem in rural areas — A case study of Gongcheng

    NASA Astrophysics Data System (ADS)

    Yang, Jin; Chen, Weichao; Chen, Bin

    2011-09-01

    The rapid growth of agro-ecosystem has been the focus of "New Rural Construction" in China due to intensive energy consumption and environmental pollution in rural areas. As a kind of renewable energy, biogas is helpful for new energy development and plays an important role in the sustainable development of agro-ecosystem in China. To evaluate the effects of biogas on agro-ecosystem from a systematic angle, we discussed the status quo of household biogas and identified its main factors that may have impacts on agro-ecosystem. An indicator framework covering environmental, social and economic aspects was established to quantify the impacts exerted by biogas project on agro-ecosystem. A case study of Gongcheng was then conducted to evaluate the combined impact of biogas project using the proposed indicator framework. Results showed that there was a notable positive effect brought by the application of biogas, and the integrated benefit has been significantly improved by 60.36%, implying that biogas as a substitute energy source can promote the sustainable level of rural areas.

  14. How Is Open Source Special?

    ERIC Educational Resources Information Center

    Kapor, Mitchell

    2005-01-01

    Open source software projects involve the production of goods, but in software projects, the "goods" consist of information. The open source model is an alternative to the conventional centralized, command-and-control way in which things are usually made. In contrast, open source projects are genuinely decentralized and transparent. Transparent…

  15. ISONITRATE demonstration project: How isotopic monitoring can improve management of nitrate pollution in water

    NASA Astrophysics Data System (ADS)

    Widory, D.

    2008-12-01

    Nitrate is one of the major pollutants of drinking water resources worldwide. Recent European directives reduced inputs from intensive agriculture, but in most places NO3 levels are approaching the potable limit of 50 mg.l-1 in groundwater. Determining the source(s) of contamination in groundwater is an important first step for improving its quality by emission control. It is with this aim that we review here the benefit of using a multi-isotope approach (d15N, d180, d11B), in addition to conventional hydrogeological analysis, to constrain the the origin of NO3 pollution in water. The isotopic composition of the dissolved nitrogen species has been used extensively to better constrain the sources and fate of nitrate in groundwater. The possibility of quantifying both origin and secondary processes affecting N concentrations by means of a single tracer appears more limited however. Nitrogen cannot be considered conservative because it is biologically modified through nitrification and denitrification reactions, both during infiltration of the water and in the groundwater body, causing isotopic fractionation that modifies the d15N signatures of the dissolved N species. Discriminating multiple NO3 sources by their N isotopic composition alone becomes impossible whenever heterogenic or autogenic denitrification occurs, thus arising the need for establishing co-migrating discriminators of NO3 sources: addition of the d180 from NO3 and of the d11B. This presentation will strongly rely on our current European Life ISONITRATE project, which aims at showing policy makers how management of nitrate pollution in water can be greatly improved by the incorporation of the multi-isotope monitoring. The pilot site is located in the Alsace region (France and border Germany), part of the Upper Rhine basin, a groundwater body considered as one of the most important drinking water reservoirs in Europe. The demonstration of the multi-isotope approach is based on 4 distinct scenarios: 1. Natural case: corresponds to the natural nitrification of the soil and represents the reference end-member. Samples with NO3 concentration levels higher than this end-member are considered as polluted. 2. Denitrification case: groundwater samples are selected along an identified denitrification gradient in the "Appenweier-Rheinau" region (Germany): the uppermost samples being contaminated by mineral fertilizers used in vineyards (but not denitrified), and the downstream sample being (almost) totally denitrified. 3. Simple case: chosen as being under the influence of a sole type of nitrate pollution source: mineral fertilisation from the "Orschwihr- Bergholtz vineyards". 4. Complex case: where nitrates correspond to a mixing of different pollution sources (mineral and organic fertilisers), located within the "Dietwiller area".

  16. Implementation of Biogas Stations into Smart Heating and Cooling Network

    NASA Astrophysics Data System (ADS)

    Milčák, P.; Konvička, J.; Jasenská, M.

    2016-10-01

    The paper is aimed at the description of implementation of a biogas station into software environment for the "Smart Heating and Cooling Networks". The aim of this project is creation of a software tool for preparation of operation and optimization of treatment of heat/cool in small regions. In this case, the biogas station represents a kind of renewable energy source, which, however, has its own operational specifics which need to be taken into account at the creation of an implementation project. For a specific biogas station, a detailed computational model was elaborated, which is parameterized in particular for an optimization of the total computational time.

  17. A Messaging Infrastructure for WLCG

    NASA Astrophysics Data System (ADS)

    Casey, James; Cons, Lionel; Lapka, Wojciech; Paladin, Massimo; Skaburskas, Konstantin

    2011-12-01

    During the EGEE-III project operational tools such as SAM, Nagios, Gridview, the regional Dashboard and GGUS moved to a communication architecture based on ActiveMQ, an open-source enterprise messaging solution. LHC experiments, in particular ATLAS, developed prototypes of systems using the same messaging infrastructure, validating the system for their use-cases. In this paper we describe the WLCG messaging use cases and outline an improved messaging architecture based on the experience gained during the EGEE-III period. We show how this provides a solid basis for many applications, including the grid middleware, to improve their resilience and reliability.

  18. The California Baseline Methane Survey

    NASA Astrophysics Data System (ADS)

    Duren, R. M.; Thorpe, A. K.; Hopkins, F. M.; Rafiq, T.; Bue, B. D.; Prasad, K.; Mccubbin, I.; Miller, C. E.

    2017-12-01

    The California Baseline Methane Survey is the first systematic, statewide assessment of methane point source emissions. The objectives are to reduce uncertainty in the state's methane budget and to identify emission mitigation priorities for state and local agencies, utilities and facility owners. The project combines remote sensing of large areas with airborne imaging spectroscopy and spatially resolved bottom-up data sets to detect, quantify and attribute emissions from diverse sectors including agriculture, waste management, oil and gas production and the natural gas supply chain. Phase 1 of the project surveyed nearly 180,000 individual facilities and infrastructure components across California in 2016 - achieving completeness rates ranging from 20% to 100% per emission sector at < 5 meters spatial resolution. Additionally, intensive studies of key areas and sectors were performed to assess source persistence and variability at times scales ranging from minutes to months. Phase 2 of the project continues with additional data collection in Spring and Fall 2017. We describe the survey design and measurement, modeling and analysis methods. We present initial findings regarding the spatial, temporal and sectoral distribution of methane point source emissions in California and their estimated contribution to the state's total methane budget. We provide case-studies and lessons learned about key sectors including examples where super-emitters were identified and mitigated. We summarize challenges and recommendations for future methane research, inventories and mitigation guidance within and beyond California.

  19. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    NASA Astrophysics Data System (ADS)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  20. SANDS: a service-oriented architecture for clinical decision support in a National Health Information Network.

    PubMed

    Wright, Adam; Sittig, Dean F

    2008-12-01

    In this paper, we describe and evaluate a new distributed architecture for clinical decision support called SANDS (Service-oriented Architecture for NHIN Decision Support), which leverages current health information exchange efforts and is based on the principles of a service-oriented architecture. The architecture allows disparate clinical information systems and clinical decision support systems to be seamlessly integrated over a network according to a set of interfaces and protocols described in this paper. The architecture described is fully defined and developed, and six use cases have been developed and tested using a prototype electronic health record which links to one of the existing prototype National Health Information Networks (NHIN): drug interaction checking, syndromic surveillance, diagnostic decision support, inappropriate prescribing in older adults, information at the point of care and a simple personal health record. Some of these use cases utilize existing decision support systems, which are either commercially or freely available at present, and developed outside of the SANDS project, while other use cases are based on decision support systems developed specifically for the project. Open source code for many of these components is available, and an open source reference parser is also available for comparison and testing of other clinical information systems and clinical decision support systems that wish to implement the SANDS architecture. The SANDS architecture for decision support has several significant advantages over other architectures for clinical decision support. The most salient of these are:

  1. Translating Extreme Precipitation Data from Climate Change Projections into Resilient Engineering Applications

    NASA Astrophysics Data System (ADS)

    Cook, L. M.; Samaras, C.; Anderson, C.

    2016-12-01

    Engineers generally use historical precipitation trends to inform assumptions and parameters for long-lived infrastructure designs. However, resilient design calls for the adjustment of current engineering practice to incorporate a range of future climate conditions that are likely to be different than the past. Despite the availability of future projections from downscaled climate models, there remains a considerable mismatch between climate model outputs and the inputs needed in the engineering community to incorporate climate resiliency. These factors include differences in temporal and spatial scales, model uncertainties, and a lack of criteria for selection of an ensemble of models. This research addresses the limitations to working with climate data by providing a framework for the use of publicly available downscaled climate projections to inform engineering resiliency. The framework consists of five steps: 1) selecting the data source based on the engineering application, 2) extracting the data at a specific location, 3) validating for performance against observed data, 4) post-processing for bias or scale, and 5) selecting the ensemble and calculating statistics. The framework is illustrated with an example application to extreme precipitation-frequency statistics, the 25-year daily precipitation depth, using four publically available climate data sources: NARCCAP, USGS, Reclamation, and MACA. The attached figure presents the results for step 5 from the framework, analyzing how the 24H25Y depth changes when the model ensemble is culled based on model performance against observed data, for both post-processing techniques: bias-correction and change factor. Culling the model ensemble increases both the mean and median values for all data sources, and reduces range for NARCCAP and MACA ensembles due to elimination of poorer performing models, and in some cases, those that predict a decrease in future 24H25Y precipitation volumes. This result is especially relevant to engineers who wish to reduce the range of the ensemble and remove contradicting models; however, this result is not generalizable for all cases. Finally, this research highlights the need for the formation of an intermediate entity that is able to translate climate projections into relevant engineering information.

  2. MANAGING UNCERTAINTIES ASSOCIATED WITH RADIOACTIVE WASTE DISPOSAL: TASK GROUP 4 OF THE IAEA PRISM PROJECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seitz, R.

    2011-03-02

    It is widely recognized that the results of safety assessment calculations provide an important contribution to the safety arguments for a disposal facility, but cannot in themselves adequately demonstrate the safety of the disposal system. The safety assessment and a broader range of arguments and activities need to be considered holistically to justify radioactive waste disposal at any particular site. Many programs are therefore moving towards the production of what has become known as a Safety Case, which includes all of the different activities that are conducted to demonstrate the safety of a disposal concept. Recognizing the growing interest inmore » the concept of a Safety Case, the International Atomic Energy Agency (IAEA) is undertaking an intercomparison and harmonization project called PRISM (Practical Illustration and use of the Safety Case Concept in the Management of Near-surface Disposal). The PRISM project is organized into four Task Groups that address key aspects of the Safety Case concept: Task Group 1 - Understanding the Safety Case; Task Group 2 - Disposal facility design; Task Group 3 - Managing waste acceptance; and Task Group 4 - Managing uncertainty. This paper addresses the work of Task Group 4, which is investigating approaches for managing the uncertainties associated with near-surface disposal of radioactive waste and their consideration in the context of the Safety Case. Emphasis is placed on identifying a wide variety of approaches that can and have been used to manage different types of uncertainties, especially non-quantitative approaches that have not received as much attention in previous IAEA projects. This paper includes discussions of the current results of work on the task on managing uncertainty, including: the different circumstances being considered, the sources/types of uncertainties being addressed and some initial proposals for approaches that can be used to manage different types of uncertainties.« less

  3. Can SIA empower communities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gagnon, C.; Hirsch, P.; Howitt, R.

    1993-07-01

    Public participation in social impact assessment (SIA) has been identified as a source of improved decision-making about resource development in several countries, with an implicit assumption that this sort of participation provides an avenue for empowerment of affected communities in these decision-making processes. This paper provides a critical discussion of the effectiveness of SIA as a means of local empowerment through case studies of resource projects in Australia, Canada, and Southeast Asia.

  4. The Case for Artificial Intelligence in Medicine

    PubMed Central

    Reggia, James A.

    1983-01-01

    Current artificial intelligence (AI) technology can be viewed as producing “systematic artifacts” onto which we project an interpretation of intelligent behavior. One major benefit this technology could bring to medicine is help with handling the tremendous and growing volume of medical knowledge. The reader is led to a vision of the medical library of tomorrow, an interactive, artificially intelligent knowledge source that is fully and directly integrated with daily patient care.

  5. Check-Cases for Verification of 6-Degree-of-Freedom Flight Vehicle Simulations. Volume 2; Appendices

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Jackson, E. Bruce; Shelton, Robert O.

    2015-01-01

    This NASA Engineering and Safety Center (NESC) assessment was established to develop a set of time histories for the flight behavior of increasingly complex example aerospacecraft that could be used to partially validate various simulation frameworks. The assessment was conducted by representatives from several NASA Centers and an open-source simulation project. This document contains details on models, implementation, and results.

  6. Open and Crowd-Sourced Data for Treaty Verification

    DTIC Science & Technology

    2014-10-01

    the case of user - generated Internet content , such as Wikipedia, or Amazon reviews. Another example is the Zooniverse citizen science project,6 which...Prescribed by ANSI Std. Z39.18 Contents EXECUTIVE SUMMARY 1 1 INTRODUCTION 5 1.1 Charge to the Panel . . . . . . . . . . . . . . . . . . . . . . . 7 2...number of potential observations can in many in- stances make up for relatively crude measurements made by the pub- lic. Users are motivated to

  7. Building CHAOS: An Operating System for Livermore Linux Clusters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garlick, J E; Dunlap, C M

    2003-02-21

    The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less

  8. Cultural Heritage Through Time: a Case Study at Hadrian's Wall, United Kingdom

    NASA Astrophysics Data System (ADS)

    Fieber, K. D.; Mills, J. P.; Peppa, M. V.; Haynes, I.; Turner, S.; Turner, A.; Douglas, M.; Bryan, P. G.

    2017-02-01

    Diachronic studies are central to cultural heritage research for the investigation of change, from landscape to architectural scales. Temporal analyses and multi-temporal 3D reconstruction are fundamental for maintaining and safeguarding all forms of cultural heritage. Such studies form the basis for any kind of decision regarding intervention on cultural heritage, helping assess the risks and issues involved. This article introduces a European-wide project, entitled "Cultural Heritage Through Time", and the case study research carried out as a component of the project in the UK. The paper outlines the initial stages of the case study of landscape change at three locations on Hadrian's Wall, namely Beckfoot Roman Fort, Birdoswald Roman Fort and Corbridge Roman Station, all once part of the Roman Empire's north-west frontier. The main aim of the case study is to integrate heterogeneous information derived from a range of sources to help inform understanding of temporal aspects of landscape change. In particular, the study sites are at risk from natural hazards, notably erosion and flooding. The paper focuses on data collection and collation aspects, including an extensive archive search and field survey, as well as the methodology and preliminary data processing.

  9. Introduction of the 2nd Phase of the Integrated Hydrologic Model Intercomparison Project

    NASA Astrophysics Data System (ADS)

    Kollet, Stefan; Maxwell, Reed; Dages, Cecile; Mouche, Emmanuel; Mugler, Claude; Paniconi, Claudio; Park, Young-Jin; Putti, Mario; Shen, Chaopeng; Stisen, Simon; Sudicky, Edward; Sulis, Mauro; Ji, Xinye

    2015-04-01

    The 2nd Phase of the Integrated Hydrologic Model Intercomparison Project commenced in June 2013 with a workshop at Bonn University funded by the German Science Foundation and US National Science Foundation. Three test cases were defined and compared that are available online at www.hpsc-terrsys.de including a tilted v-catchment case; a case called superslab based on multiple slab-heterogeneities in the hydraulic conductivity along a hillslope; and the Borden site case, based on a published field experiment. The goal of this phase is to further interrogate the coupling of surface-subsurface flow implemented in various integrated hydrologic models; and to understand and quantify the impact of differences in the conceptual and technical implementations on the simulation results, which may constitute an additional source of uncertainty. The focus has been broadened considerably including e.g. saturated and unsaturated subsurface storages, saturated surface area, ponded surface storage in addition to discharge, and pressure/saturation profiles and cross-sections. Here, first results are presented and discussed demonstrating the conceptual and technical challenges in implementing essentially the same governing equations describing highly non-linear moisture redistribution processes and surface-groundwater interactions.

  10. Optimization and resilience in natural resources management

    USGS Publications Warehouse

    Williams, Byron K.; Johnson, Fred A.

    2015-01-01

    We consider the putative tradeoff between optimization and resilience in the management of natural resources, using a framework that incorporates different sources of uncertainty that are common in natural resources management. We address one-time decisions, and then expand the decision context to the more complex problem of iterative decision making. For both cases we focus on two key sources of uncertainty: partial observability of system state and uncertainty as to system dynamics. Optimal management strategies will vary considerably depending on the timeframe being considered and the amount and quality of information that is available to characterize system features and project the consequences of potential decisions. But in all cases an optimal decision making framework, if properly identified and focused, can be useful in recognizing sound decisions. We argue that under the conditions of deep uncertainty that characterize many resource systems, an optimal decision process that focuses on robustness does not automatically induce a loss of resilience.

  11. Asteroid models from photometry and complementary data sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaasalainen, Mikko

    I discuss inversion methods for asteroid shape and spin reconstruction with photometry (lightcurves) and complementary data sources such as adaptive optics or other images, occultation timings, interferometry, and range-Doppler radar data. These are essentially different sampling modes (generalized projections) of plane-of-sky images. An important concept in this approach is the optimal weighting of the various data modes. The maximum compatibility estimate, a multi-modal generalization of the maximum likelihood estimate, can be used for this purpose. I discuss the fundamental properties of lightcurve inversion by examining the two-dimensional case that, though not usable in our three-dimensional world, is simple to analyze,more » and it shares essentially the same uniqueness and stability properties as the 3-D case. After this, I review the main aspects of 3-D shape representations, lightcurve inversion, and the inclusion of complementary data.« less

  12. The Exercise: An Exercise Generator Tool for the SOURCe Project

    ERIC Educational Resources Information Center

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  13. SolTrace | Concentrating Solar Power | NREL

    Science.gov Websites

    NREL packaged distribution or from source code at the SolTrace open source project website. NREL Publications Support FAQs SolTrace open source project The code uses Monte-Carlo ray-tracing methodology. The -tracing capabilities. With the release of the SolTrace open source project, the software has adopted

  14. Utah Southwest Regional Geothermal Development Operations Research Project. Appendix 10 of regional operations research program for development of geothermal energy in the Southeast United States. Final technical report, June 1977--August 1978

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Green, Stanley; Wagstaff, Lyle W.

    1979-01-01

    The Southwest Regional Geothermal Operations/Research project was initiated to investigate geothermal development in the five states within the region: Arizona, Colorado, Nevada, New Mexico, and Utah. Although the region changed during the first year to include Idaho, Montana, North Dakota, South Dakota, and Wyoming, the project objectives and procedures remained unchanged. The project was funded by the DOE/DGE and the Four Corners Regional Commission with participation by the New Mexico Energy Resources Board. The study was coordinated by the New Mexico Energy Institute at New Mexico State University, acting through a 'Core Team'. A 'state' team, assigned by the states,more » conducted the project within each state. This report details most of the findings of the first year's efforts by the Utah Operations/Research team. It is a conscientious effort to report the findings and activities of the Utah team, either explicitly or by reference. The results are neither comprehensive nor final, and should be regarded as preliminary efforts to much of what the Operations/Research project was envisioned to accomplish. In some cases the report is probably too detailed, in other cases too vague; hopefully, however, the material in the report, combined with the Appendices, will be able to serve as source material for others interested in geothermal development in Utah.« less

  15. Kinematic properties of solar coronal mass ejections: Correction for projection effects in spacecraft coronagraph measurements

    NASA Astrophysics Data System (ADS)

    Howard, T. A.; Nandy, D.; Koepke, A. C.

    2008-01-01

    One of the main sources of uncertainty in quantifying the kinematic properties of coronal mass ejections (CMEs) using coronagraphs is the fact that coronagraph images are projected into the sky plane, resulting in measurements which can differ significantly from their actual values. By identifying solar surface source regions of CMEs using X-ray and Hα flare and disappearing filament data, and through considerations of CME trajectories in three-dimensional (3-D) geometry, we have devised a methodology to correct for the projection effect. We outline this method here. The methodology was automated and applied to over 10,000 CMEs in the Coordinated Data Analysis Workshop (CDAW) (SOHO Large Angle Spectroscopic Coronagraph) catalog spanning 1996-2005, in which we could associate 1961 CMEs with an appropriate surface event. In the latter subset, deprojected speeds, accelerations, and launch angles were determined to study CME kinematics. Our analysis of this subset of events reconfirms some important trends, notably that previously uncovered solar cycle variation of CME properties are preserved, CMEs with greater width have higher speeds, and slower CMEs tend to accelerate while faster CMEs tend to decelerate. This points out that statistical trends in CME properties, recovered from plane-of-sky measurements, may be preserved even in the face of more sophisticated 3-D measurements from spacecrafts such as STEREO, if CME trajectories are predominantly radial. However, our results also show that the magnitude of corrected measurements can differ significantly from the projected plane-of-sky measurements on a case-by-case basis and that acceleration is more sensitive to the deprojection process than speed. Average corrected speed and acceleration tend to be a factor of 1.7 and 4.4 higher than their projected values, with mean corrected speed and acceleration magnitudes being on the order of 1000 km/s and 50 m/s2, respectively. We conclude that while using the plane-of-sky measurements may be suitable for studies of general trends in a large sample of events, correcting for projection effects is mandatory for those investigations which rely on a numerically precise determination of the properties of individual CMEs.

  16. Use of satellite images for the monitoring of water systems

    NASA Astrophysics Data System (ADS)

    Hillebrand, Gudrun; Winterscheid, Axel; Baschek, Björn; Wolf, Thomas

    2015-04-01

    Satellite images are a proven source of information for monitoring ecological indicators in coastal waters and inland river systems. This potential of remote sensing products was demonstrated by recent research projects (e.g. EU-funded project Freshmon - www.freshmon.eu) and other activities by national institutions. Among indicators for water quality, a particular focus was set on the temporal and spatial dynamics of suspended particulate matter (SPM) and Chlorophyll-a (Chl-a). The German Federal Institute of Hydrology (BfG) was using the Weser and Elbe estuaries as test cases to compare in-situ measurements with results obtained from a temporal series of automatically generated maps of SPM distributions based on remote sensing data. Maps of SPM and Chl-a distributions in European inland rivers and alpine lakes were generated by the Freshmon Project. Earth observation based products are a valuable source for additional data that can well supplement in-situ monitoring. For 2015, the BfG and the Institute for Lake Research of the State Institute for the Environment, Measurements and Nature Conservation of Baden-Wuerttemberg, Germany (LUBW) are in the process to start implementing an operational service for monitoring SPM and Chl-a based on satellite images (Landsat 7 & 8, Sentinel 2, and if required other systems with higher spatial resolution, e.g. Rapid Eye). In this 2-years project, which is part of the European Copernicus Programme, the operational service will be set up for - the inland rivers of Rhine and Elbe - the North Sea estuaries of Elbe, Weser and Ems. Furthermore - Lake Constance and other lakes located within the Federal State of Baden-Wuerttemberg. In future, the service can be implemented for other rivers and lakes as well. Key feature of the project is a data base that holds the stock of geo-referenced maps of SPM and Chl-a distributions. Via web-based portals (e.g. GGInA - geo-portal of the BfG; UIS - environmental information system of the Federal State of Baden-Wuerttemberg; BOWIS - information system for the Lake Constance) the maps will be made accessible to the public. The aim of the project is to implement a service that automatically recognizes new satellite images covering the area of selected water systems (lake, river or estuary) and therefore is able to continually update the data base. Furthermore, the service includes a procedure to analyse newly available data with the highest possible degree of automatization. It is planned to add new maps of SPM and Chl-a distributions to the data base within a couple of days after the satellite image was taken. A high degree of automatization is the essential condition to process a large number of satellite images each year at reasonable costs. It could be demonstrated by the Freshmon Project that there are simplified but robust algorithms and procedures existing. For the successful implementation of the service, it is important to further validate the results obtained by the service line as well as the used procedure and algorithms. Therefore, several test cases will be set up. Each case is going to include an analysis of the uncertainties to describe the expected deviation between values derived from earth observation data and the in-situ data obtained from the BfG and LUBW monitoring networks. Furthermore, it will include a description of possible sources of error and the boundary conditions which are most sensitive to the analysis. Test cases are planned to be made public with all necessary data. The scientific community is invited to use the data as a benchmark test case to develop their own algorithms and procedures.

  17. AKM in Open Source Communities

    NASA Astrophysics Data System (ADS)

    Stamelos, Ioannis; Kakarontzas, George

    Previous chapters in this book have dealt with Architecture Knowledge Management in traditional Closed Source Software (CSS) projects. This chapterwill attempt to examine the ways that knowledge is shared among participants in Free Libre Open Source Software (FLOSS 1) projects and how architectural knowledge is managed w.r.t. CSS. FLOSS projects are organized and developed in a fundamentally different way than CSS projects. FLOSS projects simply do not develop code as CSS projects do. As a consequence, their knowledge management mechanisms are also based on different concepts and tools.

  18. Implementation of a community greenhouse in a remote, sub-Arctic First Nations community in Ontario, Canada: a descriptive case study.

    PubMed

    Skinner, K; Hanning, R M; Metatawabin, J; Tsuji, L J S

    2014-01-01

    Food insecurity is prevalent in northern communities in Canada and there is a movement to improve food security through both the re-vitalization of traditional harvesting practices as well as through sustainable agriculture initiatives. Gardening in northern communities can be difficult and may be aided by a community greenhouse. The objective of this project was to conduct a descriptive case study of the context and process surrounding the implementation of a community greenhouse in a remote, sub-Arctic First Nations community in Ontario, Canada. Data sources included semi-directed interviews with a purposive and snowball sample of key informants (n=14), direct observations (n=32 days), written documentation (n=107), and photo-documentation (n=621 total). Digital photographs were taken by both a university investigator during community visits and a community investigator throughout the entire project. The case study was carried out over 33 months; from early 2009 until October of 2011. Thematic data analyses were conducted and followed a categorical aggregation approach. Categories emerging from the data were appointed gardening-related themes: seasons, fertile ground, sustainability, gardeners, ownership, participant growth, and sunshine. Local champions were critical to project success. Uncertainty was expressed by several participants regarding ownership of the greenhouse; the local community members who championed the project had to emphasize, repeatedly, that it was community owned. Positive outcomes included the involvement of many community members, a host of related activities, and that the greenhouse has been a learning opportunity to gain knowledge about growing plants in a northern greenhouse setting. A strength of the project was that many children participated in greenhouse activities. Community and school greenhouse projects require local champions to be successful. It is important to establish guidelines around ownership of a greenhouse and suitable procedures for making the building accessible to everyone without compromising security. Implementing a greenhouse project can engage community members, including children, and provide a great learning opportunity for gardeners in a remote, northern community.

  19. Semantic eScience for Ecosystem Understanding and Monitoring: The Jefferson Project Case Study

    NASA Astrophysics Data System (ADS)

    McGuinness, D. L.; Pinheiro da Silva, P.; Patton, E. W.; Chastain, K.

    2014-12-01

    Monitoring and understanding ecosystems such as lakes and their watersheds is becoming increasingly important. Accelerated eutrophication threatens our drinking water sources. Many believe that the use of nutrients (e.g., road salts, fertilizers, etc.) near these sources may have negative impacts on animal and plant populations and water quality although it is unclear how to best balance broad community needs. The Jefferson Project is a joint effort between RPI, IBM and the Fund for Lake George aimed at creating an instrumented water ecosystem along with an appropriate cyberinfrastructure that can serve as a global model for ecosystem monitoring, exploration, understanding, and prediction. One goal is to help communities understand the potential impacts of actions such as road salting strategies so that they can make appropriate informed recommendations that serve broad community needs. Our semantic eScience team is creating a semantic infrastructure to support data integration and analysis to help trained scientists as well as the general public to better understand the lake today, and explore potential future scenarios. We are leveraging our RPI Tetherless World Semantic Web methodology that provides an agile process for describing use cases, identification of appropriate background ontologies and technologies, implementation, and evaluation. IBM is providing a state-of-the-art sensor network infrastructure along with a collection of tools to share, maintain, analyze and visualize the network data. In the context of this sensor infrastructure, we will discuss our semantic approach's contributions in three knowledge representation and reasoning areas: (a) human interventions on the deployment and maintenance of local sensor networks including the scientific knowledge to decide how and where sensors are deployed; (b) integration, interpretation and management of data coming from external sources used to complement the project's models; and (c) knowledge about simulation results including parameters, interpretation of results, and comparison of results against external data. We will also demonstrate some example queries highlighting the benefits of our semantic approach and will also identify reusable components.

  20. A Requirements-Based Exploration of Open-Source Software Development Projects--Towards a Natural Language Processing Software Analysis Framework

    ERIC Educational Resources Information Center

    Vlas, Radu Eduard

    2012-01-01

    Open source projects do have requirements; they are, however, mostly informal, text descriptions found in requests, forums, and other correspondence. Understanding such requirements provides insight into the nature of open source projects. Unfortunately, manual analysis of natural language requirements is time-consuming, and for large projects,…

  1. New era / new solutions: The role of alternative tariff structures in water supply projects.

    PubMed

    Pinto, F Silva; Marques, R Cunha

    2017-12-01

    Water utilities face different challenges that may force them to seek prioritized objectives. When doing so, particular projects may have to be developed, being important to understand their impact on water tariffs, and thus, on customers. Such consequences may bear an increased relevance in cases stressed with, e.g., resource scarcity, poverty, and the need for infrastructure investments. The resulting cost and revenue variability demand a comprehensive study. If the first may require a stochastic modeling (in major cost components) in order to consider its inherent uncertainty, the second needs to be modeled following context-specific objectives set by the relevant stakeholders. The solutions achieved will likely promote distinct revenue sources, as well as diversified water tariff structures. A multi-objective optimization model (i.e., a Framework for Suitable Prices) is built to deal with those diversified requirements (e.g., stochastic energy costs, affordability, cost recovery, or administrative simplicity). The model is solved through achievement scalarizing functions with several weighting coefficients for a reference point, so as to provide a significant perception of possible revenue options (and their impact) to the decision makers. The proposed method is applied to a case study, Boa Vista Island in Cabo Verde, in which the background characteristics, namely water sources availability (e.g., the adoption of desalination technologies), economic development and other contextual factors were considered. The key role of tariff structure selection is displayed, instead of assuming it a priori, giving important insights regarding project feasibility. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Energy Upgrades at City-Owned Facilities: Understanding Accounting for Energy Efficiency Financing Options. City of Dubuque Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leventis, Greg; Schiller, Steve; Kramer, Chris

    The city of Dubuque, Iowa, aimed for a twofer — lower energy costs for public facilities and reduced air emissions. To achieve that goal, the city partnered with the Iowa Economic Development Authority to establish a revolving loan fund to finance energy efficiency and other energy projects at city facilities. But the city needed to understand approaches for financing energy projects to achieve both of their goals in a manner that would not be considered debt — in this case, obligations booked as a liability on the city’s balance sheet. With funding from the U.S. Department of Energy’s Climate Actionmore » Champions Initiative, Lawrence Berkeley National Laboratory (Berkeley Lab) provided technical assistance to the city to identify strategies to achieve these goals. Revolving loans use a source of money to fund initial cost-saving projects, such as energy efficiency investments, then use the repayments and interest from these loans to support subsequent projects. Berkeley Lab and the city examined two approaches to explore whether revolving loans could potentially be treated as non-debt: 1) financing arrangements containing a non-appropriation clause and 2) shared savings agreements. This fact sheet discusses both, including considerations that may factor into their treatment as debt from an accounting perspective.« less

  3. Community responses to government defunding of watershed projects: a comparative study in India and the USA.

    PubMed

    Koontz, Tomas M; Sen, Sucharita

    2013-03-01

    When central governments decentralize natural resource management (NRM), they often retain an interest in the local efforts and provide funding for them. Such outside investments can serve an important role in moving community-based efforts forward. At the same time, they can represent risks to the community if government resources are not stable over time. Our focus in this article is on the effects of withdrawal of government resources from community-based NRM. A critical question is how to build institutional capacity to carry on when the government funding runs out. This study compares institutional survival and coping strategies used by community-based project organizations in two different contexts, India and the United States. Despite higher links to livelihoods, community participation, and private benefits, efforts in the Indian cases exhibited lower survival rates than did those in the U.S. cases. Successful coping strategies in the U.S. context often involved tapping into existing institutions and resources. In the Indian context, successful coping strategies often involved building broad community support for the projects and creatively finding additional funding sources. On the other hand, the lack of local community interest, due to the top-down development approach and sometimes narrow benefit distribution, often challenged organizational survival and project maintenance.

  4. Community Responses to Government Defunding of Watershed Projects: A Comparative Study in India and the USA

    NASA Astrophysics Data System (ADS)

    Koontz, Tomas M.; Sen, Sucharita

    2013-03-01

    When central governments decentralize natural resource management (NRM), they often retain an interest in the local efforts and provide funding for them. Such outside investments can serve an important role in moving community-based efforts forward. At the same time, they can represent risks to the community if government resources are not stable over time. Our focus in this article is on the effects of withdrawal of government resources from community-based NRM. A critical question is how to build institutional capacity to carry on when the government funding runs out. This study compares institutional survival and coping strategies used by community-based project organizations in two different contexts, India and the United States. Despite higher links to livelihoods, community participation, and private benefits, efforts in the Indian cases exhibited lower survival rates than did those in the U.S. cases. Successful coping strategies in the U.S. context often involved tapping into existing institutions and resources. In the Indian context, successful coping strategies often involved building broad community support for the projects and creatively finding additional funding sources. On the other hand, the lack of local community interest, due to the top-down development approach and sometimes narrow benefit distribution, often challenged organizational survival and project maintenance.

  5. Integrated care networks and quality of life: linking research and practice

    PubMed Central

    Warner, Morton; Gould, Nicholas

    2003-01-01

    Abstract Purpose To report on the development of a project dedicated to improving the quality of life of older people through the creation of integrated networks. Context The project is set within a post-industrial community and against a backdrop of government re-organisation and devolution within Wales. The immediate research context is determined by utilising an approach to the structure of integration derived theoretically. Case description Project CHAIN (Community Health Alliances through Integrated Networks) adopts a network perspective as a means of addressing both the determinants of health and service delivery in health and social care. The Project partners are: healthcare commissioners and providers; local authority directorates including community services and transportation; the voluntary and private sectors; and a university institute. Co-opted participants include fora representing older people's interests. Data sources The Project incorporates an action research method. This paper highlights qualitative data elicited from interviews with health and social care managers and practitioners. Conclusions and discussion The Project is ongoing and we record progress in building five integrated networks. PMID:16896421

  6. Application of Open Source Software by the Lunar Mapping and Modeling Project

    NASA Astrophysics Data System (ADS)

    Ramirez, P.; Goodale, C. E.; Bui, B.; Chang, G.; Kim, R. M.; Law, E.; Malhotra, S.; Rodriguez, L.; Sadaqathullah, S.; Mattmann, C. A.; Crichton, D. J.

    2011-12-01

    The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is responsible for the development of an information system to support lunar exploration, decision analysis, and release of lunar data to the public. The data available through the lunar portal is predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). This project has created a gold source of data, models, and tools for lunar explorers to exercise and incorporate into their activities. At Jet Propulsion Laboratory (JPL), we focused on engineering and building the infrastructure to support cataloging, archiving, accessing, and delivery of lunar data. We decided to use a RESTful service-oriented architecture to enable us to abstract from the underlying technology choices and focus on interfaces to be used internally and externally. This decision allowed us to leverage several open source software components and integrate them by either writing a thin REST service layer or relying on the API they provided; the approach chosen was dependent on the targeted consumer of a given interface. We will discuss our varying experience using open source products; namely Apache OODT, Oracle Berkley DB XML, Apache Solr, and Oracle OpenSSO (now named OpenAM). Apache OODT, developed at NASA's Jet Propulsion Laboratory and recently migrated over to Apache, provided the means for ingestion and cataloguing of products within the infrastructure. Its usage was based upon team experience with the project and past benefit received on other projects internal and external to JPL. Berkeley DB XML, distributed by Oracle for both commercial and open source use, was the storage technology chosen for our metadata. This decision was in part based on our use Federal Geographic Data Committee (FGDC) Metadata, which is expressed in XML, and the desire to keep it in its native form and exploit other technologies built on top of XML. Apache Solr, an open source search engine, was used to drive our search interface and as way to store references to metadata and data exposed via REST endpoints. As was the case with Apache OODT there was team experience with this component that helped drive this choice. Lastly, OpenSSO, an open source single sign on service, was used to secure and provide access constraints to our REST based services. For this product there was little past experience but given our service based approach seemed to be a natural fit. Given our exposure to open source we will discuss the tradeoffs and benefits received by the choices made. Moreover, we will dive into the context of how the software packages were used and the impact of their design and extensibility had on the construction of the infrastructure. Finally, we will compare our encounter across open source solutions and attributes that can vary the impression one will get. This comprehensive account of our endeavor should aid others in their assessment and use of open source.

  7. Towards data integration automation for the French rare disease registry.

    PubMed

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types.

  8. Scaling of X pinches from 1 MA to 6 MA.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bland, Simon Nicholas; McBride, Ryan D.; Wenger, David Franklin

    This final report for Project 117863 summarizes progress made toward understanding how X-pinch load designs scale to high currents. The X-pinch load geometry was conceived in 1982 as a method to study the formation and properties of bright x-ray spots in z-pinch plasmas. X-pinch plasmas driven by 0.2 MA currents were found to have source sizes of 1 micron, temperatures >1 keV, lifetimes of 10-100 ps, and densities >0.1 times solid density. These conditions are believed to result from the direct magnetic compression of matter. Physical models that capture the behavior of 0.2 MA X pinches predict more extreme parametersmore » at currents >1 MA. This project developed load designs for up to 6 MA on the SATURN facility and attempted to measure the resulting plasma parameters. Source sizes of 5-8 microns were observed in some cases along with evidence for high temperatures (several keV) and short time durations (<500 ps).« less

  9. Towards data integration automation for the French rare disease registry

    PubMed Central

    Maaroufi, Meriem; Choquet, Rémy; Landais, Paul; Jaulent, Marie-Christine

    2015-01-01

    Building a medical registry upon an existing infrastructure and rooted practices is not an easy task. It is the case for the BNDMR project, the French rare disease registry, that aims to collect administrative and medical data of rare disease patients seen in different hospitals. To avoid duplicating data entry for health professionals, the project plans to deploy connectors with the existing systems to automatically retrieve data. Given the data heterogeneity and the large number of source systems, the automation of connectors creation is required. In this context, we propose a methodology that optimizes the use of existing alignment approaches in the data integration processes. The generated mappings are formalized in exploitable mapping expressions. Following this methodology, a process has been experimented on specific data types of a source system: Boolean and predefined lists. As a result, effectiveness of the used alignment approach has been enhanced and more good mappings have been detected. Nonetheless, further improvements could be done to deal with the semantic issue and process other data types. PMID:26958224

  10. A Mission in the Desert: Albuquerque District, 1935-1985

    DTIC Science & Technology

    1985-01-01

    Engineers came into New Mexico in 1935 to construct its first project near Tucumcari, the Engineers began to develop a knowledge of the political...agency, as a local unit of the federal gov- ernment in cases of civil emergency, and as a source of engineering knowledge for Southwest engineering...the magnitude of this book could reach completion without the involvement of many people at every stage of development. The assistance and knowledge

  11. The Demand for Scientific and Technical Manpower in Selected Energy-Related Industries, 1970-85: A Methodology Applied to a Selected Scenario of Energy Output. A Summary.

    ERIC Educational Resources Information Center

    Gutmanis, Ivars; And Others

    The primary purpose of the study was to develop and apply a methodology for estimating the need for scientists and engineers by specialty in energy and energy-related industries. The projections methodology was based on the Case 1 estimates by the National Petroleum Council of the results of "maximum efforts" to develop domestic fuel sources by…

  12. AMON: Transition to real-time operations

    NASA Astrophysics Data System (ADS)

    Cowen, D. F.; Keivani, A.; Tešić, G.

    2016-04-01

    The Astrophysical Multimessenger Observatory Network (AMON) will link the world's leading high-energy neutrino, cosmic-ray, gamma-ray and gravitational wave observatories by performing real-time coincidence searches for multimessenger sources from observatories' subthreshold data streams. The resulting coincidences will be distributed to interested parties in the form of electronic alerts for real-time follow-up observation. We will present the science case, design elements, current and projected partner observatories, status of the AMON project, and an initial AMON-enabled analysis. The prototype of the AMON server has been online since August 2014 and processing archival data. Currently, we are deploying new high-uptime servers and will be ready to start issuing alerts as early as winter 2015/16.

  13. NOBLAST and JAMBLAST: New Options for BLAST and a Java Application Manager for BLAST results.

    PubMed

    Lagnel, Jacques; Tsigenopoulos, Costas S; Iliopoulos, Ioannis

    2009-03-15

    NOBLAST (New Options for BLAST) is an open source program that provides a new user-friendly tabular output format for various NCBI BLAST programs (Blastn, Blastp, Blastx, Tblastn, Tblastx, Mega BLAST and Psi BLAST) without any use of a parser and provides E-value correction in case of use of segmented BLAST database. JAMBLAST using the NOBLAST output allows the user to manage, view and filter the BLAST hits using a number of selection criteria. A distribution package of NOBLAST and JAMBLAST including detailed installation procedure is freely available from http://sourceforge.net/projects/JAMBLAST/ and http://sourceforge.net/projects/NOBLAST. Supplementary data are available at Bioinformatics online.

  14. The GenABEL Project for statistical genomics.

    PubMed

    Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.

  15. Climate downscaling effects on predictive ecological models: a case study for threatened and endangered vertebrates in the southeastern United States

    USGS Publications Warehouse

    Bucklin, David N.; Watling, James I.; Speroterra, Carolina; Brandt, Laura A.; Mazzotti, Frank J.; Romañach, Stephanie S.

    2013-01-01

    High-resolution (downscaled) projections of future climate conditions are critical inputs to a wide variety of ecological and socioeconomic models and are created using numerous different approaches. Here, we conduct a sensitivity analysis of spatial predictions from climate envelope models for threatened and endangered vertebrates in the southeastern United States to determine whether two different downscaling approaches (with and without the use of a regional climate model) affect climate envelope model predictions when all other sources of variation are held constant. We found that prediction maps differed spatially between downscaling approaches and that the variation attributable to downscaling technique was comparable to variation between maps generated using different general circulation models (GCMs). Precipitation variables tended to show greater discrepancies between downscaling techniques than temperature variables, and for one GCM, there was evidence that more poorly resolved precipitation variables contributed relatively more to model uncertainty than more well-resolved variables. Our work suggests that ecological modelers requiring high-resolution climate projections should carefully consider the type of downscaling applied to the climate projections prior to their use in predictive ecological modeling. The uncertainty associated with alternative downscaling methods may rival that of other, more widely appreciated sources of variation, such as the general circulation model or emissions scenario with which future climate projections are created.

  16. Prediction of Bicarbonate Requirements for Enhanced Reductive Bioremediation of Chlorinated Solvent-Contaminated Sites

    NASA Astrophysics Data System (ADS)

    Robinson, C.; Barry, D. A.

    2008-12-01

    Enhanced anaerobic dechlorination is a promising technology for in situ remediation of chlorinated ethene DNAPL source areas. However, the build-up of organic acids and HCl in the source zone can lead to significant groundwater acidification. The resulting pH drop inhibits the activity of the dechlorinating microorganisms and thus may stall the remediation process. Source zone remediation requires extensive dechlorination, such that it may be common for soil's natural buffering capacity to be exceeded, and for acidic conditions to develop. In these cases bicarbonate addition (e.g., NaHCO3, KHCO3) is required for pH control. As a design tool for treatment strategies, we have developed BUCHLORAC, a Windows Graphical User Interface based on an abiotic geochemical model that allows the user to predict the acidity generated during dechlorination and associated buffer requirements for their specific operating conditions. BUCHLORAC was motivated by the SABRE (Source Area BioREmediation) project, which aims to evaluate the effectiveness of enhanced reductive dechlorination in the treatment of chlorinated solvent source zones.

  17. Managing research and surveillance projects in real-time with a novel open-source eManagement tool designed for under-resourced countries.

    PubMed

    Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas

    2016-09-01

    A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Transfer of training through a science education professional development program

    NASA Astrophysics Data System (ADS)

    Sowards, Alan Bosworth

    Educational research substantiates that effective professional development models must be developed in order for reform-based teaching strategies to be implemented in classrooms. This study examined the effectiveness of an established reform-based science education professional development program, Project LIFE. The study investigated what impact Project LIFE had on participants implementation of reform-based instruction in their classroom three years after participation in the science inservice program. Participants in the case studies described use of reform-based instruction and program factors that influenced transfer of training to their classrooms. Subjects of the study were 5th--10th grade teachers who participated in the 1997--98 Project LIFE professional development program. The study employed a mixed design including both qualitative and quantitative methodology. The qualitative data was collected from multiple sources which included: an open-ended survey, classroom observations, structured interviews, and artifacts. Three purposeful selection of teachers for case studies were made with teacher approval and authorization from building principals. Interview responses from the three case studies were further analyzed qualitatively using the microcomputer software NUD*IST. Tables and figures generated from NUD*IST graphically represented the case study teachers response and case comparison to six established categories: (1) continued implementation of reform-based instruction, (2) use of reform-based instruction, (3) program factors supporting transfer of training, (4) professional development, (5) goals of Project LIFE, and (6) critical issues in science education. Paired t-tests were used to analysis the quantitative data collected from the Survey of Attitudes Toward Science and Science Teaching. The study concluded the 1997--98 Project LIFE participants continued to implement reform-based instruction in their classrooms three years later. According to the teachers the program factors having the most influence on transferring training to their classroom were the positive responses from students; reflections with other teachers regarding instructional activities and strategies; modeling of activities and strategies they received from Project LIFE staff while participating in the program; and teachers commitment to reform-based instruction. These findings are important in enhancing national science reform goals. In order for teachers to be able to implement science-reform-based instruction in their classrooms they must experience effective professional development models. Designers of professional development programs must understand which factors in staff development programs most contribute to transfer of training.

  19. Tsunami hazard assessment for the island of Rhodes, Greece

    NASA Astrophysics Data System (ADS)

    Pagnoni, Gianluca; Armigliato, Alberto; Zaniboni, Filippo; Tinti, Stefano

    2013-04-01

    The island of Rhodes is part of the Dodecanese archipelago, and is one of the many islands that are found in the Aegean Sea. The tectonics of the Rhodes area is rather complex, involving both strike-slip and dip-slip (mainly thrust) processes. Tsunami catalogues (e.g. Papadopulos et al, 2007) show the relative high frequency of occurrence of tsunamis in this area, some also destructive, in particular between the coasts of Rhodes and Turkey. In this part of the island is located the town of Rhodes, the capital and also the largest and most populated city. Rhodes is historically famous for the Colossus of Rhodes, collapsed following an earthquake, and nowadays is a popular tourist destination. This work is focused on the hazard assessment evaluation with research performed in the frame of the European project NearToWarn. The hazard is assessed by using the worst-credible case scenario, a method introduced and used to study local tsunami hazard in coastal towns like Catania, Italy, and Alexandria, Egypt (Tinti et al., 2012). The tsunami sources chosen for building scenarios are three: two located in the sea area in front of the Turkish coasts where the events are more frequent represent local sources and were selected in the frame of the European project NearToWarn, while one provides the case of a distant source. The first source is taken from the paper Ebeling et al. (2012) and modified by UNIBO and models the earthquake and small tsunami occurred on 25th April 1957.The second source is a landslide and is derived from the TRANSFER Project "Database of Tsunamigenic Non-Seismic Sources" and coincides with the so-called "Northern Rhodes Slide", possibly responsible for the 24th March 2002 tsunami. The last source is the fault that is located close to the island of Crete believed to be responsible for the tsunami event of 1303 that was reported to have caused damage in the city of Rhodes. The simulations are carried out using the finite difference code UBO-TSUFD that solves the Navier Stokes equations in shallow water approximation. To cover the entire basin two nested grids (a coarse one with 30 arc sec resolution and a finer one with 200 m resolution) are used, constructed on bathymetry data provided by the TRANSFER database. The results, as fields of highest wave elevation, maximum flood, maximum speed, arrival times and synthetic tide-gauges, are provided and discussed both individually (i.e. separately for each source) as well as in the form of a single, aggregate result, as required by the worst-case scenario technique. References Ebeling, C.W., Okal., E.A., Kalligeris, N., Synolakis, C.E.: Modern seismological reassessment and tsunami simulation of historical Hellenic Arc earthquakes. Tectonophysics, 530-531, 225-239, 2012. Papadopoulos, G. A., Daskalaki, E., Fokaefs, A., and Giraleas, N.: Tsunami hazards in the Eastern Mediterranean: strong earthquakes and tsunamis in the East Hellenic Arc and Trench system, Nat. Hazards Earth Syst. Sci., 7, 57-64, doi:10.5194/nhess-7-57-2007, 2007. Tinti S., Pagnoni G., Armigliato A., and Tonini R.: Tsunami inundation scenarios and tsunami vulnerability assessment forthe town of Alexandria, Egypt, Geophysical Research Abstracts Vol. 14, EGU2012-10325, 2012, EGU General Assembly 2012.

  20. Finding the forest in the trees: The challenge of combining diverse environmental data

    NASA Technical Reports Server (NTRS)

    1995-01-01

    It has become increasingly important to conduct interdisciplinary environmental research assessments, both nationally and internationally. For this reason, the Committee for a Pilot Study on Database Interfaces was charged to review and advise on data interfacing activities. The committee used six case studies (1) to identify and understand the most important problems associated with collecting, integrating, and analyzing environmental data from local to global spatial scales and over a very wide range of temporal scales; and (2) to elaborate the common barriers to interfacing data of disparate sources and types. Consistent with the committee's charge, the primary focus was the interfacing of geophysical and ecological data. The case studies used by the committee were: The Impact Assessment Project for Drought Early Warning in the Sahel, The National Acid Precipitation Assessment Program, The H.J. Andrew Experimental Forest Long-Term Ecological Research Site, The Carbon Dioxide Information Analysis Center, The First International Satellite Land Surface Climatology Project (ISLSCP) Field Experiment, and The California Cooperative Oceanic Fisheries Investigation. The committee derived a number of lessons from the case studies, and these lessons are summarized at the end of each case study and analyzed in the last chapter. Some are generic in nature; others are more specific to a discipline or project. The conclusions and recommendations are based on the committee's analysis of the case studies and additional research. They are organized according to four major areas of barriers or challenges to the effective interfacing of diverse environmental data. In the final section the committee offers a set of broadly applicable principles (Ten Keys to Success) that can be used by scientists and data managers in planning and conducting interfacing activities.

  1. Systematic study of target localization for bioluminescence tomography guided radiation therapy

    PubMed Central

    Yu, Jingjing; Zhang, Bin; Iordachita, Iulian I.; Reyes, Juvenal; Lu, Zhihao; Brock, Malcolm V.; Patterson, Michael S.; Wong, John W.

    2016-01-01

    Purpose: To overcome the limitation of CT/cone-beam CT (CBCT) in guiding radiation for soft tissue targets, the authors developed a spectrally resolved bioluminescence tomography (BLT) system for the small animal radiation research platform. The authors systematically assessed the performance of the BLT system in terms of target localization and the ability to resolve two neighboring sources in simulations, tissue-mimicking phantom, and in vivo environments. Methods: Multispectral measurements acquired in a single projection were used for the BLT reconstruction. The incomplete variables truncated conjugate gradient algorithm with an iterative permissible region shrinking strategy was employed as the optimization scheme to reconstruct source distributions. Simulation studies were conducted for single spherical sources with sizes from 0.5 to 3 mm radius at depth of 3–12 mm. The same configuration was also applied for the double source simulation with source separations varying from 3 to 9 mm. Experiments were performed in a standalone BLT/CBCT system. Two self-illuminated sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single-source at 6 and 9 mm depth, two sources at 3 and 5 mm separation at depth of 5 mm, or three sources in the abdomen were also used to illustrate the localization capability of the BLT system for multiple targets in vivo. Results: For simulation study, approximate 1 mm accuracy can be achieved at localizing center of mass (CoM) for single-source and grouped CoM for double source cases. For the case of 1.5 mm radius source, a common tumor size used in preclinical study, their simulation shows that for all the source separations considered, except for the 3 mm separation at 9 and 12 mm depth, the two neighboring sources can be resolved at depths from 3 to 12 mm. Phantom experiments illustrated that 2D bioluminescence imaging failed to distinguish two sources, but BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single-source case at 6 and 9 mm depth, respectively. For the 2 sources in vivo study, both sources can be distinguished at 3 and 5 mm separations, and approximately 1 mm localization accuracy can also be achieved. Conclusions: This study demonstrated that their multispectral BLT/CBCT system could be potentially applied to localize and resolve multiple sources at wide range of source sizes, depths, and separations. The average accuracy of localizing CoM for single-source and grouped CoM for double sources is approximately 1 mm except deep-seated target. The information provided in this study can be instructive to devise treatment margins for BLT-guided irradiation. These results also suggest that the 3D BLT system could guide radiation for the situation with multiple targets, such as metastatic tumor models. PMID:27147371

  2. ISOGAL: A deep survey of the obscured inner Milky Way with ISO at 7 mu m and 15 mu m and with DENIS in the near-infrared

    NASA Astrophysics Data System (ADS)

    Omont, A.; Gilmore, G. F.; Alard, C.; Aracil, B.; August, T.; Baliyan, K.; Beaulieu, S.; Bégon, S.; Bertou, X.; Blommaert, J. A. D. L.; Borsenberger, J.; Burgdorf, M.; Caillaud, B.; Cesarsky, C.; Chitre, A.; Copet, E.; de Batz, B.; Egan, M. P.; Egret, D.; Epchtein, N.; Felli, M.; Fouqué, P.; Ganesh, S.; Genzel, R.; Glass, I. S.; Gredel, R.; Groenewegen, M. A. T.; Guglielmo, F.; Habing, H. J.; Hennebelle, P.; Jiang, B.; Joshi, U. C.; Kimeswenger, S.; Messineo, M.; Miville-Deschênes, M. A.; Moneti, A.; Morris, M.; Ojha, D. K.; Ortiz, R.; Ott, S.; Parthasarathy, M.; Pérault, M.; Price, S. D.; Robin, A. C.; Schultheis, M.; Schuller, F.; Simon, G.; Soive, A.; Testi, L.; Teyssier, D.; Tiphène, D.; Unavane, M.; van Loon, J. T.; Wyse, R.

    2003-06-01

    The ISOGAL project is an infrared survey of specific regions sampling the Galactic Plane selected to provide information on Galactic structure, stellar populations, stellar mass-loss and the recent star formation history of the inner disk and Bulge of the Galaxy. ISOGAL combines 7 and 15 μm ISOCAM observations -- with a resolution of 6 arcsec at worst -- with DENIS IJKs data to determine the nature of the sources and the interstellar extinction. We have observed about 16 square degrees with a sensitivity approaching 10-20 mJy, detecting ˜105 sources, mostly AGB stars, red giants and young stars. The main features of the ISOGAL survey and the observations are summarized in this paper, together with a brief discussion of data processing and quality. The primary ISOGAL products are described briefly (a full desciption is given in Schuller et al. 2003): viz. the images and the ISOGAL-DENIS five-wavelength point source catalogue. The main scientific results already derived or in progress are summarized. These include astrometrically calibrated 7 and 15 μm images, determining structures of resolved sources; identification and properties of interstellar dark clouds; quantification of the infrared extinction law and source dereddening; analysis of red giant and (especially) AGB stellar populations in the central Bulge, determining luminosity, presence of circumstellar dust and mass-loss rate, and source classification, supplemented in some cases by ISO/CVF spectroscopy; detection of young stellar objects of diverse types, especially in the inner Bulge with information about the present and recent star formation rate; identification of foreground sources with mid-IR excess. These results are the subject of about 25 refereed papers published or in preparation. This is paper No. 20 in a refereed journal based on data from the ISOGAL project. Based on observations with ISO, an ESA project with instruments funded by ESA Member States (especially the PI countries: France, Germany, The Netherlands and the UK) and with the participation of ISAS and NASA. Based on observations collected at the European Southern Observatory, La Silla, Chile.

  3. CUBES: cassegrain U-band Brazil-ESO spectrograph

    NASA Astrophysics Data System (ADS)

    Barbuy, B.; Bawden Macanhan, V.; Bristow, P.; Castilho, B.; Dekker, H.; Delabre, B.; Diaz, M.; Gneiding, C.; Kerber, F.; Kuntschner, H.; La Mura, G.; Maciel, W.; Meléndez, J.; Pasquini, L.; Pereira, C. B.; Petitjean, P.; Reiss, R.; Siqueira-Mello, C.; Smiljanic, R.; Vernet, J.

    2014-11-01

    CUBES is a high-efficiency, medium-resolution ( R˜20,000) ground based UV (300-400 nm) spectrograph, to be installed in the cassegrain focus of one of ESO's VLT unit telescopes in 2017/18. The CUBES project is a joint venture between ESO and IAG/USP, and LNA/MCTI. CUBES will provide access to a wealth of new and relevant information for stellar as well as extragalactic sources. Main science cases include the study of beryllium and heavy elements in metal-poor stars, the direct determination of carbon, nitrogen and oxygen abundances by study of molecular bands in the UV range, as well as the study of active galactic nuclei and the quasar absorption lines. With a streamlined modern instrument design, high efficiency dispersing elements and UV-sensitive detectors, it will give a significant gain in sensitivity over existing ground based medium-high resolution spectrographs, enabling vastly increased sample sizes accessible to the astronomical community. We present here a brief overview of the project including the status, science cases and a discussion of the design options.

  4. Historical Temporal Shipping (HITS)

    DTIC Science & Technology

    1978-06-28

    Histogram Cells 45 El Figure 4-3 Projection of Area onto Route Perpendicular 45 Figure 4-4 Single Column Cut of Route Envelope 46ii Figure 4-5 Histogram of...Resources, "Super" Bulk Carriers, and Deepwater Port Development." Naval Postgraduate School . June 1974. 8. Gulland, J.A. "The Fish Resources of the Ocean...sailing reports from the various harbour masters. The completeness of the data thus depends in most cases upon the diligence of a single reporting source

  5. Building a Business Case for Compressed Natural Gas in Fleet Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, G.

    2015-03-19

    Natural gas is a clean-burning, abundant, and domestically produced source of energy. Compressed natural gas (CNG) has recently garnered interest as a transportation fuel because of these attributes and because of its cost savings and price stability compared to conventional petroleum fuels. The National Renewable Energy Laboratory (NREL) developed the Vehicle Infrastructure and Cash-Flow Evaluation (VICE) model to help businesses and fleets evaluate the financial soundness of CNG vehicle and CNG fueling infrastructure projects.

  6. Disadvantaged persons' participation in health promotion projects: some structural dimensions.

    PubMed

    Boyce, W F

    2001-05-01

    A structural perspective was used in studying community participation of disadvantaged groups (poor women, street youth, and disabled persons) in health promotion projects. Five community projects in the Canadian Health Promotion Contribution Program were examined in a comparative case study utilizing in-depth interviews, documents, and secondary sources. Analysis revealed relatively low numbers and restricted range of participants, difficulties in recruiting and maintaining participants, declining rates of active participation over time, and limited target group influence and power. This paper reports on the relationship between various dimensions of structure (social-cultural, organizational, political-legal-economic) and the community participation process. Participation was influenced by structural factors such as bureaucratic rules and regulators, perceived minority group rights and relations, agency reputations and responsibilities, available resources, and organizational roles. Control of projects by target group members, rather than by service agencies, was an important overall organizational structural factor which allowed community members to achieve influence in projects. The study concludes that a conceptual model based on structural factors is useful in explaining how key factors from federal and local levels can restrict or facilitate the community participation process.

  7. Modeling Tool to Quantify Metal Sources in Stormwater Discharges at Naval Facilities (NESDI Project 455)

    DTIC Science & Technology

    2014-06-01

    TECHNICAL REPORT 2077 June 2014 Modeling Tool to Quantify Metal Sources in Stormwater Discharges at Naval Facilities (NESDI Project 455... Stormwater Discharges at Naval Facilities (NESDI Project 455) Final Report and Guidance C. Katz K. Sorensen E. Arias SSC Pacific R. Pitt L. Talebi...demonstration/validation project to assess the use of the urban stormwater model Windows Source Loading and Management Model (WinSLAMM) to characterize

  8. Incidence of inadvertent intra-articular lumbar facet joint injection during fluoroscopically guided interlaminar epidural steroid injection.

    PubMed

    Huang, Ambrose J; Palmer, William E

    2012-02-01

    To determine the incidence of inadvertent lumbar facet joint injection during an interlaminar epidural steroid injection (ESI). A total of 686 interlaminar lumbar ESIs were performed from January 1, 2009 to December 31, 2009. Archived images from these cases were retrospectively reviewed on the PACS. Positive cases of inadvertent lumbar facet joint injection were identified by the characteristic sigmoid-shaped contrast pattern projecting over the posterior elements on the lateral view and/or ovoid contrast projecting over the facet joints on the anteroposterior (AP) view. Eight positive events were identified (1.2%). There was no statistically significant gender or lumbar level predilection. In 3/8 of the positive cases (37.5%), the inadvertent facet joint injection was recognized by the operator. The needle was repositioned as a result, and contrast within the posterior epidural space was documented by the end of the procedure. In 5/8 of the positive cases (62.5%), the patients reported an immediate decrease in the presenting pain. The incidence of inadvertent lumbar facet joint injection during an interlaminar epidural steroid injection is low. Recognizing the imaging features of this event permits the operator to redirect the needle tip into the epidural space and/or identify the facet joint(s) as a source of the patient's presenting pain.

  9. Good governance and sustainability: a case study from Pakistan.

    PubMed

    Israr, Syed Muhammad; Islam, Anwar

    2006-01-01

    On the basis of a case study in Pakistan, the paper argues that good governance, characterized by transparency, accountability and meaningful community participation, plays a critical role in the sustainability of donor-funded health systems projects in the public health sector. The Family Health Project (FHP) (1992-1999), funded by the World Bank, has been used as a case study. Critical analysis of secondary data mainly obtained from the Department of Health (DoH) in the province of Sindh in Pakistan is the major tool used for the study. Data from other sources including the World Bank have also been used. The analysis reveals that the existing health care system could not fully absorb and sustain major "sociopolitical" thrusts of the project, meaningful community participation and "democratic" decision-making processes being the most important ones. The hierarchical structure and management process made it difficult to produce a sense of ownership of the project among all managers and the rank and file staff. The Provincial Health Development Center (PHDC) and District Health Development Centers (DHDCs) established by the FHP did not receive adequate financial and political support from DoH and the Ministry of Health to have much control of the project at the local level. Consequently, these Centers largely failed to institutionalize a continuing training program for district level health officials/professionals. Due to lack of political support, the District Health Management Teams (DHMTs) could not be institutionalized. Community participation in the DHMTs was symbolic rather than forceful. Improved coordination among all stakeholders, more stable and competent leadership, more meaningful community participation, greater devolution of project management to the district level, and better management of resources would have resulted in more effective and efficient implementation of the project. Based on these findings, the paper introduces a Sustainable Management Approach (SMA) as a tool that can be used to ensure the sustainability of health systems projects, particularly those funded by international organizations in developing countries. Good governance and a conducive organizational culture are important prerequisites for incorporating any new project within an existing system. This includes prior consensus building among all stakeholders, a meaningful and inclusive participatory planning, implementation and evaluation process involving communities, political commitment, and the identification and use of appropriate leadership for project management.

  10. Funding knowledgebases: Towards a sustainable funding model for the UniProt use case

    PubMed Central

    Gabella, Chiara; Durinx, Christine; Appel, Ron

    2018-01-01

    Millions of life scientists across the world rely on bioinformatics data resources for their research projects. Data resources can be very expensive, especially those with a high added value as the expert-curated knowledgebases. Despite the increasing need for such highly accurate and reliable sources of scientific information, most of them do not have secured funding over the near future and often depend on short-term grants that are much shorter than their planning horizon. Additionally, they are often evaluated as research projects rather than as research infrastructure components. In this work, twelve funding models for data resources are described and applied on the case study of the Universal Protein Resource (UniProt), a key resource for protein sequences and functional information knowledge. We show that most of the models present inconsistencies with open access or equity policies, and that while some models do not allow to cover the total costs, they could potentially be used as a complementary income source. We propose the Infrastructure Model as a sustainable and equitable model for all core data resources in the life sciences. With this model, funding agencies would set aside a fixed percentage of their research grant volumes, which would subsequently be redistributed to core data resources according to well-defined selection criteria. This model, compatible with the principles of open science, is in agreement with several international initiatives such as the Human Frontiers Science Program Organisation (HFSPO) and the OECD Global Science Forum (GSF) project. Here, we have estimated that less than 1% of the total amount dedicated to research grants in the life sciences would be sufficient to cover the costs of the core data resources worldwide, including both knowledgebases and deposition databases. PMID:29333230

  11. Funding knowledgebases: Towards a sustainable funding model for the UniProt use case.

    PubMed

    Gabella, Chiara; Durinx, Christine; Appel, Ron

    2017-01-01

    Millions of life scientists across the world rely on bioinformatics data resources for their research projects. Data resources can be very expensive, especially those with a high added value as the expert-curated knowledgebases. Despite the increasing need for such highly accurate and reliable sources of scientific information, most of them do not have secured funding over the near future and often depend on short-term grants that are much shorter than their planning horizon. Additionally, they are often evaluated as research projects rather than as research infrastructure components. In this work, twelve funding models for data resources are described and applied on the case study of the Universal Protein Resource (UniProt), a key resource for protein sequences and functional information knowledge. We show that most of the models present inconsistencies with open access or equity policies, and that while some models do not allow to cover the total costs, they could potentially be used as a complementary income source. We propose the Infrastructure Model as a sustainable and equitable model for all core data resources in the life sciences. With this model, funding agencies would set aside a fixed percentage of their research grant volumes, which would subsequently be redistributed to core data resources according to well-defined selection criteria. This model, compatible with the principles of open science, is in agreement with several international initiatives such as the Human Frontiers Science Program Organisation (HFSPO) and the OECD Global Science Forum (GSF) project. Here, we have estimated that less than 1% of the total amount dedicated to research grants in the life sciences would be sufficient to cover the costs of the core data resources worldwide, including both knowledgebases and deposition databases.

  12. SMART characterisation of New Zealand's aquifers using fast and passive methods

    NASA Astrophysics Data System (ADS)

    Klug, H.; Daughney, C.; Verhagen, F.; Westerhoff, R.; Ward, N. Dudley

    2012-04-01

    Groundwater resources account for about half of New Zealand's abstractive water needs and supplies about eighty per cent of all water used in the agricultural sector. Despite the importance of New Zealand's groundwater resources, we still lack essential information related to their basic properties such as volume, hydraulic properties, interaction with surface water, and water age. These measures are required to ensure sustainable management in order to avoid overexploitation of water resources and to circumvent water scarcity situations where humans and the economy will be stressed due to insufficient water supply. A newly established research collaboration between New Zealand and Europe aims to provide a methodological framework to characterise New Zealand's groundwater aquifers. The SMART project (www.smart-project.info) will rely on existing data sources of regional councils and research institutes and will develop novel measurement techniques that can be applied to large areas with little effort, little acquisition time, and minimal cost. The project aims to synthesise in situ measurements from sensor observation services, ambient noise seismic tomography, real-time fibre optic temperature sensing, novel age tracers, airborne geophysical surveying and satellite remote sensing techniques. Validation of direct and indirect groundwater information will be achieved through use of multiple methods in case study areas and by "ground-truthing" the new methods against existing data obtained from traditional methods (e.g. drilling, aquifer pump testing, river gauging). An important overarching part of the project is the quantification of uncertainty associated with all techniques to be employed. An online Sensor WebGIS prototype will provide the project results and other case study observations (e.g. temperature, precipitation, soil moisture) in as near real-time as possible. These datasets serve as a validation source for the satellite monitoring results and present an actual view on the status of the environment. The web portal will not only visualise near real-time (station based) point measurements but also process these datasets to spatially distributed maps on climatological parameters. The OGC compliant and open source based portal will be developed towards a 3D groundwater interface and inventory. This inventory will be tailored to stakeholder needs (e.g. open access, ease of use, and interoperability with existing systems) which have already been identified through stakeholder consultation processes. The portal prototype runs on a platform-independent web browser ensuring access and visibility to all stakeholders and decision makers at regional and national level.

  13. Formal Methods Tool Qualification

    NASA Technical Reports Server (NTRS)

    Wagner, Lucas G.; Cofer, Darren; Slind, Konrad; Tinelli, Cesare; Mebsout, Alain

    2017-01-01

    Formal methods tools have been shown to be effective at finding defects in safety-critical digital systems including avionics systems. The publication of DO-178C and the accompanying formal methods supplement DO-333 allows applicants to obtain certification credit for the use of formal methods without providing justification for them as an alternative method. This project conducted an extensive study of existing formal methods tools, identifying obstacles to their qualification and proposing mitigations for those obstacles. Further, it interprets the qualification guidance for existing formal methods tools and provides case study examples for open source tools. This project also investigates the feasibility of verifying formal methods tools by generating proof certificates which capture proof of the formal methods tool's claim, which can be checked by an independent, proof certificate checking tool. Finally, the project investigates the feasibility of qualifying this proof certificate checker, in the DO-330 framework, in lieu of qualifying the model checker itself.

  14. Conducting a FERC environmental assessment: a case study and recommendations from the Terror Lake Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olive, S.W.; Lamb, B.L.

    This paper is an account of the process that evolved during acquisition of the license to operate the Terror Lake hydro-electric power project under the auspices of the Federal Energy Regulatory Commission (FERC). The Terror River is located on Kodiak Island in Alaska. The river is within the Kodiak National Wildlife Refuge; it supports excellent runs of several species of Pacific Salmon which are both commercially important and a prime source of nutrition for the Kodiak brown bear. This paper discusses both the fish and wildlife questions, but concentrates on instream uses and how protection of these uses was decided.more » In this focus the paper explains the FERC process, gives a history of the Terror Lake Project, and, ultimately, makes recommendations for improved management of controversies within the context of FERC licensing procedures. 64 references.« less

  15. Hierarchy Software Development Framework (h-dp-fwk) project

    NASA Astrophysics Data System (ADS)

    Zaytsev, A.

    2010-04-01

    Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.

  16. Social Networks and Community-Based Natural Resource Management

    NASA Astrophysics Data System (ADS)

    Lauber, T. Bruce; Decker, Daniel J.; Knuth, Barbara A.

    2008-10-01

    We conducted case studies of three successful examples of collaborative, community-based natural resource conservation and development. Our purpose was to: (1) identify the functions served by interactions within the social networks of involved stakeholders; (2) describe key structural properties of these social networks; and (3) determine how these structural properties varied when the networks were serving different functions. The case studies relied on semi-structured, in-depth interviews of 8 to 11 key stakeholders at each site who had played a significant role in the collaborative projects. Interview questions focused on the roles played by key stakeholders and the functions of interactions between them. Interactions allowed the exchange of ideas, provided access to funding, and enabled some stakeholders to influence others. The exchange of ideas involved the largest number of stakeholders, the highest percentage of local stakeholders, and the highest density of interactions. Our findings demonstrated the value of tailoring strategies for involving stakeholders to meet different needs during a collaborative, community-based natural resource management project. Widespread involvement of local stakeholders may be most appropriate when ideas for a project are being developed. During efforts to exert influence to secure project approvals or funding, however, involving specific individuals with political connections or influence on possible sources of funds may be critical. Our findings are consistent with past work that has postulated that social networks may require specific characteristics to meet different needs in community-based environmental management.

  17. Transuranic sealed source recovery project.

    PubMed

    Tompkins, J A; Pearson, M W

    2001-11-01

    If you have transuranic sealed sources (239Pu, 238Pu, or 241Am) that have no potential for recycle or commercial disposal, the Off Site Source Recovery Project at LANL can assist in recovering the sealed sources from your facility to a DOE storage site.

  18. Predicting the Impacts of Climate Change on Runoff and Sediment Processes in Agricultural Watersheds: A Case Study from the Sunflower Watershed in the Lower Mississippi Basin

    NASA Astrophysics Data System (ADS)

    Elkadiri, R.; Momm, H.; Yasarer, L.; Armour, G. L.

    2017-12-01

    Climatic conditions play a major role in physical processes impacting soil and agrochemicals detachment and transportation from/in agricultural watersheds. In addition, these climatic conditions are projected to significantly vary spatially and temporally in the 21st century, leading to vast uncertainties about the future of sediment and non-point source pollution transport in agricultural watersheds. In this study, we selected the sunflower basin in the lower Mississippi River basin, USA to contribute in the understanding of how climate change affects watershed processes and the transport of pollutant loads. The climate projections used in this study were retrieved from the archive of World Climate Research Programme's (WCRP) Coupled Model Intercomparison Phase 5 (CMIP5) project. The CMIP5 dataset was selected because it contains the most up-to-date spatially downscaled and bias corrected climate projections. A subset of ten GCMs representing a range in projected climate were spatially downscaled for the sunflower watershed. Statistics derived from downscaled GCM output representing the 2011-2040, 2041-2070 and 2071-2100 time periods were used to generate maximum/minimum temperature and precipitation on a daily time step using the USDA Synthetic Weather Generator, SYNTOR. These downscaled climate data were then utilized as inputs to run in the Annualized Agricultural Non-Point Source (AnnAGNPS) pollution watershed model to estimate time series of runoff, sediment, and nutrient loads produced from the watershed. For baseline conditions a validated simulation of the watershed was created and validated using historical data from 2000 until 2015.

  19. 76 FR 22038 - Revision to the South Coast Portion of the California State Implementation Plan, CPV Sentinel...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-20

    ... California SIP. This source-specific SIP revision is known as the CPV Sentinel Energy Project AB 1318... list of those emissions credits. The Sentinel Energy Project is a source that is not authorized to... District to transfer certain emissions credits to one stationary source, the Sentinel Energy Project. The...

  20. A model for quantifying construction waste in projects according to the European waste list.

    PubMed

    Llatas, C

    2011-06-01

    The new EU challenge is to recover 70% by weight of C&D waste in 2020. Literature reveals that one major barrier is the lack of data. Therefore, this paper presents a model which allows technicians to estimate C&D waste during the design stage in order to promote prevention and recovery. The types and quantities of CW are estimated and managed according to EU guidelines, by building elements and specifically for each project. The model would allow detection of the source of the waste and to adopt other alternative procedures which delete hazardous waste and reduce CW. Likewise, it develops a systematic structure of the construction process, a waste classification system and some analytical expressions which are based on factors. These factors depend on technology and represent a standard on site. It would allow to develop a database of waste anywhere. A Spanish case study is covered. Factors were obtained by studying over 20 dwellings. The source and types of packaging waste, remains, soil and hazardous waste were estimated in detail and were compared with other studies. Results reveal that the model can be implemented in projects and the chances of reducing and recovery C&D waste could be increased, well above the EU challenge. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Large Crater Clustering tool

    NASA Astrophysics Data System (ADS)

    Laura, Jason; Skinner, James A.; Hunter, Marc A.

    2017-08-01

    In this paper we present the Large Crater Clustering (LCC) tool set, an ArcGIS plugin that supports the quantitative approximation of a primary impact location from user-identified locations of possible secondary impact craters or the long-axes of clustered secondary craters. The identification of primary impact craters directly supports planetary geologic mapping and topical science studies where the chronostratigraphic age of some geologic units may be known, but more distant features have questionable geologic ages. Previous works (e.g., McEwen et al., 2005; Dundas and McEwen, 2007) have shown that the source of secondary impact craters can be estimated from secondary impact craters. This work adapts those methods into a statistically robust tool set. We describe the four individual tools within the LCC tool set to support: (1) processing individually digitized point observations (craters), (2) estimating the directional distribution of a clustered set of craters, back projecting the potential flight paths (crater clusters or linearly approximated catenae or lineaments), (3) intersecting projected paths, and (4) intersecting back-projected trajectories to approximate the local of potential source primary craters. We present two case studies using secondary impact features mapped in two regions of Mars. We demonstrate that the tool is able to quantitatively identify primary impacts and supports the improved qualitative interpretation of potential secondary crater flight trajectories.

  2. SU-E-I-20: Dead Time Count Loss Compensation in SPECT/CT: Projection Versus Global Correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siman, W; Kappadath, S

    Purpose: To compare projection-based versus global correction that compensate for deadtime count loss in SPECT/CT images. Methods: SPECT/CT images of an IEC phantom (2.3GBq 99mTc) with ∼10% deadtime loss containing the 37mm (uptake 3), 28 and 22mm (uptake 6) spheres were acquired using a 2 detector SPECT/CT system with 64 projections/detector and 15 s/projection. The deadtime, Ti and the true count rate, Ni at each projection, i was calculated using the monitor-source method. Deadtime corrected SPECT were reconstructed twice: (1) with projections that were individually-corrected for deadtime-losses; and (2) with original projections with losses and then correcting the reconstructed SPECTmore » images using a scaling factor equal to the inverse of the average fractional loss for 5 projections/detector. For both cases, the SPECT images were reconstructed using OSEM with attenuation and scatter corrections. The two SPECT datasets were assessed by comparing line profiles in xyplane and z-axis, evaluating the count recoveries, and comparing ROI statistics. Higher deadtime losses (up to 50%) were also simulated to the individually corrected projections by multiplying each projection i by exp(-a*Ni*Ti), where a is a scalar. Additionally, deadtime corrections in phantoms with different geometries and deadtime losses were also explored. The same two correction methods were carried for all these data sets. Results: Averaging the deadtime losses in 5 projections/detector suffices to recover >99% of the loss counts in most clinical cases. The line profiles (xyplane and z-axis) and the statistics in the ROIs drawn in the SPECT images corrected using both methods showed agreement within the statistical noise. The count-loss recoveries in the two methods also agree within >99%. Conclusion: The projection-based and the global correction yield visually indistinguishable SPECT images. The global correction based on sparse sampling of projections losses allows for accurate SPECT deadtime loss correction while keeping the study duration reasonable.« less

  3. Image processing in biodosimetry: A proposal of a generic free software platform.

    PubMed

    Dumpelmann, Matthias; Cadena da Matta, Mariel; Pereira de Lemos Pinto, Marcela Maria; de Salazar E Fernandes, Thiago; Borges da Silva, Edvane; Amaral, Ademir

    2015-08-01

    The scoring of chromosome aberrations is the most reliable biological method for evaluating individual exposure to ionizing radiation. However, microscopic analyses of chromosome human metaphases, generally employed to identify aberrations mainly dicentrics (chromosome with two centromeres), is a laborious task. This method is time consuming and its application in biological dosimetry would be almost impossible in case of a large scale radiation incidents. In this project, a generic software was enhanced for automatic chromosome image processing from a framework originally developed for the Framework V project Simbio, of the European Union for applications in the area of source localization from electroencephalographic signals. The platforms capability is demonstrated by a study comparing automatic segmentation strategies of chromosomes from microscopic images.

  4. Near-Infrared Scintillation of Liquid Argon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tilly, Elizabeth; Escobar, Carlos

    2017-01-01

    Liquid argon is well known to scintillate in the vacuum ultraviolet (VUV) range which is inherently difficult to detect. There has been recent evidence to suggest that it also emits near infrared (NIR) light. If this is the case, many large-scale time projection chambers and other similar detectors will be able to maximize light collection while minimizing cost. The goal of this project is to confirm and quantify this NIR emission. In order to accomplish this, an α-source was placed in a volume of highly purified liquid argon and observed using an infrared PMT with a filter excluding light withmore » wavelength <715 nm. Performing a simple counting experiment, there were indications of NIR scintillation. Further analysis is in progress.« less

  5. Electronic Engineering Notebook: A software environment for research execution, documentation and dissemination

    NASA Technical Reports Server (NTRS)

    Moerder, Dan

    1994-01-01

    The electronic engineering notebook (EEN) consists of a free form research notebook, implemented in a commercial package for distributed hypermedia, which includes utilities for graphics capture, formatting and display of LaTex constructs, and interfaces to the host operating system. The latter capability consists of an information computer-aided software engineering (CASE) tool and a means to associate executable scripts with source objects. The EEN runs on Sun and HP workstations. The EEN, in day-to-day use can be used in much the same manner as the sort of research notes most researchers keep during development of projects. Graphics can be pasted in, equations can be entered via LaTex, etc. In addition, the fact that the EEN is hypermedia permits easy management of 'context', e.g., derivations and data can contain easily formed links to other supporting derivations and data. The CASE tool also permits development and maintenance of source code directly in the notebook, with access to its derivations and data.

  6. Child Maltreatment Surveillance Improvement Opportunities: A Wake County, North Carolina Pilot Project.

    PubMed

    Shanahan, Meghan E; Fliss, Mike D; Proescholdbell, Scott K

    2018-01-01

    BACKGROUND As child maltreatment often occurs in private, child welfare numbers underestimate its true prevalence. Child maltreatment surveillance systems have been used to ascertain more accurate counts of children who experience maltreatment. This manuscript describes the results from a pilot child maltreatment surveillance system in Wake County, North Carolina. METHODS We linked 2010 and 2011 data from 3 sources (Child Protective Services, Raleigh Police Department, and Office of the Chief Medical Examiner) to obtain rates of definite and possible child maltreatment. We separately analyzed emergency department visits from 2010 and 2011 to obtain counts of definite and possible child maltreatment. We then compared the results from the surveillance systems to those obtained from Child Protective Services (CPS) data alone. RESULTS In 2010 and 2011, rates of definite child maltreatment were 11.7 and 11.3 per 1,000 children, respectively, when using the linked data, compared to 10.0 and 9.5 per 1,000 children using CPS data alone. The rates of possible maltreatment were 25.3 and 23.8 per 1,000, respectively. In the 2010 and 2011 emergency department data, there were 68 visits and 84 visits, respectively, that met the case definition for maltreatment. LIMITATIONS While 4 data sources were analyzed, only 3 were linked in the current surveillance system. It is likely that we would have identified more cases of maltreatment had more sources been included. CONCLUSION While the surveillance system identified more children who met the case definition of maltreatment than CPS data alone, the rates of definite child maltreatment were not considerably higher than official reports. Rates of possible child maltreatment were much higher than both the definite case definition and child welfare records. Tracking both definite and possible case definitions and using a variety of data sources provides a more complete picture of child maltreatment in North Carolina. ©2018 by the North Carolina Institute of Medicine and The Duke Endowment. All rights reserved.

  7. Problem-based learning in optical engineering studies

    NASA Astrophysics Data System (ADS)

    Voznesenskaya, Anna

    2016-09-01

    Nowadays, the Problem-Based Learning (PBL) is one of the most prospective educational technologies. PBL is based on evaluation of learning outcomes of a student, both professional and personal, instead of traditional evaluation of theoretical knowledge and selective practical skills. Such an approach requires changes in the curricula development. There should be introduced projects (cases) imitating real tasks from the professional life. These cases should include a problem summary with necessary theoretic description, charts, graphs, information sources etc, task to implement and evaluation indicators and criteria. Often these cases are evaluated with the assessment-center method. To motivate students for the given task they could be divided into groups and have a contest. Whilst it looks easy to implement in social, economic or teaching fields PBL is pretty complicated in engineering studies. Examples of cases in the first-cycle optical engineering studies are shown in this paper. Procedures of the PBL implementation and evaluation are described.

  8. DNAPL Remediation: Selected Projects Approaching Regulatory Closure

    EPA Pesticide Factsheets

    This paper is a status update on the use of DNAPL source reduction remedial technologies, and provides information about recent projects where regulatory closure has been reached or projects are approaching regulatory closure, following source reduction.

  9. 40 CFR 149.111 - Funding to redesigned projects.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROGRAMS (CONTINUED) SOLE SOURCE AQUIFERS Review of Projects Affecting the Edwards Underground Reservoir, A Designated Sole Source Aquifer in the San Antonio, Texas Area § 149.111 Funding to redesigned projects. After... 40 Protection of Environment 22 2010-07-01 2010-07-01 false Funding to redesigned projects. 149...

  10. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    NASA Astrophysics Data System (ADS)

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.; Romanovsky, Vladimir; Miller, Charles E.

    2018-01-01

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration of deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55° N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km2) by 2300, 6.2 million km2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20-200 years by high ecosystem productivity, such that talik peaks early ( ˜ 2050s, although borehole data suggest sooner) and C source transition peaks late ( ˜ 2150-2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January-February) soil warming at depth ( ˜ 2 m), (2) increasing cold-season emissions (November-April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO2 emissions, and atmospheric 14CO2 as key indicators of the permafrost C feedback.

  11. Quantifying the sources of ozone, fine particulate matter, and regional haze in the Southeastern United States.

    PubMed

    Odman, M Talat; Hu, Yongtao; Russell, Armistead G; Hanedar, Asude; Boylan, James W; Brewer, Patricia F

    2009-07-01

    A detailed sensitivity analysis was conducted to quantify the contributions of various emission sources to ozone (O3), fine particulate matter (PM2.5), and regional haze in the Southeastern United States. O3 and particulate matter (PM) levels were estimated using the Community Multiscale Air Quality (CMAQ) modeling system and light extinction values were calculated from modeled PM concentrations. First, the base case was established using the emission projections for the year 2009. Then, in each model run, SO2, primary carbon (PC), NH3, NO(x) or VOC emissions from a particular source category in a certain geographic area were reduced by 30% and the responses were determined by calculating the difference between the results of the reduced emission case and the base case. The sensitivity of summertime O3 to VOC emissions is small in the Southeast and ground-level NO(x) controls are generally more beneficial than elevated NO(x) controls (per unit mass of emissions reduced). SO2 emission reduction is the most beneficial control strategy in reducing summertime PM2.5 levels and improving visibility in the Southeast and electric generating utilities are the single largest source of SO2. Controlling PC emissions can be very effective locally, especially in winter. Reducing NH3 emissions is an effective strategy to reduce wintertime ammonium nitrate (NO3NH4) levels and improve visibility; NO(x) emissions reductions are not as effective. The results presented here will help the development of specific emission control strategies for future attainment of the National Ambient Air Quality Standards in the region.

  12. Application of Multi-Frequency Modulation (MFM) for High-Speed Data Communications to a Voice Frequency Channel

    DTIC Science & Technology

    1990-06-01

    reader is cautioned that computer programs developed in this research may not have been exercised for all cases of interest. While every effort has been...Source of Funding Numbers _. Program Element No Project No I Task No I Work Unit Accession No 11 Title (Include security classflcation) APPLICATION OF...formats. Previous applications of these encoding formats were on industry standard computers (PC) over a 16-20 klIz channel. This report discusses the

  13. Report on the Feasibility of Three Data Bases as Sources for the Ambulatory Resource Analysis Project,

    DTIC Science & Technology

    1991-01-08

    type (250.00) * Otitis media , unspecified (382.9) This latter diagnosis was confined largely to children. It should be noted that one of the top ten... Otitis media , unspecified (382.9) * Routine infant or child care (V20.2) Essential hypertension was either the first or second most common diagnosis in...401) * Suppurative and unspecified otitis media (382.0) * Sprains and strains, ankle (845.0) * Acute nasopharyngitis (460) As was the case for the

  14. Observation of valley-selective microwave transport in photonic crystals

    NASA Astrophysics Data System (ADS)

    Ye, Liping; Yang, Yuting; Hong Hang, Zhi; Qiu, Chunyin; Liu, Zhengyou

    2017-12-01

    Recently, the discrete valley degree of freedom has attracted extensive attention in condensed matter physics. Here, we present an experimental observation of the intriguing valley transport for microwaves in photonic crystals, including the bulk valley transport and the valley-projected edge modes along the interface separating different photonic insulating phases. For both cases, valley-selective excitations are realized by a point-like chiral source located at proper locations inside the samples. Our results are promising for exploring unprecedented routes to manipulate microwaves.

  15. Technology commercialization cost model and component case study

    NASA Astrophysics Data System (ADS)

    1991-12-01

    Fuel cells seem poised to emerge as a clean, efficient, and cost competitive source of fossil fuel based electric power and thermal energy. Sponsors of fuel cell technology development need to determine the validity and the attractiveness of a technology to the market in terms of meeting requirements and providing value which exceeds the total cost of ownership. Sponsors of fuel cell development have addressed this issue by requiring the developers to prepare projections of the future production cost of their fuel cells in commercial quantities. These projected costs, together with performance and life projections, provide a preliminary measure of the total value and cost of the product to the customer. Booz-Allen & Hamilton Inc. and Michael A. Cobb & Company have been retained in several assignments over the years to audit these cost projections. The audits have gone well beyond a simple review of the numbers. They have probed the underlying technical and financial assumptions, the sources of data on material and equipment costs, and explored issues such as the realistic manufacturing yields which can be expected in various processes. Based on the experience gained from these audits, DOE gave Booz-Allen and Michael A. Cobb & company the task to develop a criteria to be used in the execution of future fuel cell manufacturing cost studies. It was thought that such a criteria would make it easier to execute such studies in the future as well as to cause such studies to be more understandable and comparable.

  16. The GenABEL Project for statistical genomics

    PubMed Central

    Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.

    2016-01-01

    Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381

  17. User Participation in Coproduction of Health Innovation: Proposal for a Synergy Project

    PubMed Central

    Zukauskaite, Elena; Westberg, Niklas

    2018-01-01

    Background This project concerns advancing knowledge, methods, and logic for user participation in coproduction of health innovations. Such advancement is vital for several reasons. From a user perspective, participation in coproduction provides an opportunity to gain real influence over goal definition, design, and implementation of health innovations, ensuring that the solution developed solves real problems in right ways. From a societal perspective, it’s a mean to improve the efficiency of health care and the implementation of the Patient Act. As for industry, frameworks and knowledge of coproduction offer tools to operate in a complex sector, with great potential for innovation of services and products. Objective The fundamental objective of this project is to advance knowledge and methods of how user participation in the coproduction of health innovations can be applied in order to benefit users, industry, and public sector. Methods This project is a synergy project, which means that the objective will be accomplished through collaboration and meta-analysis between three subprojects that address different user groups, apply different strategies to promote human health, and relate to different parts of the health sector. Furthermore, subprojects focus on distinctive stages in the spectrum of innovation, with the objective to generate knowledge of the innovation process as a whole. The project is organized around three work packages related to three challenges—coproduction, positioning, and realization. Each subproject is designed such that it has its own field of study with clearly identified objectives but also targets work packages to contribute to the project as a whole. The work on the work packages will use case methodology for data collection and analysis based on the subprojects as data sources. More concretely, logic of multiple case studies will be applied with each subproject representing a separate case which is similar to each other in its attention to user participation in coproduction, but different regarding, for example, context and target groups. At the synergy level, the framework methodology will be used to handle and analyze the vast amount of information generated within the subprojects. Results The project period is from July 1, 2018 to June 30, 2022. Conclusions By addressing the objective of this project, we will create new knowledge on how to manage challenges to health innovation associated with the coproduction process, the positioning of solutions, and realization. PMID:29743159

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shanks, Katherine S.; Philipp, Hugh T.; Weiss, Joel T.

    Experiments at storage ring light sources as well as at next-generation light sources increasingly require detectors capable of high dynamic range operation, combining low-noise detection of single photons with large pixel well depth. XFEL sources in particular provide pulse intensities sufficiently high that a purely photon-counting approach is impractical. The High Dynamic Range Pixel Array Detector (HDR-PAD) project aims to provide a dynamic range extending from single-photon sensitivity to 10{sup 6} photons/pixel in a single XFEL pulse while maintaining the ability to tolerate a sustained flux of 10{sup 11} ph/s/pixel at a storage ring source. Achieving these goals involves themore » development of fast pixel front-end electronics as well as, in the XFEL case, leveraging the delayed charge collection due to plasma effects in the sensor. A first prototype of essential electronic components of the HDR-PAD readout ASIC, exploring different options for the pixel front-end, has been fabricated. Here, the HDR-PAD concept and preliminary design will be described.« less

  19. Helicon Wave Physics Impacts on Electrodeless Thruster Design

    NASA Technical Reports Server (NTRS)

    Gilland, James H.

    2007-01-01

    Effective generation of helicon waves for high density plasma sources is determined by the dispersion relation and plasma power balance. Helicon wave plasma sources inherently require an applied magnetic field of .01-0.1 T, an antenna properly designed to couple to the helicon wave in the plasma, and an rf power source in the 10-100 s of MHz, depending on propellant choice. For a plasma thruster, particularly one with a high specific impulse (>2000 s), the physics of the discharge would also have to address the use of electron cyclotron resonance (ECR) heating and magnetic expansion. In all cases the system design includes an optimized magnetic field coil, plasma source chamber, and antenna. A preliminary analysis of such a system, calling on experimental data where applicable and calculations where required, has been initiated at Glenn Research Center. Analysis results showing the mass scaling of various components as well as thruster performance projections and their impact on thruster size are discussed.

  20. Helicon Wave Physics Impacts on Electrodeless Thruster Design

    NASA Technical Reports Server (NTRS)

    Gilland, James

    2003-01-01

    Effective generation of helicon waves for high density plasma sources is determined by the dispersion relation and plasma power balance. Helicon wave plasma sources inherently require an applied magnetic field of .01-0.1 T, an antenna properly designed to couple to the helicon wave in the plasma, and an rf power source in the 10-100 s of MHz, depending on propellant choice. For a plasma thruster, particularly one with a high specific impulse (>2000 s), the physics of the discharge would also have to address the use of electron cyclotron resonance (ECR) heating and magnetic expansion. In all cases the system design includes an optimized magnetic field coil, plasma source chamber, and antenna. A preliminary analysis of such a system, calling on experimental data where applicable and calculations where required, has been initiated at Glenn Research Center. Analysis results showing the mass scaling of various components as well as thruster performance projections and their impact on thruster size are discussed.

  1. Sources of magnetic fields in recurrent interplanetary streams

    NASA Technical Reports Server (NTRS)

    Burlaga, L. F.; Behannon, K. W.; Hansen, S. F.; Pneuman, G. W.; Feldman, W. C.

    1977-01-01

    The sources of magnetic fields in recurrent streams were examined. Most fields and plasmas at 1 AU were related to coronal holes, and the magnetic field lines were open in those holes. Some of the magnetic fields and plasmas were related to open field line regions on the sun which were not associated with known coronal holes, indicating that open field lines are more basic than coronal holes as sources of the solar wind. Magnetic field intensities in five equatorial coronal holes ranged from 2G to 18G. Average measured photospheric magnetic fields along the footprints of the corresponding unipolar fields on circular equatorial arcs at 2.5 solar radii had a similar range and average, but in two cases the intensities were approximately three times higher than the projected intensities. The coronal footprints of the sector boundaries on the source surface at 2.5 solar radii, meandered between -45 deg and +45 deg latitude, and their inclination ranged from near zero to near ninety degrees.

  2. Local SPTHA through tsunami inundation simulations: a test case for two coastal critical infrastructures in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Volpe, M.; Selva, J.; Tonini, R.; Romano, F.; Lorito, S.; Brizuela, B.; Argyroudis, S.; Salzano, E.; Piatanesi, A.

    2016-12-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) is a methodology to assess the exceedance probability for different thresholds of tsunami hazard intensity, at a specific site or region in a given time period, due to a seismic source. A large amount of high-resolution inundation simulations is typically required for taking into account the full variability of potential seismic sources and their slip distributions. Starting from regional SPTHA offshore results, the computational cost can be reduced by considering for inundation calculations only a subset of `important' scenarios. We here use a method based on an event tree for the treatment of the seismic source aleatory variability; a cluster analysis on the offshore results to define the important sources; epistemic uncertainty treatment through an ensemble modeling approach. We consider two target sites in the Mediterranean (Milazzo, Italy, and Thessaloniki, Greece) where coastal (non nuclear) critical infrastructures (CIs) are located. After performing a regional SPTHA covering the whole Mediterranean, for each target site, few hundreds of representative scenarios are filtered out of all the potential seismic sources and the tsunami inundation is explicitly modeled, obtaining a site-specific SPTHA, with a complete characterization of the tsunami hazard in terms of flow depth and velocity time histories. Moreover, we also explore the variability of SPTHA at the target site accounting for coseismic deformation (i.e. uplift or subsidence) due to near field sources located in very shallow water. The results are suitable and will be applied for subsequent multi-hazard risk analysis for the CIs. These applications have been developed in the framework of the Italian Flagship Project RITMARE, EC FP7 ASTARTE (Grant agreement 603839) and STREST (Grant agreement 603389) projects, and of the INGV-DPC Agreement.

  3. A preliminary probabilistic analysis of tsunami sources of seismic and non-seismic origin applied to the city of Naples, Italy

    NASA Astrophysics Data System (ADS)

    Tonini, R.; Anita, G.

    2011-12-01

    In both worldwide and regional historical catalogues, most of the tsunamis are caused by earthquakes and a minor percentage is represented by all the other non-seismic sources. On the other hand, tsunami hazard and risk studies are often applied to very specific areas, where this global trend can be different or even inverted, depending on the kind of potential tsunamigenic sources which characterize the case study. So far, few probabilistic approaches consider the contribution of landslides and/or phenomena derived by volcanic activity, i.e. pyroclastic flows and flank collapses, as predominant in the PTHA, also because of the difficulties to estimate the correspondent recurrence time. These considerations are valid, for example, for the city of Naples, Italy, which is surrounded by a complex active volcanic system (Vesuvio, Campi Flegrei, Ischia) that presents a significant number of potential tsunami sources of non-seismic origin compared to the seismic ones. In this work we present the preliminary results of a probabilistic multi-source tsunami hazard assessment applied to Naples. The method to estimate the uncertainties will be based on Bayesian inference. This is the first step towards a more comprehensive task which will provide a tsunami risk quantification for this town in the frame of the Italian national project ByMuR (http://bymur.bo.ingv.it). This three years long ongoing project has the final objective of developing a Bayesian multi-risk methodology to quantify the risk related to different natural hazards (volcanoes, earthquakes and tsunamis) applied to the city of Naples.

  4. A Voyage of Mathematical and Cultural Awareness for Students of Upper Secondary School

    NASA Astrophysics Data System (ADS)

    Panagiotou, Evangelos N.

    2014-01-01

    Many papers have emphasized the need for and importance of particular examples and the underlying rationale for introducing a historical dimension in mathematics education. This article presents the development and implementation of a project, based on original sources, in a situation where the existing curriculum does not include history. The subject was conic sections and the motivating problems and original work which eventually found resolution in modern concepts. The project was carried out during the school year 2006-2007 with 18 students of a Greek experimental high school 2nd class (11th degree). It was devised as a series of worksheets, separate readings and oral presentations and written essays so that students might appreciate that mathematics evolves under the influence of factors intrinsic and extrinsic to it. Both epistemological and disciplinary issues are taken into account. Even though this work is just one case study, we have found that exposing students directly to primary sources in mathematics contributes greatly to motivation and understanding, and illustrates the nature of mathematics as a discipline and as a human endeavour.

  5. Research on cross - Project software defect prediction based on transfer learning

    NASA Astrophysics Data System (ADS)

    Chen, Ya; Ding, Xiaoming

    2018-04-01

    According to the two challenges in the prediction of cross-project software defects, the distribution differences between the source project and the target project dataset and the class imbalance in the dataset, proposing a cross-project software defect prediction method based on transfer learning, named NTrA. Firstly, solving the source project data's class imbalance based on the Augmented Neighborhood Cleaning Algorithm. Secondly, the data gravity method is used to give different weights on the basis of the attribute similarity of source project and target project data. Finally, a defect prediction model is constructed by using Trad boost algorithm. Experiments were conducted using data, come from NASA and SOFTLAB respectively, from a published PROMISE dataset. The results show that the method has achieved good values of recall and F-measure, and achieved good prediction results.

  6. An open source toolkit for medical imaging de-identification.

    PubMed

    González, David Rodríguez; Carpenter, Trevor; van Hemert, Jano I; Wardlaw, Joanna

    2010-08-01

    Medical imaging acquired for clinical purposes can have several legitimate secondary uses in research projects and teaching libraries. No commonly accepted solution for anonymising these images exists because the amount of personal data that should be preserved varies case by case. Our objective is to provide a flexible mechanism for anonymising Digital Imaging and Communications in Medicine (DICOM) data that meets the requirements for deployment in multicentre trials. We reviewed our current de-identification practices and defined the relevant use cases to extract the requirements for the de-identification process. We then used these requirements in the design and implementation of the toolkit. Finally, we tested the toolkit taking as a reference those requirements, including a multicentre deployment. The toolkit successfully anonymised DICOM data from various sources. Furthermore, it was shown that it could forward anonymous data to remote destinations, remove burned-in annotations, and add tracking information to the header. The toolkit also implements the DICOM standard confidentiality mechanism. A DICOM de-identification toolkit that facilitates the enforcement of privacy policies was developed. It is highly extensible, provides the necessary flexibility to account for different de-identification requirements and has a low adoption barrier for new users.

  7. The dialog between health and foreign policy in Brazilian cooperation in human milk banks.

    PubMed

    Pittas, Tiago Mocellin; Dri, Clarissa Franzoi

    2017-07-01

    Mother's milk is the primary source of nourishment in early infancy. When this source is unavailable, secondary sources may be used, such as human milk banks. The first milk bank in Brazil was created in 1943, and they have been used ever since. A national model was developed through a number of phases, culminating in the Brazilian Network of Human Milk Banks. This gave rise to a number of international cooperation projects, with the Brazilian model particularly relevant for developing nations. The main objective of this analysis is to understand what drives Brazil to promote milk banks internationally. To do this we tried to understand the relationship between health and foreign policy, expressed here as soft power, as here the two areas dialog with one another. The results include gains in both areas and the affirmation of health as a central goal of the national interest cluster of the case.

  8. EXPERIENCES FROM THE SOURCE-TERM ANALYSIS OF A LOW AND INTERMEDIATE LEVEL RADWASTE DISPOSAL FACILITY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park,Jin Beak; Park, Joo-Wan; Lee, Eun-Young

    2003-02-27

    Enhancement of a computer code SAGE for evaluation of the Korean concept for a LILW waste disposal facility is discussed. Several features of source term analysis are embedded into SAGE to analyze: (1) effects of degradation mode of an engineered barrier, (2) effects of dispersion phenomena in the unsaturated zone and (3) effects of time dependent sorption coefficient in the unsaturated zone. IAEA's Vault Safety Case (VSC) approach is used to demonstrate the ability of this assessment code. Results of MASCOT are used for comparison purposes. These enhancements of the safety assessment code, SAGE, can contribute to realistic evaluation ofmore » the Korean concept of the LILW disposal project in the near future.« less

  9. Probabilities for gravitational lensing by point masses in a locally inhomogeneous universe

    NASA Technical Reports Server (NTRS)

    Isaacson, Jeffrey A.; Canizares, Claude R.

    1989-01-01

    Probability functions for gravitational lensing by point masses that incorporate Poisson statistics and flux conservation are formulated in the Dyer-Roeder construction. Optical depths to lensing for distant sources are calculated using both the method of Press and Gunn (1973) which counts lenses in an otherwise empty cone, and the method of Ehlers and Schneider (1986) which projects lensing cross sections onto the source sphere. These are then used as parameters of the probability density for lensing in the case of a critical (q0 = 1/2) Friedmann universe. A comparison of the probability functions indicates that the effects of angle-averaging can be well approximated by adjusting the average magnification along a random line of sight so as to conserve flux.

  10. Climate model uncertainty in impact assessments for agriculture: A multi-ensemble case study on maize in sub-Saharan Africa

    NASA Astrophysics Data System (ADS)

    Dale, Amy; Fant, Charles; Strzepek, Kenneth; Lickley, Megan; Solomon, Susan

    2017-03-01

    We present maize production in sub-Saharan Africa as a case study in the exploration of how uncertainties in global climate change, as reflected in projections from a range of climate model ensembles, influence climate impact assessments for agriculture. The crop model AquaCrop-OS (Food and Agriculture Organization of the United Nations) was modified to run on a 2° × 2° grid and coupled to 122 climate model projections from multi-model ensembles for three emission scenarios (Coupled Model Intercomparison Project Phase 3 [CMIP3] SRES A1B and CMIP5 Representative Concentration Pathway [RCP] scenarios 4.5 and 8.5) as well as two "within-model" ensembles (NCAR CCSM3 and ECHAM5/MPI-OM) designed to capture internal variability (i.e., uncertainty due to chaos in the climate system). In spite of high uncertainty, most notably in the high-producing semi-arid zones, we observed robust regional and sub-regional trends across all ensembles. In agreement with previous work, we project widespread yield losses in the Sahel region and Southern Africa, resilience in Central Africa, and sub-regional increases in East Africa and at the southern tip of the continent. Spatial patterns of yield losses corresponded with spatial patterns of aridity increases, which were explicitly evaluated. Internal variability was a major source of uncertainty in both within-model and between-model ensembles and explained the majority of the spatial distribution of uncertainty in yield projections. Projected climate change impacts on maize production in different regions and nations ranged from near-zero or positive (upper quartile estimates) to substantially negative (lower quartile estimates), highlighting a need for risk management strategies that are adaptive and robust to uncertainty.

  11. The Chandra Source Catalog: Background Determination and Source Detection

    NASA Astrophysics Data System (ADS)

    McCollough, Michael; Rots, Arnold; Primini, Francis A.; Evans, Ian N.; Glotfelty, Kenny J.; Hain, Roger; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Danny G. Gibbs, II; Grier, John D.; Hall, Diane M.; Harbo, Peter N.; He, Xiang Qun (Helen); Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Siemiginowska, Aneta L.; Sundheim, Beth A.; Tibbetts, Michael S.; van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2009-09-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory are used to generate one of the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  12. Chandra Source Catalog: Background Determination and Source Detection

    NASA Astrophysics Data System (ADS)

    McCollough, Michael L.; Rots, A. H.; Primini, F. A.; Evans, I. N.; Glotfelty, K. J.; Hain, R.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Evans, J. D.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Refsdal, B. L.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    The Chandra Source Catalog (CSC) is a major project in which all of the pointed imaging observations taken by the Chandra X-Ray Observatory will used to generate the most extensive X-ray source catalog produced to date. Early in the development of the CSC it was recognized that the ability to estimate local background levels in an automated fashion would be critical for essential CSC tasks such as source detection, photometry, sensitivity estimates, and source characterization. We present a discussion of how such background maps are created directly from the Chandra data and how they are used in source detection. The general background for Chandra observations is rather smoothly varying, containing only low spatial frequency components. However, in the case of ACIS data, a high spatial frequency component is added that is due to the readout streaks of the CCD chips. We discuss how these components can be estimated reliably using the Chandra data and what limitations and caveats should be considered in their use. We will discuss the source detection algorithm used for the CSC and the effects of the background images on the detection results. We will also touch on some the Catalog Inclusion and Quality Assurance criteria applied to the source detection results. This work is supported by NASA contract NAS8-03060 (CXC).

  13. The Impact of Pollution Prevention on Toxic Environmental Releases from U.S. Manufacturing Facilities.

    PubMed

    Ranson, Matthew; Cox, Brendan; Keenan, Cheryl; Teitelbaum, Daniel

    2015-11-03

    Between 1991 and 2012, the facilities that reported to the U.S. Environmental Protection Agency's Toxic Release Inventory (TRI) Program conducted 370,000 source reduction projects. We use this data set to conduct the first quasi-experimental retrospective evaluation of how implementing a source reduction (pollution prevention) project affects the quantity of toxic chemicals released to the environment by an average industrial facility. We use a differences-in-differences methodology, which measures how implementing a source reduction project affects a facility's releases of targeted chemicals, relative to releases of (a) other untargeted chemicals from the same facility, or (b) the same chemical from other facilities in the same industry. We find that the average source reduction project causes a 9-16% decrease in releases of targeted chemicals in the year of implementation. Source reduction techniques vary in effectiveness: for example, raw material modification causes a large decrease in releases, while inventory control has no detectable effect. Our analysis suggests that in aggregate, the source reduction projects carried out in the U.S. since 1991 have prevented between 5 and 14 billion pounds of toxic releases.

  14. The CSAICLAWPS project: a multi-scalar, multi-data source approach to providing climate services for both modelling of climate change impacts on crop yields and development of community-level adaptive capacity for sustainable food security

    NASA Astrophysics Data System (ADS)

    Forsythe, N. D.; Fowler, H. J.

    2017-12-01

    The "Climate-smart agriculture implementation through community-focused pursuit of land and water productivity in South Asia" (CSAICLAWPS) project is a research initiative funded by the (UK) Royal Society through its Challenge Grants programme which is part of the broader UK Global Challenges Research Fund (GCRF). CSAICLAWPS has three objectives: a) development of "added-value" - bias assessed, statistically down-scaled - climate projections for selected case study sites across South Asia; b) investigation of crop failure modes under both present (observed) and future (projected) conditions; and c) facilitation of developing local adaptive capacity and resilience through stakeholder engagement. At AGU we will be presenting both next steps and progress to date toward these three objectives: [A] We have carried out bias assessments of a substantial multi-model RCM ensemble (MME) from the CORDEX South Asia (CORDEXdomain for case studies in three countries - Pakistan, India and Sri Lanka - and (stochastically) produced synthetic time-series for these sites from local observations using a Python-based implementation of the principles underlying the Climate Research Unit Weather Generator (CRU-WG) in order to enable probabilistic simulation of current crop yields. [B] We have characterised present response of local crop yields to climate variability in key case study sites using AquaCrop simulations parameterised based on input (agronomic practices, soil conditions, etc) from smallholder farmers. [C] We have implemented community-based hydro-climatological monitoring in several case study "revenue villages" (panchayats) in the Nainital District of Uttarakhand. The purpose of this is not only to increase availability of meteorological data, but also has the aspiration of, over time, leading to enhanced quantitative awareness of present climate variability and potential future conditions (as projected by RCMs). Next steps in our work will include: 1) future crop yield simulations driven by "perturbation" of synthetic time-series using "change factors from the CORDEX-SA MME; 2) stakeholder dialogues critically evaluating potential strategies at the grassroots (implementation) level to mitigate impacts of climate variability and change on crop yields.

  15. Systematic study of target localization for bioluminescence tomography guided radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Jingjing; Zhang, Bin; Reyes, Juvenal

    Purpose: To overcome the limitation of CT/cone-beam CT (CBCT) in guiding radiation for soft tissue targets, the authors developed a spectrally resolved bioluminescence tomography (BLT) system for the small animal radiation research platform. The authors systematically assessed the performance of the BLT system in terms of target localization and the ability to resolve two neighboring sources in simulations, tissue-mimicking phantom, and in vivo environments. Methods: Multispectral measurements acquired in a single projection were used for the BLT reconstruction. The incomplete variables truncated conjugate gradient algorithm with an iterative permissible region shrinking strategy was employed as the optimization scheme to reconstructmore » source distributions. Simulation studies were conducted for single spherical sources with sizes from 0.5 to 3 mm radius at depth of 3–12 mm. The same configuration was also applied for the double source simulation with source separations varying from 3 to 9 mm. Experiments were performed in a standalone BLT/CBCT system. Two self-illuminated sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single-source at 6 and 9 mm depth, two sources at 3 and 5 mm separation at depth of 5 mm, or three sources in the abdomen were also used to illustrate the localization capability of the BLT system for multiple targets in vivo. Results: For simulation study, approximate 1 mm accuracy can be achieved at localizing center of mass (CoM) for single-source and grouped CoM for double source cases. For the case of 1.5 mm radius source, a common tumor size used in preclinical study, their simulation shows that for all the source separations considered, except for the 3 mm separation at 9 and 12 mm depth, the two neighboring sources can be resolved at depths from 3 to 12 mm. Phantom experiments illustrated that 2D bioluminescence imaging failed to distinguish two sources, but BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single-source case at 6 and 9 mm depth, respectively. For the 2 sources in vivo study, both sources can be distinguished at 3 and 5 mm separations, and approximately 1 mm localization accuracy can also be achieved. Conclusions: This study demonstrated that their multispectral BLT/CBCT system could be potentially applied to localize and resolve multiple sources at wide range of source sizes, depths, and separations. The average accuracy of localizing CoM for single-source and grouped CoM for double sources is approximately 1 mm except deep-seated target. The information provided in this study can be instructive to devise treatment margins for BLT-guided irradiation. These results also suggest that the 3D BLT system could guide radiation for the situation with multiple targets, such as metastatic tumor models.« less

  16. Removal of EOG Artifacts from EEG Recordings Using Stationary Subspace Analysis

    PubMed Central

    Zeng, Hong; Song, Aiguo

    2014-01-01

    An effective approach is proposed in this paper to remove ocular artifacts from the raw EEG recording. The proposed approach first conducts the blind source separation on the raw EEG recording by the stationary subspace analysis (SSA) algorithm. Unlike the classic blind source separation algorithms, SSA is explicitly tailored to the understanding of distribution changes, where both the mean and the covariance matrix are taken into account. In addition, neither independency nor uncorrelation is required among the sources by SSA. Thereby, it can concentrate artifacts in fewer components than the representative blind source separation methods. Next, the components that are determined to be related to the ocular artifacts are projected back to be subtracted from EEG signals, producing the clean EEG data eventually. The experimental results on both the artificially contaminated EEG data and real EEG data have demonstrated the effectiveness of the proposed method, in particular for the cases where limited number of electrodes are used for the recording, as well as when the artifact contaminated signal is highly nonstationary and the underlying sources cannot be assumed to be independent or uncorrelated. PMID:24550696

  17. Water Resources Management in Turkey as a Case Study Southeastern Anatolia Project (gap)

    NASA Astrophysics Data System (ADS)

    Ačma, Bülent

    2010-05-01

    The Southeastern Anatolia Project (GAP), one of the most important projects for develop remarkable natural resources of the world, is accepted as a change for getting benefit from rich water and agricultural resources of the Southeastern Anatolia Region. The GAP Project has been considered as a regional development projects through years, but the dimensions of sustainability, protection of environment and participatory have been attached to the master of the project in recent years. When the GAP Project is completed, the Upper Mesopotomia, the centers of many civilisation, will re-again its importance as it had in the ancient times, and will be alive a center of civilisation. Moreover, when the problem of water shortage and water supplies in the world for the future is kept in mind, the importance of Southeastern Anatolia's water supplies will be doubled. For this reason, the GAP Project, developed by depending on water and natural resources of the region, will have an important place in the world. The aim of this study is to introduce the region with rich natural resources and the GAP Project. For this reason, firstly, the natural potential of the region will be introduced. Second, The GAP Project will be presented in detailes. In the third stage, the projects being processed for protecting the natural sources and environment will be analyzed. In the last stage, strategies and policies to develop and to protect the natural resources of the region in short, mid, and long terms will be proposed.

  18. Empowerment in practice - insights from CITI-SENSE project in Ljubljana

    NASA Astrophysics Data System (ADS)

    Robinson, Johanna; Kocman, David; Smolnikar, Miha; Mohorčič, Miha; Horvat, Milena

    2014-05-01

    We present specifics of the citizen empowerment and crowd sourced citizen science conducted in Ljubljana, Slovenia, as one of the case study cities within the ongoing EU-project CITI-SENSE. CITI-SENSE addresses urban air quality and rests on three pillars: technological platforms for distributed monitoring; novel information and communication technologies; and citizen participation. In the project, empowerment initiatives are conducted, enabling citizens to participate in various aspects of urban air quality, both outdoor and indoor at schools affecting everyday life of societal groups. Each participating country runs its own citizen empowerment campaign adapting to local circumstances. In addition to Ljubljana, local campaigns have been initiated in Barcelona, Belgrade, Edinburgh, Haifa, Ljubljana, Oslo, Ostrava, Vienna and in Vitoria. Poor air quality has been recognized as an important factor affecting the quality of life, especially in urban environments. In Ljubljana specifically, the main air pollution sources are traffic-related emissions, individual house heating devices including increased use of coal and biomass in recent years, and to a limit extent industrial point sources and waste disposal sites. Air quality can be occasionally very poor due to specific climatic conditions owing partially to its location in a basin and on the marshes, resulting in a very complex circulation of air masses, temperature inversions and formation of urban heat island. By recognizing this, we established the main stakeholders in the city who are responsible for monitoring the quality of air in Ljubljana. Based on full stakeholder analysis we consider co-operation with local governmental- and non-governmental institutions with already established means of communications with citizens, as a tool for empowerment. Since we spend over 90% of our time indoors, the indoor air quality is of great importance. It is why the CITI-SENSE project empowerment initiatives also cover this aspect. In Ljubljana we have identified and are involving three schools; differing by location, house type and age of students. The project also gives children a unique approach to learning about air quality issues - by being involved. To evaluate the success of empowerment initiatives after a pilot phase, key performance indicators (KPI) were defined that will enable performance improvement for the full implementation phase of the project. Acknowledgements: CITI-SENSE is a Collaborative Project partly funded by the EU FP7-ENV-2012 under grant agreement no 308524. www.citi-sense.eu.

  19. Translocation of threatened plants as a conservation measure in China.

    PubMed

    Liu, Hong; Ren, Hai; Liu, Qiang; Wen, XiangYing; Maunder, Michael; Gao, JiangYun

    2015-12-01

    We assessed the current status of plant conservation translocation efforts in China, a topic poorly reported in recent scientific literature. We identified 222 conservation translocation cases involving 154 species, of these 87 were Chinese endemic species and 101 (78%) were listed as threatened on the Chinese Species Red List. We categorized the life form of each species and, when possible, determined for each case the translocation type, propagule source, propagule type, and survival and reproductive parameters. A surprisingly large proportion (26%) of the conservation translocations in China were conservation introductions, largely implemented in response to large-scale habitat destruction caused by the Three-Gorge Dam and another hydropower project. Documentation and management of the translocations varied greatly. Less than half the cases had plant survival records. Statistical analyses showed that survival percentages were significantly correlated with plant life form and the type of planting materials. Thirty percent of the cases had records on whether or not individuals flowered or fruited. Results of information theoretic model selection indicated that plant life form, translocation type, propagule type, propagule source, and time since planting significantly influenced the likelihood of flowering and fruiting on the project level. We suggest that the scientific-based application of species conservation translocations should be promoted as part of a commitment to species recovery management. In addition, we recommend that the common practice of within and out of range introductions in nature reserves to be regulated more carefully due to its potential ecological risks. We recommend the establishment of a national office and database to coordinate conservation translocations in China. Our review effort is timely considering the need for a comprehensive national guideline for the newly announced nation-wide conservation program on species with extremely small populations, which is expected to stimulate conservation translocations for many species in the near future. © 2015 Society for Conservation Biology.

  20. The Genesis Project: Science Cases for a Large Submm Telescope

    NASA Astrophysics Data System (ADS)

    Schneider, Nicola

    2018-01-01

    The formation of stars is intimately linked to the structure and evolution of molecular clouds in the interstellar medium. In the context of the ANR/DFG project GENESIS (GENeration and Evolution of Structures in the Ism, http://www.astro.uni-koeln.de/node/965), we explore this link with a new approach by combining far-infrared maps and surveys of dust (Herschel) and cooling lines (CII, CI, CO, OI with SOFIA), with molecular line maps. Dedicated analysis tools are used to characterise molecular cloud structure, and we explore the coupling of turbulence with heating- and cooling processes. The project gathers a large observational data set, from molecular line maps at (sub)-mm wavelengths from ground-based telescopes (e.g. IRAM) up to high-frequency airborne spectroscopic and continuum observations (SOFIA). Nevertheless, we identified the need for a large single-dish submm telescope, operating in the southern hemisphere at high frequencies. Only such an instrument is able to observe important ISM cooling lines, like the CI lines at 490 and 810 GHz or high-J CO lines, shock tracers, or probes of turbulence dissipation with high angular resolution in Galactic and extragalactic sources. We will discuss possible science cases and demonstrate how those are addressed within GENESIS, and the science done with the new 6m Cologne-Cornell CCAT-prime submm telescope.

  1. The Privacy and Security Implications of Open Data in Healthcare.

    PubMed

    Kobayashi, Shinji; Kane, Thomas B; Paton, Chris

    2018-04-22

     The International Medical Informatics Association (IMIA) Open Source Working Group (OSWG) initiated a group discussion to discuss current privacy and security issues in the open data movement in the healthcare domain from the perspective of the OSWG membership.  Working group members independently reviewed the recent academic and grey literature and sampled a number of current large-scale open data projects to inform the working group discussion.  This paper presents an overview of open data repositories and a series of short case reports to highlight relevant issues present in the recent literature concerning the adoption of open approaches to sharing healthcare datasets. Important themes that emerged included data standardisation, the inter-connected nature of the open source and open data movements, and how publishing open data can impact on the ethics, security, and privacy of informatics projects.  The open data and open source movements in healthcare share many common philosophies and approaches including developing international collaborations across multiple organisations and domains of expertise. Both movements aim to reduce the costs of advancing scientific research and improving healthcare provision for people around the world by adopting open intellectual property licence agreements and codes of practice. Implications of the increased adoption of open data in healthcare include the need to balance the security and privacy challenges of opening data sources with the potential benefits of open data for improving research and healthcare delivery. Georg Thieme Verlag KG Stuttgart.

  2. The state and profile of open source software projects in health and medical informatics.

    PubMed

    Janamanchi, Balaji; Katsamakas, Evangelos; Raghupathi, Wullianallur; Gao, Wei

    2009-07-01

    Little has been published about the application profiles and development patterns of open source software (OSS) in health and medical informatics. This study explores these issues with an analysis of health and medical informatics related OSS projects on SourceForge, a large repository of open source projects. A search was conducted on the SourceForge website during the period from May 1 to 15, 2007, to identify health and medical informatics OSS projects. This search resulted in a sample of 174 projects. A Java-based parser was written to extract data for several of the key variables of each project. Several visually descriptive statistics were generated to analyze the profiles of the OSS projects. Many of the projects have sponsors, implying a growing interest in OSS among organizations. Sponsorship, we discovered, has a significant impact on project success metrics. Nearly two-thirds of the projects have a restrictive license type. Restrictive licensing may indicate tighter control over the development process. Our sample includes a wide range of projects that are at various stages of development (status). Projects targeted towards the advanced end user are primarily focused on bio-informatics, data formats, database and medical science applications. We conclude that there exists an active and thriving OSS development community that is focusing on health and medical informatics. A wide range of OSS applications are in development, from bio-informatics to hospital information systems. A profile of OSS in health and medical informatics emerges that is distinct and unique to the health care field. Future research can focus on OSS acceptance and diffusion and impact on cost, efficiency and quality of health care.

  3. Breaking Away

    ERIC Educational Resources Information Center

    Panettieri, Joseph C.

    2007-01-01

    This article discusses open source projects which may free universities from expensive, rigid commercial software. But will the rewards outweigh the potential risks? The Kuali Project involves multiple universities writing and sharing code for their financial and operational systems. Another, the Sakai Project, is a community source platform for…

  4. Earth-Observation based mapping and monitoring of exposure change in the megacity of Istanbul: open-source tools from the MARSITE project

    NASA Astrophysics Data System (ADS)

    De Vecchi, Daniele; Dell'Acqua, Fabio

    2016-04-01

    The EU FP7 MARSITE project aims at assessing the "state of the art" of seismic risk evaluation and management at European level, as a starting point to move a "step forward" towards new concepts of risk mitigation and management by long-term monitoring activities carried out both on land and at sea. Spaceborne Earth Observation (EO) is one of the means through which MARSITE is accomplishing this commitment, whose importance is growing as a consequence of the operational unfolding of the Copernicus initiative. Sentinel-2 data, with its open-data policy, represents an unprecedented opportunity to access global spaceborne multispectral data for various purposes including risk monitoring. In the framework of EU FP7 projects MARSITE, RASOR and SENSUM, our group has developed a suite of geospatial software tools to automatically extract risk-related features from EO data, especially on the exposure and vulnerability side of the "risk equation" [1]. These are for example the extension of a built-up area or the distribution of building density. These tools are available open-source as QGIS plug-ins [2] and their source code can be freely downloaded from GitHub [3]. A test case on the risk-prone mega city of Istanbul has been set up, and preliminary results will be presented in this paper. The output of the algorithms can be incorporated into a risk modeling process, whose output is very useful to stakeholders and decision makers who intend to assess and mitigate the risk level across the giant urban agglomerate. Keywords - Remote Sensing, Copernicus, Istanbul megacity, seismic risk, multi-risk, exposure, open-source References [1] Harb, M.M.; De Vecchi, D.; Dell'Acqua, F., "Physical Vulnerability Proxies from Remotes Sensing: Reviewing, Implementing and Disseminating Selected Techniques," Geoscience and Remote Sensing Magazine, IEEE , vol.3, no.1, pp.20,33, March 2015. doi: 10.1109/MGRS.2015.2398672 [2] SENSUM QGIS plugin, 2016, available online at: https://plugins.qgis.org/plugins/sensum_eo_tools/ [3] SENSUM QGIS code repository, 2016, available online at: https://github.com/SENSUM-project/sensum_rs_qgis

  5. THe Case Method of Instruction (CMI) Project. Final Report.

    ERIC Educational Resources Information Center

    McWilliam, P. J.; And Others

    This final report describes the Case Method of Instruction (CMI) Project, a project to develop, field test, and disseminate training materials to facilitate the use of the Case Method of Instruction by inservice and preservice instructors in developmental disabilities. CMI project activities focused on developing a collection of case stories and…

  6. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude distribution [Youngs and Coppersmith, 1985]. Notably, the software can deal with uncertainty in the seismicity input parameters such as maximum magnitude value. CRISIS offers a set of built-in GMPEs, as well as the possibility of defining new ones by providing information in a tabular format. Our study shows that in case of Ajaristkali HPP study area, significant contribution to Seismic Hazard comes from local sources with quite low Mmax values, thus these two attenuation lows give us quite different PGA and SA values.

  7. Numerical modeling of gas mixing and bio-chemical transformations during underground hydrogen storage within the project H2STORE

    NASA Astrophysics Data System (ADS)

    Hagemann, B.; Feldmann, F.; Panfilov, M.; Ganzer, L.

    2015-12-01

    The change from fossil to renewable energy sources is demanding an increasing amount of storage capacities for electrical energy. A promising technological solution is the storage of hydrogen in the subsurface. Hydrogen can be produced by electrolysis using excessive electrical energy and subsequently converted back into electricity by fuel cells or engine generators. The development of this technology starts with adding small amounts of hydrogen to the high pressure natural gas grid and continues with the creation of pure underground hydrogen storages. The feasibility of hydrogen storage in depleted gas reservoirs is investigated in the lighthouse project H2STORE financed by the German Ministry for Education and Research. The joint research project has project members from the University of Jena, the Clausthal University of Technology, the GFZ Potsdam and the French National Center for Scientic Research in Nancy. The six sub projects are based on laboratory experiments, numerical simulations and analytical work which cover the investigation of mineralogical, geochemical, physio-chemical, sedimentological, microbiological and gas mixing processes in reservoir and cap rocks. The focus in this presentation is on the numerical modeling of underground hydrogen storage. A mathematical model was developed which describes the involved coupled hydrodynamic and microbiological effects. Thereby, the bio-chemical reaction rates depend on the kinetics of microbial growth which is induced by the injection of hydrogen. The model has been numerically implemented on the basis of the open source code DuMuX. A field case study based on a real German gas reservoir was performed to investigate the mixing of hydrogen with residual gases and to discover the consequences of bio-chemical reactions.

  8. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  9. Integrating HCI Specialists into Open Source Software Development Projects

    NASA Astrophysics Data System (ADS)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  10. A logistic regression approach to model the willingness of consumers to adopt renewable energy sources

    NASA Astrophysics Data System (ADS)

    Ulkhaq, M. M.; Widodo, A. K.; Yulianto, M. F. A.; Widhiyaningrum; Mustikasari, A.; Akshinta, P. Y.

    2018-03-01

    The implementation of renewable energy in this globalization era is inevitable since the non-renewable energy leads to climate change and global warming; hence, it does harm the environment and human life. However, in the developing countries, such as Indonesia, the implementation of the renewable energy sources does face technical and social problems. For the latter, renewable energy sources implementation is only effective if the public is aware of its benefits. This research tried to identify the determinants that influence consumers’ intention in adopting renewable energy sources. In addition, this research also tried to predict the consumers who are willing to apply the renewable energy sources in their houses using a logistic regression approach. A case study was conducted in Semarang, Indonesia. The result showed that only eight variables (from fifteen) that are significant statistically, i.e., educational background, employment status, income per month, average electricity cost per month, certainty about the efficiency of renewable energy project, relatives’ influence to adopt the renewable energy sources, energy tax deduction, and the condition of the price of the non-renewable energy sources. The finding of this study could be used as a basis for the government to set up a policy towards an implementation of the renewable energy sources.

  11. Environmental Management and the New Politics of Western Water: The Animas-La Plata Project and Implementation of the Endangered Species Act.

    PubMed

    ELLISON

    1999-05-01

    / This paper explores the new politics of western water policy through an examination of the Animas-La Plata water project and implementation of the Endangered Species Act. It is suggested that the focus of western water programming has shifted from the source of distributed funds, the United States Congress, to the agencies originally created to deliver federal benefits because funding for new project construction has not been forthcoming. Under this new system, members of Congress continue to excite their constituents with promises of money for new project starts, while the administrative agencies perform the myriad duties needed to keep these projects alive. The result is that political objectives have replaced operational/management objectives in administrative processes. In this case, the author demonstrates how resource managers in the Bureau of Reclamation manipulated hydrological analysis to control administrative process, why their manipulation was unfair, and perhaps illegal, and why biologists from the US Fish and Wildlife Service accepted the analysis. While ostensibly protecting all interests, the result is that none of the objectives of federal water programming are achieved. KEY WORDS: Environmental management; Administrative politics; Water policy; Endangered Species Act; Animas-La Plata, Bureau of Reclamation

  12. Breathing motion compensated reconstruction for C-arm cone beam CT imaging: initial experience based on animal data

    NASA Astrophysics Data System (ADS)

    Schäfer, D.; Lin, M.; Rao, P. P.; Loffroy, R.; Liapi, E.; Noordhoek, N.; Eshuis, P.; Radaelli, A.; Grass, M.; Geschwind, J.-F. H.

    2012-03-01

    C-arm based tomographic 3D imaging is applied in an increasing number of minimal invasive procedures. Due to the limited acquisition speed for a complete projection data set required for tomographic reconstruction, breathing motion is a potential source of artifacts. This is the case for patients who cannot comply breathing commands (e.g. due to anesthesia). Intra-scan motion estimation and compensation is required. Here, a scheme for projection based local breathing motion estimation is combined with an anatomy adapted interpolation strategy and subsequent motion compensated filtered back projection. The breathing motion vector is measured as a displacement vector on the projections of a tomographic short scan acquisition using the diaphragm as a landmark. Scaling of the displacement to the acquisition iso-center and anatomy adapted volumetric motion vector field interpolation delivers a 3D motion vector per voxel. Motion compensated filtered back projection incorporates this motion vector field in the image reconstruction process. This approach is applied in animal experiments on a flat panel C-arm system delivering improved image quality (lower artifact levels, improved tumor delineation) in 3D liver tumor imaging.

  13. Iron-Nitride Alloy Magnets: Transformation Enabled Nitride Magnets Absent Rare Earths (TEN Mare)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-01-01

    REACT Project: Case Western is developing a highly magnetic iron-nitride alloy to use in the magnets that power electric motors found in EVs and renewable power generators. This would reduce the overall price of the motor by eliminating the expensive imported rare earth minerals typically found in today’s best commercial magnets. The iron-nitride powder is sourced from abundant and inexpensive materials found in the U.S. The ultimate goal of this project is to demonstrate this new magnet system, which contains no rare earths, in a prototype electric motor. This could significantly reduce the amount of greenhouse gases emitted in themore » U.S. each year by encouraging the use of clean alternatives to oil and coal.« less

  14. 18 CFR 4.51 - Contents of application.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the total project as proposed specifying any projected changes in the costs (life-cycle costs) over the estimated financing or licensing period if the applicant takes such changes into account... lowest cost alternative source, specifying any projected changes in the cost of power from that source...

  15. 18 CFR 4.51 - Contents of application.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... the total project as proposed specifying any projected changes in the costs (life-cycle costs) over the estimated financing or licensing period if the applicant takes such changes into account... lowest cost alternative source, specifying any projected changes in the cost of power from that source...

  16. 18 CFR 4.51 - Contents of application.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... the total project as proposed specifying any projected changes in the costs (life-cycle costs) over the estimated financing or licensing period if the applicant takes such changes into account... lowest cost alternative source, specifying any projected changes in the cost of power from that source...

  17. 18 CFR 4.51 - Contents of application.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the total project as proposed specifying any projected changes in the costs (life-cycle costs) over the estimated financing or licensing period if the applicant takes such changes into account... lowest cost alternative source, specifying any projected changes in the cost of power from that source...

  18. MINE WASTE TECHNOLOGY PROGRAM - UNDERGROUND MINE SOURCE CONTROL DEMONSTRATION PROJECT

    EPA Science Inventory

    This report presents results of the Mine Waste Technology Program Activity III, Project 8, Underground Mine Source Control Demonstration Project implemented and funded by the U. S. Environmental Protection Agency (EPA) and jointly administered by EPA and the U. S. Department of E...

  19. 18 CFR 4.51 - Contents of application.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... the total project as proposed specifying any projected changes in the costs (life-cycle costs) over the estimated financing or licensing period if the applicant takes such changes into account... lowest cost alternative source, specifying any projected changes in the cost of power from that source...

  20. Freeform array projection

    NASA Astrophysics Data System (ADS)

    Michaelis, D.; Schreiber, P.; Li, C.; Bräuer, A.; Gross, H.

    2015-09-01

    The concept of multichannel array projection is generalized in order to realize an ultraslim, highly efficient optical system for structured illumination with high lumen output, where additionally the Köhler illumination principle is utilized and source light homogenization occurs. The optical system consists of a multitude of neighboring optical channels. In each channel two optical freeforms generate a real or a virtual spatial light pattern and furthermore, the ray directions are modified to enable Köhler illumination of a subsequent projection lens. The internal light pattern may be additionally influenced by absorbing apertures or slides. The projection lens transfers the resulting light pattern to a target, where the total target distribution is produced by superposition of all individual channel output pattern. The optical system without absorbing apertures can be regarded as a generalization of a fly's eye condenser for structured illumination. In this case light pattern is exclusively generated by freeform light redistribution. The commonly occurring blurring effect for freeform beamshaping is reduced due to the creation of a virtual object light structure by means of the two freeform surfaces and its imaging towards the target. But, the remaining blurring inhibits very high spatial frequencies at the target. In order to create target features with very high spatial resolution the absorbing apertures can be utilized. In this case the freeform beamshaping can be used for an enhanced light transmission through the absorbing apertures. The freeform surfaces are designed by a generalized approach of Cartesian oval representation.

  1. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 23 Highways 1 2011-04-01 2011-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt of...

  2. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 23 Highways 1 2013-04-01 2013-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt of...

  3. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 23 Highways 1 2014-04-01 2014-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt of...

  4. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 23 Highways 1 2012-04-01 2012-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY... PROGRAM § 661.43 Can other sources of funds be used to finance a queued project in advance of receipt of...

  5. Sun light European Project

    NASA Astrophysics Data System (ADS)

    Soubielle, Marie-Laure

    2015-04-01

    2015 has been declared the year of light. Sunlight plays a major role in the world. From the sunbeams that heat our planet and feed our plants to the optical analysis of the sun or the modern use of sun particles in technologies, sunlight is everywhere and it is vital. This project aims to understand better the light of the Sun in a variety of fields. The experiments are carried out by students aged 15 to 20 in order to share their discoveries with Italian students from primary and secondary schools. The experiments will also be presented to a group of Danish students visiting our school in January. All experiments are carried out in English and involve teams of teachers. This project is 3 folds: part 1: Biological project = what are the mechanisms of photosynthesis? part 2: Optical project= what are the components of sunlight and how to use it? part 3: Technical project= how to use the energy of sunlight for modern devices? Photosynthesis project Biology and English Context:Photosynthesis is a process used by plants and other organisms to convert light energy, normally from the Sun, into chemical energy that can later fuel the organisms' activities. This chemical energy is stored in molecules which are synthesized from carbon dioxide and water. In most cases, oxygen is released as a waste product. Most plants perform photosynthesis. Photosynthesis maintains atmospheric oxygen levels and supplies all of the organic compounds and most of the energy necessary for life on Earth. Outcome: Our project consists in understanding the various steps of photosynthesis. Students will shoot a DVD of the experiments presenting the equipments required, the steps of the experiments and the results they have obtained for a better understanding of photosynthesis Digital pen project Electricity, Optics and English Context: Sunlight is a complex source of light based on white light that can be decomposed to explain light radiations or colours. This light is a precious source to create innovative devices. Outcome: In this project students will carry out various experiments to have a better optical understanding of sunlight. They will shoot tutorials and use these experiments to make a digital pen. Solar Impulse Project Model aircraft Technology, Electricity and English project Context : Solar Impulse is a solar plane that flew around the world with no stop using only the energy of the solar cells situated on its wings. The plane only requires an external source energy for take off unlike gliders. The pilot in the cockpit is one of the conception engineers. The plane can store enough energy for an 8-hour night flight. Outcome : This project will create a tutorial and a model aircraft of the plane Solar Impulse with solar cells providing energy for 4 engines, batteries, LED lighting and a tension reader. This plane will not fly.

  6. Alternative financing sources. ECRI. Emergency Care Research Institute.

    PubMed

    1987-01-01

    A number of new capital sources have been developed and used by health care institutions unable to finance high-tech projects with equity or conventional tax-exempt debt instruments; these include REITs, MLPs, per-use rentals, venture capital, and banks as brokers. However, there are no magic capital acquisition solutions. Institutions with good credit will continue to find a number of doors open to them; poorer credit risks will have fewer options, and those available will carry greater risk, allow for less provider control over projects, and limit potential return on investment to some extent. It is essential to examine carefully the drawbacks inherent in each type of alternative financing source. Venture capital in particular requires specific analysis because of the wide variety of possible arrangements that exist. If you cannot find either traditional or alternative sources of funding for a proposed project, you should reexamine the project and its underlying utilization projections and reimbursement assumptions.

  7. Metrics for Success in the Preservation of Scientific Data at the STFC Centre for Environmental Data Archival (CEDA).

    NASA Astrophysics Data System (ADS)

    Lawrence, B.; Pepler, S.

    2009-04-01

    CEDA (http://www.ceda.ac.uk) hosts three main data centres: the British Atmospheric Data Centre (http://badc.nerc.ac.uk), the NERC Earth Observation Data Centre (http://neodc.nerc.ac.uk), and the Intergovernmental Panel for Climate Change Dedicated Data Centre (http://ipcc-data.org) as well as components of many national and international projects. CEDA recieves both core funding (from the UK Natural Environment Research Council) and per project funding (from a variety of sources). However, all funders require metrics assessing success. In the case of preservation it is hard to measure success - usage alone is not enough, since next year someone may use currently unused data if it is well preserved, and so it is the act of preservation which in this case marks success. Even where data is accessed, it is not necessarily used. Hence at CEDA we have three key focii in our approach to metrics: measuring direct website access, benchmarking procedures against best practice, and hopefully soon, recording data citation. In this presentation we cover how we are addressing each of these three areas.

  8. Cancer Trends in Mexico: Essential Data for the Creation and Follow-Up of Public Policies

    PubMed Central

    Mohar-Betancourt, Alejandro; Armas-Texta, Daniel; Gutiérrez-Delgado, Cristina; Torres-Domínguez, Juan A.

    2017-01-01

    Purpose Cancer in a country like Mexico is a challenge for the current health system and for public health. However, the statistics about cancer in Mexico are scarce, so epidemiologic surveillance needs to be improved. The objectives of this article were to describe the extent of cancer and to estimate the national burden of cancer through 2020. Materials and Methods To meet this objective, an analysis of secondary official sources was performed. The cancer cases through 2020 were estimated on the basis of trends in mortality and the projection of incident cases reported by GLOBOCAN. Results In 2013, cancer was the cause of 12.84% of all deaths in Mexico. It is projected that the prevalence of cancer will be 904,581 by 2017 and will reach 1,262,861 by early in the next decade (ie, 2020). Conclusion Available data for cancer are incomplete. The development and implementation of population-based cancer registries in Mexico are essential. Assessment of the future outlook of cancer in Mexico will provide awareness of future challenges and can help health systems prepare to face them. PMID:29244991

  9. AIR TOXICS ASSESSMENT REFINEMENT IN RAPCA'S JURISDICTION - DAYTON, OH AREA

    EPA Science Inventory

    RAPCA has receive two grants to conduct this project. As part of the original project, RAPCA has improved and expanded their point source inventory by converting the following area sources to point sources: dry cleaners, gasoline throughput processes and halogenated solvent clea...

  10. Magnetic properties of a new obsidian source, west Antelope Creek, Grant County, New Mexico

    NASA Astrophysics Data System (ADS)

    Sternberg, R. S.; Samuels, R.; Feinberg, J. M.; Shackley, M. S.

    2013-12-01

    This work is part of a Keck Geology Consortium project on characterizing obsidian sources in New Mexico using magnetic and geochemical properties. We collected over 3,000 samples, many of which were georeferenced, from 10 obsidian sources at three locales - Mule creek, Mt. Taylor, and Valles Caldera. One of the Mule Creek sources, herein called the west Antelope Creek (wAC) source, was previously unknown. The 143 samples collected at this source covered about 1 km2, but were not individually georeferenced. We plan to characterize the magnetic and chemical properties of this source to see if it is distinguishable from other nearby sources and useful for provenancing archaeological obsidians. Initial measurements on 34 specimens from 20 samples show NRM values range from 1-80 Am2/kg, and low-field susceptibilities range from 1.2-96 x 10-8 mass specific SI units. When there were two specimens from the same sample, results were in good agreement. The measurements define a rather broad field in NRM-susceptibility space compared to other Southwestern sources examined to date, and a considerably larger field than from the nearby Antelope Creek (AC) source. The previously measured NRM and susceptibility values from AC are all in the high end on both dimensions of the wAC field, so that these fields overlap but in many cases could be distinguished.

  11. Comparative study of x ray and microwave emissions during solar flares

    NASA Technical Reports Server (NTRS)

    Winglee, Robert M.

    1993-01-01

    The work supported by the grant consisted of two projects. The first project involved making detailed case studies of two flares using SMM data in conjunction with ground based observations. The first flare occurred at 1454 UT on June 20, 1989 and involved the eruption of a prominence near the limb. In the study we used data from many wavelength regimes including the radio, H-alpha, hard X-rays, and soft X-rays. We used a full gyrosynchrotron code to model the apparent presence of a 1.4 GHz source early in the flare that was in the form of a large coronal loop. The model results lead us to conclude that the initial acceleration occurs in small, dense loops which also produced the flare's hard X-ray emission. We also found evidence that a source at 1.4 GHz later in the event was due to second harmonic plasma emission. This source was adjacent to a leg of the prominence and comes from a dense column of material in the magnetic structure supporting the prominence. Finally, we investigated a source of microwaves and soft X-rays, occurring approximately 10 min after the hard X-ray peak, and calculate a lower limit for the density of the source. The second flare that was studied occurred at 2156 UT on June 20, 1989 and was observed with the VLA and the Owens Valley Radio Observatory (OVRO) Frequency Agile Array. We have developed a gyrosynchrotron model of the sources at flare peak using a new gyrosynchrotron approximation which is valid at very low harmonics of the gyrofrequency. We found that the accelerated particle densities of the sources decreased much more with radius from the source center than had been supposed in previous work, while the magnetic field varied less. We also used the available data to analyze a highly polarized source which appeared late in the flare. The second project involved compiling a statistical base for the relative timing of the hard X-ray peak, the turbulent and blue-shift velocities inferred from soft X-ray line emissions observed by SMM and the microwave peak as determined from ground-based observations. This timing was then used to aid the testing of newly developed global models for flares that incorporate the global magnetic topology as well as the electron dynamics that are responsible for the hard X-rays and microwaves.

  12. In-country and lending institution environmental requirements for thermal power plants in the Philippines and India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, A.T.; Khanna, R.

    1996-11-01

    Diverse environmental reviews and approvals are required by both Government and non-government organizations (NGOs) for licensing or permitting of major thermal power plants in Asia; specifically, India and Philippines. The number and type of approvals required for a specific project vary depending on site characteristics, fuel source, project-specific design and operating parameters as well as type of project financing. A model 400 MW coal-fired project located in Asia is presented to illustrate the various lender and host country environmental guidelines. A case study of the environmental reviews and approvals for Ogden Quezon Power, Inc. Project (Quezon Province, Republic of themore » Philippines) is also included. A list of acronyms is provided at the paper`s end. As independent power project (IPP) developers seek financing for these capital-intensive infrastructure projects, a number of international finance/lending institutions are likely to become involved. Each lender considers different environmental aspects of a project. This paper compares relevant environmental requirements of various lenders which finance IPPs and their interest in a project`s environmental review. Finally, the authors of this paper believe that the environmental review process can bring together many parties involved with IPP development, including local and central governments, non government organizations, various lenders (such as multilateral and export credit agencies) as well as project proponents. Environmental review provides input opportunity for interested and affected parties. Airing environmental issues in open forums such as public hearings or meetings helps ensure projects are not evaluated without public input.« less

  13. The Consortium of Advanced Residential Buildings (CARB) - A Building America Energy Efficient Housing Partnership

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robb Aldrich; Lois Arena; Dianne Griffiths

    2010-12-31

    This final report summarizes the work conducted by the Consortium of Advanced Residential Buildings (CARB) (http://www.carb-swa.com/), one of the 'Building America Energy Efficient Housing Partnership' Industry Teams, for the period January 1, 2008 to December 31, 2010. The Building America Program (BAP) is part of the Department of Energy (DOE), Energy Efficiency and Renewable Energy, Building Technologies Program (BTP). The long term goal of the BAP is to develop cost effective, production ready systems in five major climate zones that will result in zero energy homes (ZEH) that produce as much energy as they use on an annual basis bymore » 2020. CARB is led by Steven Winter Associates, Inc. with Davis Energy Group, Inc. (DEG), MaGrann Associates, and Johnson Research, LLC as team members. In partnership with our numerous builders and industry partners, work was performed in three primary areas - advanced systems research, prototype home development, and technical support for communities of high performance homes. Our advanced systems research work focuses on developing a better understanding of the installed performance of advanced technology systems when integrated in a whole-house scenario. Technology systems researched included: - High-R Wall Assemblies - Non-Ducted Air-Source Heat Pumps - Low-Load HVAC Systems - Solar Thermal Water Heating - Ventilation Systems - Cold-Climate Ground and Air Source Heat Pumps - Hot/Dry Climate Air-to-Water Heat Pump - Condensing Boilers - Evaporative condensers - Water Heating CARB continued to support several prototype home projects in the design and specification phase. These projects are located in all five program climate regions and most are targeting greater than 50% source energy savings over the Building America Benchmark home. CARB provided technical support and developed builder project case studies to be included in near-term Joule Milestone reports for the following community scale projects: - SBER Overlook at Clipper Mill (mixed, humid climate) - William Ryan Homes - Tampa (hot, humid climate).« less

  14. Future of family support: Projected living arrangements and income sources of older people in Hong Kong up to 2030.

    PubMed

    Ng, Kok-Hoe

    2016-06-01

    The study aims to project future trends in living arrangements and access to children's cash contributions and market income sources among older people in Hong Kong. A cell-based model was constructed by combining available population projections, labour force projections, an extrapolation of the historical trend in living arrangements based on national survey datasets and a regression model on income sources. Under certain assumptions, the proportion of older people living with their children may decline from 59 to 48% during 2006-2030. Although access to market income sources may improve slightly, up to 20% of older people may have no access to either children's financial support or market income sources, and will not live with their children by 2030. Family support is expected to contract in the next two decades. Public pensions should be expanded to protect financially vulnerable older people. © 2015 AJA Inc.

  15. Identifying the institutional decision process to introduce decentralized sanitation in the city of Kunming (China).

    PubMed

    Medilanski, Edi; Chuan, Liang; Mosler, Hans-Joachim; Schertenleib, Roland; Larsen, Tove A

    2007-05-01

    We conducted a study of the institutional barriers to introducing urine source separation in the urban area of Kunming, China. On the basis of a stakeholder analysis, we constructed stakeholder diagrams showing the relative importance of decision-making power and (positive) interest in the topic. A hypothetical decision-making process for the urban case was derived based on a successful pilot project in a periurban area. All our results were evaluated by the stakeholders. We concluded that although a number of primary stakeholders have a large interest in testing urine source separation also in an urban context, most of the key stakeholders would be reluctant to this idea. However, the success in the periurban area showed that even a single, well-received pilot project can trigger the process of broad dissemination of new technologies. Whereas the institutional setting for such a pilot project is favorable in Kunming, a major challenge will be to adapt the technology to the demands of an urban population. Methodologically, we developed an approach to corroborate a stakeholder analysis with the perception of the stakeholders themselves. This is important not only in order to validate the analysis but also to bridge the theoretical gap between stakeholder analysis and stakeholder involvement. We also show that in disagreement with the assumption of most policy theories, local stakeholders consider informal decision pathways to be of great importance in actual policy-making.

  16. Identifying the Institutional Decision Process to Introduce Decentralized Sanitation in the City of Kunming (China)

    NASA Astrophysics Data System (ADS)

    Medilanski, Edi; Chuan, Liang; Mosler, Hans-Joachim; Schertenleib, Roland; Larsen, Tove A.

    2007-05-01

    We conducted a study of the institutional barriers to introducing urine source separation in the urban area of Kunming, China. On the basis of a stakeholder analysis, we constructed stakeholder diagrams showing the relative importance of decision-making power and (positive) interest in the topic. A hypothetical decision-making process for the urban case was derived based on a successful pilot project in a periurban area. All our results were evaluated by the stakeholders. We concluded that although a number of primary stakeholders have a large interest in testing urine source separation also in an urban context, most of the key stakeholders would be reluctant to this idea. However, the success in the periurban area showed that even a single, well-received pilot project can trigger the process of broad dissemination of new technologies. Whereas the institutional setting for such a pilot project is favorable in Kunming, a major challenge will be to adapt the technology to the demands of an urban population. Methodologically, we developed an approach to corroborate a stakeholder analysis with the perception of the stakeholders themselves. This is important not only in order to validate the analysis but also to bridge the theoretical gap between stakeholder analysis and stakeholder involvement. We also show that in disagreement with the assumption of most policy theories, local stakeholders consider informal decision pathways to be of great importance in actual policy-making.

  17. An open source software for fast grid-based data-mining in spatial epidemiology (FGBASE).

    PubMed

    Baker, David M; Valleron, Alain-Jacques

    2014-10-30

    Examining whether disease cases are clustered in space is an important part of epidemiological research. Another important part of spatial epidemiology is testing whether patients suffering from a disease are more, or less, exposed to environmental factors of interest than adequately defined controls. Both approaches involve determining the number of cases and controls (or population at risk) in specific zones. For cluster searches, this often must be done for millions of different zones. Doing this by calculating distances can lead to very lengthy computations. In this work we discuss the computational advantages of geographical grid-based methods, and introduce an open source software (FGBASE) which we have created for this purpose. Geographical grids based on the Lambert Azimuthal Equal Area projection are well suited for spatial epidemiology because they preserve area: each cell of the grid has the same area. We describe how data is projected onto such a grid, as well as grid-based algorithms for spatial epidemiological data-mining. The software program (FGBASE), that we have developed, implements these grid-based methods. The grid based algorithms perform extremely fast. This is particularly the case for cluster searches. When applied to a cohort of French Type 1 Diabetes (T1D) patients, as an example, the grid based algorithms detected potential clusters in a few seconds on a modern laptop. This compares very favorably to an equivalent cluster search using distance calculations instead of a grid, which took over 4 hours on the same computer. In the case study we discovered 4 potential clusters of T1D cases near the cities of Le Havre, Dunkerque, Toulouse and Nantes. One example of environmental analysis with our software was to study whether a significant association could be found between distance to vineyards with heavy pesticide. None was found. In both examples, the software facilitates the rapid testing of hypotheses. Grid-based algorithms for mining spatial epidemiological data provide advantages in terms of computational complexity thus improving the speed of computations. We believe that these methods and this software tool (FGBASE) will lower the computational barriers to entry for those performing epidemiological research.

  18. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  19. The Open Source Teaching Project (OSTP): Research Note.

    ERIC Educational Resources Information Center

    Hirst, Tony

    The Open Source Teaching Project (OSTP) is an attempt to apply a variant of the successful open source software approach to the development of educational materials. Open source software is software licensed in such a way as to allow anyone the right to modify and use it. From such a simple premise, a whole industry has arisen, most notably in the…

  20. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    DOE PAGES

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.; ...

    2018-01-12

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration ofmore » deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55°N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km 2) by 2300, 6.2 million km 2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20–200 years by high ecosystem productivity, such that talik peaks early (~2050s, although borehole data suggest sooner) and C source transition peaks late (~2150–2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January–February) soil warming at depth (~2m), (2) increasing cold-season emissions (November–April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO 2 emissions, and atmospheric 14CO 2 as key indicators of the permafrost C feedback« less

  1. Detecting the permafrost carbon feedback: talik formation and increased cold-season respiration as precursors to sink-to-source transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parazoo, Nicholas C.; Koven, Charles D.; Lawrence, David M.

    Thaw and release of permafrost carbon (C) due to climate change is likely to offset increased vegetation C uptake in northern high-latitude (NHL) terrestrial ecosystems. Models project that this permafrost C feedback may act as a slow leak, in which case detection and attribution of the feedback may be difficult. The formation of talik, a subsurface layer of perennially thawed soil, can accelerate permafrost degradation and soil respiration, ultimately shifting the C balance of permafrost-affected ecosystems from long-term C sinks to long-term C sources. It is imperative to understand and characterize mechanistic links between talik, permafrost thaw, and respiration ofmore » deep soil C to detect and quantify the permafrost C feedback. Here, we use the Community Land Model (CLM) version 4.5, a permafrost and biogeochemistry model, in comparison to long-term deep borehole data along North American and Siberian transects, to investigate thaw-driven C sources in NHL ( > 55°N) from 2000 to 2300. Widespread talik at depth is projected across most of the NHL permafrost region (14 million km 2) by 2300, 6.2 million km 2 of which is projected to become a long-term C source, emitting 10 Pg C by 2100, 50 Pg C by 2200, and 120 Pg C by 2300, with few signs of slowing. Roughly half of the projected C source region is in predominantly warm sub-Arctic permafrost following talik onset. This region emits only 20 Pg C by 2300, but the CLM4.5 estimate may be biased low by not accounting for deep C in yedoma. Accelerated decomposition of deep soil C following talik onset shifts the ecosystem C balance away from surface dominant processes (photosynthesis and litter respiration), but sink-to-source transition dates are delayed by 20–200 years by high ecosystem productivity, such that talik peaks early (~2050s, although borehole data suggest sooner) and C source transition peaks late (~2150–2200). The remaining C source region in cold northern Arctic permafrost, which shifts to a net source early (late 21st century), emits 5 times more C (95 Pg C) by 2300, and prior to talik formation due to the high decomposition rates of shallow, young C in organic-rich soils coupled with low productivity. Our results provide important clues signaling imminent talik onset and C source transition, including (1) late cold-season (January–February) soil warming at depth (~2m), (2) increasing cold-season emissions (November–April), and (3) enhanced respiration of deep, old C in warm permafrost and young, shallow C in organic-rich cold permafrost soils. Our results suggest a mosaic of processes that govern carbon source-to-sink transitions at high latitudes and emphasize the urgency of monitoring soil thermal profiles, organic C age and content, cold-season CO 2 emissions, and atmospheric 14CO 2 as key indicators of the permafrost C feedback« less

  2. Student Ownership of Projects in an Upper-Division Optics Laboratory Course: A Multiple Case Study of Successful Experiences

    ERIC Educational Resources Information Center

    Dounas-Frazer, Dimitri R.; Stanley, Jacob T.; Lewandowski, H. J.

    2017-01-01

    We investigate students' sense of ownership of multiweek final projects in an upper-division optics lab course. Using a multiple case study approach, we describe three student projects in detail. Within-case analyses focused on identifying key issues in each project, and constructing chronological descriptions of those events. Cross-case analysis…

  3. Luminous Binary Supersoft X-Ray Sources

    NASA Technical Reports Server (NTRS)

    DiStefano, Rosanne; Oliversen, Ronald J. (Technical Monitor)

    2002-01-01

    This grant was for the study of Luminous Supersoft X-Ray Sources (SSSs). During the first year a number of projects were completed and new projects were started. The projects include: 1) Time variability of SSSs 2) SSSs in M31; 3) Binary evolution scenarios; and 4) Acquiring new data.

  4. Teaching, Doing, and Sharing Project Management in a Studio Environment: The Development of an Instructional Design Open-Source Project Management Textbook

    ERIC Educational Resources Information Center

    Randall, Daniel L.; Johnson, Jacquelyn C.; West, Richard E.; Wiley, David A.

    2013-01-01

    In this article, the authors present an example of a project-based course within a studio environment that taught collaborative innovation skills and produced an open-source project management textbook for the field of instructional design and technology. While innovation plays an important role in our economy, and many have studied how to teach…

  5. Reviews on current carbon emission reduction technologies and projects and their feasibilities on ships

    NASA Astrophysics Data System (ADS)

    Wang, Haibin; Zhou, Peilin; Wang, Zhongcheng

    2017-06-01

    Concern about global climate change is growing, and many projects and researchers are committed to reducing greenhouse gases from all possible sources. International Maritime (IMO) has set a target of 20% CO2 reduction from shipping by 2020 and also presented a series of carbon emission reduction methods, which are known as Energy Efficiency Design Index (EEDI) and Energy Efficiency Operation Indicator (EEOI). Reviews on carbon emission reduction from all industries indicate that, Carbon Capture and Storage (CCS) is an excellent solution to global warming. In this paper, a comprehensive literature review of EEDI and EEOI and CCS is conducted and involves reviewing current policies, introducing common technologies, and considering their feasibilities for marine activities, mainly shipping. Current projects are also presented in this paper, thereby illustrating that carbon emission reduction has been the subject of attention from all over the world. Two case ship studies indicate the economic feasibility of carbon emission reduction and provide a guide for CCS system application and practical installation on ships.

  6. A pilot study of occupational envenomations in North American zoos and aquaria.

    PubMed

    Vohra, Rais; Clark, Rick; Shah, Nilofar

    2008-11-01

    To characterize occupational envenomations from exotic and native creatures, we surveyed North American zoos and aquaria. Survey questionnaires were mailed to curators at 216 zoos/aquaria which are accredited by the Association of Zoos and Aquariums (AZA) and listed on the AZA website. Reptile curators were asked to complete the zoo surveys. The questions addressed the number and types of bites, availability of antivenom (AV) on the premises, and sources of general information about envenoming. Responses were kept anonymous. A total of 216 surveys were mailed. The response rate was 58% for this pilot research project. Twenty-six (21%) of responding institutions replied that they had at least one incident of bite from a venomous species in the last 10 years. Species of animals included a variety of native and exotic terrestrial and marine species. There were no deaths or serious outcomes reported as complications of these incidents. Less than one-third of responding institutions reported having AVs on-site for medical use in case of envenomations. A variety of information sources, including internally developed protocols and poison center resources, were reported as sources of envenoming information for respondents. Clinicians and toxicologists should be prepared to care for cases of envenomations from exotic zoo or aquarium species such as the ones identified in this survey in their practice regions.

  7. Transportation Fuels and the Hydrogen Economy

    NASA Astrophysics Data System (ADS)

    Gabbard, Alex

    2004-11-01

    An energy analysis of transportation fuels is performed for comparing automobiles and fuels currently in the marketplace as real world benchmarks projected as "hydrogen economy" requirements. Comparisons are made for ideal case average energy values at Standard Temperature and Pressure (STP) at 20°C, 1 atmosphere with no loses. "Real world" benchmarks currently in the marketplace illuminate the challenges to be met if an equivalent "hydrogen economy" is to become reality. The idea of a "hydrogen economy" is that, at some time in the future, world energy needs will be supplied in part or totally from hydrogen; in part as compared to the current "petroleum economy" that is the source of most of the world's transportation fuels and only a portion of total energy use, or hydrogen as the source of all energy consumption.

  8. Tsunamis from Tectonic Sources along Caribbean Plate Boundaries

    NASA Astrophysics Data System (ADS)

    Lopez, A. M.; Chacon, S.; Zamora, N.; Audemard, F. A.; Dondin, F. J. Y.; Clouard, V.; Løvholt, F.; Harbitz, C. B.; Vanacore, E. A.; Huerfano Moreno, V. A.

    2015-12-01

    The Working Group 2 (WG2) of the Intergovernmental Coordination Group for the Tsunami and Other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (ICG/CARIBE-EWS) in charge of Tsunami Hazards Assessment, has generated a list of tsunami sources for the Caribbean region. Simulating these worst-case, most credible scenarios would provide an estimate of the resulting effects on coastal areas within the Caribbean. In the past few years, several publications have addressed this issue resulting in a collection of potential tsunami sources and scenarios. These publications come from a wide variety of sources; from government agencies to academic institutions. Although these provide the scientific community with a list of sources and scenarios, it was the interest of the WG2 to evaluate what has been proposed and develop a comprehensive list of sources, therefore leaving aside proposed scenarios. The seismo-tectonics experts of the Caribbean within the WG2 members were tasked to evaluate comprehensively which published sources are credible, worst-cases, and consider other sources that have been omitted from available reports. Among these published sources are the GEM Faulted Earth Subduction Characterization Project, and the LANTEX/Caribe Wave annual exercise publications (2009-2015). Caribbean tectonic features capable of generating tsunamis from seismic dislocation are located along the Northeastern Caribbean, the Lesser Antilles Trench, and the Panamá and Southern Caribbean Deformed Belts. The proposed sources have been evaluated based on historical and instrumental seismicity as well as geological and geophysical studies. This paper presents the sources and their justification as most-probable tsunami sources based on the context of crustal deformation due to Caribbean plate interacting with neighboring North and South America plates. Simulations of these sources is part of a subsequent phase in which effects of these tectonically induced tsunamis are evaluated in the near-field, and wave history snapshots are used to estimate arrival times and coastal effects at other locations within the Caribbean basin. This study is part of a contribution of the WG2 of ICG/CARIBE-EWS to UNESCO's Intergovernmental Oceanographic Commission.

  9. Affordances of students' using the World Wide Web as a publishing medium in project-based learning environments

    NASA Astrophysics Data System (ADS)

    Bos, Nathan Daniel

    This dissertation investigates the emerging affordance of the World Wide Web as a place for high school students to become authors and publishers of information. Two empirical studies lay groundwork for student publishing by examining learning issues related to audience adaptation in writing, motivation and engagement with hypermedia, design, problem-solving, and critical evaluation. Two models of student publishing on the World Wide Web were investigated over the course of two 11spth grade project-based science curriculums. In the first curricular model, students worked in pairs to design informative hypermedia projects about infectious diseases that were published on the Web. Four case studies were written, drawing on both product- and process-related data sources. Four theoretically important findings are illustrated through these cases: (1) multimedia, especially graphics, seemed to catalyze some students' design processes by affecting the sequence of their design process and by providing a connection between the science content and their personal interest areas, (2) hypermedia design can demand high levels of analysis and synthesis of science content, (3) students can learn to think about science content representation through engagement with challenging design tasks, and (4) students' consideration of an outside audience can be facilitated by teacher-given design principles. The second Web-publishing model examines how students critically evaluate scientific resources on the Web, and how students can contribute to the Web's organization and usability by publishing critical reviews. Students critically evaluated Web resources using a four-part scheme: summarization of content, content, evaluation of credibility, evaluation of organizational structure, and evaluation of appearance. Content analyses comparing students' reviews and reviewed Web documents showed that students were proficient at summarizing content of Web documents, identifying their publishing source, and evaluating their organizational features; however, students struggled to identify scientific evidence, bias, or sophisticated use of media in Web pages. Shortcomings were shown to be partly due to deficiencies in the Web pages themselves and partly due to students' inexperience with the medium or lack of critical evaluation skills. Future directions of this idea are discussed, including discussion of how students' reviews have been integrated into a current digital library development project.

  10. The economic value of remote sensing by satellite: An ERTS overview and the value of continuity of service. Volume 2: Source document

    NASA Technical Reports Server (NTRS)

    Andrews, J.; Donziger, A.; Hazelrigg, G. A., Jr.; Heiss, K. P.; Sand, F.; Stevenson, P.

    1974-01-01

    The economic value of an ERS system with a technical capability similar to ERTS, allowing for increased coverage obtained through the use of multiple active satellites in orbit is presented. A detailed breakdown of the benefits achievable from an ERS system is given and a methodology for their estimation is established. The ECON case studies in agriculture, water use, and land cover are described along with the current ERTS system. The cost for a projected ERS system is given.

  11. Flood and Weather Monitoring Using Real-time Twitter Data Streams

    NASA Astrophysics Data System (ADS)

    Demir, I.; Sit, M. A.; Sermet, M. Y.

    2016-12-01

    Social media data is a widely used source to making inference within public crisis periods and events in disaster times. Specifically, since Twitter provides large-scale data publicly in real-time, it is one of the most extensive resources with location information. This abstract provides an overview of a real-time Twitter analysis system to support flood preparedness and response using a comprehensive information-centric flood ontology and natural language processing. Within the scope of this project, we deal with acquisition and processing of real-time Twitter data streams. System fetches the tweets with specified keywords and classifies them as related to flooding or heavy weather conditions. The system uses machine learning algorithms to discover patterns using the correlation between tweets and Iowa Flood Information System's (IFIS) extensive resources. The system uses these patterns to forecast the formation and progress of a potential future flood event. While fetching tweets, predefined hashtags are used for filtering and enhancing the relevancy for selected tweets. With this project, tweets can also be used as an alternative data source where other data sources are not sufficient for specific tasks. During the disasters, the photos that people upload alongside their tweets can be collected and placed to appropriate locations on a mapping system. This allows decision making authorities and communities to see the most recent outlook of the disaster interactively. In case of an emergency, concentration of tweets can help the authorities to determine a strategy on how to reach people most efficiently while providing them the supplies they need. Thanks to the extendable nature of the flood ontology and framework, results from this project will be a guide for other natural disasters, and will be shared with the community.

  12. Development and Deployment of a Short Rotation Woody Crops Harvesting System Based on a Case New Holland Forage Harvester and SRC Woody Crop Header

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eisenbies, Mark; Volk, Timothy; Abrahamson, Lawrence

    Biomass for biofuels, bioproducts and bioenergy can be sourced from forests, agricultural crops, various residue streams, and dedicated woody or herbaceous crops. Short rotation woody crops (SRWC), like willow and hybrid poplar, are perennial cropping systems that produce a number of environmental and economic development benefits in addition to being a renewable source of biomass that can be produced on marginal land. Both hybrid poplar and willow have several characteristics that make them an ideal feedstock for biofuels, bioproducts, and bioenergy; these include high yields that can be obtained in three to four years, ease of cultivar propagation from dormantmore » cuttings, a broad underutilized genetic base, ease of breeding, ability to resprout after multiple harvests, and feedstock composition similar to other sources of woody biomass. Despite the range of benefits associated with SRWC systems, their deployment has been restricted by high costs, low market acceptance associated with inconsistent chip quality (see below for further explanation), and misperceptions about other feedstock characteristics (see below for further explanation). Harvesting of SRWC is the largest single cost factor (~1/3 of the final delivered cost) in the feedstock supply system. Harvesting is also the second largest input of primary fossil energy in the system after commercial N fertilizer, accounting for about one third of the input. Therefore, improving the efficiency of the harvesting system has the potential to reduce both cost and environmental impact. At the start of this project, we projected that improving the overall efficiency of the harvesting system by 25% would reduce the delivered cost of SRWC by approximately $0.50/MMBtu (or about $7.50/dry ton). This goal was exceeded over the duration of this project, as noted below.« less

  13. Enhancing participatory approach in water resources management: development of a survey to evaluate stakeholders needs and priorities related to software capabilities

    NASA Astrophysics Data System (ADS)

    Foglia, L.; Rossetto, R.; Borsi, I.; Josef, S.; Boukalova, Z.; Triana, F.; Ghetta, M.; Sabbatini, T.; Bonari, E.; Cannata, M.; De Filippis, G.

    2016-12-01

    The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management) aims at simplifying the application of EU-water related Directives, by developing an open source and public domain, GIS-integrated platform for planning and management of ground- and surface-water resources. The FREEWAT platform is conceived as a canvas, where several distributed and physically-based simulation codes are virtually integrated. The choice of such codes was supported by the result of a survey performed by means of questionnaires distributed to 14 case study FREEWAT project partners and several stakeholders. This was performed in the first phase of the project within the WP 6 (Enhanced science and participatory approach evidence-based decision making), Task 6.1 (Definition of a "needs/tools" evaluation grid). About 30% among all the invited entities and institutions from several EU and non-EU Countries expressed their interest in contributing to the survey. Most of them were research institutions, government and geoenvironmental companies and river basin authorities.The result of the questionnaire provided a spectrum of needs and priorities of partners/stakeholders, which were addressed during the development phase of the FREEWAT platform. The main needs identified were related to ground- and surface-water quality, sustainable water management, interaction between groundwater/surface-water bodies, and design and management of Managed Aquifer Recharge schemes. Needs and priorities were then connected to the specific EU Directives and Regulations to be addressed.One of the main goals of the questionnaires was to collect information and suggestions regarding the use of existing commercial/open-source software tools to address needs and priorities, and regarding the needs to address specific water-related processes/problems.

  14. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    NASA Astrophysics Data System (ADS)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over Spring 2015. All of the code, documentation and workflow description are currently available on GitHub and a public data portal is in development. We present a case study of how students reacted to the challenge of a real science problem, their interactions with end-users, what went right, and what could be done better in the future.

  15. Swept Source OCT Angiography of Neovascular Macular Telangiectasia Type 2

    PubMed Central

    Zhang, Qinqin; Wang, Ruikang K.; Chen, Chieh-Li; Legarreta, Andrew D.; Durbin, Mary K.; An, Lin; Sharma, Utkarsh; Stetson, Paul F.; Legarreta, John E.; Roisman, Luiz; Gregori, Giovanni; Rosenfeld, Philip J.

    2015-01-01

    Objective To image subretinal neovascularization in proliferative macular telangiectasia type 2 (MacTel2) using swept source optical coherence tomography based microangiography (OMAG). Study Design Patients with MacTel2 were enrolled in a prospective, observational study known as the MacTel Project and evaluated using a high-speed 1050nm swept-source OCT (SS-OCT) prototype system. The OMAG algorithm generated en face flow images from three retinal layers, as well as the region bounded by the outer retina and Bruch’s membrane, the choriocapillaris, and the remaining choroidal vasculature. The en face OMAG images were compared to images from fluorescein angiography (FA) and indocyanine green angiography (ICGA). Results Three eyes with neovascular MacTel2 were imaged. The neovascularization was best identified from the en face OMAG images that included a layer between the outer retinal boundary and Bruch’s membrane. OMAG images identified these abnormal vessels better than FA and were comparable to the images obtained using ICGA. In all three cases, OMAG identified choroidal vessels communicating with the neovascularization, and these choroidal vessels were evident in the two cases with ICGA imaging. In one case, monthly injections of bevacizumab reduced the microvascular complexity of the neovascularization, as well as the telangiectatic changes within the retinal microvasculature. In another case, less frequent bevacizumab therapy was associated with growth of the subretinal neovascular complex. Conclusions OMAG imaging provided detailed, depth-resolved information about subretinal neovascularization in MacTel2 eyes demonstrating superiority to FA imaging and similarities to ICGA imaging for documenting the retinal microvascular changes, the size and extent of the neovascular complex, the communications between the neovascular complex and the choroidal circulation, and the response to monthly bevacizumab therapy. PMID:26457402

  16. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    NASA Astrophysics Data System (ADS)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a second example, the option of seasonal hydrogen storage in a deep saline aquifer is considered. The induced thermal and hydraulic multiphase flow processes were simulated. Also, an integrative approach towards geophysical monitoring of gas presence was evaluated by synthetically applying these monitoring methods to the synthetic, however realistically defined numerical storage scenarios. Laboratory experiments provided parameterisations of geochemical effects caused by storage gas leakage into shallow aquifers in cases of sealing failure. Ultimately, the analysis of realistically defined scenarios of subsurface energy storage within the ANGUS+ project allows a quantification of the subsurface space claimed by a storage operation and its induced effects. Acknowledgments: This work is part of the ANGUS+ project (www.angusplus.de) and funded by the German Federal Ministry of Education and Research (BMBF) as part of the energy storage initiative "Energiespeicher".

  17. Financing Renewable Energy Projects in Developing Countries: A Critical Review

    NASA Astrophysics Data System (ADS)

    Donastorg, A.; Renukappa, S.; Suresh, S.

    2017-08-01

    Access to clean and stable energy, meeting sustainable development goals, the fossil fuel dependency and depletion are some of the reasons that have impacted developing countries to transform the business as usual economy to a more sustainable economy. However, access and availability of finance is a major challenge for many developing countries. Financing renewable energy projects require access to significant resources, by multiple parties, at varying points in the project life cycles. This research aims to investigate sources and new trends in financing RE projects in developing countries. For this purpose, a detail and in-depth literature review have been conducted to explore the sources and trends of current RE financial investment and projects, to understand the gaps and limitations. This paper concludes that there are various internal and external sources of finance available for RE projects in developing countries.

  18. Vector tomography for reconstructing electric fields with non-zero divergence in bounded domains

    NASA Astrophysics Data System (ADS)

    Koulouri, Alexandra; Brookes, Mike; Rimpiläinen, Ville

    2017-01-01

    In vector tomography (VT), the aim is to reconstruct an unknown multi-dimensional vector field using line integral data. In the case of a 2-dimensional VT, two types of line integral data are usually required. These data correspond to integration of the parallel and perpendicular projection of the vector field along the integration lines and are called the longitudinal and transverse measurements, respectively. In most cases, however, the transverse measurements cannot be physically acquired. Therefore, the VT methods are typically used to reconstruct divergence-free (or source-free) velocity and flow fields that can be reconstructed solely from the longitudinal measurements. In this paper, we show how vector fields with non-zero divergence in a bounded domain can also be reconstructed from the longitudinal measurements without the need of explicitly evaluating the transverse measurements. To the best of our knowledge, VT has not previously been used for this purpose. In particular, we study low-frequency, time-harmonic electric fields generated by dipole sources in convex bounded domains which arise, for example, in electroencephalography (EEG) source imaging. We explain in detail the theoretical background, the derivation of the electric field inverse problem and the numerical approximation of the line integrals. We show that fields with non-zero divergence can be reconstructed from the longitudinal measurements with the help of two sparsity constraints that are constructed from the transverse measurements and the vector Laplace operator. As a comparison to EEG source imaging, we note that VT does not require mathematical modeling of the sources. By numerical simulations, we show that the pattern of the electric field can be correctly estimated using VT and the location of the source activity can be determined accurately from the reconstructed magnitudes of the field.

  19. Two cases of intraocular infection with Alaria mesocercaria (Trematoda)

    PubMed

    McDonald, H R; Kazacos, K R; Schatz, H; Johnson, R N

    1994-04-15

    We encountered two cases of human intraocular infection with mesocercariae of Alaria (Trematoda), involving unrelated Asian men who had unilateral decreased vision. Both patients had pigmentary tracks in the retina, areas of active or healed retinitis, or both, and other signs of diffuse unilateral subacute neuroretinitis. Similar, nonnematode worms were seen in the patients' retinas and vitreous, respectively, several years after apparent infection. The worm in Case 1 was analyzed from projected fundus photographs and diagnosed as an Alaria mesocercaria on the basis of its shape, size (500 x 150 microns), and movement; it was successfully killed with laser. The worm in Case 2 was removed surgically from the vitreous and identified as A. mesocercaria, 555 x 190 microns, most likely A. americana. The probable source of infection in the patients was ingestion, in local restaurants, of undercooked frogs' legs containing A. mesocercaria. In addition to causing prolonged intraocular infection, A. mesocercaria was found to be a cause of diffuse unilateral subacute neuroretinitis, a condition previously attributed only to intraocular nematode larvae.

  20. Developing a Domain Ontology: the Case of Water Cycle and Hydrology

    NASA Astrophysics Data System (ADS)

    Gupta, H.; Pozzi, W.; Piasecki, M.; Imam, B.; Houser, P.; Raskin, R.; Ramachandran, R.; Martinez Baquero, G.

    2008-12-01

    A semantic web ontology enables semantic data integration and semantic smart searching. Several organizations have attempted to implement smart registration and integration or searching using ontologies. These are the NOESIS (NSF project: LEAD) and HydroSeek (NSF project: CUAHS HIS) data discovery engines and the NSF project GEON. All three applications use ontologies to discover data from multiple sources and projects. The NASA WaterNet project was established to identify creative, innovative ways to bridge NASA research results to real world applications, linking decision support needs to available data, observations, and modeling capability. WaterNet (NASA project) utilized the smart query tool Noesis as a testbed to test whether different ontologies (and different catalog searches) could be combined to match resources with user needs. NOESIS contains the upper level SWEET ontology that accepts plug in domain ontologies to refine user search queries, reducing the burden of multiple keyword searches. Another smart search interface was that developed for CUAHSI, HydroSeek, that uses a multi-layered concept search ontology, tagging variables names from any number of data sources to specific leaf and higher level concepts on which the search is executed. This approach has proven to be quite successful in mitigating semantic heterogeneity as the user does not need to know the semantic specifics of each data source system but just uses a set of common keywords to discover the data for a specific temporal and geospatial domain. This presentation will show tests with Noesis and Hydroseek lead to the conclusion that the construction of a complex, and highly heterogeneous water cycle ontology requires multiple ontology modules. To illustrate the complexity and heterogeneity of a water cycle ontology, Hydroseek successfully utilizes WaterOneFlow to integrate data across multiple different data collections, such as USGS NWIS. However,different methodologies are employed by the Earth Science, the Hydrological, and Hydraulic Engineering Communities, and each community employs models that require different input data. If a sub-domain ontology is created for each of these,describing water balance calculations, then the resulting structure of the semantic network describing these various terms can be rather complex, heterogeneous, and overlapping, and will require "mapping" between equivalent terms in the ontologies, along with the development of an upper level conceptual or domain ontology to utilize and link to those already in existence.

  1. Open Source Projects in Software Engineering Education: A Mapping Study

    ERIC Educational Resources Information Center

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  2. User Participation in Coproduction of Health Innovation: Proposal for a Synergy Project.

    PubMed

    Nygren, Jens; Zukauskaite, Elena; Westberg, Niklas

    2018-05-09

    This project concerns advancing knowledge, methods, and logic for user participation in coproduction of health innovations. Such advancement is vital for several reasons. From a user perspective, participation in coproduction provides an opportunity to gain real influence over goal definition, design, and implementation of health innovations, ensuring that the solution developed solves real problems in right ways. From a societal perspective, it's a mean to improve the efficiency of health care and the implementation of the Patient Act. As for industry, frameworks and knowledge of coproduction offer tools to operate in a complex sector, with great potential for innovation of services and products. The fundamental objective of this project is to advance knowledge and methods of how user participation in the coproduction of health innovations can be applied in order to benefit users, industry, and public sector. This project is a synergy project, which means that the objective will be accomplished through collaboration and meta-analysis between three subprojects that address different user groups, apply different strategies to promote human health, and relate to different parts of the health sector. Furthermore, subprojects focus on distinctive stages in the spectrum of innovation, with the objective to generate knowledge of the innovation process as a whole. The project is organized around three work packages related to three challenges-coproduction, positioning, and realization. Each subproject is designed such that it has its own field of study with clearly identified objectives but also targets work packages to contribute to the project as a whole. The work on the work packages will use case methodology for data collection and analysis based on the subprojects as data sources. More concretely, logic of multiple case studies will be applied with each subproject representing a separate case which is similar to each other in its attention to user participation in coproduction, but different regarding, for example, context and target groups. At the synergy level, the framework methodology will be used to handle and analyze the vast amount of information generated within the subprojects. The project period is from July 1, 2018 to June 30, 2022. By addressing the objective of this project, we will create new knowledge on how to manage challenges to health innovation associated with the coproduction process, the positioning of solutions, and realization. ©Jens Nygren, Elena Zukauskaite, Niklas Westberg. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.05.2018.

  3. Using Satellite Observations to Evaluate the AeroCOM Volcanic Emissions Inventory and the Dispersal of Volcanic SO2 Clouds in MERRA

    NASA Technical Reports Server (NTRS)

    Hughes, Eric J.; Krotkov, Nickolay; da Silva, Arlindo; Colarco, Peter

    2015-01-01

    Simulation of volcanic emissions in climate models requires information that describes the eruption of the emissions into the atmosphere. While the total amount of gases and aerosols released from a volcanic eruption can be readily estimated from satellite observations, information about the source parameters, like injection altitude, eruption time and duration, is often not directly known. The AeroCOM volcanic emissions inventory provides estimates of eruption source parameters and has been used to initialize volcanic emissions in reanalysis projects, like MERRA. The AeroCOM volcanic emission inventory provides an eruptions daily SO2 flux and plume top altitude, yet an eruption can be very short lived, lasting only a few hours, and emit clouds at multiple altitudes. Case studies comparing the satellite observed dispersal of volcanic SO2 clouds to simulations in MERRA have shown mixed results. Some cases show good agreement with observations Okmok (2008), while for other eruptions the observed initial SO2 mass is half of that in the simulations, Sierra Negra (2005). In other cases, the initial SO2 amount agrees with the observations but shows very different dispersal rates, Soufriere Hills (2006). In the aviation hazards community, deriving accurate source terms is crucial for monitoring and short-term forecasting (24-h) of volcanic clouds. Back trajectory methods have been developed which use satellite observations and transport models to estimate the injection altitude, eruption time, and eruption duration of observed volcanic clouds. These methods can provide eruption timing estimates on a 2-hour temporal resolution and estimate the altitude and depth of a volcanic cloud. To better understand the differences between MERRA simulations and volcanic SO2 observations, back trajectory methods are used to estimate the source term parameters for a few volcanic eruptions and compared to their corresponding entry in the AeroCOM volcanic emission inventory. The nature of these mixed results is discussed with respect to the source term estimates.

  4. WE-FG-BRA-06: Systematic Study of Target Localization for Bioluminescence Tomography Guided Radiation Therapy for Preclinical Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, B; Reyes, J; Wong, J

    Purpose: To overcome the limitation of CT/CBCT in guiding radiation for soft tissue targets, we developed a bioluminescence tomography(BLT) system for preclinical radiation research. We systematically assessed the system performance in target localization and the ability of resolving two sources in simulations, phantom and in vivo environments. Methods: Multispectral images acquired in single projection were used for the BLT reconstruction. Simulation studies were conducted for single spherical source radius from 0.5 to 3 mm at depth of 3 to 12 mm. The same configuration was also applied for the double sources simulation with source separations varying from 3 to 9more » mm. Experiments were performed in a standalone BLT/CBCT system. Two sources with 3 and 4.7 mm separations placed inside a tissue-mimicking phantom were chosen as the test cases. Live mice implanted with single source at 6 and 9 mm depth, 2 sources with 3 and 5 mm separation at depth of 5 mm or 3 sources in the abdomen were also used to illustrate the in vivo localization capability of the BLT system. Results: Simulation and phantom results illustrate that our BLT can provide 3D source localization with approximately 1 mm accuracy. The in vivo results are encouraging that 1 and 1.7 mm accuracy can be attained for the single source case at 6 and 9 mm depth, respectively. For the 2 sources study, both sources can be distinguished at 3 and 5 mm separations at approximately 1 mm accuracy using 3D BLT but not 2D bioluminescence image. Conclusion: Our BLT/CBCT system can be potentially applied to localize and resolve targets at a wide range of target sizes, depths and separations. The information provided in this study can be instructive to devise margins for BLT-guided irradiation and suggests that the BLT could guide radiation for multiple targets, such as metastasis. Drs. John W. Wong and Iulian I. Iordachita receive royalty payment from a licensing agreement between Xstrahl Ltd and Johns Hopkins University.« less

  5. The Energy Economics of Financial Structuring for Renewable Energy Projects

    NASA Astrophysics Data System (ADS)

    Rana, Vishwajeet

    2011-12-01

    This dissertation focuses on the various financial structuring options for the renewable energy sector. The projects in this sector are capital-intensive to build but have relatively low operating costs in the long run when compared to traditional energy resources. The large initial capital requirements tend to discourage investors. To encourage renewable investments the government needs to provide financial incentives. Since these projects ultimately generate returns, the government's monetary incentives go to the sponsors and tax equity investors who build and operate such projects and invest capital in them. These incentives are usually in the form of ITCs, PTCs and accelerated depreciation benefits. Also, in some parts of the world, carbon credits are another form of incentive for the sponsors and equity investors to invest in such turnkey projects. The relative importance of these various considerations, however, differs from sponsor to sponsor, investor to investor and from project to project. This study focuses mainly on the US market, the federal tax benefits and incentives provided by the government. This study focuses on the energy economics that are used for project decision-making and parties involved in the transaction as: Project Developer/Sponsor, Tax equity investor, Debt investor, Energy buyer and Tax regulator. The study fulfils the knowledge gap in the decision making process that takes advantage of tax monetization in traditional after-tax analysis for renewable energy projects if the sponsors do not have the tax capacity to realize the total benefits of the project. A case-study for a wind farm, using newly emerging financial structures, validates the hypothesis that these renewable energy sources can meet energy industry economic criteria. The case study also helps to validate the following hypotheses: a) The greater a sponsor's tax appetite, the tower the sponsor's equity dilution. b) The use of leverage increases the cost of equity financing and the financing fee. c) Capital contributions by the sponsor are not relevant to the rate of return (IRR) over the life of the project. Overall conclusion is that financial structures can have a major impact on renewable energy, meeting energy demand in an economic manner. At the end, the dissertation lays down the foundation for future research that can be conducted in this field. Key Words: Renewable energy investments, structured finance, financial structuring

  6. Uncertainty Analyses for Back Projection Methods

    NASA Astrophysics Data System (ADS)

    Zeng, H.; Wei, S.; Wu, W.

    2017-12-01

    So far few comprehensive error analyses for back projection methods have been conducted, although it is evident that high frequency seismic waves can be easily affected by earthquake depth, focal mechanisms and the Earth's 3D structures. Here we perform 1D and 3D synthetic tests for two back projection methods, MUltiple SIgnal Classification (MUSIC) (Meng et al., 2011) and Compressive Sensing (CS) (Yao et al., 2011). We generate synthetics for both point sources and finite rupture sources with different depths, focal mechanisms, as well as 1D and 3D structures in the source region. The 3D synthetics are generated through a hybrid scheme of Direct Solution Method and Spectral Element Method. Then we back project the synthetic data using MUSIC and CS. The synthetic tests show that the depth phases can be back projected as artificial sources both in space and time. For instance, for a source depth of 10km, back projection gives a strong signal 8km away from the true source. Such bias increases with depth, e.g., the error of horizontal location could be larger than 20km for a depth of 40km. If the array is located around the nodal direction of direct P-waves the teleseismic P-waves are dominated by the depth phases. Therefore, back projections are actually imaging the reflection points of depth phases more than the rupture front. Besides depth phases, the strong and long lasted coda waves due to 3D effects near trench can lead to additional complexities tested here. The strength contrast of different frequency contents in the rupture models also produces some variations to the back projection results. In the synthetic tests, MUSIC and CS derive consistent results. While MUSIC is more computationally efficient, CS works better for sparse arrays. In summary, our analyses indicate that the impact of various factors mentioned above should be taken into consideration when interpreting back projection images, before we can use them to infer the earthquake rupture physics.

  7. Source of polarised deuterons. (JINR accelerator complex)

    NASA Astrophysics Data System (ADS)

    Fimushkin, V. V.; Belov, A. S.; Kovalenko, A. D.; Kutuzova, L. V.; Prokofichev, Yu. V.; Shimanskiy, S. S.; Vadeev, V. P.

    2008-08-01

    The proposed project assumes the development of a universal high-intensity source of polarized deuterons (protons) using a charge-exchange plasma ionizer. The design output current of the source will be up to 10mA for ↑ D+(↑ H+) and polarization will be up to 90% of the maximal vector (±1) and tensor (+1,-2) polarization. The project is based on the equipment which was supplied within the framework of an agreement between JINR and IUCF (Bloomington, USA). The project will be realized in close cooperation with INR (Moscow, Russia). The source will be installed in the linac hall (LU-20) and polarization of beams will be measured at the output of LU-20. The main purpose of the project is to increase the intensity of the accelerated polarized beams at the JINR Accelerator Complex up to 1010 d/pulse. Calculations and first accelerator runs have shown that the depolarization resonances are absent for the deuteron beam in the entire energy range of the NUCLOTRON. The source could be transformed into a source of polarized negative ions if necessary. The period of reliable operation without participation of the personnel should be within 1000 hours. The project should be implemented within two to two and a half years from the start of funding.

  8. Nonpoint Source Tribal: Award Projects

    EPA Pesticide Factsheets

    Tribal CWA section 319 funding is awarded via base grants and competitive grants. To learn about current nonpoint source funded work in Indian Country, see the project summary descriptions of recent competitive grant awardees.

  9. Synthesis and Comparison of Baseline Avian and Bat Use, Raptor Nesting and Mortality Information from Proposed and Existing Wind Developments: Final Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Wallace P.

    2002-12-01

    Primarily due to concerns generated from observed raptor mortality at the Altamont Pass (CA) wind plant, one of the first commercial electricity generating wind plants in the U.S., new proposed wind projects both within and outside of California have received a great deal of scrutiny and environmental review. A large amount of baseline and operational monitoring data have been collected at proposed and existing U.S. wind plants. The primary use of the avian baseline data collected at wind developments has been to estimate the overall project impacts (e.g., very low, low, moderate, and high relative mortality) on birds, especially raptorsmore » and sensitive species (e.g., state and federally listed species). In a few cases, these data have also been used for guiding placement of turbines within a project boundary. This new information has strengthened our ability to accurately predict and mitigate impacts from new projects. This report should assist various stakeholders in the interpretation and use of this large information source in evaluating new projects. This report also suggests that the level of baseline data (e.g., avian use data) required to adequately assess expected impacts of some projects may be reduced. This report provides an evaluation of the ability to predict direct impacts on avian resources (primarily raptors and waterfowl/waterbirds) using less than an entire year of baseline avian use data (one season, two seasons, etc.). This evaluation is important because pre-construction wildlife surveys can be one of the most time-consuming aspects of permitting wind power projects. For baseline data, this study focuses primarily on standardized avian use data usually collected using point count survey methodology and raptor nest survey data. In addition to avian use and raptor nest survey data, other baseline data is usually collected at a proposed project to further quantify potential impacts. These surveys often include vegetation mapping and state or federal sensitive-status wildlife and plant surveys if there is a likelihood of these species occurring in the vicinity of the project area. This report does not address these types of surveys, however, it is assumed in this document that those surveys are conducted when appropriate to help further quantify potential impacts. The amount and extent of ecological baseline data to collect at a wind project should be determined on a case-by-case basis. The decision should use information gained from this report, recent information from new projects (e.g., Stateline OR/WA), existing project site data from agencies and other knowledgeable groups/individuals, public scoping, and results of vegetation and habitat mapping. Other factors that should also be considered include the likelihood of the presence of sensitive species at the site and expected impacts to those species, project size and project layout.« less

  10. Multi-objective Optimization of Solar-driven Hollow-fiber Membrane Distillation Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nenoff, Tina M.; Moore, Sarah E.; Mirchandani, Sera

    Securing additional water sources remains a primary concern for arid regions in both the developed and developing world. Climate change is causing fluctuations in the frequency and duration of precipitation, which can be can be seen as prolonged droughts in some arid areas. Droughts decrease the reliability of surface water supplies, which forces communities to find alternate primary water sources. In many cases, ground water can supplement the use of surface supplies during periods of drought, reducing the need for above-ground storage without sacrificing reliability objectives. Unfortunately, accessible ground waters are often brackish, requiring desalination prior to use, and underdevelopedmore » infrastructure and inconsistent electrical grid access can create obstacles to groundwater desalination in developing regions. The objectives of the proposed project are to (i) mathematically simulate the operation of hollow fiber membrane distillation systems and (ii) optimize system design for off-grid treatment of brackish water. It is anticipated that methods developed here can be used to supply potable water at many off-grid locations in semi-arid regions including parts of the Navajo Reservation. This research is a collaborative project between Sandia and the University of Arizona.« less

  11. Definition and test of the electromagnetic immunity of UAS for first responders

    NASA Astrophysics Data System (ADS)

    Adami, C.; Chmel, S.; Jöster, M.; Pusch, T.; Suhrke, M.

    2015-11-01

    Recent technological developments considerably lowered the barrier for unmanned aerial systems (UAS) to be employed in a variety of usage scenarios, comprising live video transmission from otherwise inaccessible vantage points. As an example, in the French-German ANCHORS project several UAS guided by swarm intelligence provide aerial views and environmental data of a disaster site while deploying an ad-hoc communication network for first responders. Since being able to operate in harsh environmental conditions is a key feature, the immunity of the UAS against radio frequency (RF) exposure has been studied. Conventional Electromagnetic Compatibility (EMC) applied to commercial and industrial electronics is not sufficient since UAS are airborne and can as such move beyond the bounds within which RF exposure is usually limited by regulatory measures. Therefore, the EMC requirements have been complemented by a set of specific RF test frequencies and parameters where strong sources are expected to interfere in the example project test case of an inland port environment. While no essential malfunctions could be observed up to field strengths of 30 V m-1, a sophisticated, more exhaustive approach for testing against potential sources of interference in key scenarios of UAS usage should be derived from our present findings.

  12. A study of ecological sanitation as an integrated urban water supply system: case study of sustainable strategy for Kuching City, Sarawak, Malaysia.

    PubMed

    Seng, Darrien Mah Yau; Putuhena, Frederik Josep; Said, Salim; Ling, Law Puong

    2009-03-01

    A city consumes a large amount of water. Urban planning and development are becoming more compelling due to the fact of growing competition for water, which has lead to an increasing and conflicting demand. As such, investments in water supply, sanitation and water resources management is a strong potential for a solid return. A pilot project of greywater ecological treatment has been established in Kuching city since 2003. Such a treatment facility opens up an opportunity of wastewater reclamation for reuse as secondary sources of water for non-consumptive purposes. This paper aims to explore the potential of the intended purposes in the newly developed ecological treatment project. By utilizing the Wallingford Software model, InfoWorks WS (Water Supply) is employed to carry out a hydraulic modeling of a hypothetical greywater recycling system as an integrated part of the Kuching urban water supply, where the greywater is treated, recycled and reused in the domestic environment. The modeling efforts have shown water savings of about 40% from the investigated system reinstating that the system presents an alternative water source worth exploring in an urban environment.

  13. Polycyclic Aromatic Hydrocarbons and Esophageal Squamous Cell Carcinoma-A Review

    PubMed Central

    Roshandel, Gholamreza; Semnani, Shahryar; Malekzadeh, Reza; Dawsey, Sanford M.

    2018-01-01

    Esophageal cancer (EC) is the 8th most common cancer and the 6th most frequent cause of cancer mortality worldwide. Esophageal squamous cell carcinoma (ESCC) is the most common type of EC. Exposure to polycyclic aromatic hydrocarbons (PAHs) has been suggested as a risk factor for developing ESCC. In this paper we will review different aspects of the relationship between PAH exposure and ESCC. PAHs are a group of compounds that are formed by incomplete combustion of organic matter. Studies in humans have shown an association between PAH exposure and development of ESCC in many populations. The results of a recent case-control study in a high risk population in northeastern Iran showed a dramatic dose-response relationship between PAH content in non-tumor esophageal tissue (the target tissue for esophageal carcinogenesis) and ESCC case status, consistent with a causal role for PAH exposure in the pathogenesis of ESCC. Identifying the main sources of exposure to PAHs may be the first and most important step in designing appropriate PAH-reduction interventions for controlling ESCC, especially in high risk areas. Coal smoke and drinking mate have been suggested as important modifiable sources of PAH exposure in China and Brazil, respectively. But the primary source of exposure to PAHs in other high risk areas for ESCC, such as northeastern Iran, has not yet been identified. Thus, environmental studies to determining important sources of PAH exposure should be considered as a high priority in future research projects in these areas. PMID:23102250

  14. 18 CFR 4.103 - General provisions for case-specific exemption.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ..., EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.103 General provisions for case-specific exemption. (a) Exemptible projects. Subject to... exempt on a case-specific basis any small hydroelectric power project from all or part of Part I of the...

  15. 18 CFR 4.103 - General provisions for case-specific exemption.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ..., EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.103 General provisions for case-specific exemption. (a) Exemptible projects. Subject to... exempt on a case-specific basis any small hydroelectric power project from all or part of Part I of the...

  16. 18 CFR 4.103 - General provisions for case-specific exemption.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.103 General provisions for case-specific exemption. (a) Exemptible projects. Subject to... exempt on a case-specific basis any small hydroelectric power project from all or part of Part I of the...

  17. 18 CFR 4.103 - General provisions for case-specific exemption.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ..., EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.103 General provisions for case-specific exemption. (a) Exemptible projects. Subject to... exempt on a case-specific basis any small hydroelectric power project from all or part of Part I of the...

  18. 18 CFR 4.103 - General provisions for case-specific exemption.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ..., EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.103 General provisions for case-specific exemption. (a) Exemptible projects. Subject to... exempt on a case-specific basis any small hydroelectric power project from all or part of Part I of the...

  19. VISAGE Visualization for Integrated Satellite, Airborne and Ground-Based Data Exploration

    NASA Technical Reports Server (NTRS)

    Conover, Helen; Berendes, Todd; Naeger, Aaron; Maskey, Manil; Gatlin, Patrick; Wingo, Stephanie; Kulkarni, Ajinkya; Gupta, Shivangi; Nagaraj, Sriraksha; Wolff, David; hide

    2017-01-01

    The primary goal of the VISAGE project is to facilitate more efficient Earth Science investigations via a tool that can provide visualization and analytic capabilities for diverse coincident datasets. This proof-of-concept project will be centered around the GPM Ground Validation program, which provides a valuable source of intensive, coincident observations of atmospheric phenomena. The data are from a wide variety of ground-based, airborne and satellite instruments, with a wide diversity in spatial and temporal scales, variables, and formats, which makes these data difficult to use together. VISAGE will focus on "golden cases" where most ground instruments were in operation and multiple research aircraft sampled a significant weather event, ideally while the GPM Core Observatory passed overhead. The resulting tools will support physical process studies as well as satellite and model validation.

  20. A Note on Compiling Fortran

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Busby, L. E.

    Fortran modules tend to serialize compilation of large Fortran projects, by introducing dependencies among the source files. If file A depends on file B, (A uses a module defined by B), you must finish compiling B before you can begin compiling A. Some Fortran compilers (Intel ifort, GNU gfortran and IBM xlf, at least) offer an option to ‘‘verify syntax’’, with the side effect of also producing any associated Fortran module files. As it happens, this option usually runs much faster than the object code generation and optimization phases. For some projects on some machines, it can be advantageous tomore » compile in two passes: The first pass generates the module files, quickly; the second pass produces the object files, in parallel. We achieve a 3.8× speedup in the case study below.« less

  1. Corporate Mortality Files and Late Industrial Necropolitics.

    PubMed

    Little, Peter C

    2017-10-05

    This article critically examines the corporate production, archival politics, and socio-legal dimensions of corporate mortality files (CMFs), the largest corporate archive developed by IBM to systematically document industrial exposures and occupational health outcomes for electronics workers. I first provide a history of IBM's CMF project, which amounts to a comprehensive mortality record for IBM employees over the past 40 years. Next, I explore a recent case in Endicott, New York, birthplace of IBM, where the U.S. National Institute for Occupational Safety and Health studied IBM's CMFs for workers at IBM's former Endicott plant. Tracking the production of the IBM CMF, the strategic avoidance of this source of big data as evidence for determining a recent legal settlement, alongside local critiques of the IBM CMF project, the article develops what I call "late industrial necropolitics." © 2017 by the American Anthropological Association.

  2. Efficient management of high level XMM-Newton science data products

    NASA Astrophysics Data System (ADS)

    Zolotukhin, Ivan

    2015-12-01

    Like it is the case for many large projects, XMM-Newton data have been used by the community to produce many valuable higher level data products. However, even after 15 years of the successful mission operation, the potential of these data is not yet fully uncovered, mostly due to the logistical and data management issues. We present a web application, http://xmm-catalog.irap.omp.eu, to highlight an idea that existing public high level data collections generate significant added research value when organized and exposed properly. Several application features such as access to the all-time XMM-Newton photon database and online fitting of extracted sources spectra were never available before. In this talk we share best practices we worked out during the development of this website and discuss their potential use for other large projects generating astrophysical data.

  3. Rapid kinematic finite source inversion for Tsunamic Early Warning using high rate GNSS data

    NASA Astrophysics Data System (ADS)

    Chen, K.; Liu, Z.; Song, Y. T.

    2017-12-01

    Recently, Global Navigation Satellite System (GNSS) has been used for rapid earthquake source inversion towards tsunami early warning. In practice, two approaches, i.e., static finite source inversion based on permanent co-seismic offsets and kinematic finite source inversion using high-rate (>= 1 Hz) co-seismic displacement waveforms, are often employed to fulfill the task. The static inversion is relatively easy to be implemented and does not require additional constraints on rupture velocity, duration, and temporal variation. However, since most GNSS receivers are deployed onshore locating on one side of the subduction fault, there is very limited resolution on near-trench fault slip using GNSS in static finite source inversion. On the other hand, the high-rate GNSS displacement waveforms, which contain the timing information of earthquake rupture explicitly and static offsets implicitly, have the potential to improve near-trench resolution by reconciling with the depth-dependent megathrust rupture behaviors. In this contribution, we assess the performance of rapid kinematic finite source inversion using high-rate GNSS by three selected historical tsunamigenic cases: the 2010 Mentawai, 2011 Tohoku and 2015 Illapel events. With respect to the 2010 Mentawai case, it is a typical tsunami earthquake with most slip concentrating near the trench. The static inversion has little resolution there and incorrectly puts slip at greater depth (>10km). In contrast, the recorded GNSS displacement waveforms are deficit in high-frequency energy, the kinematic source inversion recovers a shallow slip patch (depth less than 6 km) and tsunami runups are predicted quite reasonably. For the other two events, slip from kinematic and static inversion show similar characteristics and comparable tsunami scenarios, which may be related to dense GNSS network and behavior of the rupture. Acknowledging the complexity of kinematic source inversion in real-time, we adopt the back-projection approach to provide constraint on rupture velocity.

  4. The Ensembl genome database project.

    PubMed

    Hubbard, T; Barker, D; Birney, E; Cameron, G; Chen, Y; Clark, L; Cox, T; Cuff, J; Curwen, V; Down, T; Durbin, R; Eyras, E; Gilbert, J; Hammond, M; Huminiecki, L; Kasprzyk, A; Lehvaslaiho, H; Lijnzaad, P; Melsopp, C; Mongin, E; Pettett, R; Pocock, M; Potter, S; Rust, A; Schmidt, E; Searle, S; Slater, G; Smith, J; Spooner, W; Stabenau, A; Stalker, J; Stupka, E; Ureta-Vidal, A; Vastrik, I; Clamp, M

    2002-01-01

    The Ensembl (http://www.ensembl.org/) database project provides a bioinformatics framework to organise biology around the sequences of large genomes. It is a comprehensive source of stable automatic annotation of the human genome sequence, with confirmed gene predictions that have been integrated with external data sources, and is available as either an interactive web site or as flat files. It is also an open source software engineering project to develop a portable system able to handle very large genomes and associated requirements from sequence analysis to data storage and visualisation. The Ensembl site is one of the leading sources of human genome sequence annotation and provided much of the analysis for publication by the international human genome project of the draft genome. The Ensembl system is being installed around the world in both companies and academic sites on machines ranging from supercomputers to laptops.

  5. The role of economic evaluation in the decision-making process of family physicians: design and methods of a qualitative embedded multiple-case study

    PubMed Central

    Lessard, Chantale; Contandriopoulos, André-Pierre; Beaulieu, Marie-Dominique

    2009-01-01

    Background A considerable amount of resource allocation decisions take place daily at the point of the clinical encounter; especially in primary care, where 80 percent of health problems are managed. Ignoring economic evaluation evidence in individual clinical decision-making may have a broad impact on the efficiency of health services. To date, almost all studies on the use of economic evaluation in decision-making used a quantitative approach, and few investigated decision-making at the clinical level. An important question is whether economic evaluations affect clinical practice. The project is an intervention research study designed to understand the role of economic evaluation in the decision-making process of family physicians (FPs). The contributions of the project will be from the perspective of Pierre Bourdieu's sociological theory. Methods/design A qualitative research strategy is proposed. We will conduct an embedded multiple-case study design. Ten case studies will be performed. The FPs will be the unit of analysis. The sampling strategies will be directed towards theoretical generalization. The 10 selected cases will be intended to reflect a diversity of FPs. There will be two embedded units of analysis: FPs (micro-level of analysis) and field of family medicine (macro-level of analysis). The division of the determinants of practice/behaviour into two groups, corresponding to the macro-structural level and the micro-individual level, is the basis for Bourdieu's mode of analysis. The sources of data collection for the micro-level analysis will be 10 life history interviews with FPs, documents and observational evidence. The sources of data collection for the macro-level analysis will be documents and 9 open-ended, focused interviews with key informants from medical associations and academic institutions. The analytic induction approach to data analysis will be used. A list of codes will be generated based on both the original framework and new themes introduced by the participants. We will conduct within-case and cross-case analyses of the data. Discussion The question of the role of economic evaluation in FPs' decision-making is of great interest to scientists, health care practitioners, managers and policy-makers, as well as to consultants, industry, and society. It is believed that the proposed research approach will make an original contribution to the development of knowledge, both empirical and theoretical. PMID:19210787

  6. Open-source hardware for medical devices

    PubMed Central

    2016-01-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device. PMID:27158528

  7. Open-source hardware for medical devices.

    PubMed

    Niezen, Gerrit; Eslambolchilar, Parisa; Thimbleby, Harold

    2016-04-01

    Open-source hardware is hardware whose design is made publicly available so anyone can study, modify, distribute, make and sell the design or the hardware based on that design. Some open-source hardware projects can potentially be used as active medical devices. The open-source approach offers a unique combination of advantages, including reducing costs and faster innovation. This article compares 10 of open-source healthcare projects in terms of how easy it is to obtain the required components and build the device.

  8. IEA Wind Task 26 - Multi-national Case Study of the Financial Cost of Wind Energy; Work Package 1 Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwabe, P.; Lensink, S.; Hand, M.

    2011-03-01

    The lifetime cost of wind energy is comprised of a number of components including the investment cost, operation and maintenance costs, financing costs, and annual energy production. Accurate representation of these cost streams is critical in estimating a wind plant's cost of energy. Some of these cost streams will vary over the life of a given project. From the outset of project development, investors in wind energy have relatively certain knowledge of the plant's lifetime cost of wind energy. This is because a wind energy project's installed costs and mean wind speed are known early on, and wind generation generallymore » has low variable operation and maintenance costs, zero fuel cost, and no carbon emissions cost. Despite these inherent characteristics, there are wide variations in the cost of wind energy internationally, which is the focus of this report. Using a multinational case-study approach, this work seeks to understand the sources of wind energy cost differences among seven countries under International Energy Agency (IEA) Wind Task 26 - Cost of Wind Energy. The participating countries in this study include Denmark, Germany, the Netherlands, Spain, Sweden, Switzerland, and the United States. Due to data availability, onshore wind energy is the primary focus of this study, though a small sample of reported offshore cost data is also included.« less

  9. The CASE Project: Evaluation of Case-Based Approaches to Learning and Teaching in Statistics Service Courses

    ERIC Educational Resources Information Center

    Fawcett, Lee

    2017-01-01

    The CASE project (Case-based Approaches to Statistics Education; see www.mas.ncl.ac.uk/~nlf8/innovation) was established to investigate how the use of real-life, discipline-specific case study material in Statistics service courses could improve student engagement, motivation, and confidence. Ultimately, the project aims to promote deep learning…

  10. Considerations Related To Human Intrusion In The Context Of Disposal Of Radioactive Waste-The IAEA HIDRA Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seitz, Roger; Kumano, Yumiko; Bailey, Lucy

    2014-01-09

    The principal approaches for management of radioactive waste are commonly termed ‘delay and decay’, ‘concentrate and contain’ and ‘dilute and disperse’. Containing the waste and isolating it from the human environment, by burying it, is considered to increase safety and is generally accepted as the preferred approach for managing radioactive waste. However, this approach results in concentrated sources of radioactive waste contained in one location, which can pose hazards should the facility be disrupted by human action in the future. The International Commission on Radiological Protection (ICRP), International Atomic Energy Agency (IAEA), and Organization for Economic Cooperation and Development/Nuclear Energymore » Agency (OECD/NEA) agree that some form of inadvertent human intrusion (HI) needs to be considered to address the potential consequences in the case of loss of institutional control and loss of memory of the disposal facility. Requirements are reflected in national regulations governing radioactive waste disposal. However, in practice, these requirements are often different from country to country, which is then reflected in the actual implementation of HI as part of a safety case. The IAEA project on HI in the context of Disposal of RadioActive waste (HIDRA) has been started to identify potential areas for improved consistency in consideration of HI. The expected outcome is to provide recommendations on how to address human actions in the safety case in the future, and how the safety case may be used to demonstrate robustness and optimize siting, design and waste acceptance criteria within the context of a safety case.« less

  11. SAGE as a Source for Undergraduate Research Projects

    ERIC Educational Resources Information Center

    Hutz, Benjamin

    2017-01-01

    This article examines the use of the computer algebra system SAGE for undergraduate student research projects. After reading this article, the reader should understand the benefits of using SAGE as a source of research projects and how to commence working with SAGE. The author proposes a tiered working group model to allow maximum benefit to the…

  12. Electric energy savings from new technologies. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrer, B.J.; Kellogg, M.A.; Lyke, A.J.

    1986-09-01

    Purpose of the report is to provide information about the electricity-saving potential of new technologies to OCEP that it can use in developing alternative long-term projections of US electricity consumption. Low-, base-, and high-case scenarios of the electricity savings for 10 technologies were prepared. The total projected annual savings for the year 2000 for all 10 technologies were 137 billion kilowatt hours (BkWh), 279 BkWh, and 470 BkWh, respectively, for the three cases. The magnitude of these savings projections can be gauged by comparing them to the Department's reference case projection for the 1985 National Energy Policy Plan. In themore » Department's reference case, total consumption in 2000 is projected to be 3319 BkWh. Because approximately 75% of the base-case estimate of savings are already incorporated into the reference projection, only 25% of the savings estimated here should be subtracted from the reference projection for analysis purposes.« less

  13. Acoustical source reconstruction from non-synchronous sequential measurements by Fast Iterative Shrinkage Thresholding Algorithm

    NASA Astrophysics Data System (ADS)

    Yu, Liang; Antoni, Jerome; Leclere, Quentin; Jiang, Weikang

    2017-11-01

    Acoustical source reconstruction is a typical inverse problem, whose minimum frequency of reconstruction hinges on the size of the array and maximum frequency depends on the spacing distance between the microphones. For the sake of enlarging the frequency of reconstruction and reducing the cost of an acquisition system, Cyclic Projection (CP), a method of sequential measurements without reference, was recently investigated (JSV,2016,372:31-49). In this paper, the Propagation based Fast Iterative Shrinkage Thresholding Algorithm (Propagation-FISTA) is introduced, which improves CP in two aspects: (1) the number of acoustic sources is no longer needed and the only making assumption is that of a "weakly sparse" eigenvalue spectrum; (2) the construction of the spatial basis is much easier and adaptive to practical scenarios of acoustical measurements benefiting from the introduction of propagation based spatial basis. The proposed Propagation-FISTA is first investigated with different simulations and experimental setups and is next illustrated with an industrial case.

  14. Simulation of the hybrid and steady state advanced operating modes in ITER

    NASA Astrophysics Data System (ADS)

    Kessel, C. E.; Giruzzi, G.; Sips, A. C. C.; Budny, R. V.; Artaud, J. F.; Basiuk, V.; Imbeaux, F.; Joffrin, E.; Schneider, M.; Murakami, M.; Luce, T.; St. John, Holger; Oikawa, T.; Hayashi, N.; Takizuka, T.; Ozeki, T.; Na, Y.-S.; Park, J. M.; Garcia, J.; Tucillo, A. A.

    2007-09-01

    Integrated simulations are performed to establish a physics basis, in conjunction with present tokamak experiments, for the operating modes in the International Thermonuclear Experimental Reactor (ITER). Simulations of the hybrid mode are done using both fixed and free-boundary 1.5D transport evolution codes including CRONOS, ONETWO, TSC/TRANSP, TOPICS and ASTRA. The hybrid operating mode is simulated using the GLF23 and CDBM05 energy transport models. The injected powers are limited to the negative ion neutral beam, ion cyclotron and electron cyclotron heating systems. Several plasma parameters and source parameters are specified for the hybrid cases to provide a comparison of 1.5D core transport modelling assumptions, source physics modelling assumptions, as well as numerous peripheral physics modelling. Initial results indicate that very strict guidelines will need to be imposed on the application of GLF23, for example, to make useful comparisons. Some of the variations among the simulations are due to source models which vary widely among the codes used. In addition, there are a number of peripheral physics models that should be examined, some of which include fusion power production, bootstrap current, treatment of fast particles and treatment of impurities. The hybrid simulations project to fusion gains of 5.6-8.3, βN values of 2.1-2.6 and fusion powers ranging from 350 to 500 MW, under the assumptions outlined in section 3. Simulations of the steady state operating mode are done with the same 1.5D transport evolution codes cited above, except the ASTRA code. In these cases the energy transport model is more difficult to prescribe, so that energy confinement models will range from theory based to empirically based. The injected powers include the same sources as used for the hybrid with the possible addition of lower hybrid. The simulations of the steady state mode project to fusion gains of 3.5-7, βN values of 2.3-3.0 and fusion powers of 290 to 415 MW, under the assumptions described in section 4. These simulations will be presented and compared with particular focus on the resulting temperature profiles, source profiles and peripheral physics profiles. The steady state simulations are at an early stage and are focused on developing a range of safety factor profiles with 100% non-inductive current.

  15. Evidence from Social Service Enhancement Projects: Selected Cases from Norway's HUSK Project.

    PubMed

    Johannessen, Asbjorn; Eide, Solveig Botnen

    2015-01-01

    Through this article the authors describe the social service context of the HUSK (The University Research Program to Support Selected Municipal Social Service Offices) projects and briefly describe 10 of the 50 projects funded throughout the country. The welfare state context for the cases and the criteria for case selection are also provided. The 10 cases are organized into three categories that feature the role of dialogue, educational innovation, and service innovation. These cases provide the foundation for the analysis and implications located in the subsequent articles of the special issue.

  16. Case Study for the ARRA-Funded Ground Source Heat Pump Demonstration at Ball State University

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Im, Piljae; Liu, Xiaobing; Henderson, Jr., Hugh

    With funding provided by the American Recovery and Reinvestment Act (ARRA), 26 ground-source heat pump (GSHP) projects were competitively selected in 2009 to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. One of the selected demonstration projects is a district central GSHP system installed at Ball State University (BSU) in Muncie, IN. Prior to implementing the district GSHP system, 47 major buildings in BSU were served by a central steam plant with four coal-fired and three natural-gas-fired steam boilers. Cooling was provided by five water-cooled centrifugal chillers at the District Energy Station Southmore » (DESS). The new district GSHP system replaced the existing coal-fired steam boilers and conventional water-cooled chillers. It uses ground-coupled heat recovery (HR) chillers to meet the simultaneous heating and cooling demands of the campus. The actual performance of the GSHP system was analyzed based on available measured data from August 2015 through July 2016, construction drawings, maintenance records, personal communications, and construction costs. Since Phase 1 was funded in part by the ARRA grant, it is the focus of this case study. The annual energy consumption of the GSHP system was calculated based on the available measured data and other related information. It was compared with the performance of a baseline scenario— a conventional water-cooled chiller and natural-gas-fired boiler system, both of which meet the minimum energy efficiencies allowed by the American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE 90.1-2013). The comparison was made to determine source energy savings, energy cost savings, and CO2 emission reductions achieved by the GSHP system. A cost analysis was performed to evaluate the simple payback of the GSHP system. The following sections summarize the results of the analysis, the lessons learned, and recommendations for improvement in the operation of this district GSHP system.« less

  17. Prediction Interval Development for Wind-Tunnel Balance Check-Loading

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Toro, Kenneth G.; Commo, Sean A.; Lynn, Keith C.

    2014-01-01

    Results from the Facility Analysis Verification and Operational Reliability project revealed a critical gap in capability in ground-based aeronautics research applications. Without a standardized process for check-loading the wind-tunnel balance or the model system, the quality of the aerodynamic force data collected varied significantly between facilities. A prediction interval is required in order to confirm a check-loading. The prediction interval provides an expected upper and lower bound on balance load prediction at a given confidence level. A method has been developed which accounts for sources of variability due to calibration and check-load application. The prediction interval method of calculation and a case study demonstrating its use is provided. Validation of the methods is demonstrated for the case study based on the probability of capture of confirmation points.

  18. Total variation-based neutron computed tomography

    NASA Astrophysics Data System (ADS)

    Barnard, Richard C.; Bilheux, Hassina; Toops, Todd; Nafziger, Eric; Finney, Charles; Splitter, Derek; Archibald, Rick

    2018-05-01

    We perform the neutron computed tomography reconstruction problem via an inverse problem formulation with a total variation penalty. In the case of highly under-resolved angular measurements, the total variation penalty suppresses high-frequency artifacts which appear in filtered back projections. In order to efficiently compute solutions for this problem, we implement a variation of the split Bregman algorithm; due to the error-forgetting nature of the algorithm, the computational cost of updating can be significantly reduced via very inexact approximate linear solvers. We present the effectiveness of the algorithm in the significantly low-angular sampling case using synthetic test problems as well as data obtained from a high flux neutron source. The algorithm removes artifacts and can even roughly capture small features when an extremely low number of angles are used.

  19. The Financing of Media Projects for Development.

    ERIC Educational Resources Information Center

    Spain, Peter L.

    1978-01-01

    Discusses the financing of Third World media projects that are designed for development, and reports on five main sources of funding--government sources, international agencies, advertising sales, private local support, and self-support. (Author/JEG)

  20. Rethink Disposable: Packaging Waste Source Reduction Pilot Project

    EPA Pesticide Factsheets

    Information about the SFBWQP Rethink Disposable: Packaging Waste Source Reduction Pilot Project, part of an EPA competitive grant program to improve SF Bay water quality focused on restoring impaired waters and enhancing aquatic resources.

  1. ITS Projects In The Northeast Corridor

    DOT National Transportation Integrated Search

    1996-01-01

    THIS DOCUMENT CONTAINS BRIEF DESCRIPTIONS OF ITS PROJECTS RECENTLY COMPLETED, UNDERWAY OR PLANNED WITHIN THE NORTHEAST CORRIDOR. THE BASIC SOURCE OF THIS INFORMATION IS FROM THE SPONSORING AGENCIES ALTHOUGH OTHER SECONDARY SOURCES HAVE ALSO BEEN USED...

  2. The Woodworker's Website: A Project Management Case Study

    ERIC Educational Resources Information Center

    Jance, Marsha

    2014-01-01

    A case study that focuses on building a website for a woodworking business is discussed. Project management and linear programming techniques can be used to determine the time required to complete the website project discussed in the case. This case can be assigned to students in an undergraduate or graduate decision modeling or management science…

  3. Implementation of CUAHSI-HIS Community Project Components in a Local Observatory

    NASA Astrophysics Data System (ADS)

    Muste, M.; Arnold, N.; Kim, D.

    2008-12-01

    The deployment of the eleven WATERS Network local observatories using CUAHSI-HIS project products showed that water observations data collected by academic investigators could be stored, published on the Internet, federated with water observations data published by water agencies, and searched using a concept framework that connects with variables in each individual data source. For many within the water resources community, the CUAHSI-HIS community project represents a new opportunity to approach the management, publication, and analysis of their data systematically - i.e., moving from collections of ASCII text or spreadsheet files to relational data models. This research describes the initial efforts carried out by a University of Iowa research group during the component implementation of a hydrologic community project in a local CI-based digital watershed (DW). The goal was to test what types of data query the DW can handle and see how it performs in use cases where data streams are coupled with models for continuous forecasting. This paper also discusses the general context for the DW development and summarizes the lessons learned by the group during this initial developmental stage. Given the uniform and scalable nature of the community project components, it is expected that the workflows presented herein are transferable to other users and other watersheds.

  4. Creating a Project on Difference Equations with Primary Sources: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Ruch, David

    2014-01-01

    This article discusses the creation of a student project about linear difference equations using primary sources. Early 18th-century developments in the area are outlined, focusing on efforts by Abraham De Moivre (1667-1754) and Daniel Bernoulli (1700-1782). It is explained how primary sources from these authors can be used to cover material…

  5. Modelling as a means to promote water diplomacy in Southern Africa: the Stampriet Transboundary Aquifer System case study

    NASA Astrophysics Data System (ADS)

    De Filippis, Giovanna; Carvalho Resende, Tales; Filali-Meknassi, Youssef; Puri, Shaminder; Kenabatho, Piet; Amakali, Maria; Majola, Kwazikwakhe; Rossetto, Rudy

    2017-04-01

    Within the framework of the "Governance of Groundwater Resources in Transboundary Aquifers" (GGRETA) project, funded by the Swiss Agency for Development and Cooperation (SDC), the Governments of Botswana, Namibia and South Africa, jointly with the UNESCO International Hydrological Programme (UNESCO-IHP) are undertaking an assessment of the Stampriet Transboundary Aquifer System (STAS). The importance of the STAS to the region draws from the fact that it is the only permanent and dependable water resource in the area, which covers 87000 km2 from Central Namibia into Western Botswana and South Africa's Northern Cape Province. The first phase of the project (2013-2015) focused on an assessment of the STAS which allowed establishing a shared science based understanding of the resource. The activities of the second phase of the project (2016-2018) will consolidate the technical results achieved and the tools developed in the first phase, and will strengthen capacity on groundwater governance at the national and transboundary levels to support the process of establishment of a multi-country cooperation mechanism (MCCM). The establishment of the STAS MCCM would be the first example of a mechanism for the management and governance of a transboundary aquifer in Southern Africa. The joint development of a numerical model is crucial to foster such cooperation as it provides a baseline for the formulation of sound policies for the governance of the STAS. The model is being developed through the application of the FREEWAT platform (within the H2020 FREEWAT project - FREE and open source software tools for WATer resource management; Rossetto et al., 2015), an open source and public domain GIS-integrated modelling environment for the simulation of the hydrological cycle. The FREEWAT project aims at improving water resource management by simplifying the application of water-related regulations through the use of modeling environments and GIS tools for storage, management and visualization of large spatial datasets; this is demonstrated by running fourteen case studies using the FREEWAT platform. Among these, the STAS is a particularly representative case study aiming at facilitating the link between science based analysis and stakeholder participation aiming at the adoption of sound transboundary management policies. Due to the scarcity of surface water, water-demanding activities in the study area rely only on groundwater. The first version of the model is developed adapting an existing model of the Namibian part of the aquifer: so far, the groundwater body is discretized using rectangular cells about 40 km2 wide and a stack of three aquifers divided respectively by three aquitards with variable thicknesses and heterogeneous hydraulic properties. The model setup is then revised integrating outcomes from the GGRETA project and extended until the groundwater body limits. Also, boundary conditions and hydrologic stresses (i.e., rainfall infiltration and abstraction for irrigation purposes) were re-defined according to maps and datasets available from the GGRETA project. The involvement of the UNESCO-IHP within the FREEWAT Consortium supports the coordination and integration of previous research outcomes (e.g., from the GGRETA project) and the model development to achieve a full characterization of the STAS current and forecast dynamics and possibly highlighting any existing knowledge gaps. This will be

  6. Promoting North-South partnership in space data use and applications: Case study - East African countries space programs/projects new- concepts in document management

    NASA Astrophysics Data System (ADS)

    Mlimandago, S.

    This research paper have gone out with very simple and easy (several) new concepts in document management for space projects and programs which can be applied anywhere both in the developing and developed countries. These several new concepts are and have been applied in Tanzania, Kenya and Uganda and found out to bear very good results using simple procedures. The intergral project based its documentation management approach from the outset on electronic document sharing and archiving. The main objective of having new concepts was to provide a faster and wider availability of the most current space information to all parties rather than creating a paperless office. Implementation of the new concepts approach required the capturing of documents in an appropriate and simple electronic format at source establishing new procedures for project wide information sharing and the deployment of a new generation of simple procedure - WEB - based tools. Key success factors were the early adoption of Internet technologies and simple procedures for improved information flow new concepts which can be applied anywhere both in the developed and the developing countries.

  7. Air quality impact assessment of at-berth ship emissions: Case-study for the project of a new freight port.

    PubMed

    Lonati, Giovanni; Cernuschi, Stefano; Sidi, Shelina

    2010-12-01

    This work is intended to assess the impact on local air quality due to atmospheric emissions from port area activities for a new port in project in the Mediterranean Sea. The sources of air pollutants in the harbour area are auxiliary engines used by ships at berth during loading/offloading operations. A fleet activity-based methodology is first applied to evaluate annual pollutant emissions (NO(X), SO(X), PM, CO and VOC) based on vessel traffic data, ships tonnage and in-port hotelling time for loading/offloading operations. The 3-dimensional Calpuff transport and dispersion model is then applied for the subsequent assessment of the ground level spatial distribution of atmospheric pollutants for both long-term and short-term averaging times. Compliance with current air quality standards in the port area is finally evaluated and indications for port operation are provided. Some methodological aspects of the impact assessment procedure, namely those concerning the steps of emission scenario definitions and model simulations set-up at the project stage, are specifically addressed, suggesting a pragmatic approach for similar evaluations for small new ports in project. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. Thermal Performance Analysis of a Geologic Borehole Repository

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reagin, Lauren

    2016-08-16

    The Brazilian Nuclear Research Institute (IPEN) proposed a design for the disposal of Disused Sealed Radioactive Sources (DSRS) based on the IAEA Borehole Disposal of Sealed Radioactive Sources (BOSS) design that would allow the entirety of Brazil’s inventory of DSRS to be disposed in a single borehole. The proposed IPEN design allows for 170 waste packages (WPs) containing DSRS (such as Co-60 and Cs-137) to be stacked on top of each other inside the borehole. The primary objective of this work was to evaluate the thermal performance of a conservative approach to the IPEN proposal with the equivalent of twomore » WPs and two different inside configurations using Co-60 as the radioactive heat source. The current WP configuration (heterogeneous) for the IPEN proposal has 60% of the WP volume being occupied by a nuclear radioactive heat source and the remaining 40% as vacant space. The second configuration (homogeneous) considered for this project was a homogeneous case where 100% of the WP volume was occupied by a nuclear radioactive heat source. The computational models for the thermal analyses of the WP configurations with the Co-60 heat source considered three different cooling mechanisms (conduction, radiation, and convection) and the effect of mesh size on the results from the thermal analysis. The results of the analyses yielded maximum temperatures inside the WPs for both of the WP configurations and various mesh sizes. The heterogeneous WP considered the cooling mechanisms of conduction, convection, and radiation. The temperature results from the heterogeneous WP analysis suggest that the model is cooled predominantly by conduction with effect of radiation and natural convection on cooling being negligible. From the thermal analysis comparing the two WP configurations, the results suggest that either WP configuration could be used for the design. The mesh sensitivity results verify the meshes used, and results obtained from the thermal analyses were close to being independent of mesh size. The results from the computational case and analytically-calculated case for the homogeneous WP in benchmarking were almost identical, which indicates that the computational approach used here was successfully verified by the analytical solution.« less

  9. Freeing Worldview's development process: Open source everything!

    NASA Astrophysics Data System (ADS)

    Gunnoe, T.

    2016-12-01

    Freeing your code and your project are important steps for creating an inviting environment for collaboration, with the added side effect of keeping a good relationship with your users. NASA Worldview's codebase was released with the open source NOSA (NASA Open Source Agreement) license in 2014, but this is only the first step. We also have to free our ideas, empower our users by involving them in the development process, and open channels that lead to the creation of a community project. There are many highly successful examples of Free and Open Source Software (FOSS) projects of which we can take note: the Linux kernel, Debian, GNOME, etc. These projects owe much of their success to having a passionate mix of developers/users with a great community and a common goal in mind. This presentation will describe the scope of this openness and how Worldview plans to move forward with a more community-inclusive approach.

  10. ILC TARGET WHEEL RIM FRAGMENT/GUARD PLATE IMPACT ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagler, L

    2008-07-17

    A positron source component is needed for the International Linear Collider Project. The leading design concept for this source is a rotating titanium alloy wheel whose spokes rotate through an intense localized magnetic field. The system is composed of an electric motor, flexible motor/drive-shaft coupling, stainless steel drive-shaft, two Plumber's Block tapered roller bearings, a titanium alloy target wheel, and electromagnet. Surrounding the target wheel and magnet is a steel frame with steel guarding plates intended to contain shrapnel in case of catastrophic wheel failure. Figure 1 is a layout of this system (guard plates not shown for clarity). Thismore » report documents the FEA analyses that were performed at LLNL to help determine, on a preliminary basis, the required guard plate thickness for three potential plate steels.« less

  11. Hydroelectric System Response to Part Load Vortex Rope Excitation

    NASA Astrophysics Data System (ADS)

    Alligné, S.; Nicolet, C.; Bégum, A.; Landry, C.; Gomes, J.; Avellan, F.

    2016-11-01

    The prediction of pressure and output power fluctuations amplitudes on Francis turbine prototype is a challenge for hydro-equipment industry since it is subjected to guarantees to ensure smooth and reliable operation of the hydro units. The European FP7 research project Hyperbole aims to setup a methodology to transpose the pressure fluctuations induced by the cavitation vortex rope on the reduced scale model to the prototype generating units. A Francis turbine unit of 444MW with a specific speed value of v = 0.29, is considered as case study. A SIMSEN model of the power station including electrical system, controllers, rotating train and hydraulic system with transposed draft tube excitation sources is setup. Based on this model, a frequency analysis of the hydroelectric system is performed to analyse potential interactions between hydraulic excitation sources and electrical components.

  12. Conditions for Productive Learning in Networked Learning Environments: A Case Study from the VO@NET Project

    ERIC Educational Resources Information Center

    Ryberg, Thomas; Koottatep, Suporn; Pengchai, Petch; Dirckinck-Holmfeld, Lone

    2006-01-01

    In this article we bring together experiences from two international research projects: the Kaleidoscope ERT research collaboration and the VO@NET project. We do this by using a shared framework identified for cross-case analyses within the Kaleidoscope ERT to analyse a particular case in the VO@NET project, a training course called "Green…

  13. Enabling Automated Graph-based Search for the Identification and Characterization of Mesoscale Convective Complexes in Satellite Datasets through Integration with the Apache Open Climate Workbench

    NASA Astrophysics Data System (ADS)

    McGibbney, L. J.; Whitehall, K. D.; Mattmann, C. A.; Goodale, C. E.; Joyce, M.; Ramirez, P.; Zimdars, P.

    2014-12-01

    We detail how Apache Open Climate Workbench (OCW) (recently open sourced by NASA JPL) was adapted to facilitate an ongoing study of Mesoscale Convective Complexes (MCCs) in West Africa and their contributions within the weather-climate continuum as it relates to climate variability. More than 400 MCCs occur annually over various locations on the globe. In West Africa, approximately one-fifth of that total occur during the summer months (June-November) alone and are estimated to contribute more than 50% of the seasonal rainfall amounts. Furthermore, in general the non-discriminatory socio-economic geospatial distribution of these features correlates with currently and projected densely populated locations. As such, the convective nature of MCCs raises questions regarding their seasonal variability and frequency in current and future climates, amongst others. However, in spite of the formal observation criteria of these features in 1980, these questions have remained comprehensively unanswered because of the untimely and subjective methods for identifying and characterizing MCCs due to limitations data-handling limitations. The main outcome of this work therefore documents how a graph-based search algorithm was implemented on top of the OCW stack with the ultimate goal of improving fully automated end-to-end identification and characterization of MCCs in high resolution observational datasets. Apache OCW as an open source project was demonstrated from inception and we display how it was again utilized to advance understanding and knowledge within the above domain. The project was born out of refactored code donated by NASA JPL from the Earth science community's Regional Climate Model Evaluation System (RCMES), a joint project between the Joint Institute for Regional Earth System Science and Engineering (JIFRESSE), and a scientific collaboration between the University of California at Los Angeles (UCLA) and NASA JPL. The Apache OCW project was then integrated back into the donor code with the aim of more efficiently powering that project. Notwithstanding, the object-oriented approach to creating a core set of libraries Apache OCW has scaled the usability of the project beyond climate model evaluation as displayed in the MCC use case detailed herewith.

  14. Project Career: A qualitative examination of five college students with traumatic brain injuries.

    PubMed

    Nardone, Amanda; Sampson, Elaine; Stauffer, Callista; Leopold, Anne; Jacobs, Karen; Hendricks, Deborah J; Elias, Eileen; Chen, Hui; Rumrill, Phillip

    2015-01-01

    Project Career is an interprofessional five-year development project designed to improve the employment success of undergraduate college and university students with traumatic brain injury (TBI). The case study information was collected and synthesized by the project's Technology and Employment Coordinators (TECs) at each of the project's three university sites. The project's evaluation is occurring independently through JBS International, Inc. Five case studies are presented to provide an understanding of student participants' experiences within Project Career. Each case study includes background on the student, engagement with technology, vocational supports, and interactions with his/her respective TEC. A qualitative analysis from the student's case notes is provided within each case study, along with a discussion of the overall qualitative analysis. Across all five students, the theme Positive Outcomes was mentioned most often in the case notes. Of all the different type of challenges, Cognitive Challenges were most often mentioned during meetings with the TECs, followed by Psychological Challenges, Physical Challenges, Other Challenges, and Academic Challenges, respectively. Project Career is providing academic enrichment and career enhancement that may substantially improve the unsatisfactory employment outcomes that presently await students with TBI following graduation.

  15. Self-Tracking: Reflections from the BodyTrack Project.

    PubMed

    Wright, Anne

    2016-07-06

    Based on the author's experiences the practice of self-tracking can empower individuals to explore and address issues in their lives. This work is inspired by examples of people who have reclaimed their wellness through an iterative process of noticing patterns of ups and downs, trying out new ideas and strategies, and observing the results. In some cases, individuals have realized that certain foods, environmental exposures, or practices have unexpected effects for them, and that adopting custom strategies can greatly improve quality of life, overcoming chronic problems. Importantly, adopting the role of investigator of their own situation appears to be transformative: people who embarked on this path changed their relationship to their health situation even before making discoveries that helped lead to symptom improvement. The author co-founded the BodyTrack project in 2010 with the goal of empowering a broader set of people to embrace this investigator role in their own lives and better address their health and wellness concerns, particularly those with complex environmental or behavioral components. The core of the BodyTrack system is an open source web service called Fluxtream ( https://fluxtream.org ) that allows users to aggregate, visualize, and reflect on data from myriad sources on a common timeline. The project is also working to develop and spread peer coaching practices to help transfer the culture and skills of self-tracking while mentoring individuals in how to self-assess their own situation and guide the process for themselves.

  16. A line-source method for aligning on-board and other pinhole SPECT systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yan, Susu; Bowsher, James; Yin, Fang-Fang

    2013-12-15

    Purpose: In order to achieve functional and molecular imaging as patients are in position for radiation therapy, a robotic multipinhole SPECT system is being developed. Alignment of the SPECT system—to the linear accelerator (LINAC) coordinate frame and to the coordinate frames of other on-board imaging systems such as cone-beam CT (CBCT)—is essential for target localization and image reconstruction. An alignment method that utilizes line sources and one pinhole projection is proposed and investigated to achieve this goal. Potentially, this method could also be applied to the calibration of the other pinhole SPECT systems.Methods: An alignment model consisting of multiple alignmentmore » parameters was developed which maps line sources in three-dimensional (3D) space to their two-dimensional (2D) projections on the SPECT detector. In a computer-simulation study, 3D coordinates of line-sources were defined in a reference room coordinate frame, such as the LINAC coordinate frame. Corresponding 2D line-source projections were generated by computer simulation that included SPECT blurring and noise effects. The Radon transform was utilized to detect angles (α) and offsets (ρ) of the line-source projections. Alignment parameters were then estimated by a nonlinear least squares method, based on the α and ρ values and the alignment model. Alignment performance was evaluated as a function of number of line sources, Radon transform accuracy, finite line-source width, intrinsic camera resolution, Poisson noise, and acquisition geometry. Experimental evaluations were performed using a physical line-source phantom and a pinhole-collimated gamma camera attached to a robot.Results: In computer-simulation studies, when there was no error in determining angles (α) and offsets (ρ) of the measured projections, six alignment parameters (three translational and three rotational) were estimated perfectly using three line sources. When angles (α) and offsets (ρ) were provided by the Radon transform, estimation accuracy was reduced. The estimation error was associated with rounding errors of Radon transform, finite line-source width, Poisson noise, number of line sources, intrinsic camera resolution, and detector acquisition geometry. Statistically, the estimation accuracy was significantly improved by using four line sources rather than three and by thinner line-source projections (obtained by better intrinsic detector resolution). With five line sources, median errors were 0.2 mm for the detector translations, 0.7 mm for the detector radius of rotation, and less than 0.5° for detector rotation, tilt, and twist. In experimental evaluations, average errors relative to a different, independent registration technique were about 1.8 mm for detector translations, 1.1 mm for the detector radius of rotation (ROR), 0.5° and 0.4° for detector rotation and tilt, respectively, and 1.2° for detector twist.Conclusions: Alignment parameters can be estimated using one pinhole projection of line sources. Alignment errors are largely associated with limited accuracy of the Radon transform in determining angles (α) and offsets (ρ) of the line-source projections. This alignment method may be important for multipinhole SPECT, where relative pinhole alignment may vary during rotation. For pinhole and multipinhole SPECT imaging on-board radiation therapy machines, the method could provide alignment of SPECT coordinates with those of CBCT and the LINAC.« less

  17. A line-source method for aligning on-board and other pinhole SPECT systems

    PubMed Central

    Yan, Susu; Bowsher, James; Yin, Fang-Fang

    2013-01-01

    Purpose: In order to achieve functional and molecular imaging as patients are in position for radiation therapy, a robotic multipinhole SPECT system is being developed. Alignment of the SPECT system—to the linear accelerator (LINAC) coordinate frame and to the coordinate frames of other on-board imaging systems such as cone-beam CT (CBCT)—is essential for target localization and image reconstruction. An alignment method that utilizes line sources and one pinhole projection is proposed and investigated to achieve this goal. Potentially, this method could also be applied to the calibration of the other pinhole SPECT systems. Methods: An alignment model consisting of multiple alignment parameters was developed which maps line sources in three-dimensional (3D) space to their two-dimensional (2D) projections on the SPECT detector. In a computer-simulation study, 3D coordinates of line-sources were defined in a reference room coordinate frame, such as the LINAC coordinate frame. Corresponding 2D line-source projections were generated by computer simulation that included SPECT blurring and noise effects. The Radon transform was utilized to detect angles (α) and offsets (ρ) of the line-source projections. Alignment parameters were then estimated by a nonlinear least squares method, based on the α and ρ values and the alignment model. Alignment performance was evaluated as a function of number of line sources, Radon transform accuracy, finite line-source width, intrinsic camera resolution, Poisson noise, and acquisition geometry. Experimental evaluations were performed using a physical line-source phantom and a pinhole-collimated gamma camera attached to a robot. Results: In computer-simulation studies, when there was no error in determining angles (α) and offsets (ρ) of the measured projections, six alignment parameters (three translational and three rotational) were estimated perfectly using three line sources. When angles (α) and offsets (ρ) were provided by the Radon transform, estimation accuracy was reduced. The estimation error was associated with rounding errors of Radon transform, finite line-source width, Poisson noise, number of line sources, intrinsic camera resolution, and detector acquisition geometry. Statistically, the estimation accuracy was significantly improved by using four line sources rather than three and by thinner line-source projections (obtained by better intrinsic detector resolution). With five line sources, median errors were 0.2 mm for the detector translations, 0.7 mm for the detector radius of rotation, and less than 0.5° for detector rotation, tilt, and twist. In experimental evaluations, average errors relative to a different, independent registration technique were about 1.8 mm for detector translations, 1.1 mm for the detector radius of rotation (ROR), 0.5° and 0.4° for detector rotation and tilt, respectively, and 1.2° for detector twist. Conclusions: Alignment parameters can be estimated using one pinhole projection of line sources. Alignment errors are largely associated with limited accuracy of the Radon transform in determining angles (α) and offsets (ρ) of the line-source projections. This alignment method may be important for multipinhole SPECT, where relative pinhole alignment may vary during rotation. For pinhole and multipinhole SPECT imaging on-board radiation therapy machines, the method could provide alignment of SPECT coordinates with those of CBCT and the LINAC. PMID:24320537

  18. A line-source method for aligning on-board and other pinhole SPECT systems.

    PubMed

    Yan, Susu; Bowsher, James; Yin, Fang-Fang

    2013-12-01

    In order to achieve functional and molecular imaging as patients are in position for radiation therapy, a robotic multipinhole SPECT system is being developed. Alignment of the SPECT system-to the linear accelerator (LINAC) coordinate frame and to the coordinate frames of other on-board imaging systems such as cone-beam CT (CBCT)-is essential for target localization and image reconstruction. An alignment method that utilizes line sources and one pinhole projection is proposed and investigated to achieve this goal. Potentially, this method could also be applied to the calibration of the other pinhole SPECT systems. An alignment model consisting of multiple alignment parameters was developed which maps line sources in three-dimensional (3D) space to their two-dimensional (2D) projections on the SPECT detector. In a computer-simulation study, 3D coordinates of line-sources were defined in a reference room coordinate frame, such as the LINAC coordinate frame. Corresponding 2D line-source projections were generated by computer simulation that included SPECT blurring and noise effects. The Radon transform was utilized to detect angles (α) and offsets (ρ) of the line-source projections. Alignment parameters were then estimated by a nonlinear least squares method, based on the α and ρ values and the alignment model. Alignment performance was evaluated as a function of number of line sources, Radon transform accuracy, finite line-source width, intrinsic camera resolution, Poisson noise, and acquisition geometry. Experimental evaluations were performed using a physical line-source phantom and a pinhole-collimated gamma camera attached to a robot. In computer-simulation studies, when there was no error in determining angles (α) and offsets (ρ) of the measured projections, six alignment parameters (three translational and three rotational) were estimated perfectly using three line sources. When angles (α) and offsets (ρ) were provided by the Radon transform, estimation accuracy was reduced. The estimation error was associated with rounding errors of Radon transform, finite line-source width, Poisson noise, number of line sources, intrinsic camera resolution, and detector acquisition geometry. Statistically, the estimation accuracy was significantly improved by using four line sources rather than three and by thinner line-source projections (obtained by better intrinsic detector resolution). With five line sources, median errors were 0.2 mm for the detector translations, 0.7 mm for the detector radius of rotation, and less than 0.5° for detector rotation, tilt, and twist. In experimental evaluations, average errors relative to a different, independent registration technique were about 1.8 mm for detector translations, 1.1 mm for the detector radius of rotation (ROR), 0.5° and 0.4° for detector rotation and tilt, respectively, and 1.2° for detector twist. Alignment parameters can be estimated using one pinhole projection of line sources. Alignment errors are largely associated with limited accuracy of the Radon transform in determining angles (α) and offsets (ρ) of the line-source projections. This alignment method may be important for multipinhole SPECT, where relative pinhole alignment may vary during rotation. For pinhole and multipinhole SPECT imaging on-board radiation therapy machines, the method could provide alignment of SPECT coordinates with those of CBCT and the LINAC.

  19. Scientist-Practitioner Engagement to Inform Regional Hydroclimate Model Evaluation

    NASA Astrophysics Data System (ADS)

    Jones, A. D.; Jagannathan, K. A.; Ullrich, P. A.

    2017-12-01

    Water mangers face significant challenges in planning for the coming decades as previously stationary aspects of the regional hydroclimate shift in response to global climate change. Providing scientific insights that enable appropriate use of regional hydroclimate projections for planning is a non-trivial problem. The system of data, models, and methods used to produce regional hydroclimate projections is subject to multiple interacting uncertainties and biases, including uncertainties that arise from general circulation models, re-analysis data products, regional climate models, hydrologic models, and statistical downscaling methods. Moreover, many components of this system were not designed with the information needs of water managers in mind. To address this problem and provide actionable insights into the sources of uncertainty present in regional hydroclimate data products, Project Hyperion has undertaken a stakeholder engagement process in four case study water basins across the US. Teams of water managers and scientists are interacting in a structured manner to identify decision-relevant metrics of model performance. These metrics are in turn being used to drive scientific investigations to uncover the sources of uncertainty in these quantities. Thus far, we have found that identification of climate phenomena of interest to stakeholders is relatively easy, but translating these into specific quantifiable metrics and prioritizing metrics is more challenging. Iterative feedback among scientists and stakeholders has proven critical in resolving these challenges, as has the roles played by boundary spanners who understand and can speak to the perspectives of multiple professional communities. Here we describe the structured format of our engagement process and the lessons learned so far, as we aim to improve the decision-relevance of hydroclimate projections through a collaborative process.

  20. Selection of radio sources for Venus balloon-Pathfinder Delta-DOR navigation at 1.7 GHz

    NASA Technical Reports Server (NTRS)

    Liewer, K. M

    1986-01-01

    In order to increase the success rate of the Delta-DOR (Delta-Differential One-way Range) VLBI navigational support for the French-Soviet Venus Balloon and Halley Pathfinder projects, forty-four extragalactic radio sources were observed in advance of these projects to determine which were suitable for use as reference sources. Of these forty-four radio sources taken from the existing JPL radio source catalogue, thirty-six were determined to be of sufficient strength for use in Delta-DOR VLBI navigation.

  1. Particle-in-cell code library for numerical simulation of the ECR source plasma

    NASA Astrophysics Data System (ADS)

    Shirkov, G.; Alexandrov, V.; Preisendorf, V.; Shevtsov, V.; Filippov, A.; Komissarov, R.; Mironov, V.; Shirkova, E.; Strekalovsky, O.; Tokareva, N.; Tuzikov, A.; Vatulin, V.; Vasina, E.; Fomin, V.; Anisimov, A.; Veselov, R.; Golubev, A.; Grushin, S.; Povyshev, V.; Sadovoi, A.; Donskoi, E.; Nakagawa, T.; Yano, Y.

    2003-05-01

    The project ;Numerical simulation and optimization of ion accumulation and production in multicharged ion sources; is funded by the International Science and Technology Center (ISTC). A summary of recent project development and the first version of a computer code library for simulation of electron-cyclotron resonance (ECR) source plasmas based on the particle-in-cell method are presented.

  2. 6. Photographic copy of photograph (Source: Salt River Project Archives, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    6. Photographic copy of photograph (Source: Salt River Project Archives, Tempe, Lubken collection, #R-295) Transformer house under construction. View looking north. October 5, 1908. - Theodore Roosevelt Dam, Transformer House, Salt River, Tortilla Flat, Maricopa County, AZ

  3. 5. Photographic copy of photograph (Source: Salt River Project Archives, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    5. Photographic copy of photograph (Source: Salt River Project Archives, Tempe, Lubken collection, #R-273) Transformer house under construction. View looking north. July 1, 1908. - Theodore Roosevelt Dam, Transformer House, Salt River, Tortilla Flat, Maricopa County, AZ

  4. 8. Photographic copy of photograph (Source: Salt River Project Archives, ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    8. Photographic copy of photograph (Source: Salt River Project Archives, Tempe, Box 8040, File 29) View of transformer house looking north. No date. CA. 1920. - Theodore Roosevelt Dam, Transformer House, Salt River, Tortilla Flat, Maricopa County, AZ

  5. Near-source air quality assessment: challenges and collaboration

    EPA Science Inventory

    This presentation is to give a general overview of near-source air pollution concerns and recent EPA projects (near-road, near-rail, near-port), as well as explaining how these projects were implemented through collaboration internally and externally.

  6. Mid- and long-term debris environment projections using the EVOLVE and CHAIN models

    NASA Astrophysics Data System (ADS)

    Eichler, Peter; Reynolds, Robert C.

    1995-06-01

    Results of debris environment projections are of great importance for the evaluation of the necessity and effectiveness of debris mitigation measures. EVOLVE and CHAIN are two models for debris environment projections that have been developed independently using different conceptual approaches. A comparison of results from these two models therefore provides a means of validating debris environment projections which they have made. EVOLVE is a model that requires mission model projections to describe future space operation; these projections include launch date, mission orbit altitude and inclimation, mission duration, vehicle size and mass, and classification as an object capable of experiencing breakup from on-board stored energy. EVOLVE describes the orbital debris environment by the orbital elements of the objects in the environment. CHAIN is an analytic model that bins the debris environemnt in size and altitude rather than following the orbit evolution of individual debris fragments. The altitude/size bins are coupled by the initial spreading of fragments by collisions and the following orbital decay behavior. A set of test cases covering a variety of space usage scenarios have been defined for the two models. In this paper, a comparison of the results will be presented and sources of disagreement identified and discussed. One major finding is that despite differences in the results of the two models, the basic tendencies of the environment projections are independent of modeled uncertainties, leading to the demand of debris mitigation measures--explosion suppression and de-orbit of rocket bodies and payloads after mission completion.

  7. Mapping the literature of case management nursing.

    PubMed

    White, Pamela; Hall, Marilyn E

    2006-04-01

    Nursing case management provides a continuum of health care services for defined groups of patients. Its literature is multidisciplinary, emphasizing clinical specialties, case management methodology, and the health care system. This study is part of a project to map the literature of nursing, sponsored by the Nursing and Allied Health Resources Section of the Medical Library Association. The study identifies core journals cited in case management literature and indexing services that access those journals. Three source journals were identified based on established criteria, and cited references from each article published from 1997 to 1999 were analyzed. Nearly two-thirds of the cited references were from journals; others were from books, monographs, reports, government documents, and the Internet. Cited journal references were ranked in descending order, and Bradford's Law of Scattering was applied. The many journals constituting the top two zones reflect the diversity of this field. Zone 1 included journals from nursing administration, case management, general medicine, medical specialties, and social work. Two databases, PubMed/MEDLINE and OCLC ArticleFirst, provided the best indexing coverage. Collections that support case management require a relatively small group of core journals. Students and health care professionals will need to search across disciplines to identify appropriate literature.

  8. The AlpArray-CASE project: temporary broadband seismic network deployment and characterization

    NASA Astrophysics Data System (ADS)

    Dasović, Iva; Molinari, Irene; Stipčević, Josip; Šipka, Vesna; Salimbeni, Simone; Jarić, Dejan; Prevolnik, Snježan; Kissling, Eduard; Clinton, John; Giardini, Domenico

    2017-04-01

    While the northern part of the Adriatic microplate will be accurately imaged within the AlpArray project, its central and southern parts deserve detailed studies to obtain a complete picture of its structure and evolution. The Adriatic microplate forms the upper plate in the Western and Central Alps whereas it forms the lower plate in the Apennines and the Dinarides. However, the tectonics of Adriatic microplate is not well constrained and remains controversial, especially with regard to its contact with the Dinarides. The primary goal of the Central Adriatic Seismic Experiment (CASE) is to provide high quality seismological data and to shed light on seismicity and 3D lithospheric structure of the central Adriatic microplate and its boundaries. The CASE project is an international AlpArray Complementary Experiment carried out by four institutions: Department of Earth Sciences and Swiss Seismological Service of ETH Zürich (CH), Department of Geophysics and Croatian Seismological Service of Faculty of Science at University of Zagreb (HR), Republic Hydrometeorological Service of Republic of Srpska (BIH) and Istituto Nazionale di Geofisica e Vulcanologia (I). It establishes a temporary seismic network, expected to be operational at least for one year, composed by existing permanent and temporary seismic stations operated by the institutions involved and newly deployed temporary seismic stations, installed in November and December 2016, provided by ETH Zürich and INGV: five in Croatia, four in Bosnia and Herzegovina and two in Italy. In this work, we present stations sites and settings and discuss their characteristics in terms of site-effects and noise level of each station. In particular, we analyse the power spectral density estimates in order to investigate major sources of noise and background noise.

  9. Prevalence of Primary Sjögren's Syndrome in a US Population-Based Cohort.

    PubMed

    Maciel, Gabriel; Crowson, Cynthia S; Matteson, Eric L; Cornec, Divi

    2017-10-01

    To report the point prevalence of primary Sjögren's syndrome (SS) in the first US population-based study. Cases of all potential primary SS patients living in Olmsted County, Minnesota, on January 1, 2015, were retrieved using Rochester Epidemiology Project resources, and ascertained by manual medical records review. Primary SS cases were defined according to physician diagnosis. The use of diagnostic tests was assessed and the performance of classification criteria was evaluated. The number of prevalent cases in 2015 was also projected based on 1976-2005 incidence data from the same source population. A total of 106 patients with primary SS were included in the study: 86% were female, with a mean ± SD age of 64.6 ± 15.2 years, and a mean ± SD disease duration of 10.5 ± 8.4 years. A majority were anti-SSA positive (75%) and/or anti-SSB positive (58%), but only 22% met American-European Consensus Group or American College of Rheumatology criteria, because the other tests required for disease classification (ocular dryness objective assessment, salivary gland functional or morphologic tests, or salivary gland biopsy) were rarely performed in clinical practice. According to the physician diagnosis, the age- and sex-adjusted prevalence of primary SS was 10.3 per 10,000 inhabitants, but according to classification criteria, this prevalence was only 2.2 per 10,000. The analysis based on previous incidence data projected a similar 2015 prevalence rate of 11.0 per 10,000. The prevalence of primary SS in this geographically well-defined population was estimated to be between 2 and 10 per 10,000 inhabitants. Physicians rarely used tests included in the classification criteria to diagnose the disease in this community setting. © 2016, American College of Rheumatology.

  10. Comparative Analysis of the Long-term Trends of the Surface Ozone Concentration at Elevated Sites in the Alps and in Caucasus Region

    NASA Astrophysics Data System (ADS)

    Tarasova, O. A.; Staehelin, J.; Prevot, A. S.; Senik, I. A.; Sosonkin, M. G.; Cui, J.

    2007-12-01

    Analysis of the long-term surface ozone records of two mountain sites, namely Kislovodsk High Mountain Station (KHMS) in Caucasus, Russia (43.7°N, 42.7°E, 2070 asl.) and Jungfraujoch (JFJ) in Switzerland (46.5°N, 7.9°E, 3580m asl) will be presented. A strong increase in ozone concentration (up +0.46±0.11ppb/year) was found at JFJ while ozone significantly deceased at KHMS (-0.65 ±0.09 ppb/year) during 1990-2005. We will compare trends values for earlier years (1990-2001) and for the latter ones (1993-2005). Among the possible reasons of the trends difference the impact of atmospheric transport is studied. Both vertical and horizontal components are considered in connection with ozone concentration trends. Transport analysis is based on 3D trajectories using LAGRANTO. There was no substantial difference in trends detected for different PV-levels or PBL filtered cases, while the main difference has been found in the source areas of the air masses at the two locations and inside different advection sectors at the each particular site. Trends will be compared (for the two receptor sites and two periods) for filtered subsets of upper tropospheric/stratospheric cases (based on PV and trajectory altitude), cases impacted by Planetary Boundary Layer (based on PBL height) and in different horizontal advection clusters. The work is financially supported by the Swiss National Science Foundation (JRP IB7320-110831), European Commission (Marie-Curie IIF project N 039905 - FP6-2005-Mobility-7) and Russian Foundation for Basic Research (projects 06-05-64427 and 06-05-65308) and contributes to ACCENT T&TP project.

  11. Near-IR trigonometric parallaxes of nearby stars in the Galactic plane using the VVV survey

    NASA Astrophysics Data System (ADS)

    Beamín, J. C.; Mendez, R. A.; Smart, R. L.; Jara, R.; Kurtev, R.; Gromadzki, M.; Villanueva, V.; Minniti, D.; Smith, L. C.; Lucas, P. W.

    2017-07-01

    We use the multi-epoch KS band observations, covering a ˜ 5 years baseline to obtain milli and sub-milli arcsec precision astrometry for a sample of eighteen previously known high proper motion sources, including precise parallaxes for these sources for the first time. In this pioneer study we show the capability of the VVV project to measure high precision trigonometric parallaxes for very low mass stars (VLMS) up to distances of ˜ 400 pc reaching farther than most other ground based surveys or space missions for these types of stars. Two stars in our sample are low mass companions to sources in the TGAS catalog, the VVV astrometry of the fainter source is consistent within 1-σ with the astrometry for the primary source in TGAS catalog, confirming the excellent astrometric quality of the VVV data even nearby of saturated sources, as in these cases. Additionally, we used spectral energy distribution to search for evidence of unresolved binary systems and cool sub-dwarfs. We detected five systems that are most likely VLMS belonging to the Galactic halo based on their tangential velocities, and four objects within 60 pc that are likely members of the thick disk. A more comprehensive study of high proper motion sources and parallaxes of VLMS and brown dwarfs with the VVV is ongoing, including thousands of newly discovered objects (Kurtev et al. 2016).

  12. Global crowd data to understand risk taking behavior: Understanding the costs of crowd sourcing

    NASA Astrophysics Data System (ADS)

    Hendrikx, J.; Johnson, J.

    2016-12-01

    Crowd sourcing is an increasingly common approach to collect data from a large number of places, people, or both, for a given phenomenon or observation. It is often thought of as a very cost effective approach to collect data from large spatial domains, or from difficult to reach areas, or for spatially discrete observations. While crowd sourcing data can provide a wealth of data beyond that which most research teams can collect themselves, there are many associated, and sometime unexpected costs with this approach. We present a case study of a crowd-sourced data collection campaign to collect GPS tracks of back country recreationalists in avalanche terrain. We ask the volunteers to track their outings using the GPS on their smart phone using a free application, and on the completion of their trip email us their track. On receipt of this track we automatically reply with a link to a decision making survey. In this way we collect data on both the physical attributes of their trip, as well as the social, psychological and demographic data about the person. While this approach has been very successful, it has come at a high cost time-wise. Much like the role of an online course instructor, instructor (or in this case researcher) presence is essential. Replying to emails, updating webpages, posting on social media, and connecting with your volunteer data collectors can become a full time job - and that's even before you start the data analysis. We encourage future researchers to plan ahead for this when starting a crowd sourcing project involving the general public, and seek advice and training in social media, web site development and communication techniques like semi-automated email.

  13. A case study of autonomy and motivation in a student-led game development project

    NASA Astrophysics Data System (ADS)

    Prigmore, M.; Taylor, R.; De Luca, D.

    2016-07-01

    This paper presents the findings of an exploratory case study into the relationship between student autonomy and motivation in project based learning, using Self-Determination Theory (SDT) to frame the investigation. The case study explores how different forms of motivation affect the students' response to challenges and their intention to complete the project. Earlier studies have made little explicit use of theoretical perspectives on student autonomy and motivation, a weakness this study attempts to address. As an exploratory case study seeking to evaluate the suitability of a particular theoretical framework, we chose a small case: three students on a one-term computer games development project. Given the small scale, the approach is necessarily qualitative, drawing on project documentation and one-to-one interviews with the students. Our conclusion is that the concepts of SDT provide a useful framework for analysing students' motivations to undertake project work, and its predictions can offer useful guidance on how to initiate and supervise such projects.

  14. The Magsat precision vector magnetometer

    NASA Technical Reports Server (NTRS)

    Acuna, M. H.

    1980-01-01

    This paper examines the Magsat precision vector magnetometer which is designed to measure projections of the ambient field in three orthogonal directions. The system contains a highly stable and linear triaxial fluxgate magnetometer with a dynamic range of + or - 2000 nT (1 nT = 10 to the -9 weber per sq m). The magnetometer electronics, analog-to-digital converter, and digitally controlled current sources are implemented with redundant designs to avoid a loss of data in case of failures. Measurements are carried out with an accuracy of + or - 1 part in 64,000 in magnitude and 5 arcsec in orientation (1 arcsec = 0.00028 deg).

  15. Internet and social network recruitment: two case studies.

    PubMed

    Johnson, Kathy A; Peace, Jane

    2012-01-01

    The recruitment of study participants is a significant research challenge. The Internet, with its ability to reach large numbers of people in networks connected by email, Facebook and other social networking mechanisms, appears to offer new avenues for recruitment. This paper reports recruitment experiences from two research projects that engaged the Internet and social networks in different ways for study recruitment. Drawing from the non-Internet recruitment literature, we speculate that the relationship with the source of the research and the purpose of the engaged social network should be a consideration in Internet or social network recruitment strategies.

  16. Evaluation of CNT Energy Savers Retrofit Packages Implemented in Multifamily Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farley, Jenne; Ruch, Russell

    This evaluation explored the feasibility of designing prescriptive retrofit measure packages for typical Chicago region multifamily buildings in order to achieve 25%-30% source energy savings through the study of three case studies. There is an urgent need to scale up energy efficiency retrofitting of Chicago's multifamily buildings in order to address rising energy costs and a rapidly depleting rental stock. Aimed at retrofit program administrators and building science professionals, this research project investigates the possibility of using prescriptive retrofit packages as a time- and resource-effective approach to the process of retrofitting multifamily buildings.

  17. Evaluation of CNT Energy Savers Retrofit Packages Implemented in Multifamily Buildings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farley, Jenne; Ruch, Russell

    This evaluation explored the feasibility of designing prescriptive retrofit measure packages for typical Chicago region multifamily buildings in order to achieve 25%-30% source energy savings through the study of three case studies. There is an urgent need to scale up energy efficiency retrofitting of Chicago's multifamily buildings in order to address rising energy costs and a rapidly depletingrental stock. Aimed at retrofit program administrators and building science professionals, this research project investigates the possibility of using prescriptive retrofit packages as a time- and resource-effective approach to the process of retrofitting multifamily buildings.

  18. National Register of Research Projects, 1986/87. Part 2A: Natural sciences. Physical, engineering and related sciences (modified projects)

    NASA Astrophysics Data System (ADS)

    1988-08-01

    This Register is intended to serve as a source of information on research which is being conducted in all fields (both natural and human sciences) in the Republic of South Africa. New and current research projects that were commenced or modified during 1986 and 1987, on which information was received by the compilers until January 1988, are included, with the exception of confidential projects. Project titles and keywords are presented in the language as supplied, and the classifications are based on those provided by the primary sources.

  19. Evaluation of DICOM viewer software for workflow integration in clinical trials

    NASA Astrophysics Data System (ADS)

    Haak, Daniel; Page, Charles E.; Kabino, Klaus; Deserno, Thomas M.

    2015-03-01

    The digital imaging and communications in medicine (DICOM) protocol is nowadays the leading standard for capture, exchange and storage of image data in medical applications. A broad range of commercial, free, and open source software tools supporting a variety of DICOM functionality exists. However, different from patient's care in hospital, DICOM has not yet arrived in electronic data capture systems (EDCS) for clinical trials. Due to missing integration, even just the visualization of patient's image data in electronic case report forms (eCRFs) is impossible. Four increasing levels for integration of DICOM components into EDCS are conceivable, raising functionality but also demands on interfaces with each level. Hence, in this paper, a comprehensive evaluation of 27 DICOM viewer software projects is performed, investigating viewing functionality as well as interfaces for integration. Concerning general, integration, and viewing requirements the survey involves the criteria (i) license, (ii) support, (iii) platform, (iv) interfaces, (v) two-dimensional (2D) and (vi) three-dimensional (3D) image viewing functionality. Optimal viewers are suggested for applications in clinical trials for 3D imaging, hospital communication, and workflow. Focusing on open source solutions, the viewers ImageJ and MicroView are superior for 3D visualization, whereas GingkoCADx is advantageous for hospital integration. Concerning workflow optimization in multi-centered clinical trials, we suggest the open source viewer Weasis. Covering most use cases, an EDCS and PACS interconnection with Weasis is suggested.

  20. Augmenting Traditional Static Analysis With Commonly Available Metadata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cook, Devin

    Developers and security analysts have been using static analysis for a long time to analyze programs for defects and vulnerabilities with some success. Generally a static analysis tool is run on the source code for a given program, flagging areas of code that need to be further inspected by a human analyst. These areas may be obvious bugs like potential bu er over flows, information leakage flaws, or the use of uninitialized variables. These tools tend to work fairly well - every year they find many important bugs. These tools are more impressive considering the fact that they only examinemore » the source code, which may be very complex. Now consider the amount of data available that these tools do not analyze. There are many pieces of information that would prove invaluable for finding bugs in code, things such as a history of bug reports, a history of all changes to the code, information about committers, etc. By leveraging all this additional data, it is possible to nd more bugs with less user interaction, as well as track useful metrics such as number and type of defects injected by committer. This dissertation provides a method for leveraging development metadata to find bugs that would otherwise be difficult to find using standard static analysis tools. We showcase two case studies that demonstrate the ability to find 0day vulnerabilities in large and small software projects by finding new vulnerabilities in the cpython and Roundup open source projects.« less

  1. Remote Monitoring of Soil Water Content, Temperature, and Heat Flow Using Low-Cost Cellular (3G) IoT Technology

    NASA Astrophysics Data System (ADS)

    Ham, J. M.

    2016-12-01

    New microprocessor boards, open-source sensors, and cloud infrastructure developed for the Internet of Things (IoT) can be used to create low-cost monitoring systems for environmental research. This project describes two applications in soil science and hydrology: 1) remote monitoring of the soil temperature regime near oil and gas operations to detect the thermal signature associated with the natural source zone degradation of hydrocarbon contaminants in the vadose zone, and 2) remote monitoring of soil water content near the surface as part of a global citizen science network. In both cases, prototype data collection systems were built around the cellular (2G/3G) "Electron" microcontroller (www.particle.io). This device allows connectivity to the cloud using a low-cost global SIM and data plan. The systems have cellular connectivity in over 100 countries and data can be logged to the cloud for storage. Users can view data real time over any internet connection or via their smart phone. For both projects, data logging, storage, and visualization was done using IoT services like Thingspeak (thingspeak.com). The soil thermal monitoring system was tested on experimental plots in Colorado USA to evaluate the accuracy and reliability of different temperature sensors and 3D printed housings. The soil water experiment included comparison opens-source capacitance-based sensors to commercial versions. Results demonstrate the power of leveraging IoT technology for field research.

  2. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 36: Technical uncertainty as a correlate of information use by US industry-affiliated aerospace engineers and scientists

    NASA Technical Reports Server (NTRS)

    Pinelli, Thomas E.; Glassman, Nanci A.; Affelder, Linda O.; Hecht, Laura M.; Kennedy, John M.; Barclay, Rebecca O.

    1994-01-01

    This paper reports the results of an exploratory study that investigated the influence of technical uncertainty on the use of information and information sources by U.S. industry-affiliated aerospace engineers and scientists in completing or solving a project, task, or problem. Data were collected through a self-administered questionnaire. Survey participants were U.S. aerospace engineers and scientists whose names appeared on the Society of Automotive Engineers (SAE) mailing list. The results support the findings of previous research and the following study assumptions. Information and information-source use differ for projects, problems, and tasks with high and low technical uncertainty. As technical uncertainty increases, information-source use changes from internal to external and from informal to formal sources. As technical uncertainty increases, so too does the use of federally funded aerospace research and development (R&D). The use of formal information sources to learn about federally funded aerospace R&D differs for projects, problems, and tasks with high and low technical uncertainty.

  3. Evaluating the completeness of the national ALS registry, United States.

    PubMed

    Kaye, Wendy E; Wagner, Laurie; Wu, Ruoming; Mehta, Paul

    2018-02-01

    Our objective was to evaluate the completeness of the United States National ALS Registry (Registry). We compared persons with ALS who were passively identified by the Registry with those actively identified in the State and Metropolitan Area ALS Surveillance project. Cases in the two projects were matched using a combination of identifiers, including, partial social security number, name, date of birth, and sex. The distributions of cases from the two projects that matched/did not match were compared and Chi-square tests conducted to determine statistical significance. There were 5883 ALS cases identified by the surveillance project. Of these, 1116 died before the Registry started, leaving 4767 cases. We matched 2720 cases from the surveillance project to those in the Registry. The cases identified by the surveillance project that did not match cases in the Registry were more likely to be non-white, Hispanic, less than 65 years of age, and from western states. The methods used by the Registry to identify ALS cases, i.e. national administrative data and self-registration, worked well but missed cases. These findings suggest that developing strategies to identify and promote the Registry to those who were more likely to be missing, e.g. non-white and Hispanic, could be beneficial to improving the completeness of the Registry.

  4. Teaching Introductory Oceanography through Case Studies: Project based approach for general education students

    NASA Astrophysics Data System (ADS)

    Farnsworth, K. L.; House, M.; Hovan, S. A.

    2013-12-01

    A recent workshop sponsored by SERC-On the Cutting Edge brought together science educators from a range of schools across the country to discuss new approaches in teaching oceanography. In discussing student interest in our classes, we were struck by the fact that students are drawn to emotional or controversial topics such as whale hunting and tsunami hazard and that these kinds of topics are a great vehicle for introducing more complex concepts such as wave propagation, ocean upwelling and marine chemistry. Thus, we have developed an approach to introductory oceanography that presents students with real-world issues in the ocean sciences and requires them to explore the science behind them in order to improve overall ocean science literacy among non-majors and majors at 2 and 4 year colleges. We have designed a project-based curriculum built around topics that include, but are not limited to: tsunami hazard, whale migration, ocean fertilization, ocean territorial claims, rapid climate change, the pacific trash patch, overfishing, and ocean acidification. Each case study or project consists of three weeks of class time and is structured around three elements: 1) a media analysis; 2) the role of ocean science in addressing the issue; 3) human impact/response. Content resources range from textbook readings, popular or current print news, documentary film and television, and data available on the world wide web from a range of sources. We employ a variety of formative assessments for each case study in order to monitor student access and understanding of content and include a significant component of in-class student discussion and brainstorming guided by faculty input to develop the case study. Each study culminates in summative assessments ranging from exams to student posters to presentations, depending on the class size and environment. We envision this approach for a range of classroom environments including large group face-to-face instruction as well as hybrid and fully online courses.

  5. Piecewise polynomial representations of genomic tracks.

    PubMed

    Tarabichi, Maxime; Detours, Vincent; Konopka, Tomasz

    2012-01-01

    Genomic data from micro-array and sequencing projects consist of associations of measured values to chromosomal coordinates. These associations can be thought of as functions in one dimension and can thus be stored, analyzed, and interpreted as piecewise-polynomial curves. We present a general framework for building piecewise polynomial representations of genome-scale signals and illustrate some of its applications via examples. We show that piecewise constant segmentation, a typical step in copy-number analyses, can be carried out within this framework for both array and (DNA) sequencing data offering advantages over existing methods in each case. Higher-order polynomial curves can be used, for example, to detect trends and/or discontinuities in transcription levels from RNA-seq data. We give a concrete application of piecewise linear functions to diagnose and quantify alignment quality at exon borders (splice sites). Our software (source and object code) for building piecewise polynomial models is available at http://sourceforge.net/projects/locsmoc/.

  6. Lessons Learned in Student Venture Creation

    NASA Astrophysics Data System (ADS)

    Caner, Edward

    The Physics Entrepreneurship Master's Program (PEP) at Case Western Reserve University is now in its 15th year of operation. PEP is a 27 credit-hour Master of Science in Physics, Entrepreneurship Track. The curriculum can be tailored to the needs of each student. Coursework consists of graduate-level classes in science, business, intellectual property law, and innovation. A master's thesis is required that is based on a real-world project in innovation or entrepreneurship within an existing company or startup (possibly the student's). PEP faculty help students connect with mentors, advisors, partners, funding sources and job opportunities. In this talk I will chronicle several pitfalls that we have encountered with our ''real world'' student projects and start-up businesses, several of which met their complete demise despite showing great promise for success. I will discuss how we have learned to avoid most of these pitfalls by taking surprisingly simple actions.

  7. THE CHALLENGING ROLE OF A READING COACH, A CAUTIONARY TALE.

    PubMed

    Al Otaiba, Stephanie; Hosp, John L; Smartt, Susan; Dole, Janice A

    2008-04-01

    The purpose of this case study is to describe the challenges one coach faced during the initial implementation of a coaching initiative involving 33 teachers in an urban, high-poverty elementary school. Reading coaches are increasingly expected to play a key role in the professional development efforts to improve reading instruction in order to improve reading achievement for struggling readers. Data sources included initial reading scores for kindergarten and first-graders, pretest and posttest scores of teachers' knowledge, a teacher survey, focus group interviews, project documents, and field notes. Data were analyzed using a mixed methods approach. Findings revealed several challenges that have important implications for research and practice: that teachers encountered new information about teaching early reading that conflicted with their current knowledge, this new information conflicted with their core reading program, teachers had differing perceptions of the role of the reading coach that affected their feelings about the project, and reform efforts are time-intensive.

  8. Quantifying and minimizing entropy generation in AMTEC cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hendricks, T.J.; Huang, C.

    1997-12-31

    Entropy generation in an AMTEC cell represents inherent power loss to the AMTEC cell. Minimizing cell entropy generation directly maximizes cell power generation and efficiency. An internal project is on-going at AMPS to identify, quantify and minimize entropy generation mechanisms within an AMTEC cell, with the goal of determining cost-effective design approaches for maximizing AMTEC cell power generation. Various entropy generation mechanisms have been identified and quantified. The project has investigated several cell design techniques in a solar-driven AMTEC system to minimize cell entropy generation and produce maximum power cell designs. In many cases, various sources of entropy generation aremore » interrelated such that minimizing entropy generation requires cell and system design optimization. Some of the tradeoffs between various entropy generation mechanisms are quantified and explained and their implications on cell design are discussed. The relationship between AMTEC cell power and efficiency and entropy generation is presented and discussed.« less

  9. Enlisting User Community Perspectives to Inform Development of a Semantic Web Application for Discovery of Cross-Institutional Research Information and Data

    NASA Astrophysics Data System (ADS)

    Johns, E. M.; Mayernik, M. S.; Boler, F. M.; Corson-Rikert, J.; Daniels, M. D.; Gross, M. B.; Khan, H.; Maull, K. E.; Rowan, L. R.; Stott, D.; Williams, S.; Krafft, D. B.

    2015-12-01

    Researchers seek information and data through a variety of avenues: published literature, search engines, repositories, colleagues, etc. In order to build a web application that leverages linked open data to enable multiple paths for information discovery, the EarthCollab project has surveyed two geoscience user communities to consider how researchers find and share scholarly output. EarthCollab, a cross-institutional, EarthCube funded project partnering UCAR, Cornell University, and UNAVCO, is employing the open-source semantic web software, VIVO, as the underlying technology to connect the people and resources of virtual research communities. This study will present an analysis of survey responses from members of the two case study communities: (1) the Bering Sea Project, an interdisciplinary field program whose data archive is hosted by NCAR's Earth Observing Laboratory (EOL), and (2) UNAVCO, a geodetic facility and consortium that supports diverse research projects informed by geodesy. The survey results illustrate the types of research products that respondents indicate should be discoverable within a digital platform and the current methods used to find publications, data, personnel, tools, and instrumentation. The responses showed that scientists rely heavily on general purpose search engines, such as Google, to find information, but that data center websites and the published literature were also critical sources for finding collaborators, data, and research tools.The survey participants also identify additional features of interest for an information platform such as search engine indexing, connection to institutional web pages, generation of bibliographies and CVs, and outward linking to social media. Through the survey, the user communities prioritized the type of information that is most important to display and describe their work within a research profile. The analysis of this survey will inform our further development of a platform that will facilitate different types of information discovery strategies, and help researchers to find and use the associated resources of a research project.

  10. Domain Specific Language Support for Exascale. Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baden, Scott

    The project developed a domain specific translator enable legacy MPI source code to tolerate communication delays, which are increasing over time due to technological factors. The translator performs source-to-source translation that incorporates semantic information into the translation process. The output of the translator is a C program runs as a data driven program, and uses an existing run time to overlap communication automatically

  11. Anatomy of BioJS, an open source community for the life sciences.

    PubMed

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  12. Cf-252 Characterization Documents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feldman, Alexander

    2014-03-14

    Six documents were written by Vance and Associates under contract to the Off-Site Source Recovery Project of Los Alamos National Laboratory. These Six documents provided the basis for characterization of Californium-252 sealed sources and for the packaging and manifesting of this material for disposal at the Waste Isolation Pilot Project. The Six documents are: 1. VA-OSR-10, Development of radionuclide distributions for Cf-252 sealed sources. 2. VA-OSR-11, Uncertainty analysis for Cf-252 sealed sources. 3. VA-OSR-12, To determine the radionuclides in the waste drums containing Cf-252 sealed source waste that are required to be reported under the requirements of the WIPP WACmore » and the TRAMPAC. 4. VA-OSR-13, Development of the spreadsheet for the radiological calculations for the characterization of Cf-252 sources. 5. VA-OSR-14, Relative importance of neutron-induced fission in Cf-252 sources. 6. VA-OSR-15, Determine upper bound of decay product inventories from a drum of Cf-252 sources. These six documents provide the technical basis for the characterization of Cf-252 sources and will be part of the AK documentation required for submittal to the Central Characterization Project (CCP) of WIPP.« less

  13. Use and Optimisation of Paid Crowdsourcing for the Collection of Geodata

    NASA Astrophysics Data System (ADS)

    Walter, V.; Laupheimer, D.; Fritsch, D.

    2016-06-01

    Crowdsourcing is a new technology and a new business model that will change the way in which we work in many fields in the future. Employers divide and source out their work to a huge number of anonymous workers on the Internet. The division and outsourcing is not a trivial process but requires the definition of complete new workflows - from the definition of subtasks, to the execution and quality control. A popular crowdsourcing project in the field of collection of geodata is OpenStreetMap, which is based on the work of unpaid volunteers. Crowdsourcing projects that are based on the work of unpaid volunteers need an active community, whose members are convinced about the importance of the project and who have fun to collaborate. This can only be realized for some tasks. In the field of geodata collection many other tasks exist, which can in principle be solved with crowdsourcing, but where it is difficult to find a sufficient large number of volunteers. Other incentives must be provided in these cases, which can be monetary payments.

  14. Design of a decentralized reusable research database architecture to support data acquisition in large research projects.

    PubMed

    Iavindrasana, Jimison; Depeursinge, Adrien; Ruch, Patrick; Spahni, Stéphane; Geissbuhler, Antoine; Müller, Henning

    2007-01-01

    The diagnostic and therapeutic processes, as well as the development of new treatments, are hindered by the fragmentation of information which underlies them. In a multi-institutional research study database, the clinical information system (CIS) contains the primary data input. An important part of the money of large scale clinical studies is often paid for data creation and maintenance. The objective of this work is to design a decentralized, scalable, reusable database architecture with lower maintenance costs for managing and integrating distributed heterogeneous data required as basis for a large-scale research project. Technical and legal aspects are taken into account based on various use case scenarios. The architecture contains 4 layers: data storage and access are decentralized at their production source, a connector as a proxy between the CIS and the external world, an information mediator as a data access point and the client side. The proposed design will be implemented inside six clinical centers participating in the @neurIST project as part of a larger system on data integration and reuse for aneurism treatment.

  15. Effort to Accelerate MBSE Adoption and Usage at JSC

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Izygon, Michel; Okron, Shira; Garner, Larry; Wagner, Howard

    2016-01-01

    This paper describes the authors' experience in adopting Model Based System Engineering (MBSE) at the NASA/Johnson Space Center (JSC). Since 2009, NASA/JSC has been applying MBSE using the Systems Modeling Language (SysML) to a number of advanced projects. Models integrate views of the system from multiple perspectives, capturing the system design information for multiple stakeholders. This method has allowed engineers to better control changes, improve traceability from requirements to design and manage the numerous interactions between components. As the project progresses, the models become the official source of information and used by multiple stakeholders. Three major types of challenges that hamper the adoption of the MBSE technology are described. These challenges are addressed by a multipronged approach that includes educating the main stakeholders, implementing an organizational infrastructure that supports the adoption effort, defining a set of modeling guidelines to help engineers in their modeling effort, providing a toolset that support the generation of valuable products, and providing a library of reusable models. JSC project case studies are presented to illustrate how the proposed approach has been successfully applied.

  16. Interactive bibliographical database on color

    NASA Astrophysics Data System (ADS)

    Caivano, Jose L.

    2002-06-01

    The paper describes the methodology and results of a project under development, aimed at the elaboration of an interactive bibliographical database on color in all fields of application: philosophy, psychology, semiotics, education, anthropology, physical and natural sciences, biology, medicine, technology, industry, architecture and design, arts, linguistics, geography, history. The project is initially based upon an already developed bibliography, published in different journals, updated in various opportunities, and now available at the Internet, with more than 2,000 entries. The interactive database will amplify that bibliography, incorporating hyperlinks and contents (indexes, abstracts, keywords, introductions, or eventually the complete document), and devising mechanisms for information retrieval. The sources to be included are: books, doctoral dissertations, multimedia publications, reference works. The main arrangement will be chronological, but the design of the database will allow rearrangements or selections by different fields: subject, Decimal Classification System, author, language, country, publisher, etc. A further project is to develop another database, including color-specialized journals or newsletters, and articles on color published in international journals, arranged in this case by journal name and date of publication, but allowing also rearrangements or selections by author, subject and keywords.

  17. Combining Adaptive Hypermedia with Project and Case-Based Learning

    ERIC Educational Resources Information Center

    Papanikolaou, Kyparisia; Grigoriadou, Maria

    2009-01-01

    In this article we investigate the design of educational hypermedia based on constructivist learning theories. According to the principles of project and case-based learning we present the design rational of an Adaptive Educational Hypermedia system prototype named MyProject; learners working with MyProject undertake a project and the system…

  18. GPU-accelerated iterative reconstruction for limited-data tomography in CBCT systems.

    PubMed

    de Molina, Claudia; Serrano, Estefania; Garcia-Blas, Javier; Carretero, Jesus; Desco, Manuel; Abella, Monica

    2018-05-15

    Standard cone-beam computed tomography (CBCT) involves the acquisition of at least 360 projections rotating through 360 degrees. Nevertheless, there are cases in which only a few projections can be taken in a limited angular span, such as during surgery, where rotation of the source-detector pair is limited to less than 180 degrees. Reconstruction of limited data with the conventional method proposed by Feldkamp, Davis and Kress (FDK) results in severe artifacts. Iterative methods may compensate for the lack of data by including additional prior information, although they imply a high computational burden and memory consumption. We present an accelerated implementation of an iterative method for CBCT following the Split Bregman formulation, which reduces computational time through GPU-accelerated kernels. The implementation enables the reconstruction of large volumes (>1024 3 pixels) using partitioning strategies in forward- and back-projection operations. We evaluated the algorithm on small-animal data for different scenarios with different numbers of projections, angular span, and projection size. Reconstruction time varied linearly with the number of projections and quadratically with projection size but remained almost unchanged with angular span. Forward- and back-projection operations represent 60% of the total computational burden. Efficient implementation using parallel processing and large-memory management strategies together with GPU kernels enables the use of advanced reconstruction approaches which are needed in limited-data scenarios. Our GPU implementation showed a significant time reduction (up to 48 ×) compared to a CPU-only implementation, resulting in a total reconstruction time from several hours to few minutes.

  19. Scientific Data Purchase Project Overview Presentation

    NASA Technical Reports Server (NTRS)

    Holekamp, Kara; Fletcher, Rose

    2001-01-01

    The Scientific Data Purchase (SDP) project acquires science data from commercial sources. It is a demonstration project to test a new way of doing business, tap new sources of data, support Earth science research, and support the commercial remote sensing industry. Phase I of the project reviews simulated/prototypical data sets from 10 companies. Phase II of the project is a 3 year purchase/distribution of select data from 5 companies. The status of several SDP projects is reviewed in this viewgraph presentation, as is the SDP process of tasking, verification, validation, and data archiving. The presentation also lists SDP results for turnaround time, metrics, customers, data use, science research, applications research, and user feedback.

  20. Fine sediment sources in conservation effects assessment project watersheds

    USDA-ARS?s Scientific Manuscript database

    Two naturally occurring radionuclides, 7Be and 210Pbxs , were used as tracers to discriminate eroded surface soils from channel-derived sediments in the fine suspended sediment loads of eight Conservation Effects Assessment Project (CEAP) benchmark watersheds. Precipitation, source soils, and suspe...

  1. Student ownership of projects in an upper-division optics laboratory course: A multiple case study of successful experiences

    NASA Astrophysics Data System (ADS)

    Dounas-Frazer, Dimitri R.; Stanley, Jacob T.; Lewandowski, H. J.

    2017-12-01

    We investigate students' sense of ownership of multiweek final projects in an upper-division optics lab course. Using a multiple case study approach, we describe three student projects in detail. Within-case analyses focused on identifying key issues in each project, and constructing chronological descriptions of those events. Cross-case analysis focused on identifying emergent themes with respect to five dimensions of project ownership: student agency, instructor mentorship, peer collaboration, interest and value, and affective responses. Our within- and cross-case analyses yielded three major findings. First, coupling division of labor with collective brainstorming can help balance student agency, instructor mentorship, and peer collaboration. Second, students' interest in the project and perceptions of its value can increase over time; initial student interest in the project topic is not a necessary condition for student ownership of the project. Third, student ownership is characterized by a wide range of emotions that fluctuate as students alternate between extended periods of struggle and moments of success while working on their projects. These findings not only extend the literature on student ownership into a new educational domain—namely, upper-division physics labs—they also have concrete implications for the design of experimental physics projects in courses for which student ownership is a desired learning outcome. We describe the course and projects in sufficient detail that others can adapt our results to their particular contexts.

  2. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  3. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  4. Detection of cerebrospinal fluid leakage: initial experience with three-dimensional fast spin-echo magnetic resonance myelography.

    PubMed

    Tomoda, Y; Korogi, Y; Aoki, T; Morioka, T; Takahashi, H; Ohno, M; Takeshita, I

    2008-03-01

    The pathogenesis of cerebrospinal fluid (CSF) hypovolemia is supposed to be caused by CSF leakage through small dural defects. To compare source three-dimensional (3D) fast spin-echo (FSE) images of magnetic resonance (MR) myelography with radionuclide cisternography findings, and to evaluate the feasibility of MR myelography in the detection of CSF leakage. A total of 67 patients who were clinically suspected of CSF hypovolemia underwent indium-111 radionuclide cisternography, and 27 of those who had direct findings of CSF leakage were selected for evaluation. MR myelography with 3D FSE sequences (TR/TE 6000/203 ms) was performed at the lumbar spine for all patients. We evaluated source images and maximum intensity projection (MIP) images of MR myelography, and the findings were correlated with radionuclide cisternography findings. MR myelography of five healthy volunteers was used as a reference. The MR visibility of the CSF leakage was graded as definite (leakage clearly visible), possible (leakage poorly seen), or absent (not shown). CSF leakage was identified with source 3D FSE images in 22 (81.5%) of 27 patients. Of the 22 patients, 16 were graded as definite and six were graded as possible. For the definite cases, 3D FSE images clearly showed the extent of the leaked CSF in the paraspinal structures. In the remaining five patients with absent findings, radionuclide cisternography showed only slight radionuclide activity out of the arachnoid space. Source 3D FSE images of MR myelography seem useful in the detection of CSF leakage. Invasive radionuclide cisternography may be reserved for equivocal cases only.

  5. Low cost monitoring from space using Landsat TM time series and open source technologies: the case study of Iguazu park

    NASA Astrophysics Data System (ADS)

    Nole, Gabriele; Lasaponara, Rosa

    2015-04-01

    Up to nowadays, satellite data have become increasingly available, thus offering a low cost or even free of charge unique tool, with a great potential for operational monitoring of vegetation cover, quantitative assessment of urban expansion and urban sprawl, as well as for monitoring of land use changes and soil consumption. This growing observational capacity has also highlighted the need for research efforts aimed at exploring the potential offered by data processing methods and algorithms, in order to exploit as much as possible this invaluable space-based data source. The work herein presented concerns an application study on the monitoring of vegetation cover and urban sprawl conducted with the use of satellite Landsat TM data. The selected test site is the Iguazu park highly significant, being it one of the most threatened global conservation priorities (http://whc.unesco.org/en/list/303/). In order to produce synthetic maps of the investigated areas to monitor the status of vegetation and ongoing subtle changes, satellite Landsat TM data images were classified using two automatic classifiers, Maximum Likelihood (MLC) and Support Vector Machines (SVMs) applied by changing setting parameters, with the aim to compare their respective performances in terms of robustness, speed and accuracy. All process steps have been developed integrating Geographical Information System and Remote Sensing, and adopting free and open source software. Results pointed out that the SVM classifier with RBF kernel was generally the best choice (with accuracy higher than 90%) among all the configurations compared, and the use of multiple bands globally improves classification. One of the critical elements found in the case of monitoring of urban area expansion is given by the presence of urban garden mixed with urban fabric. The use of different configurations for the SVMs, i.e. different kernels and values of the setting parameters, allowed us to calibrate the classifier also to cope with a specific need, as in our case, to achieve a reliable discrimination of urban from non urban areas. Acknowledgement This research was performed within the framework of the Great relevance project " Smart management of cultural heritage sites in Italy and Argentina: Earth Observation and pilot projects funded by the Ministero degli Affari Esteri e della Cooperazione Internazionale --MAE, 17/04/2014, Prot. nr. 0090692, 2014-2016

  6. Open-Source as a strategy for operational software - the case of Enki

    NASA Astrophysics Data System (ADS)

    Kolberg, Sjur; Bruland, Oddbjørn

    2014-05-01

    Since 2002, SINTEF Energy has been developing what is now known as the Enki modelling system. This development has been financed by Norway's largest hydropower producer Statkraft, motivated by a desire for distributed hydrological models in operational use. As the owner of the source code, Statkraft has recently decided on Open Source as a strategy for further development, and for migration from an R&D context to operational use. A current cooperation project is currently carried out between SINTEF Energy, 7 large Norwegian hydropower producers including Statkraft, three universities and one software company. Of course, the most immediate task is that of software maturing. A more important challenge, however, is one of gaining experience within the operational hydropower industry. A transition from lumped to distributed models is likely to also require revision of measurement program, calibration strategy, use of GIS and modern data sources like weather radar and satellite imagery. On the other hand, map based visualisations enable a richer information exchange between hydrologic forecasters and power market traders. The operating context of a distributed hydrology model within hydropower planning is far from settled. Being both a modelling framework and a library of plugin-routines to build models from, Enki supports the flexibility needed in this situation. Recent development has separated the core from the user interface, paving the way for a scripting API, cross-platform compilation, and front-end programs serving different degrees of flexibility, robustness and security. The open source strategy invites anyone to use Enki and to develop and contribute new modules. Once tested, the same modules are available for the operational versions of the program. A core challenge is to offer rigid testing procedures and mechanisms to reject routines in an operational setting, without limiting the experimentation with new modules. The Open Source strategy also has implications for building and maintaining competence around the source code and the advanced hydrological and statistical routines in Enki. Originally developed by hydrologists, the Enki code is now approaching a state where maintenance requires a background in professional software development. Without the advantage of proprietary source code, both hydrologic improvements and software maintenance depend on donations or development support on a case-to-case basis, a situation well known within the open source community. It remains to see whether these mechanisms suffice to keep Enki at the maintenance level required by the hydropower sector. ENKI is available from www.opensource-enki.org.

  7. Legal and financial methods for reducing low emission sources: Options for incentives

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samitowski, W.

    1995-12-31

    There are two types of the so-called low emission sources in Cracow: over 1,000 local boiler houses and several thousand solid fuel-fired stoves. The accomplishment of each of 5 sub-projects offered under the American-Polish program entails solving the technical, financial, legal and public relations-related problems. The elimination of the low emission source requires, therefore, a joint effort of the following pairs: (a) local authorities, (b) investors, (c) owners and users of low emission sources, and (d) inhabitants involved in particular projects. The results of the studies developed by POLINVEST indicate that the accomplishment of the projects for the elimination ofmore » low emission sources will require financial incentives. Bearing in mind the today`s resources available from the community budget, this process may last as long as a dozen or so years. The task of the authorities of Cracow City is making a long-range operational strategy enabling reduction of low emission sources in Cracow.« less

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brauner, Edwin Jr.; Carlson, Daniel C.

    The Geysers steamfields in northern Sonoma County have produced reliable ''green'' power for many years. An impediment to long-term continued production has been the ability to provide a reliable source of injection water to replace water extracted and lost in the form of steam. The steamfield operators have historcially used cooling towers to recycle a small portion of the steam and have collected water during the winter months using stream extraction. These two sources, however, could not by themselves sustain the steamfield in the long term. The Lake County Reclaimed Water Project (SEGEP) was inititated in 1997 and provides anothermore » source of steamfield replenishment water. The Santa Rosa Geysers Recharge Project provides another significant step in replenishing the steamfield. In addition, the Santa Rosa Geysers Recharge Project has been built with capacity to potentially meet virtually all injection water requirements, when combined with these other sources. Figure 2.1 graphically depicts the combination of injection sources.« less

  9. Minimal Polynomial Method for Estimating Parameters of Signals Received by an Antenna Array

    NASA Astrophysics Data System (ADS)

    Ermolaev, V. T.; Flaksman, A. G.; Elokhin, A. V.; Kuptsov, V. V.

    2018-01-01

    The effectiveness of the projection minimal polynomial method for solving the problem of determining the number of sources of signals acting on an antenna array (AA) with an arbitrary configuration and their angular directions has been studied. The method proposes estimating the degree of the minimal polynomial of the correlation matrix (CM) of the input process in the AA on the basis of a statistically validated root-mean-square criterion. Special attention is paid to the case of the ultrashort sample of the input process when the number of samples is considerably smaller than the number of AA elements, which is important for multielement AAs. It is shown that the proposed method is more effective in this case than methods based on the AIC (Akaike's Information Criterion) or minimum description length (MDL) criterion.

  10. Wind for Schools: A Wind Powering America Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2007-12-01

    This brochure serves as an introduction to Wind Powering America's Wind for Schools Project, including a description of the project, the participants, funding sources, and the basic configurations of the project.

  11. European monitoring systems and data for assessing environmental and climate impacts on human infectious diseases.

    PubMed

    Nichols, Gordon L; Andersson, Yvonne; Lindgren, Elisabet; Devaux, Isabelle; Semenza, Jan C

    2014-04-09

    Surveillance is critical to understanding the epidemiology and control of infectious diseases. The growing concern over climate and other drivers that may increase infectious disease threats to future generations has stimulated a review of the surveillance systems and environmental data sources that might be used to assess future health impacts from climate change in Europe. We present an overview of organizations, agencies and institutions that are responsible for infectious disease surveillance in Europe. We describe the surveillance systems, tracking tools, communication channels, information exchange and outputs in light of environmental and climatic drivers of infectious diseases. We discuss environmental and climatic data sets that lend themselves to epidemiological analysis. Many of the environmental data sets have a relatively uniform quality across EU Member States because they are based on satellite measurements or EU funded FP6 or FP7 projects with full EU coverage. Case-reporting systems for surveillance of infectious diseases should include clear and consistent case definitions and reporting formats that are geo-located at an appropriate resolution. This will allow linkage to environmental, social and climatic sources that will enable risk assessments, future threat evaluations, outbreak management and interventions to reduce disease burden.

  12. European Monitoring Systems and Data for Assessing Environmental and Climate Impacts on Human Infectious Diseases

    PubMed Central

    Nichols, Gordon L.; Andersson, Yvonne; Lindgren, Elisabet; Devaux, Isabelle; Semenza, Jan C.

    2014-01-01

    Surveillance is critical to understanding the epidemiology and control of infectious diseases. The growing concern over climate and other drivers that may increase infectious disease threats to future generations has stimulated a review of the surveillance systems and environmental data sources that might be used to assess future health impacts from climate change in Europe. We present an overview of organizations, agencies and institutions that are responsible for infectious disease surveillance in Europe. We describe the surveillance systems, tracking tools, communication channels, information exchange and outputs in light of environmental and climatic drivers of infectious diseases. We discuss environmental and climatic data sets that lend themselves to epidemiological analysis. Many of the environmental data sets have a relatively uniform quality across EU Member States because they are based on satellite measurements or EU funded FP6 or FP7 projects with full EU coverage. Case-reporting systems for surveillance of infectious diseases should include clear and consistent case definitions and reporting formats that are geo-located at an appropriate resolution. This will allow linkage to environmental, social and climatic sources that will enable risk assessments, future threat evaluations, outbreak management and interventions to reduce disease burden. PMID:24722542

  13. The Holocaust in Hungary and Poland: Case Studies of Response to Genocide. Curriculum Project. Fulbright-Hays Summer Seminars Abroad Program, 1998 (Hungary/Poland).

    ERIC Educational Resources Information Center

    Hartley, William L.

    This curriculum project was designed primarily to be incorporated into a larger world history unit on the Holocaust and World War II. The project can be adapted for a lesson on 'situational ethics' for use in a philosophy class. The lesson requires students to examine a historical case and to write and discuss that particular case. The project's…

  14. Sharing Health Big Data for Research - A Design by Use Cases: The INSHARE Platform Approach.

    PubMed

    Bouzillé, Guillaume; Westerlynck, Richard; Defossez, Gautier; Bouslimi, Dalel; Bayat, Sahar; Riou, Christine; Busnel, Yann; Le Guillou, Clara; Cauvin, Jean-Michel; Jacquelinet, Christian; Pladys, Patrick; Oger, Emmanuel; Stindel, Eric; Ingrand, Pierre; Coatrieux, Gouenou; Cuggia, Marc

    2017-01-01

    Sharing and exploiting Health Big Data (HBD) allow tackling challenges: data protection/governance taking into account legal, ethical, and deontological aspects enables trust, transparent and win-win relationship between researchers, citizens, and data providers. Lack of interoperability: compartmentalized and syntactically/semantica heterogeneous data. INSHARE project using experimental proof of concept explores how recent technologies overcome such issues. Using 6 data providers, platform is designed via 3 steps to: (1) analyze use cases, needs, and requirements; (2) define data sharing governance, secure access to platform; and (3) define platform specifications. Three use cases - from 5 studies and 11 data sources - were analyzed for platform design. Governance derived from SCANNER model was adapted to data sharing. Platform architecture integrates: data repository and hosting, semantic integration services, data processing, aggregate computing, data quality and integrity monitoring, Id linking, multisource query builder, visualization and data export services, data governance, study management service and security including data watermarking.

  15. Nodding syndrome in Kitgum District, Uganda: association with conflict and internal displacement

    PubMed Central

    Landis, Jesa L; Palmer, Valerie S; Spencer, Peter S

    2014-01-01

    Objectives To test for any temporal association of Nodding syndrome with wartime conflict, casualties and household displacement in Kitgum District, northern Uganda. Methods Data were obtained from publicly available information reported by the Ugandan Ministry of Health (MOH), the Armed Conflict Location & Event Data (ACLED) Project of the University of Sussex in the UK, peer-reviewed publications in professional journals and other sources. Results Reports of Nodding syndrome began to appear in 1997, with the first recorded cases in Kitgum District in 1998. Cases rapidly increased annually beginning in 2001, with peaks in 2003–2005 and 2008, 5–6 years after peaks in the number of wartime conflicts and deaths. Additionally, peaks of Nodding syndrome cases followed peak influxes 5–7 years earlier of households into internal displacement camps. Conclusions Peaks of Nodding syndrome reported by the MOH are associated with, but temporally displaced from, peaks of wartime conflicts, deaths and household internment, where infectious disease was rampant and food insecurity rife. PMID:25371417

  16. Fugitive emission source characterization using a gradient-based optimization scheme and scalar transport adjoint

    NASA Astrophysics Data System (ADS)

    Brereton, Carol A.; Joynes, Ian M.; Campbell, Lucy J.; Johnson, Matthew R.

    2018-05-01

    Fugitive emissions are important sources of greenhouse gases and lost product in the energy sector that can be difficult to detect, but are often easily mitigated once they are known, located, and quantified. In this paper, a scalar transport adjoint-based optimization method is presented to locate and quantify unknown emission sources from downstream measurements. This emission characterization approach correctly predicted locations to within 5 m and magnitudes to within 13% of experimental release data from Project Prairie Grass. The method was further demonstrated on simulated simultaneous releases in a complex 3-D geometry based on an Alberta gas plant. Reconstructions were performed using both the complex 3-D transient wind field used to generate the simulated release data and using a sequential series of steady-state RANS wind simulations (SSWS) representing 30 s intervals of physical time. Both the detailed transient and the simplified wind field series could be used to correctly locate major sources and predict their emission rates within 10%, while predicting total emission rates from all sources within 24%. This SSWS case would be much easier to implement in a real-world application, and gives rise to the possibility of developing pre-computed databases of both wind and scalar transport adjoints to reduce computational time.

  17. Computer Forensics Education - the Open Source Approach

    NASA Astrophysics Data System (ADS)

    Huebner, Ewa; Bem, Derek; Cheung, Hon

    In this chapter we discuss the application of the open source software tools in computer forensics education at tertiary level. We argue that open source tools are more suitable than commercial tools, as they provide the opportunity for students to gain in-depth understanding and appreciation of the computer forensic process as opposed to familiarity with one software product, however complex and multi-functional. With the access to all source programs the students become more than just the consumers of the tools as future forensic investigators. They can also examine the code, understand the relationship between the binary images and relevant data structures, and in the process gain necessary background to become the future creators of new and improved forensic software tools. As a case study we present an advanced subject, Computer Forensics Workshop, which we designed for the Bachelor's degree in computer science at the University of Western Sydney. We based all laboratory work and the main take-home project in this subject on open source software tools. We found that without exception more than one suitable tool can be found to cover each topic in the curriculum adequately. We argue that this approach prepares students better for forensic field work, as they gain confidence to use a variety of tools, not just a single product they are familiar with.

  18. YSOVAR: MID-INFRARED VARIABILITY OF YOUNG STELLAR OBJECTS AND THEIR DISKS IN THE CLUSTER IRAS 20050+2720

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poppenhaeger, K.; Wolk, S. J.; Hora, J. L.

    2015-10-15

    We present a time-variability study of young stellar objects (YSOs) in the cluster IRAS 20050+2720, performed at 3.6 and 4.5 μm with the Spitzer Space Telescope; this study is part of the Young Stellar Object VARiability (YSOVAR) project. We have collected light curves for 181 cluster members over 60 days. We find a high variability fraction among embedded cluster members of ca. 70%, whereas young stars without a detectable disk display variability less often (in ca. 50% of the cases) and with lower amplitudes. We detect periodic variability for 33 sources with periods primarily in the range of 2–6 days.more » Practically all embedded periodic sources display additional variability on top of their periodicity. Furthermore, we analyze the slopes of the tracks that our sources span in the color–magnitude diagram (CMD). We find that sources with long variability time scales tend to display CMD slopes that are at least partially influenced by accretion processes, while sources with short variability timescales tend to display extinction-dominated slopes. We find a tentative trend of X-ray detected cluster members to vary on longer timescales than the X-ray undetected members.« less

  19. Gravity and Extreme Magnetism SMEX

    NASA Technical Reports Server (NTRS)

    Swank, Jean; Kallman, Timothy R.; Jahoda, Keith M.

    2008-01-01

    Gas accreting ont,o black holes and neutron stars form a dynamic system generating X-rays with spectroscopic signatures and varying on time scales determined by the system. The radiation from various parts of these systems is surely polarized and compact sources have been calculated to give rise to net polarization from the unresolved sum of the radiation from the systems. Polarization has been looked to for some time as also bearing the imprint of strong gravity and providing complementary information that could resolve ambiguities between the physical models that can give rise to frequencies, time delays, and spectra. In the cases of both stellar black holes and supermassive black holes the net polarizations predicted for probable disk and corona models are less than 10 needed. This sensitivity can be achieved, even for sources as faint as 1 milliCrab, in the Gravity and Extreme Magnetism SMEX (GEMS) mission that uses foil mirrors and Time Projection Chamber detectors. Similarities have been pointed out between the timing and the spectral characteristics of low mass X-ray binaries and stellar black hole sources. Polarization measurements for these sources could play a role in determining the configuration of the disk and the neutron star.

  20. Influence of ion source configuration and its operation parameters on the target sputtering and implantation process.

    PubMed

    Shalnov, K V; Kukhta, V R; Uemura, K; Ito, Y

    2012-06-01

    In the work, investigation of the features and operation regimes of sputter enhanced ion-plasma source are presented. The source is based on the target sputtering with the dense plasma formed in the crossed electric and magnetic fields. It allows operation with noble or reactive gases at low pressure discharge regimes, and, the resulting ion beam is the mixture of ions from the working gas and sputtering target. Any conductive material, such as metals, alloys, or compounds, can be used as the sputtering target. Effectiveness of target sputtering process with the plasma was investigated dependently on the gun geometry, plasma parameters, and the target bias voltage. With the applied accelerating voltage from 0 to 20 kV, the source can be operated in regimes of thin film deposition, ion-beam mixing, and ion implantation. Multi-component ion beam implantation was applied to α-Fe, which leads to the surface hardness increasing from 2 GPa in the initial condition up to 3.5 GPa in case of combined N(2)-C implantation. Projected range of the implanted elements is up to 20 nm with the implantation energy 20 keV that was obtained with XPS depth profiling.

  1. 18 CFR 4.106 - Standard terms and conditions of case-specific exemption from licensing.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... LICENSES, PERMITS, EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.106 Standard terms and conditions of case-specific exemption from licensing. Any case-specific exemption from licensing granted for a small hydroelectric power project is...

  2. 18 CFR 4.106 - Standard terms and conditions of case-specific exemption from licensing.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... LICENSES, PERMITS, EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.106 Standard terms and conditions of case-specific exemption from licensing. Any case-specific exemption from licensing granted for a small hydroelectric power project is...

  3. 18 CFR 4.106 - Standard terms and conditions of case-specific exemption from licensing.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... LICENSES, PERMITS, EXEMPTIONS, AND DETERMINATION OF PROJECT COSTS Exemption of Small Hydroelectric Power Projects of 5 Megawatts or Less § 4.106 Standard terms and conditions of case-specific exemption from licensing. Any case-specific exemption from licensing granted for a small hydroelectric power project is...

  4. Teaching Case: Enterprise Architecture Specification Case Study

    ERIC Educational Resources Information Center

    Steenkamp, Annette Lerine; Alawdah, Amal; Almasri, Osama; Gai, Keke; Khattab, Nidal; Swaby, Carval; Abaas, Ramy

    2013-01-01

    A graduate course in enterprise architecture had a team project component in which a real-world business case, provided by an industry sponsor, formed the basis of the project charter and the architecture statement of work. The paper aims to share the team project experience on developing the architecture specifications based on the business case…

  5. A PDF projection method: A pressure algorithm for stand-alone transported PDFs

    NASA Astrophysics Data System (ADS)

    Ghorbani, Asghar; Steinhilber, Gerd; Markus, Detlev; Maas, Ulrich

    2015-03-01

    In this paper, a new formulation of the projection approach is introduced for stand-alone probability density function (PDF) methods. The method is suitable for applications in low-Mach number transient turbulent reacting flows. The method is based on a fractional step method in which first the advection-diffusion-reaction equations are modelled and solved within a particle-based PDF method to predict an intermediate velocity field. Then the mean velocity field is projected onto a space where the continuity for the mean velocity is satisfied. In this approach, a Poisson equation is solved on the Eulerian grid to obtain the mean pressure field. Then the mean pressure is interpolated at the location of each stochastic Lagrangian particle. The formulation of the Poisson equation avoids the time derivatives of the density (due to convection) as well as second-order spatial derivatives. This in turn eliminates the major sources of instability in the presence of stochastic noise that are inherent in particle-based PDF methods. The convergence of the algorithm (in the non-turbulent case) is investigated first by the method of manufactured solutions. Then the algorithm is applied to a one-dimensional turbulent premixed flame in order to assess the accuracy and convergence of the method in the case of turbulent combustion. As a part of this work, we also apply the algorithm to a more realistic flow, namely a transient turbulent reacting jet, in order to assess the performance of the method.

  6. Method for decreasing CT simulation time of complex phantoms and systems through separation of material specific projection data

    NASA Astrophysics Data System (ADS)

    Divel, Sarah E.; Christensen, Soren; Wintermark, Max; Lansberg, Maarten G.; Pelc, Norbert J.

    2017-03-01

    Computer simulation is a powerful tool in CT; however, long simulation times of complex phantoms and systems, especially when modeling many physical aspects (e.g., spectrum, finite detector and source size), hinder the ability to realistically and efficiently evaluate and optimize CT techniques. Long simulation times primarily result from the tracing of hundreds of line integrals through each of the hundreds of geometrical shapes defined within the phantom. However, when the goal is to perform dynamic simulations or test many scan protocols using a particular phantom, traditional simulation methods inefficiently and repeatedly calculate line integrals through the same set of structures although only a few parameters change in each new case. In this work, we have developed a new simulation framework that overcomes such inefficiencies by dividing the phantom into material specific regions with the same time attenuation profiles, acquiring and storing monoenergetic projections of the regions, and subsequently scaling and combining the projections to create equivalent polyenergetic sinograms. The simulation framework is especially efficient for the validation and optimization of CT perfusion which requires analysis of many stroke cases and testing hundreds of scan protocols on a realistic and complex numerical brain phantom. Using this updated framework to conduct a 31-time point simulation with 80 mm of z-coverage of a brain phantom on two 16-core Linux serves, we have reduced the simulation time from 62 hours to under 2.6 hours, a 95% reduction.

  7. Case Study of The ARRA-Funded GSHP Demonstration at the Natural Sources Building, Montana Tech

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malhotra, Mini; Liu, Xiaobing

    Under the American Recovery and Reinvestment Act (ARRA), 26 ground source heat pump (GSHP) projects were competitively selected in 2009 to demonstrate the benefits of GSHP systems and innovative technologies for cost reduction and/or performance improvement. One of the selected demonstration projects was proposed by Montana Tech of the University of Montana for a 56,000 sq ft, newly constructed, on-campus research facility – the Natural Resources Building (NRB) located in Butte, Montana. This demonstrated GSHP system consists of a 50 ton water-to-water heat pump and a closed-loop ground heat exchanger with two redundant 7.5 hp constant-speed pumps to use watermore » in the nearby flooded mines as a heat source or heat sink. It works in conjunction with the originally installed steam HX and an aircooled chiller to provide space heating and cooling. It is coupled with the existing hot water and chilled water piping in the building and operates in the heating or cooling mode based on the outdoor air temperature. The ground loop pumps operate in conjunction with the existing pumps in the building hot and chilled water loops for the operation of the heat pump unit. The goal of this demonstration project is to validate the technical and economic feasibility of the demonstrated commercial-scale GSHP system in the region, and illustrate the feasibility of using mine waters as the heat sink and source for GSHP systems. Should the demonstration prove satisfactory and feasible, it will encourage similar GSHP applications using mine water, thus help save energy and reduce carbon emissions. The actual performance of the system is analyzed with available measured data for January through July 2014. The annual energy performance is predicted and compared with a baseline scenario, with the heating and cooling provided by the originally designed systems. The comparison is made in terms of energy savings, operating cost savings, cost-effectiveness, and environmental benefits. Finally, limitations in conducting the analysis are identified and recommendations for improvement in the control and operation of such systems are made.« less

  8. Medical education and the quality improvement spiral: A case study from Mpumalanga, South Africa

    PubMed Central

    Bergh, Anne-Marie; Etsane, Mama E.; Hugo, Jannie

    2015-01-01

    Background: The short timeframe of medical students’ rotations is not always conducive to successful, in-depth quality-improvement projects requiring a more longitudinal approach. Aim: To describe the process of inducting students into a longitudinal quality-improvement project, using the topic of the Mother- and Baby-Friendly Initiative as a case study; and to explore the possible contribution of a quality-improvement project to the development of student competencies. Setting: Mpumalanga clinical learning centres, where University of Pretoria medical students did their district health rotations. Method: Consecutive student groups had to engage with a hospital's compliance with specific steps of the Ten Steps to Successful Breastfeeding that form the standards for the Mother- and Baby-Friendly Initiative. Primary data sources included an on-site PowerPoint group presentation (n = 42), a written group report (n = 42) and notes of individual interviews in an end-of-rotation objectively structured clinical examination station (n = 139). Results: Activities in each rotation varied according to the needs identified through the application of the quality-improvement cycle in consultation with the local health team. The development of student competencies is described according to the roles of a medical expert in the CanMEDS framework: collaborator, health advocate, scholar, communicator, manager and professional. The exposure to the real-life situation in South African public hospitals had a great influence on many students, who also acted as catalysts for transforming practice. Conclusion: Service learning and quality-improvement projects can be successfully integrated in one rotation and can contribute to the development of the different roles of a medical expert. More studies could provide insight into the potential of this approach in transforming institutions and student learning. PMID:26245606

  9. Risk-based water resources planning: Incorporating probabilistic nonstationary climate uncertainties

    NASA Astrophysics Data System (ADS)

    Borgomeo, Edoardo; Hall, Jim W.; Fung, Fai; Watts, Glenn; Colquhoun, Keith; Lambert, Chris

    2014-08-01

    We present a risk-based approach for incorporating nonstationary probabilistic climate projections into long-term water resources planning. The proposed methodology uses nonstationary synthetic time series of future climates obtained via a stochastic weather generator based on the UK Climate Projections (UKCP09) to construct a probability distribution of the frequency of water shortages in the future. The UKCP09 projections extend well beyond the range of current hydrological variability, providing the basis for testing the robustness of water resources management plans to future climate-related uncertainties. The nonstationary nature of the projections combined with the stochastic simulation approach allows for extensive sampling of climatic variability conditioned on climate model outputs. The probability of exceeding planned frequencies of water shortages of varying severity (defined as Levels of Service for the water supply utility company) is used as a risk metric for water resources planning. Different sources of uncertainty, including demand-side uncertainties, are considered simultaneously and their impact on the risk metric is evaluated. Supply-side and demand-side management strategies can be compared based on how cost-effective they are at reducing risks to acceptable levels. A case study based on a water supply system in London (UK) is presented to illustrate the methodology. Results indicate an increase in the probability of exceeding the planned Levels of Service across the planning horizon. Under a 1% per annum population growth scenario, the probability of exceeding the planned Levels of Service is as high as 0.5 by 2040. The case study also illustrates how a combination of supply and demand management options may be required to reduce the risk of water shortages.

  10. Conducting a FERC environmental assessment: a case study and recommendations from the Terror Lake Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olive, S.W.; Lamb, B.L.

    This paper is an account of the process that evolved during acquisition of the license to operate the Terror Lake hydroelectric power project under the auspices of the Federal Energy Regulatory Commission (FERC). The Commission is responsible for granting these licenses under the Federal Power Act (16 U.S.C. 792 et seq.). This act provides, in part, that FERC may condition a license to protect the public interest. The public interest in these cases has come to include both instream and terrestrial values. The Terror River is located on Kodiak Island in Alaska. The river is within the Kodiak National Wildlifemore » Refuge; it supports excellent runs of several species of Pacific Salmon which are both commercially important and a prime source of nutrition for the Kodiak brown bear. The river is also a prime resource for generating electric power. One major concern in the negotiations was the impact of land disturbance and management practices on brown bear habitat - i.e., protection of the brown bear. Maintenance of the bears' habitat is the main purpose of the Kodiak National Wildlife Refuge. But, like many other projects, resolving the instream flow issue was of major importance in the issuance of the FERC license. This paper discusses the fish and wildlife questions, but concentrates on instream uses and how protection of these uses was decided. With this as a focus, the paper explains the FERC process, gives a history of the Terror Lake Project, and, ultimately, makes recommendations for improved management of controversies within the context of the FERC licensing procedures. 65 references.« less

  11. Mitigating artifacts in back-projection source imaging with implications for frequency-dependent properties of the Tohoku-Oki earthquake

    NASA Astrophysics Data System (ADS)

    Meng, Lingsen; Ampuero, Jean-Paul; Luo, Yingdi; Wu, Wenbo; Ni, Sidao

    2012-12-01

    Comparing teleseismic array back-projection source images of the 2011 Tohoku-Oki earthquake with results from static and kinematic finite source inversions has revealed little overlap between the regions of high- and low-frequency slip. Motivated by this interesting observation, back-projection studies extended to intermediate frequencies, down to about 0.1 Hz, have suggested that a progressive transition of rupture properties as a function of frequency is observable. Here, by adapting the concept of array response function to non-stationary signals, we demonstrate that the "swimming artifact", a systematic drift resulting from signal non-stationarity, induces significant bias on beamforming back-projection at low frequencies. We introduce a "reference window strategy" into the multitaper-MUSIC back-projection technique and significantly mitigate the "swimming artifact" at high frequencies (1 s to 4 s). At lower frequencies, this modification yields notable, but significantly smaller, artifacts than time-domain stacking. We perform extensive synthetic tests that include a 3D regional velocity model for Japan. We analyze the recordings of the Tohoku-Oki earthquake at the USArray and at the European array at periods from 1 s to 16 s. The migration of the source location as a function of period, regardless of the back-projection methods, has characteristics that are consistent with the expected effect of the "swimming artifact". In particular, the apparent up-dip migration as a function of frequency obtained with the USArray can be explained by the "swimming artifact". This indicates that the most substantial frequency-dependence of the Tohoku-Oki earthquake source occurs at periods longer than 16 s. Thus, low-frequency back-projection needs to be further tested and validated in order to contribute to the characterization of frequency-dependent rupture properties.

  12. Innovative Approaches to Collaborative Groundwater Governance in the United States: Case Studies from Three High-Growth Regions in the Sun Belt.

    PubMed

    Megdal, Sharon B; Gerlak, Andrea K; Huang, Ling-Yee; Delano, Nathaniel; Varady, Robert G; Petersen-Perlman, Jacob D

    2017-05-01

    Groundwater is an increasingly important source of freshwater, especially where surface water resources are fully or over-allocated or becoming less reliable due to climate change. Groundwater reliance has created new challenges for sustainable management. This article examines how regional groundwater users coordinate and collaborate to manage shared groundwater resources, including attention to what drives collaboration. To identify and illustrate these facets, this article examines three geographically diverse cases of groundwater governance and management from the United States Sun Belt: Orange County Water District in southern California; Prescott Active Management Area in north-central Arizona; and the Central Florida Water Initiative in central Florida. These regions have different surface water laws, groundwater allocation and management laws and regulations, demographics, economics, topographies, and climate. These cases were selected because the Sun Belt faces similar pressures on groundwater due to historical and projected population growth and limited availability of usable surface water supplies. Collectively, they demonstrate groundwater governance trends in the United States, and illustrate distinctive features of regional groundwater management strategies. Our research shows how geophysical realities and state-level legislation have enabled and/or stimulated regions to develop groundwater management plans and strategies to address the specific issues associated with their groundwater resources. We find that litigation involvement and avoidance, along with the need to finance projects, are additional drivers of regional collaboration to manage groundwater. This case study underscores the importance of regionally coordinated and sustained efforts to address serious groundwater utilization challenges faced by the regions studied and around the world.

  13. Innovative Approaches to Collaborative Groundwater Governance in the United States: Case Studies from Three High-Growth Regions in the Sun Belt

    NASA Astrophysics Data System (ADS)

    Megdal, Sharon B.; Gerlak, Andrea K.; Huang, Ling-Yee; Delano, Nathaniel; Varady, Robert G.; Petersen-Perlman, Jacob D.

    2017-05-01

    Groundwater is an increasingly important source of freshwater, especially where surface water resources are fully or over-allocated or becoming less reliable due to climate change. Groundwater reliance has created new challenges for sustainable management. This article examines how regional groundwater users coordinate and collaborate to manage shared groundwater resources, including attention to what drives collaboration. To identify and illustrate these facets, this article examines three geographically diverse cases of groundwater governance and management from the United States Sun Belt: Orange County Water District in southern California; Prescott Active Management Area in north-central Arizona; and the Central Florida Water Initiative in central Florida. These regions have different surface water laws, groundwater allocation and management laws and regulations, demographics, economics, topographies, and climate. These cases were selected because the Sun Belt faces similar pressures on groundwater due to historical and projected population growth and limited availability of usable surface water supplies. Collectively, they demonstrate groundwater governance trends in the United States, and illustrate distinctive features of regional groundwater management strategies. Our research shows how geophysical realities and state-level legislation have enabled and/or stimulated regions to develop groundwater management plans and strategies to address the specific issues associated with their groundwater resources. We find that litigation involvement and avoidance, along with the need to finance projects, are additional drivers of regional collaboration to manage groundwater. This case study underscores the importance of regionally coordinated and sustained efforts to address serious groundwater utilization challenges faced by the regions studied and around the world.

  14. Continental-scale assessment of long-term trends in wet deposition trajectories: Role of anthropogenic and hydro-climatic drivers

    NASA Astrophysics Data System (ADS)

    Park, J.; Gall, H. E.; Niyogi, D.; Rao, S.

    2012-12-01

    The global trend of increased urbanization, and associated increased intensity of energy and material consumption and waste emissions, has contributed to shifts in the trajectories of aquatic, terrestrial, and atmospheric environments. Here, we focus on continental-scale spatiotemporal patterns in two atmospheric constituents (nitrate and sulfate), whose global biogeochemical cycles have been dramatically altered by emissions from mobile and fixed sources in urbanized and industrialized regions. The observed patterns in wet deposition fluxes of nitrate and sulfate are controlled by (1) natural hydro-climatic forcing, and (2) anthropogenic forcing (emissions and regulatory control), both of which are characterized by stochasticity and non-stationarity. We examine long-term wet deposition records in the U.S., Europe, and East Asia to evaluate how anthropogenic and natural forcing factors jointly contributed to the shifting temporal patterns of wet deposition fluxes at continental scales. These data offer clear evidence for successful implementation of regulatory controls and widespread adoption of technologies contributed to improving water quality and mitigation of adverse ecological impacts. We developed a stochastic model to project the future trajectories of wet deposition fluxes in emerging countries with fast growing urban areas. The model generates ellipses within which projected wet deposition flux trajectories are inscribed, similar to the trends in observational data. The shape of the ellipses provides information regarding the relative dominance of anthropogenic (e.g., industrial and urban emissions) versus hydro-climatic drivers (e.g., rainfall patterns, aridity index). Our analysis facilitates projections of the trajectory shift as a result of urbanization and other land-use changes, climate change, and regulatory enforcement. We use these observed data and the model to project likely trajectories for rapidly developing countries (BRIC), with a particular emphasis on various approaches to sustainable economic development. Brazil represents the case of shifts to alternate energy sources (bioethanol and hydroelectric power), while India and China are on the fossil fuel dependent trajectories, the same that North America and Europe had followed. Rapid increases in population, urbanization, and economic development of African cities presents an interesting case study for choices available for sustainable development, similar to that of Brazil rather than that followed by India and China. Coordinated air quality monitoring at urban and reference sites needs to be established to follow the fast-changing conditions.

  15. Comparing the use of health information/advice in Birmingham and Hull: a case study of digital health information delivered via the television.

    PubMed

    Nicholas, David; Huntington, Paul; Gunter, Barrie; Williams, Peter

    2003-01-01

    Postal questionnaire surveys were carried out with users of two digital interactive television (DiTV) providers of health content to investigate the use made of each service and the users' reactions to service content and its usefulness to them. The research indicated that health information on DiTV was used and, on the whole, rated favourably. There was some evidence also that such information might be used by some people as a substitute for going to the doctor, though information from their general practitioner (GP) or practice nurse still carries more weight for most people than any other health information source. This study forms part of an ongoing research project which has, as part of its aim, the task of identifying particular users with the information sources that may be most appropriate for them.

  16. Integration of Multiple Data Sources to Simulate the Dynamics of Land Systems

    PubMed Central

    Deng, Xiangzheng; Su, Hongbo; Zhan, Jinyan

    2008-01-01

    In this paper we present and develop a new model, which we have called Dynamics of Land Systems (DLS). The DLS model is capable of integrating multiple data sources to simulate the dynamics of a land system. Three main modules are incorporated in DLS: a spatial regression module, to explore the relationship between land uses and influencing factors, a scenario analysis module of the land uses of a region during the simulation period and a spatial disaggregation module, to allocate land use changes from a regional level to disaggregated grid cells. A case study on Taips County in North China is incorporated in this paper to test the functionality of DLS. The simulation results under the baseline, economic priority and environmental scenarios help to understand the land system dynamics and project near future land-use trajectories of a region, in order to focus management decisions on land uses and land use planning. PMID:27879726

  17. Towards an optimal adaptation of exposure to NOAA assessment methodology in Multi-Source Industrial Scenarios (MSIS): the challenges and the decision-making process

    NASA Astrophysics Data System (ADS)

    López de Ipiña, JM; Vaquero, C.; Gutierrez-Cañas, C.

    2017-06-01

    It is expected a progressive increase of the industrial processes that manufacture of intermediate (iNEPs) and end products incorporating ENMs (eNEPs) to bring about improved properties. Therefore, the assessment of occupational exposure to airborne NOAA will migrate, from the simple and well-controlled exposure scenarios in research laboratories and ENMs production plants using innovative production technologies, to much more complex exposure scenarios located around processes of manufacture of eNEPs that, in many cases, will be modified conventional production processes. Here will be discussed some of the typical challenging situations in the process of risk assessment of inhalation exposure to NOAA in Multi-Source Industrial Scenarios (MSIS), from the basis of the lessons learned when confronted to those scenarios in the frame of some European and Spanish research projects.

  18. Detecting Shielded Special Nuclear Materials Using Multi-Dimensional Neutron Source and Detector Geometries

    NASA Astrophysics Data System (ADS)

    Santarius, John; Navarro, Marcos; Michalak, Matthew; Fancher, Aaron; Kulcinski, Gerald; Bonomo, Richard

    2016-10-01

    A newly initiated research project will be described that investigates methods for detecting shielded special nuclear materials by combining multi-dimensional neutron sources, forward/adjoint calculations modeling neutron and gamma transport, and sparse data analysis of detector signals. The key tasks for this project are: (1) developing a radiation transport capability for use in optimizing adaptive-geometry, inertial-electrostatic confinement (IEC) neutron source/detector configurations for neutron pulses distributed in space and/or phased in time; (2) creating distributed-geometry, gas-target, IEC fusion neutron sources; (3) applying sparse data and noise reduction algorithms, such as principal component analysis (PCA) and wavelet transform analysis, to enhance detection fidelity; and (4) educating graduate and undergraduate students. Funded by DHS DNDO Project 2015-DN-077-ARI095.

  19. Comparison of Different EO Sensors for Mapping Tree Species- A Case Study in Southwest Germany

    NASA Astrophysics Data System (ADS)

    Enßle, Fabian; Kattenborn, Teja; Koch, Barbara

    2014-11-01

    The variety of different remote sensing sensors and thus the types of data specifications which are available is increasing continuously. Especially the differences in geometric, radiometric and temporal resolutions of different platforms affect their ability for the mapping of forests. These differences hinder the comparability and application of uniform methods of different remotely sensed data across the same region of interest. The quality and quantity of retrieved forest parameters is directly dependent on the data source, and therefore the objective of this project is to analyse the relationship between the data source and its derived parameters. A comparison of different optical EO-data (e.g. spatial resolution and spectral resolution of specific bands) will help to define the optimum data sets to produce a reproducible method to provide additional inputs to the Dragon cooperative project, specifically to method development for woody biomass estimation and biodiversity assessment services. This poster presents the first results on tree species mapping in a mixed temperate forest by satellite imagery taken from four different sensors. Tree species addressed in this pilot study are Scots pine (Pinus sylvestris), sessile oak (Quercus petraea) and red oak (Quercus rubra). The spatial resolution varies from 2m to 30m and the spectral resolutions range from 8bands up to 155bands.

  20. Comparison of Different EO Sensors for Mapping Tree Species- A Case Study in Southwest Germany

    NASA Astrophysics Data System (ADS)

    Enβle, Fabian; Kattenborn, Teja; Koch, Barbara

    2014-11-01

    The variety of different remote sensing sensors and thus the types of data specifications which are available is increasing continuously. Especially the differences in geometric, radiometric and temporal resolutions of different platforms affect their ability for the mapping of forests. These differences hinder the comparability and application of uniform methods of different remotely sensed data across the same region of interest. The quality and quantity of retrieved forest parameters is directly dependent on the data source, and therefore the objective of this project is to analyse the relationship between the data source and its derived parameters. A comparison of different optical EO-data (e.g. spatial resolution and spectral resolution of specific bands) will help to define the optimum data sets to produce a reproducible method to provide additional inputs to the Dragon cooperative project, specifically to method development for woody biomass estimation and biodiversity assessment services. This poster presents the first results on tree species mapping in a mixed temperate forest by satellite imagery taken from four different sensors. Tree species addressed in this pilot study are: Scots pine (Pinus sylvestris), sessile oak (Quercus petraea) and red oak (Quercus rubra). The spatial resolution varies from 2m to 30m and the spectral resolutions range from 8bands up to 155bands.

  1. The puzzle of the 1996 Bárdarbunga, Iceland, earthquake: no volumetric component in the source mechanism

    USGS Publications Warehouse

    Tkalcic, Hrvoje; Dreger, Douglas S.; Foulger, Gillian R.; Julian, Bruce R.

    2009-01-01

    A volcanic earthquake with Mw 5.6 occurred beneath the Bárdarbunga caldera in Iceland on 29 September 1996. This earthquake is one of a decade-long sequence of  events at Bárdarbunga with non-double-couple mechanisms in the Global Centroid Moment Tensor catalog. Fortunately, it was recorded well by the regional-scale Iceland Hotspot Project seismic experiment. We investigated the event with a complete moment tensor inversion method using regional long-period seismic waveforms and a composite structural model. The moment tensor inversion using data from stations of the Iceland Hotspot Project yields a non-double-couple solution with a 67% vertically oriented compensated linear vector dipole component, a 32% double-couple component, and a statistically insignificant (2%) volumetric (isotropic) contraction. This indicates the absence of a net volumetric component, which is puzzling in the case of a large volcanic earthquake that apparently is not explained by shear slip on a planar fault. A possible volcanic mechanism that can produce an earthquake without a volumetric component involves two offset sources with similar but opposite volume changes. We show that although such a model cannot be ruled out, the circumstances under which it could happen are rare.

  2. Drivers of seabird population recovery on New Zealand islands after predator eradication.

    PubMed

    Buxton, Rachel T; Jones, Christopher; Moller, Henrik; Towns, David R

    2014-04-01

    Eradication of introduced mammalian predators from islands has become increasingly common, with over 800 successful projects around the world. Historically, introduced predators extirpated or reduced the size of many seabird populations, changing the dynamics of entire island ecosystems. Although the primary outcome of many eradication projects is the restoration of affected seabird populations, natural population responses are rarely documented and mechanisms are poorly understood. We used a generic model of seabird colony growth to identify key predictor variables relevant to recovery or recolonization. We used generalized linear mixed models to test the importance of these variables in driving seabird population responses after predator eradication on islands around New Zealand. The most influential variable affecting recolonization of seabirds around New Zealand was the distance to a source population, with few cases of recolonization without a source population ≤25 km away. Colony growth was most affected by metapopulation status; there was little colony growth in species with a declining status. These characteristics may facilitate the prioritization of newly predator-free islands for active management. Although we found some evidence documenting natural recovery, generally this topic was understudied. Our results suggest that in order to guide management strategies, more effort should be allocated to monitoring wildlife response after eradication. © 2014 Society for Conservation Biology.

  3. Accuracy of Dual-Energy Virtual Monochromatic CT Numbers: Comparison between the Single-Source Projection-Based and Dual-Source Image-Based Methods.

    PubMed

    Ueguchi, Takashi; Ogihara, Ryota; Yamada, Sachiko

    2018-03-21

    To investigate the accuracy of dual-energy virtual monochromatic computed tomography (CT) numbers obtained by two typical hardware and software implementations: the single-source projection-based method and the dual-source image-based method. A phantom with different tissue equivalent inserts was scanned with both single-source and dual-source scanners. A fast kVp-switching feature was used on the single-source scanner, whereas a tin filter was used on the dual-source scanner. Virtual monochromatic CT images of the phantom at energy levels of 60, 100, and 140 keV were obtained by both projection-based (on the single-source scanner) and image-based (on the dual-source scanner) methods. The accuracy of virtual monochromatic CT numbers for all inserts was assessed by comparing measured values to their corresponding true values. Linear regression analysis was performed to evaluate the dependency of measured CT numbers on tissue attenuation, method, and their interaction. Root mean square values of systematic error over all inserts at 60, 100, and 140 keV were approximately 53, 21, and 29 Hounsfield unit (HU) with the single-source projection-based method, and 46, 7, and 6 HU with the dual-source image-based method, respectively. Linear regression analysis revealed that the interaction between the attenuation and the method had a statistically significant effect on the measured CT numbers at 100 and 140 keV. There were attenuation-, method-, and energy level-dependent systematic errors in the measured virtual monochromatic CT numbers. CT number reproducibility was comparable between the two scanners, and CT numbers had better accuracy with the dual-source image-based method at 100 and 140 keV. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  4. Do transportation subsidies and living allowances improve tuberculosis control outcomes among internal migrants in urban Shanghai, China?

    PubMed

    Lu, Hui; Yan, Fei; Wang, Wei; Wu, Laiwa; Ma, Weiping; Chen, Jing; Shen, Xin; Mei, Jian

    2013-01-01

    Tuberculosis (TB) in internal migrants is one of three threats for TB control in China. To address this threat, a project was launched in eight of the 19 districts of Shanghai in 2007 to provide transportation subsidies and living allowances for all migrant TB cases. This study aims to determine if this project contributed to improved TB control outcomes among migrants in urban Shanghai. This was a community intervention study. The data were derived from the TB Management Information System in three project districts and three non-project districts in Shanghai between 2006 and 2010. The impact of the project was estimated in a difference-in-difference (DID) analysis framework, and a multivariable binary logistic regression analysis. A total of 1872 pulmonary TB (PTB) cases in internal migrants were included in the study. The treatment success rate (TSR) for migrant smear-positive cases in project districts increased from 59.9% in 2006 to 87.6% in 2010 (P < 0.001). The crude DID improvement of TSR was 18.9%. There was an increased probability of TSR in the project group before and after the project intervention period (coefficient = 1.156, odds ratio = 3.178, 95% confidence interval: 1.305-7.736, P = 0.011). The study showed the project could improve treatment success in migrant PTB cases. This was a short-term programme using special financial subsidies for all migrant PTB cases. It is recommended that project funds be continuously invested by governments with particular focus on the more vulnerable PTB cases among migrants.

  5. WATERPROTECT: Innovative tools enabling drinking water protection in rural and urban environments

    NASA Astrophysics Data System (ADS)

    Seuntjens, Piet; Campling, Paul; Joris, Ingeborg; Wauters, Erwin; Lopez de Alda, Miren; Kuczynska, Anna; Lajer Hojberg, Anker; Capri, Ettore; Brabyn, Cristina; Boeckaert, Charlotte; Mellander, Per Erik; Pauwelyn, Ellen; Pop, Edit

    2017-04-01

    High-quality, safe, and sufficient drinking water is essential for life: we use it for drinking, food preparation and cleaning. Agriculture is the biggest source of pesticides and nitrate pollution in European fresh waters. The overarching objective of the recently approved H2020 project WATERPROTECT is to contribute to effective uptake and realisation of management practices and mitigation measures to protect drinking water resources. Therefore WATERPROTECT will create an integrative multi-actor participatory framework including innovative instruments that enable actors to monitor, to finance and to effectively implement management practices and measures for the protection of water sources. We propose seven case studies involving multiple actors in implementing good practices (land management, farming, product stewardship, point source pollution prevention) to ensure safe drinking water supply. The seven case studies cover different pedo-climatic conditions, different types of farming systems, different legal frameworks, larger and smaller water collection areas across the EU. In close cooperation with actors in the field in the case studies (farmers associations, local authorities, water producing companies, private water companies, consumer organisations) and other stakeholders (fertilizer and plant protection industry, environment agencies, nature conservation agencies, agricultural administrations) at local and EU level, WATERPROTECT will develop innovative water governance models investigating alternative pathways from focusing on the 'costs of water treatment' to 'rewarding water quality delivering farming systems'. Water governance structures will be built upon cost-efficiency analysis related to mitigation and cost-benefit analysis for society, and will be supported by spatially explicit GIS analyses and predictive models that account for temporal and spatial scaling issues. The outcome will be improved participatory methods and public policy instruments to protect drinking water resources.

  6. Testing Murphy's Law: Urban Myths as a Source of School Science Projects.

    ERIC Educational Resources Information Center

    Matthews, Robert A. J.

    2001-01-01

    Discusses the urban myth that "If toast can land butter-side down, it will" as an example of a source of projects demonstrating the use of the scientific method beyond its usual settings. Other urban myths suitable for investigation are discussed. (Author/MM)

  7. ProFound: Source Extraction and Application to Modern Survey Data

    NASA Astrophysics Data System (ADS)

    Robotham, A. S. G.; Davies, L. J. M.; Driver, S. P.; Koushan, S.; Taranu, D. S.; Casura, S.; Liske, J.

    2018-05-01

    We introduce PROFOUND, a source finding and image analysis package. PROFOUND provides methods to detect sources in noisy images, generate segmentation maps identifying the pixels belonging to each source, and measure statistics like flux, size, and ellipticity. These inputs are key requirements of PROFIT, our recently released galaxy profiling package, where the design aim is that these two software packages will be used in unison to semi-automatically profile large samples of galaxies. The key novel feature introduced in PROFOUND is that all photometry is executed on dilated segmentation maps that fully contain the identifiable flux, rather than using more traditional circular or ellipse-based photometry. Also, to be less sensitive to pathological segmentation issues, the de-blending is made across saddle points in flux. We apply PROFOUND in a number of simulated and real-world cases, and demonstrate that it behaves reasonably given its stated design goals. In particular, it offers good initial parameter estimation for PROFIT, and also segmentation maps that follow the sometimes complex geometry of resolved sources, whilst capturing nearly all of the flux. A number of bulge-disc decomposition projects are already making use of the PROFOUND and PROFIT pipeline, and adoption is being encouraged by publicly releasing the software for the open source R data analysis platform under an LGPL-3 license on GitHub (github.com/asgr/ProFound).

  8. Beyond User Acceptance: A Legitimacy Framework for Potable Water Reuse in California.

    PubMed

    Harris-Lovett, Sasha R; Binz, Christian; Sedlak, David L; Kiparsky, Michael; Truffer, Bernhard

    2015-07-07

    Water resource managers often tout the potential of potable water reuse to provide a reliable, local source of drinking water in water-scarce regions. Despite data documenting the ability of advanced treatment technologies to treat municipal wastewater effluent to meet existing drinking water quality standards, many utilities face skepticism from the public about potable water reuse. Prior research on this topic has mainly focused on marketing strategies for garnering public acceptance of the process. This study takes a broader perspective on the adoption of potable water reuse based on concepts of societal legitimacy, which is the generalized perception or assumption that a technology is desirable or appropriate within its social context. To assess why some potable reuse projects were successfully implemented while others faced fierce public opposition, we performed a series of 20 expert interviews and reviewed in-depth case studies from potable reuse projects in California. Results show that proponents of a legitimated potable water reuse project in Orange County, California engaged in a portfolio of strategies that addressed three main dimensions of legitimacy. In contrast, other proposed projects that faced extensive public opposition relied on a smaller set of legitimation strategies that focused near-exclusively on the development of robust water treatment technology. Widespread legitimation of potable water reuse projects, including direct potable water reuse, may require the establishment of a portfolio of standards, procedures, and possibly new institutions.

  9. Automated source classification of new transient sources

    NASA Astrophysics Data System (ADS)

    Oertel, M.; Kreikenbohm, A.; Wilms, J.; DeLuca, A.

    2017-10-01

    The EXTraS project harvests the hitherto unexplored temporal domain information buried in the serendipitous data collected by the European Photon Imaging Camera (EPIC) onboard the ESA XMM-Newton mission since its launch. This includes a search for fast transients, missed by standard image analysis, and a search and characterization of variability in hundreds of thousands of sources. We present an automated classification scheme for new transient sources in the EXTraS project. The method is as follows: source classification features of a training sample are used to train machine learning algorithms (performed in R; randomForest (Breiman, 2001) in supervised mode) which are then tested on a sample of known source classes and used for classification.

  10. Comparison of Different Approach of Back Projection Method in Retrieving the Rupture Process of Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Tan, F.; Wang, G.; Chen, C.; Ge, Z.

    2016-12-01

    Back-projection of teleseismic P waves [Ishii et al., 2005] has been widely used to image the rupture of earthquakes. Besides the conventional narrowband beamforming in time domain, approaches in frequency domain such as MUSIC back projection (Meng 2011) and compressive sensing (Yao et al, 2011), are proposed to improve the resolution. Each method has its advantages and disadvantages and should be properly used in different cases. Therefore, a thorough research to compare and test these methods is needed. We write a GUI program, which puts the three methods together so that people can conveniently use different methods to process the same data and compare the results. Then we use all the methods to process several earthquake data, including 2008 Wenchuan Mw7.9 earthquake and 2011 Tohoku-Oki Mw9.0 earthquake, and theoretical seismograms of both simple sources and complex ruptures. Our results show differences in efficiency, accuracy and stability among the methods. Quantitative and qualitative analysis are applied to measure their dependence on data and parameters, such as station number, station distribution, grid size, calculate window length and so on. In general, back projection makes it possible to get a good result in a very short time using less than 20 lines of high-quality data with proper station distribution, but the swimming artifact can be significant. Some ways, for instance, combining global seismic data, could help ameliorate this method. Music back projection needs relatively more data to obtain a better and more stable result, which means it needs a lot more time since its runtime accumulates obviously faster than back projection with the increase of station number. Compressive sensing deals more effectively with multiple sources in a same time window, however, costs the longest time due to repeatedly solving matrix. Resolution of all the methods is complicated and depends on many factors. An important one is the grid size, which in turn influences runtime significantly. More detailed results in this research may help people to choose proper data, method and parameters.

  11. Systems Engineering Technical Leadership Development Program

    DTIC Science & Technology

    2012-02-01

    leading others in creative problem solving, complexity, and why projects fail . These topics were additionally supported by case studies designed to...Your Core Values Dominick Wed 12:30-1:30 Lunch Wed 1:30-2:45 Case Study: Why Projects Fail Pennotti Wed 2:45-3:00 Break Wed 3:00-4:30 Project...Case Study: When Good Wasn’t Good Enough 11. Technical Value-5: Group Project: AR2D2 RFP 12. Customer Expectation-1: Lecture: Why Systems Fail

  12. Assessment of online public opinions on large infrastructure projects: A case study of the Three Gorges Project in China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn

    Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework usesmore » web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework. • We revealed spatial-temporal patterns of post intensity and sentiment polarity on the TGP. • We drew implications for a more in-depth understanding of the public opinion on large infrastructure projects.« less

  13. Galactic cold cores. IV. Cold submillimetre sources: catalogue and statistical analysis

    NASA Astrophysics Data System (ADS)

    Montillaud, J.; Juvela, M.; Rivera-Ingraham, A.; Malinen, J.; Pelkonen, V.-M.; Ristorcelli, I.; Montier, L.; Marshall, D. J.; Marton, G.; Pagani, L.; Toth, L. V.; Zahorecz, S.; Ysard, N.; McGehee, P.; Paladini, R.; Falgarone, E.; Bernard, J.-P.; Motte, F.; Zavagno, A.; Doi, Y.

    2015-12-01

    Context. For the project Galactic cold cores, Herschel photometric observations were carried out as a follow-up of cold regions of interstellar clouds previously identified with the Planck satellite. The aim of the project is to derive the physical properties of the population of cold sources and to study its connection to ongoing and future star formation. Aims: We build a catalogue of cold sources within the clouds in 116 fields observed with the Herschel PACS and SPIRE instruments. We wish to determine the general physical characteristics of the cold sources and to examine the correlations with their host cloud properties. Methods: From Herschel data, we computed colour temperature and column density maps of the fields. We estimated the distance to the target clouds and provide both uncertainties and reliability flags for the distances. The getsources multiwavelength source extraction algorithm was employed to build a catalogue of several thousand cold sources. Mid-infrared data were used, along with colour and position criteria, to separate starless and protostellar sources. We also propose another classification method based on submillimetre temperature profiles. We analysed the statistical distributions of the physical properties of the source samples. Results: We provide a catalogue of ~4000 cold sources within or near star forming clouds, most of which are located either in nearby molecular complexes (≲1 kpc) or in star forming regions of the nearby galactic arms (~2 kpc). About 70% of the sources have a size compatible with an individual core, and 35% of those sources are likely to be gravitationally bound. Significant statistical differences in physical properties are found between starless and protostellar sources, in column density versus dust temperature, mass versus size, and mass versus dust temperature diagrams. The core mass functions are very similar to those previously reported for other regions. On statistical grounds we find that gravitationally bound sources have higher background column densities (median Nbg(H2) ~ 5 × 1021 cm-2) than unbound sources (median Nbg(H2) ~ 3 × 1021 cm-2). These values of Nbg(H2) are higher for higher dust temperatures of the external layers of the parent cloud. However, only in a few cases do we find clear Nbg(H2) thresholds for the presence of cores. The dust temperatures of cloud external layers show clear variations with galactic location, as may the source temperatures. Conclusions: Our data support a more complex view of star formation than in the simple idea of a column density threshold. They show a clear influence of the surrounding UV-visible radiation on how cores distribute in their host clouds with possible variations on the Galactic scale. Planck (http://www.esa.int/Planck) is a project of the European Space Agency - ESA - with instruments provided by two scientific consortia funded by ESA member states (in particular the lead countries: France and Italy) with contributions from NASA (USA), and telescope reflectors provided in a collaboration between ESA and a scientific consortium led and funded by Denmark.Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.Full Table B.1 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/584/A92

  14. The Dynamics of Project-Based Learning Extension Courses: The "Laboratory of Social Projects" Case Study

    ERIC Educational Resources Information Center

    Arantes do Amaral, Joao Alberto

    2017-01-01

    In this case study we discuss the dynamics that drive a free-of-charge project-based learning extension course. We discuss the lessons learned in the course, "Laboratory of Social Projects." The course aimed to teach project management skills to the participants. It was conducted from August to November of 2015, at Federal University of…

  15. International energy outlook 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1996-05-01

    This International Energy Outlook presents historical data from 1970 to 1993 and EIA`s projections of energy consumption and carbon emissions through 2015 for 6 country groups. Prospects for individual fuels are discussed. Summary tables of the IEO96 world energy consumption, oil production, and carbon emissions projections are provided in Appendix A. The reference case projections of total foreign energy consumption and of natural gas, coal, and renewable energy were prepared using EIA`s World Energy Projection System (WEPS) model. Reference case projections of foreign oil production and consumption were prepared using the International Energy Module of the National Energy Modeling Systemmore » (NEMS). Nuclear consumption projections were derived from the International Nuclear Model, PC Version (PC-INM). Alternatively, nuclear capacity projections were developed using two methods: the lower reference case projections were based on analysts` knowledge of the nuclear programs in different countries; the upper reference case was generated by the World Integrated Nuclear Evaluation System (WINES)--a demand-driven model. In addition, the NEMS Coal Export Submodule (CES) was used to derive flows in international coal trade. As noted above, foreign projections of electricity demand are now projected as part of the WEPS. 64 figs., 62 tabs.« less

  16. A Case Study of Teaching Marketing Research Using Client-Sponsored Projects: Method, Challenges, and Benefits

    ERIC Educational Resources Information Center

    Bove, Liliana L.; Davies, W. Martin

    2009-01-01

    This case study outlines the use of client-sponsored research projects in a quantitative postgraduate marketing research subject conducted in a 12-week semester in a research-intensive Australian university. The case study attempts to address the dearth of recent literature on client-sponsored research projects in the discipline of marketing.…

  17. Airpower Projection in the Anti-Access/Area Denial Environment: Dispersed Operations

    DTIC Science & Technology

    2015-02-01

    Raptor Case Study.....................................................................6 Risks to Dispersed Operations...project airpower, this paper breaks down a case study of the Rapid Raptor concept. The risks with executing a dispersed model are analyzed and mitigation...will force leaders to look at alternative ways to project power. Alternative Option: Rapid Raptor Case Study The ability to defend forward operating

  18. Catchment Models and Management Tools for diffuse Contaminants (Sediment, Phosphorus and Pesticides): DIFFUSE Project

    NASA Astrophysics Data System (ADS)

    Mockler, Eva; Reaney, Simeon; Mellander, Per-Erik; Wade, Andrew; Collins, Adrian; Arheimer, Berit; Bruen, Michael

    2017-04-01

    The agricultural sector is the most common suspected source of nutrient pollution in Irish rivers. However, it is also often the most difficult source to characterise due to its predominantly diffuse nature. Particulate phosphorus in surface water and dissolved phosphorus in groundwater are of particular concern in Irish water bodies. Hence the further development of models and indices to assess diffuse sources of contaminants are required for use by the Irish Environmental Protection Agency (EPA) to provide support for river basin planning. Understanding connectivity in the landscape is a vital component of characterising the source-pathway-receptor relationships for water-borne contaminants, and hence is a priority in this research. The DIFFUSE Project will focus on connectivity modelling and incorporation of connectivity into sediment, nutrient and pesticide risk mapping. The Irish approach to understanding and managing natural water bodies has developed substantially in recent years assisted by outputs from multiple research projects, including modelling and analysis tools developed during the Pathways and CatchmentTools projects. These include the Pollution Impact Potential (PIP) maps, which are an example of research output that is used by the EPA to support catchment management. The PIP maps integrate an understanding of the pollution pressures and mobilisation pathways and, using the source-pathways-receptor model, provide a scientific basis for evaluation of mitigation measures. These maps indicate the potential risk posed by nitrate and phosphate from diffuse agricultural sources to surface and groundwater receptors and delineate critical source areas (CSAs) as a means of facilitating the targeting of mitigation measures. Building on this previous research, the DIFFUSE Project will develop revised and new catchment managements tools focused on connectivity, sediment, phosphorus and pesticides. The DIFFUSE project will strive to identify the state-of-the-art methods and models that are most applicable to Irish conditions and management challenges. All styles of modelling considered useful for water resources management are relevant to this project and a balance of technical sophistication, data availability and operational practicalities is the ultimate goal. Achievement of this objective will be measured by comparing the performance of the new models developed in the project with models used in other countries. The models and tools developed in the course of the project will be evaluated by comparison with Irish catchment data and with other state-of-the-art models in a model-inter-comparison workshop which will be open to other models and the wider research community.

  19. 40 CFR 63.55 - Maximum achievable control technology (MACT) determinations for affected sources subject to case...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (MACT) determinations for affected sources subject to case-by-case determination of equivalent emission... sources subject to case-by-case determination of equivalent emission limitations. (a) Requirements for... hazardous air pollutant emissions limitations equivalent to the limitations that would apply if an emission...

  20. Breaking paradigms in severe epistaxis: the importance of looking for the S-point.

    PubMed

    Kosugi, Eduardo Macoto; Balsalobre, Leonardo; Mangussi-Gomes, João; Tepedino, Miguel Soares; San-da-Silva, Daniel Marcus; Cabernite, Erika Mucciolo; Hermann, Diego; Stamm, Aldo Cassol

    Since the introduction of nasal endoscopy into the field of Otorhinolaryngology, the treatment paradigm for cases of severe epistaxis has shifted toward early and precise identification of the bleeding site. Although severe epistaxis is usually considered to arise from posterior bleeding, an arterial vascular pedicle in the superior portion of the nasal septum, around the axilla projection of the middle turbinate, posterior to the septal body, frequently has been observed. That vascular pedicle was named the Stamm's S-point. The aim of this study was to describe the S-point and report cases of severe epistaxis originating from it. A retrospective case series study was conducted. Nine patients with spontaneous severe epistaxis, where the S-point was identified as the source of bleeding, were treated between March 2016 and March 2017. Male predominance (77.8%) with age average of 59.3 years old were reported. Most cases presented comorbidities (88.9%) and were not taking acetylsalicylic acid (66.7%). A predominance of left sided involvement (55.6%) and anteroposterior bleeding being the principal initial presentation (77.8%) was seen. Six patients (66.7%) presented with hemoglobin levels below 10g/dL, and four (44.4%) required blood transfusion. Cauterization of S-point was performed in all patients, with complete resolution of bleeding. No patient experienced recurrence of severe epistaxis. The Stamm's S-point, a novel source of spontaneous severe epistaxis, is reported, and its cauterization was effective and safe. Otolaryngologists must actively seek this site of bleeding in cases of severe epistaxis. Copyright © 2018 Associação Brasileira de Otorrinolaringologia e Cirurgia Cérvico-Facial. Published by Elsevier Editora Ltda. All rights reserved.

  1. EMIRA: Ecologic Malaria Reduction for Africa--innovative tools for integrated malaria control.

    PubMed

    Dambach, Peter; Traoré, Issouf; Becker, Norbert; Kaiser, Achim; Sié, Ali; Sauerborn, Rainer

    2014-01-01

    Malaria control is based on early treatment of cases and on vector control. The current measures for malaria vector control in Africa are mainly based on long-lasting insecticide treated nets (LLINs) and to a much smaller extent on indoor residual spraying (IRS). A third pillar in the fight against the malaria vector, larval source management (LSM), has virtually not been used in Africa since the ban of DDT in the 1960s. Within the light of recent WHO recommendations for Bacillus thuringiensis israelensis (Bti) use against malaria and other vector species, larval source management could see a revival in the upcoming years. In this project we analyze the ecologic and health impacts as well as the cost effectiveness of larval source management under different larviciding scenarios in a health district in Burkina Faso. The project is designed as prospective intervention study with duration of three years (2013-2015). Its spatial scale includes three arms of interventions and control, comprising a total of 127 villages and the district capital Nouna in the extended HDSS (Health Demographic Surveillance System) of the Kossi province. Baseline data on mosquito abundance, parasitemia in U5 children, and malaria related morbidity and mortality are gathered over the project duration. Besides the outcome on ecologic and health parameters, the economic costs are seized and valued against the achieved health benefits. Risk map based, guided larvicide application might be a possibility to further decrease economic cost of LSM and facilitate its faster incorporation to integrated malaria control programs. Given the limited resources in many malaria endemic countries, it is of utmost importance to relate the costs of novel strategies for malaria prevention to their effect on the burden of the disease. Occurring costs and the impact on the health situation will be made comparable to other, existing intervention strategies, allowing stakeholders and policymakers decision making.

  2. Digital-simulation and projection of water-level declines in basalt aquifers of the Odessa-Lind area, east-central Washington

    USGS Publications Warehouse

    Luzier, J.E.; Skrivan, James A.

    1975-01-01

    A digital computer program using finite-difference techniques simulates an intensively pumped, multilayered basalt-aquifer system near Odessa. The aquifers now developed are in the upper 1,000 feet of a regionally extensive series of southwesterly dipping basalt flows of the Columbia River Group. Most of the aquifers are confined. Those in the depth range of about 500 to 1,000 feet are the chief source of ground water pumped from irrigation wells. Transmissivity of these aquifers ranges from less than 2,700 feet squared per day to more than 40,000 feet squared per day, and storage coefficients range from 0.0015 to 0.006. Shallower aquifers are generally much less permeable, but they are a source of recharge to deeper aquifers with lower artesian heads; vertical leakage occurs along joints in the basalt and down uncased wells, which short circuit the aquifer system. For model analysis, the deeper, pumped aquifers were grouped and treated as a single layer with drawdown-dependent leakage from an overlying confining layer. Verification of the model was achieved primarily by closely matching observed pumpage-related head declines ranging from about 10 feet to more than 40 feet over the 4-year period from March 1967 to March 1971. Projected average annual rates of decline in the Odessa-Lind area during the 14-year period from March 1967 to March 1981 are: from 1 to 9 feet per year if pumpage is maintained at the 1970 rate of 117,000 acre-feet per year; or, from 3 to 33 feet per year if 1970 pumpage is increased to 233,000 acre-feet per year, which includes 116,000 acre-feet per year covered by water-right applications held in abeyance. In each case, projected drawdown on the northeast side of a major ground-water barrier is about double that on the southwest side because of differences in transmissivity and storage coefficient and in sources of recharge.

  3. Investigating the unification of LOFAR-detected powerful AGN in the Boötes field

    NASA Astrophysics Data System (ADS)

    Morabito, Leah K.; Williams, W. L.; Duncan, Kenneth J.; Röttgering, H. J. A.; Miley, George; Saxena, Aayush; Barthel, Peter; Best, P. N.; Bruggen, M.; Brunetti, G.; Chyży, K. T.; Engels, D.; Hardcastle, M. J.; Harwood, J. J.; Jarvis, Matt J.; Mahony, E. K.; Prandoni, I.; Shimwell, T. W.; Shulevski, A.; Tasse, C.

    2017-08-01

    Low radio frequency surveys are important for testing unified models of radio-loud quasars and radio galaxies. Intrinsically similar sources that are randomly oriented on the sky will have different projected linear sizes. Measuring the projected linear sizes of these sources provides an indication of their orientation. Steep-spectrum isotropic radio emission allows for orientation-free sample selection at low radio frequencies. We use a new radio survey of the Boötes field at 150 MHz made with the Low-Frequency Array (LOFAR) to select a sample of radio sources. We identify 60 radio sources with powers P > 1025.5 W Hz-1 at 150 MHz using cross-matched multiwavelength information from the AGN and Galaxy Evolution Survey, which provides spectroscopic redshifts and photometric identification of 16 quasars and 44 radio galaxies. When considering the radio spectral slope only, we find that radio sources with steep spectra have projected linear sizes that are on average 4.4 ± 1.4 larger than those with flat spectra. The projected linear sizes of radio galaxies are on average 3.1 ± 1.0 larger than those of quasars (2.0 ± 0.3 after correcting for redshift evolution). Combining these results with three previous surveys, we find that the projected linear sizes of radio galaxies and quasars depend on redshift but not on power. The projected linear size ratio does not correlate with either parameter. The LOFAR data are consistent within the uncertainties with theoretical predictions of the correlation between the quasar fraction and linear size ratio, based on an orientation-based unification scheme.

  4. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements.

    PubMed

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K; Cai, Chang; Nagarajan, Srikantan S

    2018-06-01

    Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  5. Beamspace dual signal space projection (bDSSP): a method for selective detection of deep sources in MEG measurements

    NASA Astrophysics Data System (ADS)

    Sekihara, Kensuke; Adachi, Yoshiaki; Kubota, Hiroshi K.; Cai, Chang; Nagarajan, Srikantan S.

    2018-06-01

    Objective. Magnetoencephalography (MEG) has a well-recognized weakness at detecting deeper brain activities. This paper proposes a novel algorithm for selective detection of deep sources by suppressing interference signals from superficial sources in MEG measurements. Approach. The proposed algorithm combines the beamspace preprocessing method with the dual signal space projection (DSSP) interference suppression method. A prerequisite of the proposed algorithm is prior knowledge of the location of the deep sources. The proposed algorithm first derives the basis vectors that span a local region just covering the locations of the deep sources. It then estimates the time-domain signal subspace of the superficial sources by using the projector composed of these basis vectors. Signals from the deep sources are extracted by projecting the row space of the data matrix onto the direction orthogonal to the signal subspace of the superficial sources. Main results. Compared with the previously proposed beamspace signal space separation (SSS) method, the proposed algorithm is capable of suppressing much stronger interference from superficial sources. This capability is demonstrated in our computer simulation as well as experiments using phantom data. Significance. The proposed bDSSP algorithm can be a powerful tool in studies of physiological functions of midbrain and deep brain structures.

  6. The Use of Open Source Software in the Global Land Ice Measurements From Space (GLIMS) Project, and the Relevance to Institutional Cooperation

    Treesearch

    Christopher W. Helm

    2006-01-01

    GLIMS is a NASA funded project that utilizes Open-Source Software to achieve its goal of creating a globally complete inventory of glaciers. The participation of many international institutions and the development of on-line mapping applications to provide access to glacial data have both been enhanced by Open-Source GIS capabilities and play a crucial role in the...

  7. 23 CFR 661.43 - Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Can other sources of funds be used to finance a queued project in advance of receipt of IRRBP funds? 661.43 Section 661.43 Highways FEDERAL HIGHWAY ADMINISTRATION, DEPARTMENT OF TRANSPORTATION ENGINEERING AND TRAFFIC OPERATIONS INDIAN RESERVATION ROAD BRIDGE PROGRAM § 661.43 Can other sources of funds be...

  8. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    ERIC Educational Resources Information Center

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…

  9. Navigating the Net for Grant Money.

    ERIC Educational Resources Information Center

    Schnitzer, Denise K.

    1996-01-01

    The Internet offers educators a wealth of grant resources and information on securing funds for projects. The first step is finding a funding source whose goals match those of the desired project's. Certain Net search engines have excellent capabilities. Grantsweb has accessible, organized links to federal and nonfederal grants sources. Other…

  10. Occurrence and ecological risk assessment of organic micropollutants in the lower reaches of the Yangtze River, China: A case study of water diversion.

    PubMed

    Yan, Zhenhua; Yang, Haohan; Dong, Huike; Ma, Binni; Sun, Hongwei; Pan, Ting; Jiang, Runren; Zhou, Ranran; Shen, Jie; Liu, Jianchao; Lu, Guanghua

    2018-08-01

    Water diversion has been increasingly applied to improve water quality in many water bodies. However, little is known regarding pollution by organic micropollutants (OMPs) in water diversion projects, especially at the supplier, and this pollution may threaten the quality of transferred water. In the present study, a total of 110 OMPs belonging to seven classes were investigated in water and sediment collected from a supplier of the Yangtze River within four water diversion projects. A total of 69 and 58 target OMPs were detected in water and sediment, respectively, at total concentrations reaching 1041.78 ng/L and 5942.24 ng/g dry weight (dw). Polycyclic aromatic hydrocarbons (PAHs) and pharmaceuticals were the predominant pollutants identified. When preliminarily compared with the pollution in the receiving water, the Yangtze River generally exhibited mild OMPs pollution and good water quality parameters, implying a clean water source in the water diversion project. However, in Zongyang and Fenghuangjing, PAHs pollution was more abundant than that in the corresponding receiving water in Chaohu Lake. Ammonia nitrogen pollution in the Wangyu River was comparable to that in Taihu Lake. These findings imply that water diversion may threaten receiving waters in some cases. In addition, the risks of all detected pollutants in both water and sediment were assessed. PAHs in water, especially phenanthrene and high-molecular-weight PAHs, posed high risks to invertebrates, followed by the risks to fish and algae. Pharmaceuticals, such as antibiotics and antidepressants, may also pose risks to algae and fish at a number of locations. To the best of our knowledge, this report is the first to describe OMPs pollution in water diversion projects, and the results provide a new perspective regarding the security of water diversion projects. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Easy and effective--web-based information systems designed and maintained by physicians: experience with two gynecological projects.

    PubMed

    Kupka, M S; Dorn, C; Richter, O; van der Ven, H; Baur, M

    2003-08-01

    It is well established that medical information sources develop continuously from printed media to digital online sources. To demonstrate effectiveness and feasibility of decentralized performed web-based information sources for health professionals, two projects are described. The information platform of the German Working Group for Information Technologies in Gynecology and Obstetrics (AIG) and the information source concerning the German Registry for in vitro fertilization (DIR) were implemented using ordinary software and standard computer equipment. Only minimal resources and training were necessary to perform safe and reliable web-based information sources with a high correlation of effectiveness in costs and time exposure.

  12. Complex source mechanisms of mining-induced seismic events - implications for surface effects

    NASA Astrophysics Data System (ADS)

    Orlecka-Sikora, B.; Cesca, S.; Lasocki, S.; Rudzinski, L.; Lizurek, L.; Wiejacz, P.; Urban, P.; kozlowska, M.

    2012-04-01

    The seismicity of Legnica-Głogów Copper District (LGCD) is induced by mining activities in three mines: Lubin, Rudna and Polkowice-Sieroszowice. Ground motion caused by strong tremors might affect local infrastructure. "Żelazny Most" tailings pond, the biggest structure of this type in Europe, is here under special concern. Due to surface objects protection, Rudna Mine has been running ground motion monitoring for several years. From June 2010 to June 2011 unusually strong and extensive surface impact has been observed for 6 mining tremors induced in one of Rudna mining sections. The observed peak ground acceleration (PGA) for both horizontal and vertical component were in or even beyond 99% confidence interval for prediction. The aim of this paper is analyze the reason of such unusual ground motion. On the basis of registrations from Rudna Mine mining seismological network and records from Polish Seismological Network held by the Institute of Geophysics Polish Academy of Sciences (IGF PAN), the source mechanisms of these 6 tremors were calculated using a time domain moment tensor inversion. Furthermore, a kinematic analysis of the seismic source was performed, in order to determine the rupture planes orientations and rupture directions. These results showed that in case of the investigated tremors, point source models and shear fault mechanisms, which are most often assumed in mining seismology, are invalid. All analyzed events indicate extended sources with non-shear mechanism. The rapture planes have small dip angles and the rupture starts at the tremors hypocenter and propagates in the direction opposite to the plane dip. The tensional component plays here also big role. These source mechanisms well explain such observed strong ground motion, and calculated synthetic PGA values well correlates with observed ones. The relationship between mining tremors were also under investigation. All subsequent tremors occurred in the area of increased stress due to stress transfer caused by previous tremors. This indicates that preceding tremors contributed to the occurrence of later ones in the area. This work was prepared partially within the framework of the research projects No. N N307234937 and 3935/B/T02/2010/39 financed by the Ministry of Education and Science of Poland during the period 2009 to 2011 and 2010 to 2012, respectively, and the project MINE, financed by the German Ministry of Education and Research (BMBF), R&D Programme Geotechnologien, Grant of project BMBF03G0737.

  13. Probing interferometric parallax with interplanetary spacecraft

    NASA Astrophysics Data System (ADS)

    Rodeghiero, G.; Gini, F.; Marchili, N.; Jain, P.; Ralston, J. P.; Dallacasa, D.; Naletto, G.; Possenti, A.; Barbieri, C.; Franceschini, A.; Zampieri, L.

    2017-07-01

    We describe an experimental scenario for testing a novel method to measure distance and proper motion of astronomical sources. The method is based on multi-epoch observations of amplitude or intensity correlations between separate receiving systems. This technique is called Interferometric Parallax, and efficiently exploits phase information that has traditionally been overlooked. The test case we discuss combines amplitude correlations of signals from deep space interplanetary spacecraft with those from distant galactic and extragalactic radio sources with the goal of estimating the interplanetary spacecraft distance. Interferometric parallax relies on the detection of wavefront curvature effects in signals collected by pairs of separate receiving systems. The method shows promising potentialities over current techniques when the target is unresolved from the background reference sources. Developments in this field might lead to the construction of an independent, geometrical cosmic distance ladder using a dedicated project and future generation instruments. We present a conceptual overview supported by numerical estimates of its performances applied to a spacecraft orbiting the Solar System. Simulations support the feasibility of measurements with a simple and time-saving observational scheme using current facilities.

  14. Integrated modelling of nitrate loads to coastal waters and land rent applied to catchment-scale water management.

    PubMed

    Refsgaard, A; Jacobsen, T; Jacobsen, B; Ørum, J-E

    2007-01-01

    The EU Water Framework Directive (WFD) requires an integrated approach to river basin management in order to meet environmental and ecological objectives. This paper presents concepts and full-scale application of an integrated modelling framework. The Ringkoebing Fjord basin is characterized by intensive agricultural production and leakage of nitrate constitute a major pollution problem with respect groundwater aquifers (drinking water), fresh surface water systems (water quality of lakes) and coastal receiving waters (eutrophication). The case study presented illustrates an advanced modelling approach applied in river basin management. Point sources (e.g. sewage treatment plant discharges) and distributed diffuse sources (nitrate leakage) are included to provide a modelling tool capable of simulating pollution transport from source to recipient to analyse the effects of specific, localized basin water management plans. The paper also includes a land rent modelling approach which can be used to choose the most cost-effective measures and the location of these measures. As a forerunner to the use of basin-scale models in WFD basin water management plans this project demonstrates the potential and limitations of comprehensive, integrated modelling tools.

  15. Permeable Surface Corrections for Ffowcs Williams and Hawkings Integrals

    NASA Technical Reports Server (NTRS)

    Lockard, David P.; Casper, Jay H.

    2005-01-01

    The acoustic prediction methodology discussed herein applies an acoustic analogy to calculate the sound generated by sources in an aerodynamic simulation. Sound is propagated from the computed flow field by integrating the Ffowcs Williams and Hawkings equation on a suitable control surface. Previous research suggests that, for some applications, the integration surface must be placed away from the solid surface to incorporate source contributions from within the flow volume. As such, the fluid mechanisms in the input flow field that contribute to the far-field noise are accounted for by their mathematical projection as a distribution of source terms on a permeable surface. The passage of nonacoustic disturbances through such an integration surface can result in significant error in an acoustic calculation. A correction for the error is derived in the frequency domain using a frozen gust assumption. The correction is found to work reasonably well in several test cases where the error is a small fraction of the actual radiated noise. However, satisfactory agreement has not been obtained between noise predictions using the solution from a three-dimensional, detached-eddy simulation of flow over a cylinder.

  16. The ImageJ ecosystem: an open platform for biomedical image analysis

    PubMed Central

    Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368

  17. CERTS Microgrid Laboratory Test Bed - PIER Final Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eto, Joseph H.; Eto, Joseph H.; Lasseter, Robert

    2008-07-25

    The objective of the CERTS Microgrid Laboratory Test Bed project was to enhance the ease of integrating small energy sources into a microgrid. The project accomplished this objective by developing and demonstrating three advanced techniques, collectively referred to as the CERTS Microgrid concept, that significantly reduce the level of custom field engineering needed to operate microgrids consisting of small generating sources. The techniques comprising the CERTS Microgrid concept are: 1) a method for effecting automatic and seamless transitions between grid-connected and islanded modes of operation; 2) an approach to electrical protection within the microgrid that does not depend on highmore » fault currents; and 3) a method for microgrid control that achieves voltage and frequency stability under islanded conditions without requiring high-speed communications. The techniques were demonstrated at a full-scale test bed built near Columbus, Ohio and operated by American Electric Power. The testing fully confirmed earlier research that had been conducted initially through analytical simulations, then through laboratory emulations, and finally through factory acceptance testing of individual microgrid components. The islanding and resychronization method met all Institute of Electrical and Electronics Engineers 1547 and power quality requirements. The electrical protections system was able to distinguish between normal and faulted operation. The controls were found to be robust and under all conditions, including difficult motor starts. The results from these test are expected to lead to additional testing of enhancements to the basic techniques at the test bed to improve the business case for microgrid technologies, as well to field demonstrations involving microgrids that involve one or mroe of the CERTS Microgrid concepts.« less

  18. GABBs: Cyberinfrastructure for Self-Service Geospatial Data Exploration, Computation, and Sharing

    NASA Astrophysics Data System (ADS)

    Song, C. X.; Zhao, L.; Biehl, L. L.; Merwade, V.; Villoria, N.

    2016-12-01

    Geospatial data are present everywhere today with the proliferation of location-aware computing devices. This is especially true in the scientific community where large amounts of data are driving research and education activities in many domains. Collaboration over geospatial data, for example, in modeling, data analysis and visualization, must still overcome the barriers of specialized software and expertise among other challenges. In addressing these needs, the Geospatial data Analysis Building Blocks (GABBs) project aims at building geospatial modeling, data analysis and visualization capabilities in an open source web platform, HUBzero. Funded by NSF's Data Infrastructure Building Blocks initiative, GABBs is creating a geospatial data architecture that integrates spatial data management, mapping and visualization, and interfaces in the HUBzero platform for scientific collaborations. The geo-rendering enabled Rappture toolkit, a generic Python mapping library, geospatial data exploration and publication tools, and an integrated online geospatial data management solution are among the software building blocks from the project. The GABBS software will be available through Amazon's AWS Marketplace VM images and open source. Hosting services are also available to the user community. The outcome of the project will enable researchers and educators to self-manage their scientific data, rapidly create GIS-enable tools, share geospatial data and tools on the web, and build dynamic workflows connecting data and tools, all without requiring significant software development skills, GIS expertise or IT administrative privileges. This presentation will describe the GABBs architecture, toolkits and libraries, and showcase the scientific use cases that utilize GABBs capabilities, as well as the challenges and solutions for GABBs to interoperate with other cyberinfrastructure platforms.

  19. The ImageJ ecosystem: An open platform for biomedical image analysis.

    PubMed

    Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W

    2015-01-01

    Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.

  20. Study of the Acoustic Effects of Hydrokinetic Tidal Turbines in Admiralty Inlet, Puget Sound

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brian Polagye; Jim Thomson; Chris Bassett

    2012-03-30

    Hydrokinetic turbines will be a source of noise in the marine environment - both during operation and during installation/removal. High intensity sound can cause injury or behavioral changes in marine mammals and may also affect fish and invertebrates. These noise effects are, however, highly dependent on the individual marine animals; the intensity, frequency, and duration of the sound; and context in which the sound is received. In other words, production of sound is a necessary, but not sufficient, condition for an environmental impact. At a workshop on the environmental effects of tidal energy development, experts identified sound produced by turbinesmore » as an area of potentially significant impact, but also high uncertainty. The overall objectives of this project are to improve our understanding of the potential acoustic effects of tidal turbines by: (1) Characterizing sources of existing underwater noise; (2) Assessing the effectiveness of monitoring technologies to characterize underwater noise and marine mammal responsiveness to noise; (3) Evaluating the sound profile of an operating tidal turbine; and (4) Studying the effect of turbine sound on surrogate species in a laboratory environment. This study focuses on a specific case study for tidal energy development in Admiralty Inlet, Puget Sound, Washington (USA), but the methodologies and results are applicable to other turbine technologies and geographic locations. The project succeeded in achieving the above objectives and, in doing so, substantially contributed to the body of knowledge around the acoustic effects of tidal energy development in several ways: (1) Through collection of data from Admiralty Inlet, established the sources of sound generated by strong currents (mobilizations of sediment and gravel) and determined that low-frequency sound recorded during periods of strong currents is non-propagating pseudo-sound. This helped to advance the debate within the marine and hydrokinetics acoustic community as to whether strong currents produce propagating sound. (2) Analyzed data collected from a tidal turbine operating at the European Marine Energy Center to develop a profile of turbine sound and developed a framework to evaluate the acoustic effects of deploying similar devices in other locations. This framework has been applied to Public Utility District No. 1 of Snohomish Country's demonstration project in Admiralty Inlet to inform postinstallation acoustic and marine mammal monitoring plans. (3) Demonstrated passive acoustic techniques to characterize the ambient noise environment at tidal energy sites (fixed, long-term observations recommended) and characterize the sound from anthropogenic sources (drifting, short-term observations recommended). (4) Demonstrated the utility and limitations of instrumentation, including bottom mounted instrumentation packages, infrared cameras, and vessel monitoring systems. In doing so, also demonstrated how this type of comprehensive information is needed to interpret observations from each instrument (e.g., hydrophone data can be combined with vessel tracking data to evaluate the contribution of vessel sound to ambient noise). (5) Conducted a study that suggests harbor porpoise in Admiralty Inlet may be habituated to high levels of ambient noise due to omnipresent vessel traffic. The inability to detect behavioral changes associated with a high intensity source of opportunity (passenger ferry) has informed the approach for post-installation marine mammal monitoring. (6) Conducted laboratory exposure experiments of juvenile Chinook salmon and showed that exposure to a worse than worst case acoustic dose of turbine sound does not result in changes to hearing thresholds or biologically significant tissue damage. Collectively, this means that Chinook salmon may be at a relatively low risk of injury from sound produced by tidal turbines located in or near their migration path. In achieving these accomplishments, the project has significantly advanced the District's goals of developing a demonstration-scale tidal energy project in Admiralty Inlet. Pilot demonstrations of this type are an essential step in the development of commercial-scale tidal energy in the United States. This is a renewable resource capable of producing electricity in a highly predictable manner.« less

Top